% -*- texinfo -*- % @deftypefn {Function File} {[@var{p}, @var{s}] =} wpolyfit (@var{x}, @var{y}, @var{dy}, @var{n}) % Return the coefficients of a polynomial @var{p}(@var{x}) of degree % @var{n} that minimizes % @iftex % @tex % $$ % \sum_{i=1}^N (p(x_i) - y_i)^2 % $$ % @end tex % @end iftex % @ifinfo % @code{sumsq (p(x(i)) - y(i))}, % @end ifinfo % to best fit the data in the least squares sense. The standard error % on the observations @var{y} if present are given in @var{dy}. % % The returned value @var{p} contains the polynomial coefficients % suitable for use in the function polyval. The structure @var{s} returns % information necessary to compute uncertainty in the model. % % To compute the predicted values of y with uncertainty use % @example % [y,dy] = polyconf(p,x,s,'ci'); % @end example % You can see the effects of different confidence intervals and % prediction intervals by calling the wpolyfit internal plot % function with your fit: % @example % feval('wpolyfit:plt',x,y,dy,p,s,0.05,'pi') % @end example % Use @var{dy}=[] if uncertainty is unknown. % % You can use a chi^2 test to reject the polynomial fit: % @example % p = 1-chisquare_cdf(s.normr^2,s.df); % @end example % p is the probability of seeing a chi^2 value higher than that which % was observed assuming the data are normally distributed around the fit. % If p < 0.01, you can reject the fit at the 1% level. % % You can use an F test to determine if a higher order polynomial % improves the fit: % @example % [poly1,S1] = wpolyfit(x,y,dy,n); % [poly2,S2] = wpolyfit(x,y,dy,n+1); % F = (S1.normr^2 - S2.normr^2)/(S1.df-S2.df)/(S2.normr^2/S2.df); % p = 1-f_cdf(F,S1.df-S2.df,S2.df); % @end example % p is the probability of observing the improvement in chi^2 obtained % by adding the extra parameter to the fit. If p < 0.01, you can reject % the lower order polynomial at the 1% level. % % You can estimate the uncertainty in the polynomial coefficients % themselves using % @example % dp = sqrt(sumsq(inv(s.R'))'/s.df)*s.normr; % @end example % but the high degree of covariance amongst them makes this a questionable % operation. % % @deftypefnx {Function File} {[@var{p}, @var{s}, @var{mu}] =} wpolyfit (...) % % If an additional output @code{mu = [mean(x),std(x)]} is requested then % the @var{x} values are centered and normalized prior to computing the fit. % This will give more stable numerical results. To compute a predicted % @var{y} from the returned model use % @code{y = polyval(p, (x-mu(1))/mu(2)} % % @deftypefnx {Function File} wpolyfit (...) % % If no output arguments are requested, then wpolyfit plots the data, % the fitted line and polynomials defining the standard error range. % % Example % @example % x = linspace(0,4,20); % dy = (1+rand(size(x)))/2; % y = polyval([2,3,1],x) + dy.*randn(size(x)); % wpolyfit(x,y,dy,2); % @end example % % @deftypefnx {Function File} wpolyfit (..., 'origin') % % If 'origin' is specified, then the fitted polynomial will go through % the origin. This is generally ill-advised. Use with caution. % % Hocking, RR (2003). Methods and Applications of Linear Models. % New Jersey: John Wiley and Sons, Inc. % % @end deftypefn % % @seealso{polyfit,polyconf}