Tags
Language
Tags
December 2024
Su Mo Tu We Th Fr Sa
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31 1 2 3 4

Regression Analysis and Curve Fitting

Posted By: AlexGolova
Regression Analysis and Curve Fitting

Regression Analysis and Curve Fitting by John Homan, Betsy Homan, Dr. Sidney Homan
English | May 14, 2016 | ASIN: B01FPVD1PE | 444 pages | AZW3 | 2.53 MB

It is hard to describe the excitement that I feel about mathematics. It’s like taking a long journey and suddenly seeing a Hugh mountain peak come into view. You catch your breath and take in the majestic beauty of the discovery but it is more than that, you feel a humbleness in its presence. There is a moment of revelation that you have not yet reached the summit to discover the mystery that it holds. For a moment, you imagine yourself being at the top; it’s yours to conquer now.
Curve fitting is the process of constructing a curve or mathematical function that has the best fit to a series of data points. The defining input points can represent something as simple as comparing the height of individuals (by considering the height of their parents), as done by Francis Galton, 1700 England, or as complicated as predicting when an asteroid will hit earth– destroying life as we know it. The subject of Regression Analysis should not be taken lightly, since it could possibly provide answers for solving many of mankind’s problems.
Curve fitting can involve either interpolation, where an exact fit to the data is required (as in tracking an asteroid), or simply to “smooth” a curve, where an approximate fit is required (as in Galton’s research).
Regression analysis and Curve fitting(regression) are terms that are, for the most part, similar but have some differences. Regression Analysis answers the question of statistical inferences such as how much uncertainty is present in a curve that is to fit the data observed (with random error) and the equations themselves, define the actual regression. In the example of the asteroid, the regression path is important, but even more important is the accuracy of the calculations (regression analysis). Any error in the input data (distance, time, or velocity) could cost the lives of millions of people.
It is obvious that the use of Regression Analysis must be applied correctly and with caution. It is not just a matter of plugging data into an equation and expecting valid results. In Chapter no. 3, an example is given regarding an experiment done by chemists from the Riverside Cement Company to determine the proper mixture of ingredients in the formation of cement. A multiple Regression Equation is used in this research and in the final chapter (Concluding Remarks), a medical study is cited where the movement of the wrist is modeled by the regression equations formed using an equiangular spiral.
This book has been written to include actual applications of the use of Regression Analysis in the major fields in society, both social and scientific.
Angle and Curvature Constraints are also considered in Chapter number 5, dealing with the topic of Spline Regression. “Splines” are “piecewise polynomials” that bridge interruptions (jumps) in curves. Spline fits require the use of derivatives (from calculus) to handle slope problems (1st derivative) at beginning and end conditions as well as curvature considerations (2nd derivatives) at places where curves join.
All regression equations presented in the book are developed by the method of Least Means Square. This algebraic approach, first developed by Carol Gauss (1700), is a standard throughout the mathematical community. The LMS approach will also incorporate the different concepts of probability found in the Normal Distribution (Bell Curve). This approach collects a data set consisting of pairs: (Xi, Yi) where I = 1, 2, 3…N, where N = Number of points and Xi = the independent variable and Yi = the dependent variable. The final process minimizes the sum of squared residuals, a residual being the difference between the observed value (Yi) and the Fitted value (Yf), provided by the model chosen (linear, quadratic, cubic etc.). Least squared problems fall into two categories: (1) Linear type. (2) Non-linear type. Both categories will be explained in the text (Chapter No. 3 to 8) by the use of differentials in setting up the defining eq