[This is a really old page, a bit of what I was doing in 1995, for references
to it from Synapse9 see:

see: Reconstructing the flow of
individual
events or Dynamic
Synchrony
between Economic Measures]

**REFERENCE TO DR ANALYTICAL SOFTWARE**

*Intro:**Correspondence:**Philip F. Henshaw, id at synapse9.com*

Programming Notes/Command List

- Integral - INT plots Sum(dy*dt) or Sum(dy*Y)y the area under the curve or its growth precedings with the first point placed at average of second and thirdstarting point set at a user picked or entered value. AThe auto-scaling option sets the peak at 5/4 of that of source.

RELATED WORK IN ECONOMICS

__Econometric Time-series Modeling__

The conventional starting point of direct mathematical modeling forof a time-series in economics is a uses simple polynomial equations adjusted to fit the data using least squares regression, combined with a statistical component. The relative difficulty of obtaininglack of success in making predictions or defining a theoretical basis for the equationsof this approach and the current absence of anything better is responded to in three basic ways. There are simple mathematical models accepted because they that are easily made and understood, complex and mathematically sophisticated models that offer some improvement in results but add a burden of complexity, and mixed methods that pick up whatever seems useful. Interesting current discussion of each of these general approaches is found in the papers by Tiao & Tsay (1994), Young (1994), Zellner (1994)and Peña (1995) in the Journal of Forecasting and elsewhere.

The most common of the practical methods is the ARIMA model, standing for Auto-Regressive Integrated Moving Average (see Chatfield (1975) and Harvey (1981)(1993). Following this method known trends are first removed from the data, then the successive differences between points are taken (giving derivative curves) until the result seems completely random. Then a straight line is fitted to the random data, reintegrated and combined with a statistical variance to make a stochastic function to represent the behavior of the system. Peña (1995) notes that one reason that twice differenced and reintegrated models have better short term predictive results is that the procedure givesputs the greatest weightcontribution to the value of any point to the most recent data point, where fewer integrations put have thean effect of givingputting greater weight toon prior data points at a greater distance. Zellner (1994) takes a somewhat arbitrary but practical approach by taking a less than perfect ARIMA and mixing in factors from the index of leading indicators. This remains simple enough to understand and is significantly better in making predictions. Tiao and Tsay (1994) demonstrate that the performance of forecasting models depends partly on whether the number of time periods ahead for which they were optimized is the same as time period ahead they are used to forecast.

The sophisticated methods start from this approach and add variations such as piecewise optimization, statistical and frequency filtering and Fourier and Hamiltonian function analysis series. Tiao and Tsay (1994) propose a method of constructing a piecewise linear autoregression that is piecewise in the space of a threshold variable so that the regression groups the expansion and contraction periods for separate optimization and then recombination. Also demonstrated is the use of Bayesian inference with Gibbs sampling for to filtering out of the data for statistically atypical values. Young (1994) tackles the problem of statistical non-uniformity in the data withusing time variable parameters found by recursive estimation based on the frequency distribution of the statistical residuals in the data. The method is quite difficult to understand but produces fully continuous descriptive functions resultant curveswhich displaying a remarkable level of dynamic detailsynchrony between the trends of data from distinctly different sources.

__Economic Factor Theory Models__

The models based on economic factor theory, the theoretical structural relationships between technology, education, money, population, resources and public policy, etc., are also very actively pursued. One of area of strong new interest is endogenous growth theory, in which models are constructed using feedback, a structurally different kind of mathematics than the fitting of polynomials as in conventional practice. This is the same structural difference between the formulas of classical physics and the models of physical systems being developed in the study of physical system system dynamics known as chaos or bifurcation theory.

In endogenous growth theory one of the interesting ideas is that growth rates might increase with increasing population, rather than decline. Having more people available to invent things that everyone can take advantage of might explain why the large societies in history have had higher rates of growth (Kremer 1993)(Schulstad 1993). In economic theory this new approach marks the significant step oftoward abandoning the concept of perfect competition as discussed by Romer (1994), Grossman & Helpman (1994), Solo (1994,) and Pack (1994). Following this path the interest is to provide compelling reasoning to guide theory, policy and business practice, and t. The construction of successful behavioral models from factor theory equations that fit or predict the data well is currently out of current reach.

The large institutional models such
as the Kent model and others used by governments and universities may incorporate
thousands of separate time-series and related according to a combination
of empirical and a structural factor theory equations. Most are optimized
to provide one, two and three quarter business projections. There may be
many kinds of innovative experiments in modeling taking place in these
circles. There are also large scale resource models, as opposed to business
models, used by the UN and others. One of some note that is compelling
but has been treated as requiring too many assumptions is the limits to
growth study originally commissioned by the Club of Rome (Meadows 1972,
1992). This work is quite out of character with the majority of economic
modeling efforts in that it ismade a serious effort to project the behavior
of the world economy as a life support mechanismas a whole, and for a century
into the future.__ __

__A Comparison of Results:__
__Reconstruction of Physical Continuity Derivative Reconstruction &
Time Variable Parameter modelingmethods__

The reason for this section is a remarkable similarity between the results obtainedof analysis for the same data withby derivative continuity reconstruction (DR) and the time variable parameter (TVP) method of Young (1994). The significant findingimportant matter is that the two distinctly different methods bothfound the same structural relationship in the data, demonstratinge rather clearly that there is another level of stable structural information in time-series data that has not been made visible using by other methods. It also serves to confirms the validity of each method of exposing it.

Figure B1 shows the continuity DR
interpolationreconstruction and 1st differencingerivatives of log_{e}(GNP)
and log_{e}(UN) for the same data used by Young derived from the
data in figure JOF-1. The derivative trends of the stock market were also
calculated and are presented for comparison.

Figure B2 shows the 2nd derivative of these variables presented in the same manner as Young’s results, shown below in Figure JOF-11. forshowing the 2nd differencesderivative the filtered trend GNP and Unemp TVP trend reconstruction.

The remarkable feature of both is an apparent tight symmetric synchrony between the underlying accelerations in GNP and Unemployment rates, despite the appearance that the original measures describe entirely different kinds of behavior.

__The data used __

Young used 160 quarters of US GNP, Unemployment rates and other aggregates covering 1948(1) to 1988(4) (figure JOF-1. And JOF-4).

For derivative reconstruction these charts were digitally scanned and converted from TIF to DXF vector graphic files. The vectorized scans were then corrected by hand to join broken segments and trim out spurious vertical segments to make them suitable for derivative reconstruction using the functions of CURVE .

__Analysis Method__

Young’s method uses a recursive spectral density filter called IRWSMOOTH
to produce a time-series trend. It is based on the unobserved component
models (referred to Harrison and Stevens 1971 & 1976; Kitagawa 1981;
and Harvey 1984) in which the parameter variations are described by a higher-dimensional,
vector random-walk-type model. Though the details of its construction are
not presented, and its application quite complex, one mentioned feature
that might cause it to have results similar to derivative continuity reconstructionDR
is the reported similarity of the state-space algorithms used to the optimization
technique known as *regularization *which includes constraints on
the rates of change of the variables(see Young 1994 p. 181). The IRWSMOOTH
trend series was then differenced twice to produce figure JOF-11.

In this application Tthe derivative reconstruction steps wasere oriented to examining more detailed fluctuations than those used for the analysis ofof the longentire history of GNP presented in the preceding paper. The steps began with derivative interpolation (din) to reinforce the short term events. Then nearly identical stages of derivative smoothing, inflection point bridging and further derivative smoothing were applied to each curve. The two levels of trend bridging for the unemployment figures are clearly visible in figure B1. The larger scale trends were then differenced to produce the neat mirror symmetry of fluctuating first derivative rates of the DR trendimplied behavior underlying the measure of levels of GNP and Unemployment. A second differencing and scale adjustment produced Figure B2.

__Discussion__

The question, of course, is whether this kind of remarkablye clear pattern is real. It seems to have been hard for Young’s readers to accept it. Young provides a careful response to the question of the relationship being just an artifact of his unusual spectral frequency filter. He also points out that there is "more than just a similar frequency content in the series: even subtle temporal variation in the cycles can be discerned in both series. Moreover, similar filtering operations applied to some of the other series (in (figure JOF-4) do not reveal nearly so clear relationships" (Young 1994 p 203).

One additionalThe aspect of the pattern of close synchrony between underlying turning points that seems particularly convincing is concerns an implication aboutone of its implications concerning the general system structure of the economy. Neither the reversals of trend in GNP nor uunemployment are leading factors, but areturn simultaneously. This seems only to be possible only for measures of a system that acts as a uniform whole. If either one were the consequence of the other then a consistent time lag should be evident. Thus the synchrony of the turning points of the measures suggests that the two factors are not causally related, but are both indicators oftors of the samea dynamic of the whole system. that is similarly expressed in each.

The DR stock market behaviortrend derivative shown in figure B1 bears some similarity to the GNP movements, but is sometimes leadings, and sometimes followsing, in an irregular fashion, and. It therefore appearsing to be only loosely tied to the underlying fundamental dynamicbehavior of the business system as a whole. This is just what one might expect considering the strong influence of volatile and self fulfillinginfluence of incestuous and volatile investor expectations in setting the directions of the marketits directions.

All in all, what seems demonstrated are two lenses with slightly different focus and lense distortions, providing clear views of the same surprisingly systematic behavior.