( edit 4/00) This is an assortment of things that might vary in quality to some degree, and tend to drift into philosophical issues. Still, they represent concerns that I think might be of general interest to those with a least a beginning grasp of the real conceptual and methodological problem being raised.
...use [alt, lt. arrow] to return from popup links
6/20/01 cut from foreword......It can be hard to follow and enjoy the creative work of independent minded scientific explorers (for other examples). Some are well worth it, but heading off on one's own in science is inherently hazardous. Scientific ideas normally evolve through extensive interaction with others. Independent explorers may then produce nothing but interesting isolated explorations, if that. At worst, of course, and there seem to be many, independent explorers regularly fall prey to their own fantasies, a "what if I'm right" flowering of grandiose thinking, producing work of single minded imbalance and irrelevance. Then, there are a few others.
This work follows from the observation that on
the most elemental level, 1) nature's organization is localized (remote
causation relies on locally developing organization it could not
have caused), 2) natural organization develops through a process of accumulation
(visible in time traces), and 3) that the critical test of a working natural
mechanism is resilience and tolerance, not determinism or control.
4/25/99.......Determinism vs.Synthesis: maybe if there's one thing determinism should be it's determinant, but it's not. Determinism fails terminally, interminably, bogging down in the details every time. Every knowable thing carries with it uncertainty and uncertainty is sequentially additive. As a result deterministic models are all but perfectly indeterminant, like weather prediction models which allways need to recalibrate every time they are run. The one possiblity that allows determinism to survive, in logic at least, is the possibility of everything having, inherently and unknowably, some absolute certainty. Things still seem individually very certain, even if the proposition of absolute certainty violates everything we know about the details, and requires a totally unsupported major leap of faith. The certainty of familiar things is no illusion, just not the result of controlling circumstances. Nature simply does not develop an 'intention' and then seek to bring it about by means of 'control', only we do that. It is purely a matter of grand chauvanism that we have imagined nature as working in our own image.
What's missing is the suitable alternative, that makes sense of the problematic duality. We find that reducing uncertainty improves predictability while never yielding in certainty about anything. I think the answer is that nature does not operate by control but by synthesis. It is quite prolifically evidenced in the visible local synthetic mechanisms appearing at the beginning and end of virtually every kind of time limited event. They appear as what the physicists call 'rate equations in disturbed equilibrium' reflecting locally original distributed mechanisms which become temporarily closed loops of interaction that multiply, propogating some small pattern of locally evolved. It's how nature makes simple work of exceedingly improbable outcomes. It only produces one outcome because that is the one that developed, that swept up the local potential. There is some predictability, but only some, because the way things synthesize is, very reasonably, different every time. Still, it produces flawlessly constructed results because that is the only thing that can happen in a nearly perfectly indeterminant world. It also explains time's arrow, the directionality of time, by providing a model for change as accumulative synthesis and decay. New things developing from what developed before. Determinism is the work of controlled closed systems, always fleeting, and synthesis the work of uncontrolled open systems, like the one we have.....
clip.......Real things are definite, not uncertain, and ever more complexly detailed the closer you look. One of the objects is to provide a disciplined new way to just marvel at the work of creation. The detailed study of individual events gives you an enriched quality of learning in, say, physics, paleontology or economics, like hands-on experience in a trade gives to a student in business management. What you find out is that the theory may be our best and most useful tool for making use of natural processes, for predicting outcomes, but it is certainly not how they work.
clip....... Recognizing a continuous succession of events from scattered bits of information is something the human mind is especially good at. We may call it 'visualization' or 'intuition' or just 'awareness' or 'perception', but I think our natural skill has a lot to do with reading the flows of events around us by expecting a continuity of rates of change. That's what DR does mathematically. We have so little information about what goes on around us, bits and pieces of disconnected observations, and such relatively sophisticated awareness of ongoing events. At the very least we seem to have some marvelous technique for filling in where the pieces of the puzzle are missing, putting together our bits of information and the tell tale signs of which ones are connected. That was my observation some years ago. One of the available methods for connecting scattered bits of data, without any theory, is reading and interpolating the flow. I think we do that, habitually, without thinking, as an essential to making our images of a continuous world believable, and for recognizing when some single observation represents some change of direction in the whole and some new origin of events. DR may be a somewhat crude imitation, but it is also somewhat effective.
clip......... The practical technique involved is essentially an image restoration technique, treating data as a compressed image and expanding it. For the sake of rigor it is treated as a hypothesis generating method. It technically requires postulating a continuity of change in an underlying physical system, though one might have excellent statistical or circumstantial reason to believe it to be true. Much of the serious work also concerns unfamiliar subjects, for which neither continuity, or even representation in the data, can be assumed. Another major concern is that information about many scales and kinds of events may be superimposed in any one series of measures. The response is to treat data as composed of a complex hierarchy of continuous processes corresponding to the scales of variation present, and separating them for individual study. The results are curves of organic shape that give you one view of the structures of the underlying process in their natural state, and stimulates better questions.
Something of this kind would probably already be in common use if normal data was well behaved. It is not. Normal sequential measures are usually very spotty, and reflect the influences of many influences which are not at all the subject of investigation. Economic measures display seasonal cycles and the effect of crop losses, wars and panics, local, regional and global business cycles, and many other things. Ecological, biological and climatological processes are similarly encumbered by a complexity of effects. Data sets reflecting both large and small scale physical and chemical processes are no less burdened by a superimposition of related and unrelated effects on many scales. These are what DR helps to tease apart and find connections within so that the underlying mechanisms at each level can be seen more clearly.
The statistical method of science treats this normal complexity of data as noise for the sake of fitting errors and equations. DR works when these can be treated as having been produced by overlapping continuous processes, for the purpose of completing and separating their patterns of regular progression. This is not always possible or practical, but frequently is.
The DR method does depart from a variety of conventional scientific practices. Still, good data is good data in exactly the same way, it's just that DR finds more of the available information to be useful. What it is usually produced from normally complex data is a set of curves, one for each scale of fluctuation. The first reconstruction is a smooth curve passing through all the data points. The second is the smooth curve about which the first would appear to dynamically fluctuate, etc. This is as if separately describing the shapes of the ocean as ripples on waves on swells on the tide. The underlying processes of change are not always independent and superimposed in this way, but quite often are, and for those situations it helps to have a method that treats the data that way. Where the processes described in this way can be identified, the detailed description of its dynamics provides very useful information about its structure and accurate markers for correlating it with others.
clip.....Finding out why there's
any issue at all might begin probing the riddle that the laws of physics
appear to apply to all the events in the universe, except for any individual
event. Why is it, for example, that when the rain has stopped, it is statistically
still raining, or how is it that we know when to use common sense, rather
than the structure of rules given us by the sciences, to tell the difference.
The laws apply to ideal classes of events, 'Platonic ideals', and to this
point we are on our own in interpreting their results in every individual