on Linkedin – Global Foresight – Future of Western Civilization.
Nicholas Beecroft • I’d like to invite you to take a look at an emerging series of interviews with inspirational Leaders at the evolutionary edge of our culture
Phil Henshaw • Would you like a real discussion of how and why our economies became designed for a different kind of planet than the one we live on?
Nicholas Beecroft • Yes, go on, Phil……
Phil Henshaw • Well, it presses the credulity of intellectuals more than informal thinkers. There’s a very interesting property of informal language, that the same words readily refer to either the objects of nature that are their subjects, or to the cultural meanings people have for them, that the words raise in their minds. For example “apple” is easily considered the “thing in your hand you might eat” and the “idea in your mind of giving one to the teacher this morning”.
- Normal language is about real things, the economy about abstract $’s
The language of the economy isn’t like that. What our economy is designed for, that conflicts with the economy being a physical system, is behaving like numbers, i.e. to be infinitely adjustable, defined and controlled by rules and things. Nature physically works by the development of organization in complex processes, that so far science has found “indefinable” and only studies through substitute mathematical models.
So, the mismatch I see results from the economy being ruled by rules about numbers, as the artificial reality we devised for it, that people can connect to “data” about nature but lose any information about the natural relationships and even unable to use for referring to “things”. So, the versatility of natural language in being able to address abstract and natural subjects at the same time is lost by science. I find that holds up rather generally.
There’s an interesting way to overcome that, and helps illustrate the problem too. When you understand that abstract thinking and models need to be based on information, the validity of models are then also subject to the “missing information” that may exist in the observer’s view of their environment, as much as on the presence of “confirmed information”.
A large case in point is how observers are unable to see what is going on inside much of any self-organizing system, like the mind of another person or someone else’s culture. Systems that work by themselves have their organization in internal loops of relationships, that “outsiders” to those loops strongly tend to be unable to observe. The problem is quite observable, though, and that’s the beginning of practical solutions.
I have short blog posts on it from that approach at the top of my blog site this week: