Date: Thu, 21 Dec 2000 10:16:51 -0500
From: | Paul Fernhout |
pdfernhout@kurtz-fernhout.com Reply-To: unrev-II@egroups.com Organization: Kurtz-Fernhout Software |
To: | unrev-II@egroups.com |
Subject: | Is "bootstrapping" part of the problem? |
John-
[Below, is a summary of my views drawn from the dialog todate, including this letter.]
Great comments [in your letter on December 20, 2000]. Let me see if I can summarize them (hopefully without caricaturing them too much, let me know if I inadvertently did):
John \"sb\" Werneken wrote:
Desires, intelligence, values, - rings true, up to a point. But it's sort of like id, ego, superego. A way of naming and thus discussing something about mind or personality that seems real to many of us, but not terribly informative about how we function or could or should function.
I suggest that if we are affirming values (widely shared or otherwise) we ought to affirm Respect for Human Autonomy. The idea that one individual may better determine that individual's needs, wants, and means than another. The idea that one may be mistaken, especially about other people.
1) An important core value is "Respect for Human Autonomy".
Agreed.
However, there are phases in a person's life when the need assistance to grow into a member of a particular society, or to survive in that society despite calamity. If they do not get that assistance, they do not become a productive member of that society. One value to society of helping the downtrodden is in part that it creates a better social dynamic for everyone (i.e. people don't live in fear of poverty, and thus are not driven to certain extreme behaviors such as financial obesity.)
This is especially true for children, naturally. However, it may still be true for adults who have grown up in different cultures who did not gain the requirements (ideas, skills, capital, outlook) to participate in the society under discussion. (This is to not say individuals from disadvantaged backgrounds may not bring other valuable things with them from their heritage.)
We need to distinguish the desire to create opportunity from forcing people to act on the opportunity. I would not interpret "Respect for Human Autonomy" to mean let the "weak" (however that is currently defined) die from neglect. And that's what we are talking about to an extent with 840 million malnourished people.
The people trying to take away autonomy (IMHO) are more likely those Congress people passing stuff like:
http://slashdot.org/article.pl?sid=00/12/20/1620204&mode=thread&threshold=2
...where bill HR46 may allow the seizure of your computer if it is suspect of being "used in ... intellectual property theft" before you are convicted or even charged (maybe someday with IP theft interpreted as websurfing past a web site with an unauthorized use of a Disney character perhaps?)
John \"sb\" Werneken wrote:
2) Humans have built-in evolved ways of operating in a "marketplace" situation with some individuals rising and some falling based on their exploits.
Agreed. However, I think the behavior of a world economy because of the huge quantitative difference in number of actors is also qualitatively different than face-to-face human situations. Thus the emergent behavior from a world economy may be greatly different than in a tribal situation.
John \"sb\" Werneken wrote:
3) Statistical changes emerging from individual choices are likely to be better than changed mandated from a central authority.
I'd say yes and no.
Yes, because this is a model of decentralized decision making, which in general I like.
Also yes, because individuals may create their own new playing fields (example, new industries as yet unregulated, or new living spaces in the ocean or in space where different laws may be enacted, as was the case with the Pilgrims fleeing to america to avoid religious persecution).
However, also No. We do have government mandates -- for example laws derived from the U.S. constitution that are based on shared values. We also have individual choices within the mandated framework. The point is that the structure of the playing field (and changes in that structure) are something that large organizations impact, governments in terms of laws, and corporations in terms of the work environment and permissible work.
John \"sb\" Werneken wrote:
4) Some views of the DKR are authoritarian (such as ones that talk about relative priorities for the world as a whole).
Possibly.
However, in the lipstick example I didn't mean people should be forced to support starving African villages instead of buying cosmetics. I meant that even if it came down to deciding to go without one lipstick tube (for an otherwise very affluent person) in order that a village of 10,000 people in Africa could avoid starvation for all of eternity, there is no guarantee that the village will be helped.
I am not advocating forcing people to spend their money a certain way (our government taxes already people to do that, thank you). However, I am suggesting the possibility that some individuals could invest their own resources in creating a possible alternative framework (including a knowledge base) that may be of more benefit both to themselves and others in daily life.
John \"sb\" Werneken wrote:
5) All systems have limits, even those growing exponentially. There is not a pressing need to deal with any threats from exponential growth.
I agree that systems have limits, and typically exponential growth becomes an S-curve at some point.
We obviously do disagree some here though. I feel what is at issue here is whether there will be enough clear water in the machine-intelligence Lily pond over the next few decades for survival. And I feel strongly that this should be a major focus of serious attention, rather than advocating business as usual up until day 39.
John \"sb\" Werneken wrote:
6) Rising technological progress lifts all boats.
Agreed to an extent.
However, in terms of say time parents spend with children in the U.S. for example, the last twenty years of technological progress has been a step backwards. So too have been steps backwards unforeseen side-effects of technology (like plastics producing estrogen mimics). And so too might be the implications of advanced warfare and machine intelligence.
I personally would agree more if we said that technological progress lifts a lot of boats, whereas realistically some will get swamped. The boats that do get swamped need our charity.
Also, I do think if things like food or water got cheaper, charity relating to helping people survive or bootstrap themselves to be self-reliant would be cheaper, and so charitable people could accomplish more (perhaps 840 million people's worth on a billion dollars, which would certainly be feasible to raise as a charity economically today if it would for sure see the immediate end to basic suffering).
John \"sb\" Werneken wrote:
(1) We need more energy. Probably solar energy converted cheaply in space to microwaves to then become electricity on the ground. Perhaps thermonuclear or others not yet understood.
(2) We need more material resources, both as input materials and as a means of disposing (at least temporarily) of wastes. Asteroid bodies come to mind as one practical target here.
(3) We also need better ways of organizing and utilizing the energy and material we do have. Nano-technology, genetic technology, and computational technology are examples.
7) Innovation is especially needed in areas of energy supply, material acquisition and disposal, and advanced material processing.
Agreed. However, I think that some of these innovations are somewhat redundant. That is, with enough energy, everything is recyclable. With better manufacturing processes, you need less energy and materials.
John \"sb\" Werneken wrote:
On the politics of meeting human needs, it has been shown that famine is a distribution affect, not a supply and demand effect. Nonetheless a top-down redistribution would kill the golden goose of competitive autonomous innovation.
8) World hunger is due to distribution problems, but attempting to correct this directly would lead to other more serious problems.
Reservations about tinkering with a system that has produced so much certainly sound reasonable. But I think in the case of distribution we are talking more politics than anything else. The greatest number of starving people (outside of those saying America) are starving due to the consequences of civil war or selfish dictators (or a related culture of dictatorship).
Some of this gets down to feelings bout innovation, which is often cited as the reason for competition. I respectfully disagree on this. I think people and communities are fundamentally innovative. Historically, we have seen the development of fire, "alphabet technology" to quote Rod, and the plow even before extensive modern day competitive capitalism. I'm not saying competition can't spur innovation, just that it may not be strictly required.
John \"sb\" Werneken wrote:
In Bangladesh, the Grameen Bank has pioneered micro-lending managed at the village level. Through the efforts of individuals empowered by its micro-loans, extreme poverty has reduced from 1/3 of villagers to 10%.
9) Social innovations like micro-credit can solve much of the problem of poverty in a capitalist setting.
Agreed.
However, it is not clear whether this will scale as the rest of the economy continues to move ahead, if these people can not keep up. That is, the market for handicrafts may only be so large in the face of other goods.
John \"sb\" Werneken wrote:
10) It would be useful to have a DKR containing development success stories.
Agreed.
John \"sb\" Werneken wrote:
11) Linking the DKR to changing human nature will cause it to fail.
Agreed. However, human nature is variable, and creating DKRs to allow other aspects of human nature (i.e. charity) to flourish I think is worthwhile.
12) (Implied) Making statements about supporting alternate lifestyles or economics rather than modern day capitalism is equivalent to a demand for "re-arranging human character". Basically, the economic and technical system as is is working as well as could be expected and the world is getting better because of it now and in the future.
I hope I'm not distorting your points too much especially with this last. I would disagree with the implication that choosing anything other than our economics as it is require changing human nature. And I should make clear I am not advocating changing human nature. I am suggesting perhaps more diversity in its expression in terms of economics. I am also suggesting we consider creating alternative technical infrastructures (and related libraries of tools and knowledge) that may work in more harmony with certain aspects of human nature (i.e. "Buddhist Economics" mentioned in "Small is beautiful" -- in this context, for example, work as spiritual growth).
I think the current economic system is in for a radical disruption whether desired or not -- both in the elimination of the need for "supply chains" and in the final implications of machine intelligences designed to win competitive arms races (economically for corporations, or militarily for the DOD). So, further, I'm suggesting that the current unguided technological explosion may not have room for any humans at all unless we act positively to innovate in such a way as to make room for many or all people. Conversely, if we do act positively, there may be room for trillions of more people (and AIs and augmented people, etc.) in the solar system.
To reiterate my main points, which perhaps are getting lost in the debate over future economics:
-Paul Fernhout
Kurtz-Fernhout Software