Vain Notion Confirmed as Hurdle

 

Technology's Gotta Be Free to Obsolete You and Me

Personal computers took off in the 90's after they became sufficiently cheap, powerful and easy to use to accomplish business tasks efficiently. Then the allure of the Internet impelled many individuals to acquire PCs and learn how to use Web browsers and email. While Windows and Macintosh user interfaces make performing common tasks more understandable than, say, typing Unix commands or programming in C, what is going on inside their beige boxes has always baffled most users. Now that more applications are Internet-enabled and becoming distributed and componentized, even hardcore geeks get frustrated installing, running and removing innovative applications, often because the source of problems may be in operating systems, other applications, or even other computers.

To make modern software reliable and interoperable across platforms and networks, broad consensus concerning hardware architectures, network protocols, software interfaces, and database access is increasingly required. IT researchers and engineers are forced to devote more time and energy to developing standards in these areas, which innovations proceed to obsolete as quickly as they can be codified. For example, the World Wide Web wouldn't exist without the HyperText Markup Language (HTML) and HyperText Transfer Protocol (HTTP) standards, yet these are now considered old hat and detrimental to the Web's progress.

One solution to evolving more flexible and robust IT standards is to devise self-documenting meta-standards. The XML language, now widely adopted for ecommerce and vertical industry Web solutions, is a good example. In terms of expressiveness and generality, XML is to HTML as algebra is to arithmetic. XML-encoded messages enable computer programs to communicate and interpret complex data meaningfully. Any suitably equipped set of programs that communicate to the Internet can "converse" in XML, assuming each is familiar with the domain of the information it receives. XML thus avails the data of one application to the expertise of another, wherever they might be, whether friends or foes.

Other approaches to making systems more adaptive and responsive to change are tending to lead to self- modifying, distributed software and knowledge databases with no central control over their design, content, or behavior. Some techniques also remove control from programmers over what their software does and how it does it. Rather than being "coded" such programs are "trained" or "bred".

For some time, cognitive and computer scientists have digitally mimicked brain synapses to recognize visual patterns and other stimuli presented via cameras or other sensors. Such logical circuits, called Artificial Neural Nets, are now routinely used in various applications of machine learning. Neural Nets are partly self-organizing and are trained rather than programmed; presented with known inputs and desired outputs, a Neural Net soon learns to respond appropriately to variants of familiar stimuli. What's fascinating and rather frightening is that the software developer doesn't know -- and may not even care -- how the program does it.

What may be the most extreme software design paradigm doesn't involve programmers at all: Genetic Programming harnesses the power of evolution itself to randomly breed competing programs until one of them successfully meets predefined objectives. Evolving a workable solution can take many software generations, and isn't practical for well-defined or time-critical problems. Before long, though, computers will grow powerful enough to make evolving their own software an attractive option.

The direction is clear. In order for information technology to evolve as quickly as the marketplace allegedly demands, software must be free to discuss among peers, to improvise solutions, to reconfigure itself, to mutate on its own (one company has in fact trademarked "Digital DNA"). As software becomes more autonomous, Microsoft and many other vendors will progressively lose control of what their products are doing, despite their best efforts. Unfortunately, so will users, governments, and societies. Technology will progressively decide what's best for us, though we'll pretend we're in control. Innovation will burst free of human constraints and build a better world. Should we not like what is wrought, we can always leave.

At least one leading technologist has expressed reservations about creating highly autonomous and adaptable systems. At one point in a controversial essay, Bill Joy (Chief Scientist and a founder of Sun Microsystems) notes:

Perhaps it is always hard to se the bigger impact while you are in the vortex of change. Failing to understand the consequences of our inventions while we are in the rapture of discovery and innovation seems to be a common fault of scientists and technologists; we have long been driven by the overarching desire to know. That is the nature of science's quest, not stopping to notice that the progress to newer and more powerful technologies can take on a life of its own.

Joy took a lot of heat for his essay, Why the future doesn't need us (Wired, April 2000). Many of his critics felt he exaggerated the potential dangers of robotics, genetic engineering, and nanotechnology. Others took offense with his solution -- voluntarily relinquishing certain lines of research and development -- calling it impractical, if not a violation of the human spirit. High tech professionals generally regarded the piece as loony and the author as traitorous. Regardless, no one can demonstrate that the dangers Joy warns against (harmful mutations, destruction of the biosphere, and pandemics instigated by out-of-control technologies) are insignificant or completely improbable.

 

Copyright © 2001 by Geoffrey Dutton. All rights reserved.

<previous: Intellectual Props>        <next: Appropriate Innovation>