SAN FRANCISCO -- JOHN SEELEY BROWN, a scientist at the Xerox Palo Alto Research Center, turns to the German philosopher Martin Heidegger for a description of the next frontier in software: a blind man is not conscious of the cane he uses as a separate entity, Heidegger wrote, but regards it as an extension of his hand.
Dr. Brown led the legendary Xerox laboratory in the late 1980's and early 90's, when researchers there pioneered the idea of pervasive computing, in which connected microprocessors are embedded into virtually every office tool. For him, Heidegger's vision is a powerful guide for future software designers.
It is also a daunting challenge for a software industry that has long been criticized for the complexity of its products. And it remains a puzzle, for although the next generation of software is clearly upon us, its outline is still maddeningly vague.
The Internet has allowed millions of computers to be connected to one another over the last 20 years. More recently, the spread of high-speed Internet service has deepened those connections and allowed software to take the first baby step toward that vision.
Some software designers call it moving off the desktop and into the cloud. Traditionally, software has been a product, code stored in a single disk or CD that is loaded into an individual machine. Software companies large and small are now working to transform it into an array of floating services available through a global network of computers woven together from high-speed networks of copper, fiber-optic glass and radio waves.
The less poetic description of the process is that software is becoming distributed - a term that describes how a single program may span tens, hundreds or thousands of computers.
The idea is capturing the imagination of software designers who are trying to move beyond the traditional desktop graphical user interface. In fact, distributed computing is a powerful idea that most of the computer industry - from academic laboratories to industrial giants - is chasing.
Perhaps the best example of the power of distributed computing is in the Internet's domain name system. It is, in fact, a vast database that exists on many servers and instantly provides address information to any computer connected to the Internet. However, the range of distributed applications that may emerge within a decade and affect society is almost limitless, stretching from energy management to traffic control systems.
"Fundamentally, you want something to be centrally located so you can get at it from everywhere," said Danny Hillis, a leading computer designer who recently founded Applied Minds, a start-up in Glendale, Calif. He also argues that the transition to software as a service is inevitable. "Nobody wants to own software," he said. "People want to use software. Owning software is a disadvantage because you have to maintain it, update it and move it from computer to computer."
Over the last two years, a host of start-up companies raced to make that possibility into a dot-com reality. Calling themselves application service providers, they tried to cajole industries to, in effect, log on and leave the software to them. Many of them are gone. Other services live on, though generally in humbler form.
Today, three of the computer industry's leaders, I.B.M., Microsoft and Sun Microsystems, are developing software that is intended to lay the groundwork for a broader and more permanent move off the desktop.
In each case, the new systems are part of overall efforts to advance each company's current strategy: Microsoft is hoping to maintain its monopoly on desktop software by linking programs to distributed services; by contrast, I.B.M. and Sun are hoping to undermine that link by persuading developers to write their programs to Internet standards that would operate independently of the desktop programs that Microsoft still dominates.
Even more ambitious distributed computing efforts are under way at research laboratories, including those at the University of California at Berkeley, the University of California at San Diego and the Massachusetts Institute of Technology, where researchers are trying to build radically distributed computing systems that look beyond today's Internet. They are hoping, with Dr. Brown of Xerox, that the result will be a new class of software that augments human activity and intelligence. But he sees the biggest obstacle as a shift in thinking.
"People have picked up the technology part of the vision, but they haven't understood the philosophical part of the idea," Dr. Brown said. "If things are ready at hand, you don't even notice you use them."
It has always been difficult to predict the future of software. For one thing, the outlines of the next killer app - a program like the Visicalc spreadsheet or the Netscape Navigator Web browser - are almost never accurately predicted, or if they are they tend to emerge in unpredictable ways.
Distributed computing is also arriving in the same unpredictable fashion. Travel the downtown area of virtually any major city in the world and you find the streets torn up and tens of thousands of miles of fiber optic cable being laid underneath. From one perspective, they are simply faster data links, but through the looking glass of the distributed computing idea they represent a visible manifestation of the construction of a vast computational fabric that will stretch around the globe.
"From the point of view of the science- fiction writers, this is the mundane aspect of constructing a global brain," said Richard F. Rashid, the head of research at Microsoft.
Dr. Rashid points out that the paradigm of distributed computing has been around since the first days of the Internet, in the 1960's and 70's. But the rise of the personal computer and client-server computing in the 80's diverted software designers.
Moreover, the advent of distributed computing will entail an order-of-magnitude increase in complexity compared with today's single-machine software systems.
"It turns out that distributed computing is really hard," said Eric Schmidt, the chairman of Google, the Internet search engine company. "It's much harder than it looks. It has to work across different networks with different kinds of security, or otherwise it ends up being a single-vendor solution, which is not what the industry wants."
The problem of speeding up software design in the business world is particularly acute because programmers have not created the tools that have allowed them to keep up with the advances in hardware. In fact, programmer productivity has been virtually stagnant during the last two decades.
And the complexity challenge is particularly acute in the most crucial place - security - in a world where potentially hostile programs are continuously arriving across the Internet to run on potentially vulnerable computers.
William H. Gates, the chairman of Microsoft, acknowledges that problem. In an interview several months ago, he noted that the computer security aspects of his company's distributed computing initiative, called .Net, were extremely complicated, systems that even he does not fully understand.
"What's my trust model? I trust Butler," he said, referring to Butler Lampson, a longtime software designer and one of Microsoft's chief computer security designers.
The complexity issue is crucial for Microsoft because the company is building an elaborate "operating system in the sky" that will allow its software developers to write programs for hundreds of computers linked by the Internet instead of writing for individual Windows machines. Since the company announced its .Net initiative last summer, most of the attention has been focused on one aspect called Hailstorm, which represents a set of software building blocks for developing computer services like the company's Passport identification system and its Hotmail e-mail service.
Meanwhile, Microsoft is busy building other services that will always be available to programmers. One striking example is the way the company is transforming its Terraserver world map database into a Terraservice, which will provide a wide range of features automatically available to .Net programs.
All of these services are based on a new way of describing items on the Web known as XML, or extensible markup language, which makes it possible for Internet programs to automatically exchange information, even if their formats are incompatible.
In the future, Dr. Rashid said, topographical maps, weather information and street map data will be integrated into any program that taps into the .Net system.
Microsoft's archrival, Sun Microsystems, recently introduced a plan that would also shift its focus to the Internet, called the Sun Open Net Environment. But Sun is going in the opposite direction, offering a radically simplified system called JXTA.
William Joy, Sun's chief scientist, says the idea is to take advantage of the open-source software model, which makes the program's code freely available to all programmers, and allow people to quickly and easily experiment with distributed computing concepts. The system also incorporates an approach called peer-to-peer computing that for now is most associated with the music- sharing service Napster, which lets users swap files among their hard drives. By tapping into these two movements, Mr. Joy said, Sun is hoping that JXTA will grow in much the same way that the Unix operating system did, from a set of simple ideas and a community of programmers.
"Pervasive and persistent networks have been slow to emerge, but we believe that there will be a world of millions of connected devices," Mr. Joy said - too many for any monolithic approach to control.
One of the first JXTA projects is EPocketCash, an online payment system that is designed to work on any device connected to the Internet, whether it is a computer or a wireless phone.
I.B.M. has taken a third approach: building a distributed software system that focuses on applications for large corporate users. Known as the MQ Series, the I.B.M. system also relies on the XML language to permit distributed programs to share information.
Customers of I.B.M. like the Bank One Retail Group have used MQ Series to build new distributed applications that stretch across the company's 2,000 branches in 14 states, like a call-center program that handles as many as five million transactions a month.
I.B.M. executives said they thought that the distributed computing transition would take 20 to 25 years and fundamentally transform most United States corporations.
"People are just beginning to re-engineer their business processes to take advantage of the Internet as a platform," said Scott Hebner, the director of marketing for I.B.M.'s Websphere software group.
The question, indeed, may be only a matter of timing. But in that context, it is wise to remember the counsel of Paul Saffo, a consultant and futurist who frequently reminds technology-dazzled audiences in Silicon Valley: "Never mistake a clear view for a short distance."