Computer System Paradigms

In the history of computers, one could see alteration of opposite tendencies: collective vs. individual work. First electronic computers were huge instruments requiring heaps of support personnel, but basically, the operator was a god, who manually programmed the registers from the control panel, and determined what to compute and when. Years later, computers became corporate assets intended to process data for different people, and the operator became intermediary, serving to feed the perforated cards and tapes into the input devices and sort the paper from the line printer by individual user boxes. With the first steps of multitasking, the users effectively got control over a portion of computer resources allocated to specific jobs, as if one big computer were dynamically split into a number of small computers individually exploited by the users. Personal computers were a materialization of that trend. However, the era of data shared over the network brought forth the client-server paradigm, in which the central computer (server) was collectively used by terminal users (clients). This paradigm is still the basis of most computer systems nowadays. Within it, the waves of adherence to "thick" or "thin" clients could be traced, but the main framework remained the same.

The functional distinction of clients and servers has lead to software specialization, and even the appearance of different operating systems for servers and workstations. Commercial software is often designed for server installation, especially in collaborative applications.

However, there are signs of further development, that could revive the spirit of individual computing, without reducing the effectiveness of the client-server architecture. I mean the peer-to-peer approach that connects distant computers in a random manner, without any pre-determined functional distinction. Any computer can, within a specific collective process, work as a central data supplier, while working as a client in a different distributed computation. The same piece of software can be installed on any of these computers, and all the other computers in the network can remotely use it.

Of course, reality will combine the elements of different computation paradigms, and huge servers will be in use for quite a while. However, I believe that the functional difference between the servers and clients (both in hardware and software) will gradually dissolve, so that a particular kind of functionality would be achieved by merely adding the necessary components, without any need in reconfiguring the whole system. This will require a much higher level of unification in hardware, and more standards in communication protocols. All computers will speak the same language, becoming the members of the same computer society.


[Computers]
[Unism Central]
[Main sections] [Page index] [Keyword index] [Search]
[Contact information]


1