Ads 3

Ads-2

Sunday, December 27, 2015

What is Computers Generation ?

Generation of ComputersBased on the characteristics of various computers developed from time totime, they are categorized as generation of computers. Generation of Computers First Second Third Fourth FifthGeneration Generation Generation Generation Generation

First Generation Computers.

                     The first generation of computers is said by some to have started in 1946 with ENIAC, the first 'computer' to use electronic valves (ie. vacuum tubes). Others would say it started in May 1949 with the introduction of EDSAC, the first stored program computer. Whichever, the distinguishing feature of the first generation computers was the use of electronic valves.

My personal take on this is that ENIAC was the World's first electronic calculator and that the era of the first generation computers began in 1946 because that was the year when people consciously set out to build stored program computers (many won't agree, and I don't intend to debate it). The first past the post, as it were, was the EDSAC in 1949. The period closed about 1958 with the introduction of transistors and the general adoption of ferrite core memories.
OECD figures indicate that by the end of 1958 about 2,500 first generation computers were installed world-wide. (Compare this with the number of PCs shipped world-wide in just the third quarter of 2006, quoted as 59.1 million units by research company Gartner).

Two key events took place in the summer of 1946 at the Moore School of Electrical Engineering at the University of Pennsylvania. One was the completion of the ENIAC. The other was the delivery of a course of lectures on "The Theory and Techniques of Electronic Digital Computers". In particular, they described the need to store the instructions to manipulate data in the computer along with the data. The design features worked out by John von Neumann and his colleagues and described in these lectures laid the foundation for the development of the first generation of computers. That just left the technical problems!
One of the projects to commence in 1946 was the construction of the IAS computer at the Institute of Advanced Study at Princeton. The IAS computer used a random access electrostatic storage system and parallel binary arithmetic. It was very fast when compared with the delay line computers, with their sequential memories and serial arithmetic.

The Princeton group was liberal with information about their computer and before long many universities around the world were building their own, close copies. One of these was the SILLIAC at Sydney University in Australia.

I have written an emulator for SILLIAC. You can find it here, along with a link to a copy of the SILLIAC Programming Manual.

Second Generation Computer
                         The transition from first generation to second generation of computers was not abrupt. There was all round development in technology, designs and programming languages. Diode and transistor technology formed the basis of the electronic switches and the switching time came down to around 0.3 microseconds.

Computers like TRADIC and TX-0 built in 1954 used this technology. During this span, the superior magnetic core memory was in use. Some of the significant innovations of this era are floating point units for the real number calculations and index registers for controlling loops. This saved the ordeal of writing self-modifying codes and made the access to successive elements easy.

In the field of programming languages, there were superior introductions like FORTRAN (1956), ALGOL (1958) and COBOL (1959). The second generation also witnessed the development of two supercomputers - i.e. the most powerful devices amongst the peers. These two were the Liverpool Atomic Research Computer (LARC) and IBM7030. These machines overlapped memory operations with processor operations and had primitive type of parallel processing. Some of the important commercial machines of this era were IBM 704, 709 and 7094. The later introduced I/O processing.

Third Generation of Computers 

In this era, there were several innovations in various fields of computer technology. These include Integrated Circuits (ICs), Semiconductor Memories, Microprogramming, various patterns of parallel processing and introduction of Operating Systems and time-sharing. In the Integrated Circuit, division there was gradual progress. Firstly, there were small-scale integration (SSI) circuits (having 10 devices per chip), which evolved to medium scale integrated (MSI) circuits (having 100 devices per chip). There were also developments of multi-layered printed circuits.


Parallelism became the trend of the time and there were abundant use of multiple functional units, overlapping CPU and I/O operations and internal parallelism in both the instruction and the data streams. Functional parallelism was first embodied in CDC6600, which contained 10 simultaneously operating functional units and 32 independent memory banks. This device of Seymour Cray had a computation of 1 million flopping point per second (1 M Flops). After 5 years CDC7600, the first vector processor was developed by Cray and it boasted of a speed of 10 M Flops. IBM360/91 was a contemporary device and was twice as first as CDC6600, whereas IBM360-195 was comparable to CDC7600. In case of language, this era witnessed the development of CPL i.e. combined programming language (1963). CPL had many difficult features and so in order to simplify it Martin Richards developed BCPL - Basic Computer Programming Language (1967). In 1970 Ken Thompson developed yet another simplification of CPL and called it B.

Fourth Generation of Computers 

In this generation, there were developments of large-scale integration or LSI (1000 devices per chip) and very large-scale integration or VLSI (10000 devices per chip). These developments enabled the entire processor to fit into a single chip and in fact, for simple systems, the entire computer with processor; main memory and I/O controllers could fit on a single chip.

Core memories now were replaced by semiconductor memories and high-speed vectors dominated the scenario. Names of few such vectors were Cray1, Cray X-MP and Cyber205. A variety of parallel architectures developed too, but they were mostly in the experimental stage.


As far as programming languages are concerned, there were development of high-level languages like FP or functional programming and PROLOG (programming in logic). Declarative programming style was the basis of these languages where a programmer could leave many details to the compiler or runtime system. Alternatively languages like PASCAL, C used imperative style. Two other conspicuous developments of this era were the C programming language and UNIX operating system. Ritchie, the writer of C and Thompson together used C to write a particular type of UNIX for DEC PDP 11. This C based UNIX was then widely used in many computers.

Another event that is mention worthy was the publication of the report by Peter D. Lax in 1982, which was sponsored by the US department and National Scientific Foundation. The Lax report, as it was called, emphasized on the need of initiatives and coordinated national attention in the arena of high performing computing in the US. The immediate response to the Lax report was the establishment of NSF Supercomputing Centers. Other centers that came up later were San Diego Supercomputing Center, National Center for Supercomputing Applications, Pittsburgh Supercomputing Center, John von Neumann Center and Cornell Theory Center. These institutes had really been instrumental in providing computing time on super computers to the students, training them and also helping in the development of software packages.

Fifth generation of computers

In this period, computer technology achieved more superiority and parallel processing, which was until limited to vector processing and pipelining, where hundreds of processors could all work on various parts of a single program. There were introduction of systems like the Sequent Balance 8000, which connected up to twenty processors to one shared memory module.

This machine was as competent as the DEC VAX-780 in the context that it had a general purpose UNIX system and each processor worked on a different user's job. On the other hand, INTEL IPSC-I or Hypercube, as it was called, connected each processor to its own memory and used a network interface to connect the processors. With the concept of distributed network coming in, memory posed no further problem and the largest IPSC-I was built with 128 processors. Towards the end of the fifth generation, another parallel processing was introduced in the devices, which were called Data parallel or SIMD. In this system, all the processors operate under the instruction of a single control unit.

In this generation semiconductor memories became the standard were pursued vigorously. Other developments were the increasing use of single user workstations and widespread use of computer networks. Both wide area network (WAN) and local area network (LAN) developed at an incredible pace and led to a distributed computing environment. RISC technology i.e. a particular technique for the internal organization of CPU and the plunging cost of RAM ushered in huge gains in computational power of comparatively cheaper servers and workstations. This generation also witnessed a sharp increase in both quantitative and qualitative aspects of scientific visualization.


Networking technology is spreading rapidly and one of the most conspicuous growths of the sixth generation computer technology is the huge growth of WAN. For regional network, T1 is the standard and the national "backbone" uses T3 to interconnect the regional networks. Finally, the rapid advancement and high level of awareness regarding computer technology is greatly indebted to the two legislations. Just like the Lax report of 1982, the High Performance Computing Act of 1991, Information Infrastructure, and technology Act of 1992 have strengthened and ensured the scope of high performance computing. The former has ensured the establishment of high performance computing and communications programming (HPCCP) and the later has reinforced the necessity of making leading edge technologies available to academicians right from kindergarten up to graduation level.

0 comments:

Post a Comment