Pages

Monday, December 12, 2016

Computer Overview

What is Computer?

computer is an electronic device that manipulates information, or data. It has the ability to storeretrieve, and process data. You may already know that you can use a computer to type documentssend emailplay games, and browse the Web. You can also use it to edit or create spreadsheets, presentations, and even videos.

Short History About Computers

The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage.
He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on.
Generally speaking, computers can be classified into several "generations". Each generation lasted for a certain period of time,and each gave us either a new and improved computer or an improvement to the existing computer.
These generations are:-
The First Generation (1943-1958): This generation is often described as starting with the delivery of the first commercial computer to a business client. This happened in 1951 with the delivery of the UNIVAC to the US Bureau of the Census. This generation lasted until about the end of the 1950's (although some stayed in operation much longer than that). The main defining feature of the first generation of computers was that vacuum tubes were used as internal computer components. Vacuum tubes are generally about 5-10 centimeters in length and the large numbers of them required in computers resulted in huge and extremely expensive machines that often broke down (as tubes failed).
The Second Generation (1959-1964): In the mid-1950's Bell Labs developed the transistor. Transistors were capable of performing many of the same tasks as vacuum tubes but were only a fraction of the size. The first transistor-based computer was produced in 1959. Transistors were not only smaller, enabling computer size to be reduced, but they were faster, more reliable and consumed less electricity.
The other main improvement of this period was the development of computer languages.Assembler languages or symbolic languages allowed programmers to specify instructions in words (albeit very cryptic words) which were then translated into a form that the machines could understand (typically series of 0's and 1's: Binary code). Higher level languages also came into being during this period. Whereas assembler languages had a one-to-one correspondence between their symbols and actual machine functions, higher level language commands often represent complex sequences of machine codes. Two higher-level languages developed during this period (Fortran and Cobol) are still in use today though in a much more developed form.
The Third Generation (1965-1970): In 1965 the first integrated circuit (IC) was developed in which a complete circuit of hundreds of components were able to be placed on a single silicon chip 2 or 3 mm square. Computers using these IC's soon replaced transistor based machines. Again, one of the major advantages was size, with computers becoming more powerful and at the same time much smaller and cheaper. Computers thus became accessible to a much larger audience. An added advantage of smaller size is that electrical signals have much shorter distances to travel and so the speed of computers increased.
Another feature of this period is that computer software became much more powerful and flexible and for the first time more than one program could share the computer's resources at the same time (multi-tasking). The majority of programming languages used today are often referred to as 3GL's (3rd generation languages) even though some of them originated during the 2nd generation.
The Fourth Generation (1971-present): The boundary between the third and fourth generations is not very clear-cut at all. Most of the developments since the mid 1960's can be seen as part of a continuum of gradual miniaturisation. In 1970 large-scale integration was achieved where the equivalent of thousands of integrated circuits were crammed onto a single silicon chip. This development again increased computer performance (especially reliability and speed) whilst reducing computer size and cost. Around this time the first complete general-purpose microprocessor became available on a single chip. In 1975 Very Large Scale Integration (VLSI) took the process one step further. Complete computer central processors could now be built into one chip. The microcomputer was born. Such chips are far more powerful than ENIAC and are only about 1cm square whilst ENIAC filled a large building.
During this period Fourth Generation Languages (4GL's) have come into existence. Such languages are a step further removed from the computer hardware in that they use language much like natural language. Many database languages can be described as 4GL's. They are generally much easier to learn than are 3GL's.
The Fifth Generation (the future): The "fifth generation" of computers were defined by the Japanese government in 1980 when they unveiled an optimistic ten-year plan to produce the next generation of computers. This was an interesting plan for two reasons. Firstly, it is not at all really clear what the fourth generation is, or even whether the third generation had finished yet. Secondly, it was an attempt to define a generation of computers before they had come into existence. The main requirements of the 5G machines was that they incorporate the features of Artificial Intelligence, Expert Systems, and Natural Language. The goal was to produce machines that are capable of performing tasks in similar ways to humans, are capable of learning, and are capable of interacting with humans in natural language and preferably using both speech input (speech recognition) and speech output (speech synthesis). Such goals are obviously of interest to linguists and speech scientists as natural language and speech processing are key components of the definition. As you may have guessed, this goal has not yet been fully realized, although significant progress has been made towards various aspects of these goals.
To read more about computer history, click here Jeremy Meyers or Moddy.

No comments:

Post a Comment