Friday, September 30, 2011

History of Computers


History of Computers

            Computers were initially large machines that could fill entire rooms. Some were operated using large vacuum tubes that formed the basis of today's transistors.In order to operate such machines, punch cards were used. One of the first such examples of this was the Jacquard Loom.

            In 1833 Charles Babbage invented his difference engine, an early calculator.

            Together with the punch card design, he created the analytical engine.

Here are some computers that came and went in the history of computing.

             ENIAC

            A behemoth of a machine weighing 27 tones, ENIAC stood for Electrical Numerical      Integrator and Computer.

            The ENIAC used thousands of vacuum tubes and a punch card mechanism.

    •  It was originally used to perform calculations for the hydrogen bomb, and later saw use in calculating artillery firing tables.

    •  Working out the programming on paper took weeks, and performing the necessary wiring took days. The ENIAC saw service until October 2, 1955.

®                  Macintosh

    •             First introduced by Apple in 1984, the Macintosh was the first computer to use a           mouse and graphical user interface (GUI) rather than a command line interface.
  Until the dominance of the IBM PC, the Macintosh saw use primarily as a desktop publishing tool.

    •  However due to the immense cost of porting command line interface programs to the GUI, software development was initially slow.
    •             In this computer maintained as the basis level of the computer.
    •   IBM PC
  •  The father of all current personal computers, the IBM PC was introduced in 1981.

  •   It was capable of running 3 different operating systems at launch, the most popular being PC DOS.

            The IBM PC introduced the concept of the BIOS (Basic Input Output System), which was proprietary at the time, although it now has been reverse-engineered and is considered the de facto standard in firmware interfacing. Because of its success, many manufacturers were encouraged to create clones with the same feature set as the PC, which we use today as our computers.

Generations of Computers



Computer’s history is classified into 5 generations
       1st Generation
        2nd Generation
        3rd Generation
        4th Generation
        5th Generation

1st Generation

From 1940 to 1956

  •             The first computers used
  •             vacuum tubes for circuitry and
  •             magnetic drums for memory, and were
  •             often enormous, taking up entire rooms.

  •             They were very expensive to operate.

  •             They  using a great deal of electricity, and

  •             generated a lot of heat, which was often the cause of malfunction.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

2nd Generation
From 1956 to 1963

    •             Transistors replaced vacuum tubes and ushered in the second generation of     computers.

    •             The transistor was invented in 1947 but did not see widespread use in computers           until the late 1950s.

    •             The transistor was far superior to the vacuum tube, allowing computers to become         smaller, faster, cheaper, more energy-efficient and more reliable than their first-  generation predecessors.

Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.


                                                               3rd Generation
From 1964 to 1971

  •             The development of the integrated circuit was the hallmark of the third generation  of computers.

  •             Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

    • Instead of punched cards and printouts, users interacted with third generation  computers through keyboards and monitors and interfaced with an operating  system, which allowed the device to run many different applications at one time with a central program that monitored the memory.

·        Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

4th Generation
From 1971 to Present

  •             The microprocessor brought the fourth generation of computers, as thousands of            integrated circuits were built onto a single silicon chip.

What in the first generation filled an entire room could now fit in the palm of the hand.

    •  In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh.

  •             Microprocessors also moved out of the realm of desktop computers and into many        areas of life as more and more everyday products began to use microprocessors.

    •  As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet.

    •  Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

5th Generation





Present and Beyond

Fifth generation computing devices, based on artificial intelligence, are still in             development, though there are some applications, such as voice recognition, that are being used today
    • The use of parallel processing and superconductors is helping to make artificial intelligence a reality.
    •  The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.





Basic Definitions of Computers



  •             A computer is a programmable machine

  •             Receives input through input devices

  •             Process the information through processing devices

  •             Stores the information on storage devices

  •             Provides output through output devices

Computer and its Peripherals


  • Definition of a Computer.
            A device that computes, especially a programmable electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.
           
  • Definition of a Peripheral.
            An electronic equipment connected by cable to the CPU of a computer; "disk drives and printers are important peripherals"
Computer: A machine that can receive and store information and change or                       process it.

Information: Knowledge that is communicated.

Data : The representation of information in a     formalized manner suitable for                   communication, interpretation and processing, generally by a computer          system.


            Note: the term ‘raw data’ refers to unprocessed information.







Saturday, September 24, 2011

Basic Definitions of of computers

Basic Definitions of Computers
  • A computer is programming machine
  • Receive input through input devices
  • Process information through processing devices
  • Stores the information on storage devices
  • Provides output through output devices
Computer
            A machine that can receive and store information and change or process it .
            In other way a device that computes especially a programmable electronic machine that performs high-speed mathematical or logical operations or that assembles,stores , correlates, or otherwise processes information.

Information
            Knowledge that is communicated.
DATA
            The representation of  information in a formulized manner suitable for communication, interpretation  and communication , processing generally by a computer system.
RAW DATA
            The  term RAW DATA used to refer unprocessed information
COMPUTER AND ITS PERIPHERALS
            An electronic device connected by cable to CPU of a computer
 i.e  Disk device and printers are important peripherals
           


Thursday, September 22, 2011

The Computer Generations


In the beginning ...
        A generation refers to the state of improvement in the development of a product.  This term is also used in the different advancements of computer technology.  With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it.  As a result of the miniaturizationspeedpower, and memory of computers has proportionally increased.  New discoveries are constantly being developed that affect the way we live, work and play.
The First Generation:  1946-1958 (The Vacuum Tube Years)
        The first generation computers were huge, slow, expensive, and often undependable.  In 1946two
  Americans, Presper Eckert, and John Mauchly built the ENIAC     
  electronic computer which used vacuum tubes instead of the              
  mechanical switches of the Mark I.  The ENIAC used thousands of
  vacuum tubes, which took up a lot of space and gave off a great deal
  of heat just like light bulbs do.  The ENIAC led to other vacuum
  tube
  type computers like theEDVAC (Electronic Discrete Variable
  Automatic Computer) and the UNIVAC I (UNIVersal Automatic Computer).
        The vacuum tube was an extremely important step in the advancement of computers.  Vacuum tubes were invented the same time the light bulb was invented by Thomas Edison and worked very similar to light bulbs.  It's purpose was to act like an amplifier and a switch.  Without any moving parts, vacuum tubes could take very weak signals and make the signal stronger (amplify it).  Vacuum tubes could also stop and start the flow of electricity instantly (switch).  These two properties made the ENIAC computer possible.
        The ENIAC gave off so much heat that they had to be cooled by gigantic air conditioners.  However even with these huge coolers, vacuum tubes still overheated regularly.  It was time for something new.
The Second Generation:  1959-1964 (The Era of the Transistor)
        The transistor computer did not last as long as the vacuum tube computer lasted, but it was no less important in the advancement of computer technology.  In 1947 three scientists, John Bardeen,William Shockley, and Walter Brattain working at AT&T's Bell Labs invented what would replace the vacuum tube forever.  This invention was the transistor which functions like a vacuum tube in that it can be used to relay and switch electronic signals.
        There were obvious differences between the transisitor and the vacuum tube.  The transistor was faster, more reliable, smaller, and much cheaper to build than a vacuum tube.  One transistor replaced the equivalent of 40 vacuum tubes.  These transistors were made of solid material, some of which is silicon, an abundant element (second only to oxygen) found in beach sand and glass.  Therefore they were very cheap to produce.  Transistors were found to conduct electricity faster and better than vacuum tubes.  They were also much smaller and gave off virtually no heatcompared to vacuum tubes.  Their use marked a new beginning for the computer.  Without this invention, space travel in the 1960's would not have been possible.  However, a new invention would even further advance our ability to use computers.
 The Third Generation:  1965-1970 (Integrated Circuits - Miniaturizing the Computer)
       Transistors were a tremendous breakthrough in advancing the computer.  However no one could predict that thousands even now millions of transistors (circuits) could be compacted in such a small space.  The integrated circuit, or as it is sometimes referred to as semiconductor chip, packs a huge number of transistors onto a single wafer of siliconRobert Noyce of Fairchild Corporation and Jack Kilby of Texas Instrumentsindependently discovered the amazing attributes of integrated circuits.  Placing such large numbers of transistors on a single chip vastly increased the power of a single computer and lowered its cost considerably.
        Since the invention of integrated circuits, the number of transistors that can be placed on a single chip has doubled every two years, shrinking both the size and cost of computers even further and further enhancing its power.  Most electronic devices today use some form of integrated circuits placed on printedcircuit boards-- thin pieces of bakelite or fiberglass that have electrical connections etched onto them -- sometimes called a mother board.

        These third generation computers could carry out instructions in billionths of a second.  The size of these machines dropped to the size of small file cabinets. Yet, the single biggest advancement in the computer era was yet to be discovered.
The Fourth Generation:  1971-Today (The Microprocessor)
        This generation can be characterized by both the jump to monolithic integrated circuits(millions of transistors put onto one integrated circuit chip) and the invention of the microprocessor (a single chip that could do all the processing of a full-scale computer).  By putting millions of transistors onto one single chip more calculation and faster speeds could be reached by computers.  Because electricity travels about a foot in a billionth of a second, the smaller the distance the greater the speed of computers.
        However what really triggered the tremendous growth of computers and its significant impact on our lives is the invention of the microprocessor.  Ted Hoff, employed by Intel (Robert Noyce's new company) invented a chip the size of a pencil eraser that could do all the computing and logic work of a computer.  The microprocessor was made to be used in calculators, not computers.  It led, however, to the invention of personal computers, or microcomputers.
        It wasn't until the 1970's that people began buying computer for personal use.  One of the earliest personal computers was the Altair 8800 computer kit.  In 1975 you could purchase this kit and put it together to make your own personal computer.  In 1977 theApple II was sold to the public and in 1981 IBM entered the PC (personal computer) market.
        Today we have all heard of Intel and its Pentium® Processors and now we know how it all got started.  The computers of the next generation will have millions upon millions of transistors on one chip and will perform over a billion calculations in a single second.  There is no end in sight for the computer movement.