Chapter 1 Introduction of Computer Science
1.1 History of Computer Science
The start of the modern science that we call “Computer Science” can be traced back to a long ago age. In Asia, the Chinese were becoming very involved in commerce with the Japanese, Indians, and Koreans. Businessmen needed a way to tally accounts and bills. Somehow, out of this need, the abacus was born. The abacus is the first true precursor to the adding machines and computers which would follow. For over a thousand years after the Chinese invented the abacus, not much progress was made to automate counting and mathematics. The Greeks came up with numerous mathematical formulae and theorems, but all of the newly discovered math had to be worked out by hand. Most of the tables of integrals, logarithms, and trigonometric values were worked out this way, their accuracy unchecked until machines could generate the tables in far less time and with more accuracy than a team of humans could ever hope to achieve.
Blaise Pascal, noted mathematician, thinker, and scientist, built the first mechanical adding machine in 1642 based on a design described by Hero of Alexandria to add up the distance a carriage travelled. The basic principle of his calculator is still used today in water meters and modern-day odometers. This first mechanical calculator, called the Pascaline, as shown in Fig.1-1, had several disadvantages. Although it did offer a substantial improvement over manual calculations, only Pascal himself could repair the device and it cost more than the people it replaced! In addition, the first signs of technophobia emerged with mathematicians fearing the loss of their jobs due to the progress.
Fig.1-1 Pascal calculating machine
The Arithmometer, as shown in Fig.1-2, was the first mechanical calculator strong enough and reliable enough to be used daily in an office environment. This calculator could add and subtract two numbers directly and could perform long multiplications and divisions effectively by using a movable accumulator for the result. Patented in France by Thomas de Colmar in 1820 and manufactured from 1851 to 1915, it became the first commercially successful mechanical calculator.
Fig.1-2 Arithmometer
While Thomas de Colmar was developing the first successful commercial calculator, Charles Babbage realized as early as 1812 that many long computations consisted of operations that were regularly repeated. He theorized that it must be possible to design a calculating machine which could do these operations automatically. He produced a prototype of this “difference engine” by 1822 and with the help of the British government and started working on the full machine in 1823. It was intended to be steam-powered; fully automatic, even to the printing of the resulting tables; and commanded by a fixed instruction program. This machine used the decimal number system and was powered by cranking a handle. The British government was interested, since producing tables was time consuming and expensive and they hoped the difference engine would make the task more economical[3].
In 1833, Babbage ceased working on the difference engine because he had a better idea. His new idea was to build an “analytical engine.” The analytical engine was a real parallel decimal computer which would operate on words of 50 decimals and was able to store 1000 such numbers. The machine would include a number of built-in operations such as conditional control, which allowed the instructions for the machine to be executed in a specific order rather than in numerical order. The instructions for the machine were to be stored on punched cards, similar to those used on a Jacquard loom.
A step toward automated computation was the introduction of punched cards, which were first successfully used in connection with computing in 1890 by Herman Hollerith working for the US Census Bureau. He developed a device which could automatically read census information which had been punched onto card. Surprisingly, he did not get the idea from the work of Babbage, but rather from watching a train conductor punch tickets. As a result of his invention, reading errors were consequently greatly reduced, work flow was increased, and, more important, stacks of punched cards could be used as an accessible memory store of almost unlimited capacity; furthermore, different problems could be stored on different batches of cards and worked on as needed. Hollerith’s tabulator became so successful that he started his own firm to market the device; this company eventually became International Business Machines (IBM).
Hollerith’s machine though had limitations. It was strictly limited to tabulation. The punched cards could not be used to direct more complex computations. In 1941, Konrad Zuse, a German who had developed a number of calculating machines, released the first programmable computer designed to solve complex engineering equations. The machine, called the Z3, was controlled by perforated strips of discarded movie film. As well as being controllable by these celluloid strips, it was also the first machine to work on the binary system, as opposed to the more familiar decimal system. Binary representation was proven to be important in the future design of computers which took advantage of a multitude of two-state devices such card readers, electric circuits which could be on or off, and vacuum tubes.
By the late 1930s, punched-card machine techniques had become so well established and reliable that a large automatic digital computer, called the Harvard Mark I, was constructed, which could handle 23-decimal-place numbers and perform all four arithmetic operations; moreover, it had special built-in programs, or subroutines, to handle logarithms and trigonometric functions. Meanwhile, the British mathematician Alan Turing wrote a paper in 1936 entitled On Computable Numbers in which he described a hypothetical device, a Turing machine, which presaged programmable computers. The Turing machine was designed to perform logical operations and could read, write, or erase symbols written on squares of an infinite paper tape. This kind of machine came to be known as a finite state machine because at each step in a computation, the machine’s next action was matched against a finite instruction list of possible states.
Back in America, John W. Mauchly and J. Presper Eckert at the University of Pennsylvania built giant ENIAC machine. ENIAC contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. It weighed more than 30 short tons (27 t), was roughly 8 by 3 by 100 feet (2.4m × 0.9m × 30m), took up 1800 square feet (167m2), and consumed 150kW of power[4]. This led to the rumor that whenever the computer was switched on, lights in Philadelphia dimmed. ENIAC is generally acknowledged to be the first successful high-speed electronic digital computer, it was efficient in handling the particular programs for which it had been designed and was productively used from 1946 to 1955.
In 1945, mathematician John von Neumann contributed a new understanding of how practical fast computers should be organized and built; these ideas, often referred to as the stored-program technique, became fundamental for future generations of high-speed digital computers and were universally adopted. The primary advance was the provision of a special type of machine instruction called conditional control transfer, which permitted the program sequence to be interrupted and reinitiated at any point, and by storing all instruction programs, instructions could be arithmetically modified in the same way as data. As a result, frequently used subroutines did not have to be reprogrammed for each new problem but could be kept intact in “libraries” and read into memory when needed. The computer control served as an errand runner for the overall process. The first-generation stored-program computers required considerable maintenance, attained perhaps 70% to 80% reliable operation, and were used for 8 to 12 years. Typically, they were programmed directly in machine language, although by the mid-1950s progress had been made in several aspects of advanced programming. This group of machines included EDVAC and UNIVAC, the first commercially available computers.
BASIC (Beginners All-purpose Symbolic Instruction Code) had originally been developed in 1963 by Thomas Kurtz and John Kemeny, it was designed to provide an interactive, easy method for upcoming computer scientists to program computers. By this time, a number of other specialized and general-purpose languages had been developed. A surprising number of today’s popular languages have actually been around since the 1950s. FORTRAN, developed by a team of IBM programmers, was one of the first high level languages, in which the programmer does not have to deal with the machine code of 0s and 1s. It was designed to express scientific and mathematical formulas. COBOL was developed in 1960 by a joint committee. It was designed to produce applications for the business world and had the novice approach of separating the data descriptions from the actual program. In the late 1960s, a Swiss computer scientist, Niklaus Wirth, released Pascal, which forced programmers to program in a structured, logical fashion and pay close attention to the different types of data in use.
Operating systems are the interface between the user and the computer. Windows is one of the numerous graphical user interfaces around that allows the user to manipulate their environment using a mouse and icons. Other examples of Graphical User Interfaces (GUIs) include X-Windows, which runs on UNIX® machines, or Mac OS X, which is the operating system of the Macintosh. (1-1) An application is any program that a computer runs that enables you to get things done. This includes things like word processors for creating text, graphics packages for drawing pictures, and communication packages for moving data around the globe.
The Web was developed at CERN (European Organization for Nuclear Research) in Switzerland during 1980s. As a new form of communicating text and graphics across the Internet, it makes use of the hypertext markup language (HTML) as a way to describe the attributes of the text and the placement of graphics, sounds, or even movie clips. Since it was first introduced, the number of users has blossomed and the number of sites containing information and searchable archives has been growing at an unprecedented rate.