×
You can use excuses to convince others, but how will you convince yourself?
--Your friends at LectureNotes
Close

Computer Programming

by Jntu Heroes
Type: NoteInstitute: Jawaharlal nehru technological university anantapur college of engineering Offline Downloads: 81Views: 1370Uploaded: 10 months agoAdd to Favourite

Share it with your friends

Suggested Materials

Leave your Comments

Contributors

Jntu Heroes
Jntu Heroes
GE6151-COMPUTER PROGRAMMING UNIT –I INTRODUCTION Generation and Classification of Computers- Basic Organization of a Ccmputer -Number System -Binary - Decimal - Conversion - Problems. Need for logical analysis and thinking - Algorithm - Pseudo code - Flow Chart. LECTURE NOTES GENERATIONS OF COMPUTERS The Zeroth Generation The term Zeroth generation is used to refer to the period of development of computing, which predated the commercial production and sale of computer equipment. The period might be dated as extending from the mid-1800s. In particular, this period witnessed the emergence of the first electronics digital computers on the ABC, since it was the first to fully implement the idea of the stored program and serial execution of instructions. The development of EDVAC set the stage for the evolution of commercial computing and operating system software. The hardware component technology of this period was electronic vacuum tubes. The actual operation of these early computers took place without be benefit of an operating system. Early programs were written in machine language and each contained code for initiating operation of the computer itself. This system was clearly inefficient and depended on the varying competencies of the individual programmer as operators. The First Generation, 1951-1956 The first generation marked the beginning of commercial computing. The first generation was characterized by high-speed vacuum tube as the active component technology. Operation continued without the benefit of an operating system for a time. The mode wa s called "closed shop" and was characterized by the appearance of hired operators who would select the job to be run, initial program load the system, run the user‘s program, and then select another job, and so forth. Programs began to be written in higher level, procedure-oriented languages, and thus the operator‘s routine expanded. The operator now selected a job, ran the translation program to assemble or compile the source program, and combined the translated object program along with any existing library programs that the program might need for input to the linking program, loaded and ran the composite linked program, and then handled the next job in a similar fashion. Application programs were run one at a time, and were translated with absolute computer addresses. There was no provision for moving a program to different location in storage for any reason. Similarly, a program bound to specific devices could not be run at all if any of these devices were busy or broken. At the same time, the development of programming languages was moving away from the basic machine languages; first to assembly language, and later to procedure oriented languages, the Page 1 GE6151-COMPUTER PROGRAMMING 1
most significant being the development of FORTRAN The Second Generation, 1956-1964 The second generation of computer hardware was most notably characterized by transistors replacing vacuum tubes as the hardware component technology. In addition, some very important changes in hardware and software architectures occurred during this period. For the most part, computer systems remained card and tape-oriented systems. Significant use of random access devices, that is, disks, did not appear until towards the end of the second generation. Program processing was, for the most part, provided by large centralized computers operated under mono-programmed batch processing operating systems. The most significant innovations addressed the problem of excessive central processor delay due to waiting for input/output operations. Recall that programs were executed by processing the machine instructions in a strictly sequential order. As a result, the CPU, with its high speed electronic component, was often forced to wait for completion of I/O operations which involved mechanical devices (card readers and tape drives) that were order of magnitude slower.These hardware developments led to enhancements of the operatin g system. I/O and data channel communication and control became functions of the operating system, both to relieve the application. programmer from the difficult details of I/O programming and to pr otect the integrity of the system to provide improved service to users by segmenting jobs and running shorter jobs first (during "prime time") and relegating longer jobs to lower priority or night time runs. System libraries became more widely available and more comprehensive as new utilities and application software components were available to programmers. The second generation was a period of intense operating system development. Also it was the period for sequential batch processing. Researchers began to experiment with multiprogramming and multiprocessing. The Third Generation, 1964-1979 The third generation officially began in April 1964 with IBM‘s announcement of its System/360 family of computers. Hardware technology began to use integrated circuits (ICs) which yielded significant advantages in both speed and economy. Operating System development continued with the introduction and widespread adoption of multiprogramming. This marked first by the appearance of more sophisticated I/O buffering in the form of spooling operating systems. These systems worked by introducing two new systems programs, a system reader to move input jobs from cards to disk, and a system writer to move job output from disk to printer, tape, or cards. The spooling operating system in fact had multiprogramming since more than one program was resident in main storage at the same time. Later this basic idea of multiprogramming was extended to include more than one active user program in memory at time. To accommodate this extension, both the scheduler and the dispatcher were enhanced. In addition, memory management became more sophisticated in order to assure that the program code for each job or at least that part of the code being executed was resident in main storage. Users shared not only the system‘ hardware Page 2 GE6151-COMPUTER PROGRAMMING 2
but also its software resources and file system disk space. The third generation was an exciting time, indeed, for the development of both computer hardware and the accompanying operating system. During this period, the topic of operating systems became, in reality, a major element of the discipline of computing. The Fourth Generation, 1979 - Present The fourth generation is characterized by the appearance of the personal computer and the workstation. Miniaturization of electronic circuits and components continued and Large Scale Integration (LSI), the component technology of the third generation, was replaced by Very Large Scale Integration (VLSI), which characterizes the fourth generation. However, improvements in hardware miniaturization and technology have evolved so fast that we now have inexpensive workstation-class computer capable of supporting multiprogramming and time-sharing. Hence the operating systems that supports today‘s personal computers and workstations look much like those which were available for the minicomputers of the third generation. Examples are Microsoft‘s DOS for IBM-compatible personal computers and UNIX for workstation. However, many of these desktop computers are now connected as networked or distributed systems. Computers in a networked system each have their operating system augmented with communication capabilities that enable users to remotely log into any system on the network and transfer information among machines that are connected to the network. The machines that make up distributed system operate as a virtual single processor system from the user‘s point of view; a central operating system controls and makes transparent the location in the system of the particular processor or processors and file systems that are handling any given program. CLASSIFICATION OF COMPUTERS There are four classifications of digital computer systems: super-computer, mainframe computer, minicomputer, and microcomputer.  Super-computers are very fast and powerful machines. Their internal architecture enables them to run at the speed of tens of MIPS (Million Instructions per Second). Supercomputers are very expensive and for this reason are generally not used for CAD applications. Examples of super-computers are: Cray and CDC Cyber 205.  Mainframe computers are built for general computing, directly serving the needs of business and engineering. Although these computing systems are a step below supercomputers, they are still very fast and will process information at about 10 MIPS. Mainframe computing systems are located in a centralized computing center with 20100+ workstations. This type of computer is still very expensive and is not readily found in architectural/interior design offices.  Minicomputers were developed in the 1960's resulting from advances in microchip technology. Smaller and less expensive than mainframe computers, minicomputers run at Page 3 GE6151-COMPUTER PROGRAMMING 3
several MIPS and can support 5-20 users. CAD usage throughout the 1960's used minicomputers due to their low cost and high performance. Examples of minicomputers are: DEC PDP, VAX 11.  Microcomputers were invented in the 1970's and were generally used for home computing and dedicated data processing workstations. Advances in technology have improved microcomputer capabilities, resulting in the explosive growth of personal computers in industry. In the 1980's many medium and small design firms were finally introduced to CAD as a direct result of the low cost and availability of microcomputers. Examples are: IBM, Compaq, Dell, Gateway, and Apple Macintosh. The average computer user today uses a microcomputer. These types of computers include PC's, laptops, notebooks, and hand-held computers such as Palm Pilots. Larger computers fall into a mini-or mainframe category. A mini-computer is 3-25 times faster than a micro. It is physically larger and has a greater storage capacity. A mainframe is a larger type of computer and is typically 10-100 times faster than the micro. These computers require a controlled environment both for temperature and humidity. Both the mini and mainframe computers will support more workstations than will a micro. They also cost a great deal more than the micro running into several hundred thousand dollars for the mainframes. processor The term processor is a sub-system of a data processing system which processes received information after it has been encoded into data by the input sub-system. These data are then processed by the processing sub-system before being sent to the output subsystem where they are decoded back into information. However, in common parlance processor is usually referred to the microprocessor, the brains of the modern day computers. There are two main types of processors: CISC and RISC CISC: A Complex Instruction Set Computer (CISC) is a microprocessor Instruction Set Architecture (ISA) in which each instruction can indicate several low-level operations, such as a load from memory, an arithmetic operation, and a memory store, all in a single instruction. The term was coined in contrast to Reduced Instruction Set Computer (RISC). Examples of CISC processors are the VAX, PDP-11, Motorola 68000 family and the Intel x86/Pentium CPUs. RISC: Reduced Instruction Set Computing (RISC), is a microprocessor CPU design philosophy that favors a smaller and simpler set of instructions that all take about the same amount of time to execute. Most types of modern microprocessors are RISCs, for instance ARM, DEC Alpha, SPARC, MIPS, and PowerPC. The microprocessor contains the CPU which is made up of three components--the Page 4 GE6151-COMPUTER PROGRAMMING 4

Lecture Notes