Online Computer Science Degree Guide: Programming History
Computer programming is a phase of software engineering concerned with software development. Software engineers design software such as computer games, business applications, operating systems, and network control systems. They possess expert knowledge of computing theory, software structure, and the nature of hardware. Programing, which is sometimes referred to as coding, requires a broad knowledge of computer science with specialized training and extensive knowledge of algorithms and logic. The career includes the processes of designing and writing code in specific programming languages in order to create programs that behave in a certain manner. After the initial design is created, a programmer has to rigorously test the program and debug it, refining the program for any errors before making it commercially available to other users.
History of Computer Programming
The history of modern computer programming is, of necessity, brief, since the computer wasn’t invented until WWII. However, the idea of programming a computational machine in order to make it perform a certain task is much older.
The first calculating machines were invented by Blaise Pascal and Wilhelm Schickard in the 17th century, but as they could only perform very specific calculations, these machines cannot really be considered programmable. Therefore, the person to conceive a machine that could be programmed was Charles Babbage, whose Analytical Engine was imagined in 1835. The Computer History Museum calls Babbage a pioneer for his work on what was, strictly speaking, a calculator. Still, the CHM stresses that such a designation is well-deserved, for Babbage’s Engine contained many feature still used in today’s computers. It was programmable with punched cards, and had a “store” where immediate results were held, as well as a “mill” where the mathematical calculations were processed, as well as what the CHM calls “facilities” for the input and output of data.
It wasn’t until over a century later that the first example of a modern computer was build by engineer Konrad Zuse in Germany in 1941. It was a fully-functional, program-driven, digital computer called the Z3, but it did not survive WWII.
In the United States, Howard Aiken led a team of scientists from Harvard and IBM in creating another programmable computer called the Mark I, which was finished in 1944. Capt. Grace Hopper led the effort to develop a programming language called COBOL.
In 1945, John Von Neumann developed two key programming techniques. First, his “shared-program technique” stated that the actual computer hardware should be simplified to eliminate the need to hand-wire it for each program. He advocated for complex instructions that would control the simple hardware, allowing for faster reprogramming.
Second, “conditional control transfer” introduced small blocks of code that could be jumped to in any order, instead of the computer having to go through a single set of chronologically ordered steps. Von Neumann said that computer code should be able to branch according to logical IF -THEN statements.
In 1949, Short Code appeared. This language required programmers to manually change its statements into 0′s and 1′s. Nonetheless, it was the precursor for today’s complex languages. In 1951, Hopper wrote the first compiler, a program that turns the language’s statements into 0′s and 1′s for the computer to understand. This lead to measurably faster programming.
By the mid-’50s, engineers at MIT were experimenting with keyboard-command input, and in 1957, a new programming language called FORTRAN (short for FORmula TRANslator) was developed so a computer could perform one repetitive task from a single set of instructions using loops. In 1960, a program called LISP was designed for writing artificial intelligence programs. Soon, a British engineer named C. A. R. Hoare developed a sorting algorithm called Quicksort.
In 1962, ASCII–the American Standard Code for Information Interchange–was created to allow machines from different manufacturers to exchange information. An easy-to-learn programming language called BASIC was created in 1964. The ‘70s saw the development of Unix and VisiCalc, which were quickly followed by MS-DOS. In the mid-’80s, C++ (closely related to C programming) emerged as the dominant programming language.
Modern Programming Languages
Different styles of programming are called programming paradigms. Programmers take into account many different factors when choosing a language to use. For instance, company policy might dictate use of one language over another, while certain languages are better suited to some tasks than others. Additionally, personal preference may influence a programmer’s choice.
Languages range from low-level to high-level. Low-level languages execute faster because they are more machine-oriented while high-level languages are easier to use, but slower in execution. According to How to Think Like a Computer Scientist by Allen Downey, all major languages contain a set of the same basic instructions. These include the following:
1. Input: where the data comes from, like a keyboard or file
2. Output: the data that is displayed on the screen
3. Arithmetic: they perform basic mathematical operations
4. Conditional execution: the language can execute sequences appropriately
5. Repetition: actions can be performed repeatedly
Determining the best programs for a given job is not always an easy task, but there are some general guidelines for evaluating the proficiency of a program.
1. Reliability. A program can be judged on how often it produced correct results and the minimization of mistakes.
2. Robustness. This term refers to a program’s ability to anticipate problems that are out of the programmer’s control. Examples of such problems are corrupt data, lack of memory, or issues with operating systems.
3. Usability. A program needs to be clear, complete, and intuitive so people can use it easily.
4. Portability. This refers to how many operating systems and hardware platforms a program is compatible with.
5. Maintainability. How easily can a program be modified for the future?
6. Efficiency. The fewer system resources a program utilizes, the better.
According to the Bureau of Labor Statistics, there are excellent job prospects in computer programming, as the career is expected to be among the fastest-growing careers through 2018. A bachelor’s degree is considered the minimum qualification for most programming jobs, although some require further education.
Pick a Tutorial provides links to tutorials and articles on many different programming and scripting languages. If you’re interested in the history behind the machine, check out the Computer History Museum’s Timeline of Computer History.