Do not fold, bend, spindle or mutilate
An overview of important inventions in the development of computers
The software that we use today as our apps, internet and games has come a long way. Not unimportant was the role of insurance companies in developing computer systems. We would like to take you on a journey through the history of the computer.
It begins with the punched card: this card was initially used to store information, for instance to register votes, save bank details or count populations. Specifically designed machines were used to count the data on the cards or to make other calculations that would take a long time if done by hand.
The first punched cards existed as far back as 1725 and they came in many sorts and sizes, but in 1928 IBM made a standard sized card that became generally used: the IBM-80. Every column represented a single parameter, and this parameter could be given a value by punching a hole at a certain level (row) in the column. Was the card used to represent information about a person with three children, and did column 1 state ‘number of children’? Then a hole was punched on the third row from the top. A punched card only consisted of ten rows, so only the numbers from 0 to 9 were possible. To get a range of 0 to 99, two columns were needed.
Example of a punched card.
On top of every column, the value to be punched is printed in ink. This card shows how different characters were represented on the card.
In the beginning it was impossible to store negative numbers, but a solution was quickly found: by punching an extra hole above the first row, one could indicate that this column contained a negative number. This was called overpunching. In 1931, IBM started to support even more possibilities, such as letters and special characters, by making combinations of holes in the same column.
It eventually became possible to use all of the binary combinations of a column on the punched cards, theoretically resulting in 2^12=4096 possible values per column.
From counting to programming
The data on the punched cards served as input for calculations, but the calculations themselves were not included on the card. These early computers were designed to execute a specific calculation: changing this calculation meant rewiring the machine. It could take up to three weeks to wire a new programme and have it up and running.
In 1945, John von Neumann wrote his ‘First Draft of a Report on the EDVAC’, in which he proposed a new architecture for computers. His proposal was revolutionary because both the data and the calculation instructions were input for the computer. The punched cards now contained both the data and calculation instructions, recorded in so-called machine language. As a matter of fact, the von Neumann architecture is still used, for example in the computer on which you are reading this text.
By this time, computers were able to make punched cards as the output of a calculation. Because the cards could now contain the calculation instructions, computers were able to generate programmes that, in turn, could be executed. This is how Assemblers came into existence. Assemblers are able to translate standardized Assembly language into binary instruction codes, specifically for that particular computer. From now on, programmers could punch Assembly language instead of machine language onto the cards, after which the computer would translate this into binary instructions. The first programming language became a fact. Until this moment, every computer had its own team of programmers, but because Assembly was a standardized language, programmes could be executed on all computers with an Assembler. Nevertheless, this language was still complex and unorganized. The instructions were cryptic, and using conditional instructions (if-statements) quickly made the instructions incomprehensible.
To make computers easily programmable, compilers were invented. These made it possible to program in a language that people could easily read. The Formula Translating System (Fortran) is an example of such a language that converts formulas like ‘c=a+b’ and ‘print *,c’ to Assembly. Fortran is the first higher level programming language that came into general use. (Compiler was ready in 1957). For decades, it was the most applied programming language for technical, physical and econometrical problems. In fact, a new version of this language was released in 2003.
The modern era
In the early 1990s, object-oriented programming made its debut. In this language, parts of the code that logically belong together are placed in objects. Every object has its own functions and responsibilities. The calculations within objects are separated from other objects, and the communication between objects is fixed in interfaces. This way of working makes it convenient for programmers to work together or to continue someone else’s work.
In the mid-1990s, the concept of Meta modelling was introduced. Just as in the above-mentioned steps from machine language to assembly and from assembly to compiler, an extra layer was added: It became possible to generate object-oriented model definitions in an object-oriented language. An example of this is ProductXpress. In ProductXpress Designer, you define classes/objects as you would in a normal object-oriented language. The output functions as the input for the ProductXpress calculator, which in the end will calculate your insurance premiums, for instance.
At the moment that your premium is calculated, the ProductXpress calculator initializes the insurance model as an object-oriented model. The calculator will convert the calculations to be executed into a language resembling Assembly, but more modern. Eventually, you calculations will be converted into a machine language that is executed by the processor of your insurance company’s computer. And this entire process takes place in a few thousandths of a second.
The wish to quickly process census data has expanded rapidly to the computers we know today. Time and time again we came up with a programming language that was closer to our own language and further away from computer language. We can now ‘speak’ less in the language of the computer; the computer understands our programming language and translates it into its own language.
What will the future bring?