What can be said about today’s word that hasn’t already been said? Regardless of whether you refer to our current time as the Information Age, the Digital Age, or any of a number of different titles, today’s word is largely responsible for it. It’s what makes a smartphone “smart”, what powers much of our entertainment, and what really makes globalization possible. Naturally, this can only mean that today’s word is computer; yet, for all that this word means to our daily lives, few people can envision the term having the history that it does.
The word compute comes to us from the French computer, from the Latin computare, meaning ‘to sum up, count, or reckon together’. By adding an -r to the end of compute, we have our word, computer, which literally denotes ‘someone who calculates or makes arithmetical calculations’.
In fact, it was this definition referring to an individual person that is responsible for the first usage of the term. Richard Braithwaite, in his 1613 book entitled The Yong Mans Gleanings, states: “I have read the truest computer of Times, and the best Arithmetician that ever breathed, and he reduced the days into a short number.”
Considering this definition, the argument could be made that anyone who tabulates numbers, such as an accountant or bookkeeper, could also be considered a “computer” (i.e. “one who calculates”), but what about something a bit more “modern”, like a machine?
Strictly defined as a machine or device for performing or facilitating calculation, the earliest mentions in English come from the 1800s when the concept of a programmable computer was pioneered by the English mechanical engineer and polymath, Charles Babbage, who design an automatic mechanical calculator to aid in navigational calculations, known as a difference engine; followed by the first modern analog computer invented by Sir William Thomson in 1872; and the first recorded media article: the journal Engineering describing in 1897 an actual device, noting: “This was..a computer made by Mr. W. Cox. He described it as of the nature of a circular slide rule.” Outside of English though, computing devices such as these date back to Classical antiquity, notably the abacus, astrolabe, and the Antikythera mechanism.
While these devices soon became electronic, they were, for virtually the next 50 years (electromechanical relay computers appeared in the 1930s, and the next century welcomed the fully automatic digital computer), still what we would consider to be calculators – fixed programs strictly for mathematical purposes. However, after World War 2, we finally begin to see computers branching out into more applications, slowly becoming the multifunctional processing machines that are responsible for so much of our modern world.
The real explosion of computer usage occurred when computers started to become smaller, more powerful, and more widely available (as well as affordable), such as with the introduction of the Bendix G15 in 1956 or the LPG-30, which was priced at the modern equivalent of EUR 373,000. The 1977 introduction of the Commodore PET and (3 months later) the Apple II (Steve Jobs and Steve Wozniak) brought the fully-assembled and mass marketed computer into our homes, opening the floodgates to a multitude of software and application development. Time magazine’s 1982 “Machine of the Year” progressively became smaller, more powerful, more portable, and more indispensable.
Indeed, computers have come a long way in the 60 years they’ve been widely available, but this is only the beginning. From yoga outfits that can determine the correctness of your position to wearable devices that can monitor your health to contact lenses which can adjust to meet your vision needs, computers are becoming the new backbone for health and wellness products. Artificial Intelligence and machine learning is in the process of changing many aspects of our daily lives, from how we produce goods, heat our homes, drive our cars, or communicate with one another (regardless of language differences). Another worthwhile mention is blockchain, which, though it’s to-date mostly been utilized for financial transactions, has the potential to completely disrupt the data security industry, among others. Of course, with all of this disruptive potential, it’s worth noting that this disruption must also take into account that any development comes at a reasonable cost and that social and legal norms often struggle to keep up with technological development.