We are on the verge of revolution. A revolution that will make the industrial revolution seem trivial. We stand at the focus, the cumulation, the precipice, of a fundamental re-thinking of our shared social fabric. Actions today will reverberate for centuries. Technologies like bitcoin evince prophecy.
We understand the past, to understand the present. We understand the present, to understand the future. The past is the key to the future. This information and cultural revolution has roots in the free culture and computing movements of the last decades. A few dedicated men had foresight to understand the capabilities of the machines they were building. Thanks to their legacy of an ethical framework for computing, we find ourselves in this empowering position.
Origins of computing
Modern computing traces its ancestral roots to the early 30s, but there were earlier sparks signalling the age of computers.
In the early 1800s, Charles Babbage wanted to construct a general-purpose computing device: the Analytical Engine. While constructing his earlier Difference Engine, Babbage realised that he could go beyond mechanical calculator, to programmable computer. To create a machine capable of being programmed to perform a wide multitude of calculations. His design an echo of the modern computer with branching conditionals, looping and a memory storage.
Sadly Babbage was an irascible character, constantly arguing with his lead engineer and tinkering with his designs. His collaborator, Ada Byron Lovelace begged him to let her handle the business, leaving him to focus on the technical details by “allowing myself and such parties to conduct the business for you.” Ada writes that although being his best friend, that she cannot and will not support him acting in ways that are “not only wrong in themselves, but suicidal.”
After 2 decades of Babbage burning through vast amounts of cash with only a small prototype to show, the UK government pulled the plug. By now Babbage was making new designs, constantly revising them without fixing on anything. To this day, there are no consistent or complete plans for an Analytical Engine because of the constant revisioning. Babbage kept dreaming grander, bigger and better designs.
Ada writes on the building of the Analytical Engine, how she “scarcely anticipate[s] it ever will be [completed], unless someone really exercises a strong co-ercive influence over him [Babbage]). He is beyond measure careless and desultory at times.”
Had the Analytical Engine been built in the 1800s, it would be more advanced the the early computers in the 1940s; digital, programmable and Turing-complete.
And yet, Babbage’s machine fell into obscurity. The early pioneers of modern computing in the 1930s and 1940s knew not of Babbage’s or Ada’s work. Their architectural innovations were re-invented. Modern computing developed independently of Babbage’s and Ada’s thorough and assiduous work.
A brilliant cacophonic flash in the damp brown days of Victorian industrialisation followed by a long enveloping darkness. And 100 years later, a slow glimmer of activity. A growth that speaks of confidence and vision.
Ada realised something Babbage did not. To Babbage, his engines were improved calculating machines for performing algebra, and manipulating plus and minus signs. Bound by number, and manipulators of quantity.
Ada Lovelace saw beyond that. Ada saw that numbers represent entities more than quantity. Once you had a machine for manipulating numbers, you had a machine capable of manipulating symbolic representation of things according to rules. Numbers that could represent letters, musical notes and other entities.
She was the first person to make this mental leap, and the only person at the time who understood this potential in Babbage’s work. Looking back through history, we see the first explicit transition from calculation to general-purpose computing made by Ada in her 1843 paper. She spoke across the ages of a future 150 years later. Ada was the first person to cross the intellectual threshold from conceptualising computing as mere calculation, to computing as existing today: an exercise in abstraction and symbolic representation.
“The calculus of operations [i.e programming] is likewise in itself a topic of so much interest, … Whether the inventor of this engine had any such views in his mind while working out the invention, or whether he may subsequently ever have regarded it under this phase, we do not know; but it is one that forcibly occurred to ourselves on becoming acquainted with the means through which analytical combinations are actually attained by the mechanism.” ~ Ada Lovelace
Whereas Babbage designed the machine, Ada designed programs for Babbage’s machine. Thanks to this, she is popularly considered the world’s first computer programmer. In her pivotal 1843 paper, she wrote the first known computer program for calculating a sequence of Bernoulli numbers.
Babbage was solely concerned with the ability of his designs for arithmetic calculations. Ada went further. She imagined wider applications. Ada’s 1843 article made prescient comments about such a machine being used to compose complex music, produce graphics and for scientific and practical use in daily people’s lives. Ada Lovelace envisioned the future of computing through Babbage’s invention.
“Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.” ~ Ada Lovelace
Exchanging multiple mail deliveries per day and their contents, you get a sense of the excitement, passion and often frustration; Ada from Babbage’s carelessness, and Babbage from Ada’s rebuking. Despite this they remained professionally cordial, as Ada’s workdays often stretched to 18 hours.
Ada believed that mathematical science was more than a “vast body of abstract and immutable truths” but that the beauty and symmetry possesses a “deeper interest for the human race”. That this science has a language that can be used to express the natural world and its relationships. She likens mathematical truth as a way for men to perceive creation. Her world-view and belief system thus permitted her to connect the dots and envision the Analytical Engine at a higher conception.
“The bounds of arithmetic were however outstepped the moment the idea of applying the cards had occurred; and the Analytical Engine does not occupy common ground with mere “calculating machines.” It holds a position wholly its own; … A new, a vast, and a powerful language is developed for the future use of analysis, in which to wield its truths so that these may become of more speedy and accurate practical application for the purposes of mankind … Thus not only the mental and the material, but the theoretical and the practical in the mathematical world, are brought into more intimate and effective connexion with each other. We are not aware of … a thinking or of a reasoning machine.” ~ Ada Lovelace
Computing’s first incarnation and stirrings truly began in the 40s from academics. For various reasons, people started building thinking machines to carry out repetitive time-consuming tasks. Where these thinking machines differed was their ability to run programs or a ‘calculus of operations’ as Ada called it.
Donald Michie was a British researcher in artificial intelligence. While pining to study Japanese language during wartime Britain and finding the course full, he ended up enrolling in cryptography. Displaying an intuition for the subject, Michie contributed to the war effort on breaking the German Lorenz ciphers. While at Bletchley Park working alongside another renowned scientist, Alan Turing, the potential for these thinking machines transpired on Michie. A passion for computing grew in Michie. Despite going to pursue genetics after the war, he was pulled into the world of artificial intelligence, driven by wild ideas on social computing.
Michie understood that computers could go beyond running simplistic programs to calculate some result or optimise some obscure problem. Michie understood that programmers instruct computers to perform tasks in a crude way. And that by developing intelligent software, people could instruct computers. The intelligence of the computer could inter-operate with human intelligence.
Michie pioneered the field of artificial intelligence. No longer controversial and commonplace, at the time many saw the field as a fool’s gold and a mirage.
In 1973, mathematician Sir James Lighthill published a now famous document for the UK government. The “Lighthill report” was criticial of AI research; chiefly robotics and language processing. Lighthill’s report effectively terminated funding by the British government in what is popularly referred to as the “AI winter.”
That same year occurred a well publicised TV debate between Lighthill and 3 prominent AI-researchers including Donald Michie. Lighthill was scornful of AI research. Lighthill claimed language translation by machines as an impossibility and argued the uniqueness of man’s mind in front of a receptive audience. That all efforts of expenditure in the field as worthless as ancient efforts to transmute lead to gold.
Michie uncomfortably fails to adequately rebuke Lighthill, falteringly presenting a vision of a rich and developing field while maintaining the veneer of a dispassionate academic. However Michie never hid his ambitions to create learning machines that would one day achieve human level intelligence.
Whereas others failed to share Michie’s visions, he remained unfaltered in his lifelong obsession of achieving AI. Michie quotes with approval, a report submitted on Nov 15th 1876 to the president of the US telegraph company reporting on Mr GG Hubbard’s predictions in a dismissive belittling tone, calling his device “hardly more than a toy” and referring to it as a “laboratory curiosity”. The report continues: “Mr AG Bell, the inventor, is a teacher of the hard of hearing. And this ‘telephone’ may be of some value for his work, but it has too many shortcomings to be seriously considered as a means of communication.”
Michie took the idea of a computer running generic tools a step further. From his work in AI, and the increasing level of automation in the 70s, Michie extended the image of a generic platform to generic programs that could intelligently perform a wide range of actions.
General purpose computers are quite different and wholly amazing than any other form of appliance before them. Every appliance (sans computers) is built for a specific designed purpose. Outside of an appliance’s designed domain, it performs poorly – if at all.
There were many 20th century attempts to combine car and aeroplane into a flying car. All failed and none were successful. They /can/ be made to work. Sort of. But a good plane is not a good car. And a good car is not a good plane. Neither of these appliances mixes well, as they are both built for the highly specialised tasks of driving and flying.
General purpose computers are a different class of appliance. They offer the freedom and openness to utilise them to run a wide variety of software programs specified by a programmer. Computers are generic platforms with no biases or prejudices towards right or wrong.
“This revolution could lead to terrible consequences, or it could lead to the greatest advances ever for the human race. Which of these things are to happen is up to us.” ~ Donald Michie
This freedom allows creators to envision uses for these machines beyond what the engineers imagine. It is why we have a rich ecosystem of software gigantic in scope and size, and worth trillions in man-hours. Ada saw this potential when she began tentatively designing her simple programs to calculate number sequences.
Humans and computers interact in a holy trinity of code, flesh and metal. Computers with their metallic calculating minds are instructed through a highly specific sequence of operations called ‘machine language’. Machine language is devoid of the ambiguities and fuzziness of our own human languages. However, machine language is far too complex and mathematical for any human mind to muster in a meaningful way. The sheer number and intricacy of operations in any reasonably sized software in machine language makes it impossible to examine and understand.
Humans and computers therefore meet at a middle-road. A halfway point between human language and machine language aptly called a ‘programming language’. A human with an aptitude for a programming language is called a programmer. There are amateur weekend hobbyists who dabble in the art of programming, and there are professional master scribes of programming.
And like human languages, there are many regional dialects and variations of programming languages – this author is fluent in the C++ programming language and dabbles in the Python programming language.
A piece of text written in a programming language is called source-code. That is: the source of codes that will be translated by a helpful computer tool into machine language.
[DIAGRAM HERE: Human -> Sourcecode ---[translated by a tool called a "compiler"]—>  Machine language]
I can read sourcecode like someone can read Spanish or Russian. It is a mutual language through which I share communion with a computer. Conversing this language allows me to enter the mind of machine. To know its thoughts and feelings.
And it is through this sourcecode that I can understand how the software on my computer operates. With access to sourcecodes, I am an empowered computer user, able to vet every detail and liberated with the knowledge that the software on my computer works as advertised without pretense nor corrupted by an external party.
Donald Michie saw this, and extended it further. If computers are generic platforms running specialised software applications, what is to stop programmers creating generalised intelligent software applications? He saw software programs not merely as a means to an end, and a useful productivity enhancer for industry. Michie went further. Michie saw software as an end itself – “As significant as when the first wheel rolled.”
“Today computers are mindless slaves. They are becoming immensely powerful and versatile systems. This is bringing about the greatest revolution that the human race has ever known. A revolution which will enhance the value of human talent, and diminish the value of unskilled human labour.” ~ Donald Michie
Michie believed software would one day play a pivotal and socially impactful role in shaping 21st century society.
“I think that I would like if I might be allowed to, to utter a small warning here dredged up from the remote past nearly 100 years ago. It is in fact, a very short excerpt that I want to read from a report. It was a report submitted on Nov 15th 1876, to the president of the US telegraph company. And he goes as follows: “Mr GG Hubbard’s handsome predictions, while they sound very rosy, are based upon wild-eyed imagination and a lack of understanding of the technical facts of the situation. And a posture of ignoring the obvious technical limitations of his device. Which is hardly more than a toy. A laboratory curiosity. Mr AG Bell, the inventor, is a teacher of the hard of hearing. And this ‘telephone’ may be of some value for his work, but it has too many shortcomings to be seriously considered as a means of communication.”
~ Donald Michie
“The other concept which I think comes under the same term of general purpose and may be confused with it is the notion of versatility. By which one means the ability to reinstruct, re-educate almost, a device rather quick and rather easily and rather conveniently from the point of view of the human user.”
~ Donald Michie