A computer is more than just a glorified calculator — and yet we still call it a “computer” — the first of many job descriptions that migrated from people to computers.
Likewise, while typing keyboards have remained a consistent method for entering data into computers, from the early days of card punch machines to terminal emulators and current graphical environments such as Zowe, they have never been the exclusive way to use or interact with a computer. And yet, without the advent of electric typewriter keyboards, none of this would have been possible.
Originally, typewriter keyboards were manual. In fact, the very reason for the maddeningly counterintuitive arrangement of letters in the recognizable “QWERTY” configuration on the keyboard begins with the earliest manual typewriters. They didn’t have the physical capacity to work at the natural speed of human typists, hence their pathological arrangement of letters was chosen. It had the effect both of slowing typists down and having frequently-typed characters sufficiently distant from each other to minimize the likelihood of keys jamming when adjacent letters were typed too rapidly.
Ah, the power of legacy. Why am I reminded of JCL (Job Control Language)? Yes, once that configuration got established, even though keyboard technology advanced to allow for quicker typing, it was easier to train people to type more quickly than to change the order of keys on a standardized typewriter keyboard. While more efficient alternatives such as the Dvorak keyboard have been tried, none have been able to overcome the inertia of incumbency.
The era of speed typing competitions was born. Various manufacturers of manual typewriters designated expert typists to prove that their particular technology could allow people to type faster than any other, QWERTY keyboards and all. This didn’t turn out as one might expect.
You might think that if there were a consistent winner of such competitions, they would automatically become the leading manufacturer of typewriters. Yet, as I have previously asserted, the history of the Underwood Typewriters company is a cautionary tale about being too much better than one’s competitors. Going undefeated from 1906 to 1930, when the competitions were canceled just in time for the Great Depression, instead of becoming the brand of choice for all consumers, Underwood found that their competitors simply left the playing field to look for other marketing strategies.
To add insult to injury, during the five-year-gap before competitions resumed, IBM introduced market-ready electric typewriters. By 1940, such electric typewriters were allowed in the competition, and the consequent increase in typing speeds changed everything, ending the competition era.
As I pointed out in my above-cited article, it appears that the manufacturers of consumer electronics computing products chose a similar approach when faced with insuperable competitive strengths on the part of the IBM mainframe — they chose to ignore them and focus on more advantageous playing fields, such as individual choice and commodity pricing. However, because the mainframe had such established strengths and requirements for running the world economy, those points of “legacy” kept it from going away.
Here's where the story takes an interesting twist. Just when Underwood was acquired by Olivetti and their brand retired, the state of the technology changed and electric typewriters (including IBM ones) became the technology of choice for business usage, while manual typewriters remained a more affordable option for consumers. Eventually, even manual typewriters became collectible artifacts of days gone by for hobbyists, as computers took over, though the interface remained recognizably similar.
During those intervening decades, IBM remained a provider of state-of-the-art electric typewriters, and then of keyboards used for card punches, consoles, terminals, and even PCs. And one day, we woke up to find that the biggest users of manual typewriter keyboards were often writers, IT people, and others with sentimental hobbies.
That’s the thread. Now here’s the rope: in the same way, since its debut on April 7, 1964, the IBM System/360 and its successors have been the essential differentiator in business computing, similar to how electric typewriters changed their landscape. During the intervening decades, while consumer electronics computing options flooded the market with illustrations of what happens when you put cost ahead of quality, and business decision-makers got distracted by the hype, the mainframe kept improving. Today, in a continuum of improvement without a foreseeable horizon, its strengths appear ready to take on all the apparent advantages that other platforms have offered, ranging from convenient size and affordable slices of capacity to ease of use and management.
It would be disingenuous to say something like, “it began with…” because there are decades of prior art leading up to these watershed milestones. Likewise, it would be short-sighted to portray them as any kind of culmination, given the high likelihood that, from a distant historical perspective, they’re likely to be just a few more steps in an indefinitely long journey.
Still, while many of the great mainframe innovations have increased its capacity, and Moore’s Law has inevitably contributed to its decrease in size, the fact is the mainframe is suddenly an unassailable contender for cloud computing, if quality of service is considered. Among the latest examples of steps in this direction are the Cinderella footprint of the z15 that fits a standard raised floor tile; the Telum’s on-chip artificial intelligence that allows for 100% fraud evaluation of activities, such as for improper credit card transactions; its quantum-resistant encryption in the face of ubiquitous identity theft and other data compromise; and now, the rack-mountable z16 that turns a mainframe into a direct competitor against the vast array of commodity servers.
As I sit here, typing on my computer keyboard, the past successes of competitions, consumer technologies, and business technologies are a harbinger of things to come. And I’m excited to see us entering an era when the definitive computing technology that has underpinned business processing since its beginning is about to be rediscovered as, not merely the winner, but the optimal playing field for the future of quality business computing.
Reg Harbeck is the Chief Strategist at Mainframe Analytics, with a B.Sc. in Computer Science and an M.A. in Interdisciplinary Humanities (focused on the humanity of the IBM mainframe). He has worked with operating systems, networks, security and applications on mainframes, UNIX, Linux, Windows and other platforms. He has also traveled to every continent where there are mainframes and met with and presented to IT management and technical audiences, including at SHARE, Gartner, IBM zSeries, CMG, GSE, CA World and ManageTech user conferences. He has had many roles at SHARE, from speaker and volunteer to the SHARE Board of Directors. He has published many articles and blog entries and podcasts (available online) and taught many mainframe courses. Since 2020, Reg has also been recognized as an IBM Champion for Z.