Abacus, one of the oldest computing devices developed – perhaps as long as 4,500 years ago – was the calculator for merchants and accountants until the advent of mechanical calculators in the 19th century. Although now a relic, the abacus still lives on in niche areas of teaching mathematics. It originally served as a basis for number systems and aided in rapid and complex mathematical calculations.
Every year, experts and industry analysts come up with new acronyms and concepts for the next big trend in computing. The hype drives both innovation and fear.
If examined closely, they are often based on concepts and technologies that don’t have a catchy name or don’t seem exciting enough, so no one notices them. Some of them move from mere hype to mainstream and others fade away.
There are several ways to examine these trends to determine if a company should jump on the bandwagon – is real or just the next fad. Is it being used by early adopters or merely by companies conducting R&D to determine if there is any real value? Most of the time, they are based on the ‘non-exciting stealth technologies’ – i.e. technologies that fly under the radar.
“The rise and fall of technology is mainly about business and not technological determinism,” said Richard S. Tedlow, a business historian at the Harvard Business School. What are some of the commonality of these technologies?
Mostly, it seems, there is a core technology dependency; there must be a key advantage in the technology that is either not entirely supplanted by the new technology or required as a basis. It also is a business decision that matters most. Technology is all about adoption, familiarity and having the skill available to support it.
To survive, technologies must evolve, much as animal species do in nature. “Technologies want to survive, and they reinvent themselves to go on,” as Paul Saffo, a technology forecaster in Silicon Valley, puts it.
The surviving technologies also build on their own ecosystem of technical foundation, people skilled in the use of them and the business culture. Changes in the economic environments can also lead to the renaissance of an older technology as part of a rebranded new one.
The weight of legacy is underestimated, according to John Staudenmaier, editor of journal Technology and Culture, because innovation is so often portrayed as a bold break with the past.
There's a great Jeff Bezos quote about never being asked what isn't going to change in the next ten years, and that those are really the things you should build businesses around and optimise. A lot of old blocks were key ingredients in the new so-called technology transformations.
IBM’s System/360, which was designed in the 1960s, has introduced technologies still in use today. The mainframes are a standing testimony in the larger story of survivors both as technologies and business applications.
Industry pundits predicted that these technologies will be long gone, but they have proven otherwise. Some of these technologies become a basis for the next generation and also they often find a sustainable, profitable future.
Television, for example, was supposed to kill radio – and movies, for that matter. Cars, trucks and planes could have spelled the death of railways. The current cloud with its virtualised, multi-tenanted, containerised architecture has borrowed heavily from the mainframes, and even mainframes are virtualised as cloud platforms.
The backbone of the internet, FTP (file transfer protocol), celebrated its 44th birthday in April. Originally launched as the RFC 114 specification, which was published on 16 April 1971, FTP is arguably even more important today than when it was born.
Even though young upstarts such as P2P networks are now available, it’s FTP that forms the link to many cloud-based services and applications. It’s also deemed more secure than P2P, which is an essential trait for online banking or other sensitive traffic.
EMV in the payment industry is being portrayed as last decade’s technology, but this has become the de-facto chip-based security standard of today. Developed as a fraud-resistant payment standard by a consortium of card networks in the early 2000s, the Europay, MasterCard and Visa systems govern point-of-sale payments in nearly every corner of the globe – except the United States, with impending implementation by the end of this year.
Most of the long-term government use of technology is not due to its bureaucracy but the issue of cost. For decades, IRS kept its tax records in flat file format in tapes and accessed via COBOL.
The modern NoSQL database promotes the use of keeping data in simple columns and relies upon the huge storage and processing powers today branded as big data. How many people can recall Donald E. Knuth, the father of data structures, who sparked a generation of thinkers using his Spark language? Big data will exist but for these algorithms and data structures.
When it comes to weather predictions, satellites and Doppler radar are publicised widely, but the National Weather Service still relies on old-fashioned balloons – and much of the data sent from those balloons is still processed by legacy IBM PC/XT machines with 640KB of RAM, dating from the 1980s
The current generation of mobile phones is powered with great processing capabilities and improved battery power. The bulk of the licensing for these technologies came from Transmeta Corp. Crusoe Processor designed in 2000 – with great power comes great heat, short battery life and immense electricity consumption. Hence Transmeta’s goal of designing a low-power processor that would be hard to beat.
We do not really think of the enormous capacity of storage in our hands. Thanks to Toshiba NAND Flash Memory designed in 1989, it is a reality. Before flash memory came along, the only way to store what then passed for large amounts of data was to use magnetic tapes, floppy disks, and hard disks. Now, NAND flash is a key piece of every gadget – cellphones, cameras, music players, and of course, the USB drives that techies love to wear around their necks.
We all are accustomed to listening to music in our portable mp3 players. This would not be possible if Texas Instruments did not invent its TMS32010 Digital Signal Processor in 1983. Created by Texas Instruments, the TMS32010 wasn’t the first DSP (that’d be Western Electric’s DSP-1, introduced in 1980), but it was surely the fastest.
We can go on and on into the unsung contributions that are still at play. A winner will always have to see how to combine these into an offering that can transform and improve lives. No amount of hype can become a reality but for the contribution from these various inventions, technologies and inventors.
Sourced from Subramanian Gopalaratnam, Xchanging