Battle in Print: What is innovation for?

Dr Norman Lewis, 27 September 2006

I have argued on previous occasions that the extraordinary attempt in contemporary society to promote innovation in business, government and the media actually represents a sign of its absence. We are not living in an era of unprecedented innovation, but rather an era of risk-averse short-termism that has reduced research and development (R&D) to an adjunct of market-driven instrumentalism (Lewis 16.11.2005). I do not want to repeat these points, although they remain very relevant to answering the question of what innovation is for.

The question of innovation is intimately linked to that of technology. The development of technology by human society has never simply been about the provision of necessities, such as food and shelter. Human beings achieved these basics very early. Rather, the development of technology has been a part of the social evolution and progress of human society. Unlike animals that remain atechnical and thus survive merely by living in the natural world, human beings have developed technologies to create embellishments to their mere existence. The history of human evolution has been a continuous redefinition of the idea of necessity to include more. And that has meant that technological innovation has, and remains, a fundamental part of what it means to be human. As David Nye (2006: 2) puts it in his highly illuminating study Technology Matters, ‘Necessity is often not the mother of invention. In many cases, it surely has been just the opposite, and invention has been the mother of necessity’. He goes on to make the key point that when humans have created tools they have excelled at finding new uses for them: ‘The tool often exists before the problem to be solved. Latent in every tool are unforeseen transformations’. In short, social mediation transforms technologies into what are acceptable and socially useful adjuncts. Social circumstances determine precisely how a technology will be adopted, used or rejected.

This is an important point to establish at the outset because it reveals that innovation, or the spur to innovation, is the practical application of knowledge to existing problems. Innovation is about human problem solving. Practical problem solving has led to a quest for knowledge about the natural world, which in turn nurtured an increased ability to solve new problems facing society. Many people assume quite erroneously that the development of technological innovation has resulted from applied science. Through most of history, science has arisen after practical experimentation posed intellectual questions about engagement with matter and mechanisms. Thomas Edison built his electrical systems without the help of mathematical equations that explained electricity; Thomas Newcomen, the creator of the first practical steam engine, was an artisan who built his engine from trial and error; the Wright Brothers were bicycle mechanics without any knowledge of the science behind aerodynamics and flight. In most cases, the science and knowledge that followed enabled a refinement of the inventions and gave rise to an enormous wealth of innovations that have defined the modern world. This interaction between the impulse to solve practical problems and the knowledge and experience this subsequently gives rise to, which can then be applied to solve new and more complex problems, and so on, is what might be termed the social process of innovation. At its base is the fundamental idea of individuals who actively engage in the daily creation and recreation, production and reproduction of the world in which they live.

In trying to theorise this process it is critical that a distinction is made between invention and innovation and diffusion across society. Invention, the ideas or concepts for new products and processes, which can then be implemented as a useful product (innovation), always arises from within existing ideas and social circumstances. Inventions and innovations do not exist in isolation. Each is an open-ended set of problems and possibilities. And each is an extension of human lives: the inventor who thinks about a problem, the person who takes this and makes something, the person who buys it, and the people who oppose it, the many who will use it and interpret it. In other words, technology adoption or the dissemination of innovation is never a one-way process but a constantly evolving interaction. Context will shape innovation just as innovation can change and shape context. Technological innovation and adoption is therefore a process that entails change in both directions, a constantly evolving iteration that can certainly not be easily predicted or pre-determined.

The modern idea of R&D has to some extent attempted to codify these distinctions into connected but separate processes. Research is thought of as primarily invention, while development is thought of as the translation of research into products and services that can be successfully commercialised. The added problem we have to contend with today is the notion that R&D has produced such compelling innovations that their adoption and dissemination are foregone conclusions. When people argue, for example, that the adoption of the internet, or radio, or television was ‘inevitable’, the assumption is that these results of R&D were so appealing to consumers that adoption was a foregone conclusion and a natural outcome of the free market. But history tells a different story: for example, in the cases of the telegraph, telephone, phonograph and the personal computer, creating demand was far more difficult than inventing these technologies. Samuel Morse had difficulty getting anyone to invest in his telegraph, taking five years to get the US Congress to pay for the first substantial telegraph line; Alexander Graham Bell could not find anyone to invest in his patent on the telephone, so he was reduced to marketing this himself; Edison found few commercial applications for his phonograph and only belatedly, after numerous attempts to develop bizarre products like speaking dolls and other toys, and speaking clocks, to name a few, did he stumble upon the phonograph as a musical instrument, almost as an afterthought; and in the 1970s when a prototype personal computer was shown to a group of MIT professors they were uninterested as they could not think of many uses for it. Thus, four technologies that were among the most important inventions in the history of communications were initially understood as curiosities. While the inventions were difficult to comprehend, the innovations they spurned were more easily assimilated. The ‘unforeseen transformations’ were realised through social interactions, many of which were unanticipated.

In fact, the history of technological innovation is the history of unforeseen transformations. While R&D spending did result in a proliferation of remarkable inventions, products and processes, many of the technologies invented or conceived for one clearly defined use have acquired other unexpected uses over time. More often than not, the answers engineers, inventors or companies innovating in different spheres will give to the question ‘What is it for?’ will prove to be wildly at variance with what subsequently happens to their invention or products in the marketplace. Think about the internet itself: from being conceived of as a military and scientific communication system designed in a decentralised fashion to ensure it would never be destroyed by power failures, failed computers or war, we have spawned a global communications and entertainment revolution that has built upon this end-to-end architecture which it is not possible to control. The designers of the internet never anticipated the development of an eBay, an Amazon, a Google, a social network website with over 100 million registered users like MySpace, the proliferation of pornography or massive multi-player games online. More recently, the evolution of texting over mobile phones provides an even starker example where mobile operators neither saw nor anticipated the potential of this dimension of the technology they provided. But consumers did. From the phonograph and radio to telephones and video recorders, the household fridge and mobile phone ringtones, many of the innovations that have improved the quality of life for millions and that we take for granted have resulted from unanticipated and consumer driven behaviours.

Thus when we ask ‘What is innovation for?’ the answer very much depends upon the perspective from which this is being asked. If the question is a broad enquiry as to whether a specific innovation solves some real problem, it is a legitimate question to ask, indeed, a necessary one. Unfortunately, this is not the general context within which this is being posed. Since the 1980s there has been a dynamic shift in the source and aim of funding for R&D across the globe. As government R&D spending has gone down and private sector funding has risen, private funding has followed the business cycle, with its focus upon short-term results. As the report of the Task Force on the Future of American Innovation (2005: 10) makes clear, ‘71 per cent of these private funds were for development, not basic research’. More recent trends are consolidating this instrumentalism: for example, the trend towards the outsourcing of R&D. After years of squeezing costs out of the factory floor, back office and warehouses, CEOs are increasingly looking at their once-protected R&D operations, with the result that many are now locating R&D in low-wage countries like India or China. Boeing, for example, is co-developing software for their navigation systems and landing gear with India’s HCL; pharmaceutical giants like GlaxoSmithKline and Eli Lilly are teaming up with Asian biotech research companies in a bid to cut the average $500 million cost of bringing new drugs to market (Business Week 21.3.2005). An even more recent trend, ‘crowdsourcing’, which attempts to leverage the participation of networks of experts from across the globe in the development of products, is rapidly changing the face of R&D. Again, Eli Lilly is at the forefront. It funded InnoCentive as a network to connect brainpower outside the company to help develop drugs and speed them to market (Howe 2006).

On one level there is nothing wrong with introducing new processes that will help reduce costs of research and speed up time to market for new products. This is part of the innovation impetus. But many leading innovators are themselves beginning to raise concerns about corporate R&D moving away from the kind of fundamental research that wins Nobel prizes towards a narrow focus on business goals (See Rose 2006). However, this should not be fetishised for the reasons outlined above. The real problem with asking ‘What is it for?’ is the attempt to guarantee outcomes which, as I have argued, is precisely what cannot be stipulated at the outset. It reveals a mindset that regards technology as a constant subject to be exploited and human beings and their choices as objects to be manipulated. Ultimately, this elevates technology into an autonomous force with its apparent inexorable uptake, while the human subject is diminished to the role of passive consumer. But as we have seen, the history of innovation has been the remarkable interaction between mankind and machines, between human pragmatism and scientific knowledge. Both sides have been enriched through this creative interaction and fusion.

Thus, the problem goes way beyond what the proportion of spending between research and development is or may become in the future. If unpredictability and creativity, and thus human agency, are removed from the process, no amount of spending on research can produce the ideas and inventions that can solve the problems facing humanity in the twenty-first century. Instead, we will be reduced to an ever-diminishing dumbed-down consumer culture. As Thomas Hughes (2004: 6) eloquently puts it in his wonderful Human-Built World:

...in a secular age dedicated to a consumer culture, we do not see technology in the grand perspective suggested by Leonardo [Da Vinci] and Goethe. We are content to let inventors and entrepreneurs, energised by market forces, lay claim to the laurels of creativity… Goethe’s Faust would hardly have asked the moment of creation to linger, if the result was simply one more consumer good.

Dr Norman Lewis is the Director of Technology Research for the Orange Home, UK. He is writing a book, to be published next year by Continuum Books on the subject of digital children and their encounter with innovation in a risk-averse culture.

 References

Business Week (21.3.2005). ‘Special Report on Outsourcing Innovation’. Business Week: 46-53.

Howe, J. (2006). The Rise of Crowdsourcing. Wired Magazine. 14.06. June.

Hughes, T.P. (2004). Human-Built World: How to Think about Technology and Culture. Chicago and London, The University of Chicago Press.

Lewis, N. (16.11.2005). Innovation in an era of Caution. spiked.

Nye, E.D. (2006). Technology Matters. Cambridge, MA, MIT Press.

Rose, F. (2006). Moore’s Life and Law Revisited. Wired Magazine. 14.03. March

Task Force on the Future of American Innovation (2005). The Knowledge Economy: Is the United States Losing Its Competitive edge: Benchmarks of Our Innovation Future. 16 February.

 Festival Buzz

"The Battle of Ideas provides a valuable and positive resource at a time when intelligent debate, public speaking and challenge seem to be diminishing in public life."
Barb Jungr, chansonniere