We are constantly bombarded with new technologies described as revolutionary in terms of their impact. Currently, the hot technology names are “big data” and “machine learning,” which used to be buried in something called “artificial intelligence.”
Collecting business data and providing analysis is clearly valuable and machine learning refers to computer software that programs physical systems in specified activities. The concepts behind both areas are not new, but what is new is the degree of publicity given to what were specialized areas of activity.
“Big data” is now being promoted as the creator of massive corporate value because such data mining produces remarkable business value. We also hear a lot about the revolutionary “internet of things,” which has little to do with the internet but is simply the idea of connecting physical sensors to monitor industrial processes. Each promises, we hear, to create billions of dollars of business value.
“Machine learning” is now promoted as the key to replacing an ever larger number of humans involved in industrial activity because computer programs will exceed the value of human activity in many areas — including driving cars.
Such a highly promoted technology buzz carries big risks for corporate management. Consultants beat down doors to offer their services. How much do you believe such investment programs are justified? To answer such questions requires an understanding of the capabilities of such technology and the nature of the results achieved — and the ability to clearly focus with skilled technologists on the desired business results. Without that understanding, senior management is in a reactive mode that produces generally poor business results. Unfortunately, few industrial leaders are equipped to manage or assess the impact of such technologies on their business. The problem is attributable to the fact that even the best colleges do not stress science as a required subject for all students.
The issue was brought home to me in a recent interview. “Have you ever heard of Isaac Newton?” I recently asked a young applicant for a position of research analyst in private equity technology investment. “Sure,” he answered. “In our high school science class we learned how a falling apple led him to discover gravity.” Then perhaps you can tell me something about Albert Einstein and why he is considered the greatest scientist since Newton?” Silence. This young man graduated from one of the most prestigious colleges in the United States majoring in economics, but never took a science course. “Why?” I asked. “Because such courses are too hard,” he answered, “and I needed to focus on keeping a very high grade average to get a job on Wall Street.”
The education of this young man is, unfortunately, a common one. I once hired a Harvard College graduate who had majored in physics and asked her why so few of the obviously very bright students majored in the sciences. “Unless you want to become a scientist or an engineer, there is no professional motivation to put in the hard work,” she said. “You propeller heads will work for us eventually,” was their common comment when discussing such a choice compared to much easier liberal arts or economics majors that offered attractive career prospects in finance and management.
Science classes are avoided because they require rigorous thinking in solving problems based on an understanding of fundamentals
Science classes are avoided because they require rigorous thinking in solving problems based on an understanding of fundamentals. To avoid the pain of real learning, college science courses are commonly diluted down to teaching general concepts. “Science for poets” is one way of describing such superficial courses.
The result of this stress-free college education is bad. Here is a recent headline from the Wall Street Journal ( June 6, 2017): “Many colleges fail in teaching how to think — a test finds students often gain little ability to assess evidence, make a cohesive argument.” Such students will find their way into industrial, financial and political leadership in a world driven by technological change without having a basic understanding of what really drives innovations. For example, we hear talk of “Moore’s Law” driving the dramatic year to year improvement in computing power, but I have rarely found financial analysts who know that the driving component is a transistor. Such ignorance may not matter, but when the next exciting new technology that might replace transistors in computers — for example, ”quantum computers” — emerges, such industry analysts fall into a common trap of touting its virtues without understanding its practical limitations or commercial value. The writings of such analysts can drive public perceptions of corporate value.
Now, just imagine the situation of CEOs of corporations faced with investment decisions and the inability to even evaluate the opinion of technologists because they lack a basic understanding of the physical principles behind them.
Unfortunately, many lack the kind of education needed to cope with a technology-driven age. This is not a new problem and I believe it accounts for many missed business opportunities.
Here is a bit of personal history.
I spent the first part of my career at the RCA Laboratories as a scientist. At that time, the company was a world leader in consumer and military electronics as well broadcasting and communications. Given the diversity of the business, senior management was constantly faced with investment decisions in various business sectors. However, the top management of the company was in the hands of non-technically trained people.
As I rose into corporate management as a vice president, it became clear to me that the corporate CEO and staff who came from financial backgrounds were baffled by the technological issues facing the company in the future. For example, I found myself trying to recommend investments in a chip plant to people who had no idea of what constituted the core technology issues and their importance to the company. All they saw was risk and not opportunity.
What happened over time is that the staff gravitated to investments in consumer products and neglected the core strength of the company, which was electronic technologies. As a result, RCA, instead of investing in its core capabilities, launched in the 1970s a series of acquisitions of “low-tech” businesses deemed low risk, such as packaged foods, rental cars, financial services and carpets. This neglected any investment in areas like flat-panel liquid crystal displays (invented at the RCA Laboratories) that eventually dominated consumer electronics and computer systems.
The company eventually was acquired by General Electric after it had lost its strategic way under successive CEOs who were focused on consumer products instead of technology because their risk/profiles analysis was beyond their understanding.
Today the importance of technology as a driver of economic growth is increasing. It is not simply an issue of corporate management investment decisions, but the role of government in funding research and development. At the national level, the US investment in research and development in core materials and electronic technologies has declined. I believe that a big reason for that is that the national political leadership is so poorly schooled in the core fundamentals linking such investment to defense capabilities and economic growth.
What has remained constant is investment in medical research through the National Institute of Health. Is it because the NIH keeps funding a possible cure for Alzheimer’s, a disease that congressmen universally fear?
Dr Henry Kressel is a technologist, inventor and long-term private equity investor. He formerly headed corporate research and development at the RCA Laboratories in Princeton NJ. He is the author of five books including (with Norman Winarsky) If You Really Want to Change the World, Harvard Business Review Press, 2015.