Shiny Objects and Professionalism
An interesting question arose at one of my customer meetings recently. We were discussing the importance of innovation; in particular, how important it was for technology staff to stay abreast of new technologies to see whether they could add value to the enterprise. We agreed that current best practices suggest conducting rapid experiments to test ideas, and that such small, low-risk experiments should become part of everyday IT practices. But, someone asked, how can we then make sure that IT employees are not just wasting their time continually grasping at shiny new objects? The question is even more vexing given that IT vendors and consultants are continuously defining and promoting new buzzwords and product categories. Should enterprise technologists be spending (wasting?) time now on digital twins, blockchain, quantum computing, and deep learning?
How can we foster a culture of innovation and still avoid wasting time on every bright, shiny new object that comes along?
I think it is a really interesting and subtle issue. Let me begin by saying that I am a bit uncomfortable with this framing of the question. The fear that technologists will waste time on fancy new technologies, whether or not they benefit the business, is an old one, and, as I have pointed out in my books, based on stereotypes that probably no longer hold. Technologists should be expected to show good judgment in choosing what new technologies to invest time in—and how much time to invest—based on a realistic assessment of whether the technology might be relevant. And, in general, I believe that they are capable of doing so.
Because the cost and risk of experimentation is so low, and the potential benefit in a fast-moving, competitive business landscape is so high, most enterprises should probably want to err on the side of more exploration, considering new technologies whose business value might be less evident. For example, when AWS releases an interesting new service, it is fast to spin up a new sandbox environment to try it out, and because enterprises pay only for what they consume, they can try the service very cheaply as long as they are doing it at a small scale. A company that wants to create digital advantages through innovation has to accept the cost of trying out new ideas, even if a percentage of the experiments will not lead to major breakthroughs. They can take a portfolio approach of investing very small amounts in a number of possible technologies and then double down on the ones that seem promising.
But there is a deeper issue at play. Technologists, I would argue, are professionals, and like other professionals have an obligation to keep current with knowledge in their fields. I expect my doctors and nurses to be current in their knowledge of medical best practices, new drugs and treatments, and new diagnostic procedures. Professionals are not just executors; they are experts, advisors, and decision makers. IT experts are much more than just knowledge workers—they are trained, knowledgeable experts in their domains, and the enterprises that hire them expect them to be and to remain current in their skills.
IT technologists have an obligation to keep learning, and to make sure that they know enough about every major technological development relevant to their area of practice. Yes, most of us must know enough about blockchain, quantum computing, deep learning, and yesterday’s new developments to be able to make reasonable decisions and recommendations on behalf of our enterprises. It is impossible to keep up with every development—I am guessing that the same is true in medicine—but important developments, including those that are not directly relevant to what the employee is doing today, are the responsibility of IT professionals to be familiar with.
Given this, I am not satisfied with the current state of play. In many IT organizations, employees do not have a basic familiarity—a literacy, you might say—with today’s tools of the trade: the cloud, DevOps, automated infrastructure, and functional programming for example, let alone a familiarity with what might be coming tomorrow. This might very well be the consequence of a long-held view that IT folks should focus on what is immediately relevant to the business, or a fear that anything they touch will lead to extra costs for the enterprise. It could also be the result of non-sustainable ways of using IT employees’ time—late nights, constant on-call urgencies—which makes it hard for them to spend time studying new developments. But this way of thinking just will not do as we move into the continuous innovation environment of the digital world.
This of course raises the question of whether IT professionals should be advancing their knowledge on their own time or the company’s time. Here it may be hard to draw comparisons with other professional fields, where many practitioners are self-employed and perhaps don’t really distinguish between the two. But in general, I think, enterprises need to make sure that their IT employees have opportunities to freshen their skills, whether through training or just time set aside for hands-on experimentation. The enterprise will benefit from it. And this includes skills whose benefit to the company might not be immediately apparent—because that is precisely what we include in innovation, new ideas that would not have been relevant to what the company did yesterday.
Clearly, the enterprise has to avoid grabbing at every shiny object. But if it wants to be innovative, it must be open to a wider variety of shiny objects and must employ IT professionals who are current enough in their knowledge to know which shiny object might turn out to be gold.
A Seat at the Table: IT Leadership in the Age of Agility
The Art of Business Value
War and Peace and IT: Business Leadership, Technology, and Success in the Digital Age (now available for pre-order!)