When evaluating new technologies, companies typically focus on the direct financial impact the purchase would have-specifically, return on investment (ROI) and total cost of ownership (TCO). But when it comes to the shift to cloud computing, these traditional calculations-while valuable-may not reveal the full impact of the transition.
According to a recent report by MIT Technology Review Insights, additional criteria such as competitiveness, productivity, and new revenue opportunities come into play. Though difficult to measure, these factors have a clear impact on revenue growth and profitability. Indeed, there are already demonstrable productivity growth gaps between companies using legacy systems compared with those using cloud-based emerging technologies, including artificial intelligence (AI), machine learning (ML), Internet of Things (IoT), and blockchain. Low-productivity industries such as healthcare, retail, and education typically wait longer to adopt new information technologies, while industries with high productivity growth-oil & gas extraction, media and communications, and agriculture-are continuously adopting new technologies for automation and product development, investing five times more than organizations in low-productivity industries. The takeaway: Future productivity gains depend on digital (read: cloud) capabilities.
Research by the McKinsey Global Institute estimates that 60 percent of productivity-boosting opportunities during the next decade will be digital, but that currently US and European companies are running at less than 20 percent of that potential. One CIO calls this gap a "technological debt" that will have to be paid sooner or later, either by playing catch-up with more technologically advanced peers or suffering competitive disadvantages.