Slowly but steadily, pressure is building for industrial companies to come to grips with the fact that technology-enabled disruptive change is here. This creates tension in most companies, because the culture in many industries is quite conservative when it comes to innovation. Product designers may be keen to innovate in their traditional areas of expertise, yet still be reluctant to consider new designs that rely on software and enhanced connectivity. Production groups find it difficult to justify even the time it takes to consider new ways of operating, and making do with aging production systems and techniques is the norm. So when a company’s visionaries talk about ‘Internet of Things’ or ‘Industrie 4.0′ AI, or other terms for technology-enabled digital transformation, they are taking a risk. But are they right? To be sure, there’s plenty of hype in the market. It’s easy to find incredible claims about “xx billions of connected devices” or “yy trillions of dollars.” On the other hand are the naysayers who dismiss the trends saying “we’ve been doing it for years” or “we’ll never do that?” You can even find surveys from respectable firms that say “half of manufacturing executives don’t know about IoT” or some other comforting statistic. We’re not the only ones, you think. Maybe it’s not real. And if it is, it looks like I’ve got plenty of time to figure it out. Wrong. That building pressure comes from the growing gap between two forces: The exponential growth of computing technologies, and the linear thinking and inertia of humans and human systems. And that pressure is akin to tectonic pressure building at a fault line: the changes are barely noticeable for a long time, until a seismic shift unleashes a massive change. This pressure derives from what Ray Kurzweil calls the law of […]
The mixture of happenstance and geostrategy that helped make a small European country key to the global semiconductor market is depicted in delicious detail in Chris Miller’s “Chip War.” The book couldn’t have been timed better. Miller, an associate professor of international history at the Fletcher School, uses a colorful cast of characters to tell the story of a truly pivotal industry’s formation, and explain why altering it in a meaningful way seems unlikely any time soon – regardless of mounting geopolitical pressure. Chips are coveted not least for the role they play in artificial intelligence tools seemingly poised to shake things up for just about everyone. The more we want them, though, the more expensive and difficult they are to make. It’s all gotten very complicated. Take the Dutch niche in the supply chain, for example – it’s based on one company’s machine “that took tens of billions of dollars and several decades to develop,” Miller writes, and uses light to print patterns on silicon by deploying lasers that can hit 50 million tin drops per second. It’s an industry full of such mind-bending extremes. In an interview with the Forum’s Radio Davos podcast, Miller marvelled at having recently visited a facility in the US being built with “seventh-biggest crane that exists in the world,” which will eventually assemble chips mounted with transistors “roughly the size of a coronavirus.” Nvidia, the company now most closely identified with chips powering artificial intelligence, features prominently in Miller’s book. The company traces its roots to a meeting at a 24-hour diner on the fringes of Silicon Valley, he writes. At a certain point it realised that its semiconductors used for video-game graphics could do a good job of training AI systems. Earlier this year, its market value increased by $184 billion in a single day. Nvidia’s chips aren’t made anywhere near […]
Just like the world at large, the world of work shifts and changes over time. The future of work refers to an informed perspective on what businesses and other organisations need to know about how work could shift (given digitisation and other trends), plus how workforces and workplaces can prepare for those changes, big and small. When you think of the future of work, what do you picture? Offices that look more or less like today’s? Factories full of robots? Or something else entirely? While no one can predict the future with absolute certainty, the world of work is changing, just as the world itself is. Looking ahead at how work will shift, along with trends affecting the workforce and workplaces, can help you prepare for what is next: One in sixteen workers may have to switch occupations by 2030. That is more than 100 million workers across the eight economies studied—and the pandemic accelerated expected workforce transitions. Job growth will be more concentrated in high-skill jobs (for example, in healthcare or science, technology, engineering, and math [STEM] fields), while middle- and low-skill jobs (such as food service, production work, or office support roles) will decline. A few job categories could see more growth than others. The rise of e-commerce created demand for warehouse workers; investments in the green economy could increase the need for wind turbine technicians; aging populations in many advanced economies will increase demand for nurses, home health aides, and hearing-aid technicians; and teachers and training instructors will also continue to find work over the coming decade. But other types of jobs may be at risk: for example, as grocery stores increasingly install self-checkout counters, there may be a need for fewer clerks, and robotics used to process routine paperwork may lessen demand for some office workers. The future of work was shifting […]
Two management and technology experts show that AI is not a job destroyer, exploring worker-AI collaboration in real-world work settings. This book breaks through both the hype and the doom-and-gloom surrounding automation and the deployment of artificial intelligence-enabled—“smart”—systems at work. Management and technology experts Thomas Davenport and Steven Miller show that, contrary to widespread predictions, prescriptions, and denunciations, AI is not primarily a job destroyer. Rather, AI changes the way we work—by taking over some tasks but not entire jobs, freeing people to do other, more important and more challenging work. By offering detailed, real-world case studies of AI-augmented jobs in settings that range from finance to the factory floor, Davenport and Miller also show that AI in the workplace is not the stuff of futuristic speculation. It is happening now to many companies and workers. These cases include a digital system for life insurance underwriting that analyzes applications and third-party data in real time, allowing human underwriters to focus on more complex cases; an intelligent telemedicine platform with a chat-based interface; a machine learning-system that identifies impending train maintenance issues by analyzing diesel fuel samples; and Flippy, a robotic assistant for fast food preparation. For each one, Davenport and Miller describe in detail the work context for the system, interviewing job incumbents, managers, and technology vendors. Short “insight” chapters draw out common themes and consider the implications of human collaboration with smart systems.
Companies believe smart manufacturing is key to their business’ future success, The State of Smart Manufacturing Report reveals. Plex Systems, a Rockwell Automation company in cloud-delivered smart manufacturing solutions, has announced the results of its 7th annual study, “The State of Smart Manufacturing Report.” The research reveals that smart manufacturing adoption accelerated by 50% globally in 2021, and these new technologies are now solving the industry’s critical challenges. Asia-Pacific (APAC) organisations (93%) view smart manufacturing as “very” or “extremely” important to future success, compared to North America (NA) (84%) and Europe, Middle East, and Africa (EMEA) (75%). Reflecting their optimistic attitude towards smart manufacturing, most APAC companies reported a shorter timeframe – “within the next 7 to 11 months” – for adoption of smart manufacturing solutions, compared with NA and EMEA’s majority response of “within the next 1-2 years”. The global study surveyed more than 300 manufacturers in a variety of industries including automotive, aerospace, food and beverage, electronics, consumer goods, plastics and rubber, precision metal forming, and more. This year’s report, conducted in collaboration with Hanover Research, offers valuable insights into trends, challenges, and future plans for global manufacturers. The report aims to help manufacturers benchmark their technology strategy and incorporate smart manufacturing best practices to stay competitive and thrive. A scalable technology strategy makes it possible to incrementally adopt solutions and achieve value quickly. As manufacturers look to streamline processes and solve today’s challenges, they are placing significant value on using smart technology to address and improve actual business outcomes. The report’s findings show the pandemic both exposed and exacerbated pre-existing conditions in manufacturing. Skilled worker shortages, competition, and supply chain disruptions are the top three challenges being faced by Asia-Pacific companies because of the pandemic. With that in mind, nine in 10 APAC respondents (93%) believe smart […]
Companies at the forefront of the technology frontier are empowering their workers with digital technologies—and the skills they need to use them. For many members of the world’s workforces, change can sometimes be seen as a threat, particularly when it comes to technology. This is often coupled with fears that automation will replace people. But a look beyond the headlines shows that the reverse is proving to be true, with Fourth Industrial Revolution (4IR) technologies driving productivity and growth across manufacturing and production at brownfield and greenfield sites. These technologies are creating more and different jobs that are transforming manufacturing and helping to build fulfilling, rewarding, and sustainable careers. What’s more, with 4IR technologies in the hands of a workforce empowered with the skills needed to use them, an organisation’s digital transformation journey can move from aspiration to reality. While there is a common perception that digitisation and automation are a threat to the world’s workers, companies at the forefront of the technology frontier have actually created jobs—different, new roles that are much more high tech than the roles of the past. And with the current labor mismatch being felt in many countries, the time is now to further engage workers for a digitally enabled future. This focus is backed by growing research proving that workforce engagement is key. Over the last several years, research with the World Economic Forum, in collaboration with McKinsey, surveyed thousands of manufacturing sites on their way to digitising operations and have identified about 90 leaders. These are the lighthouses—sites and supply chains chosen by an independent panel of experts for leadership in creating dramatic improvements with technology. Together they create the Global Lighthouse Network, committed to sharing what they’ve learned along the way. A common theme among these sites is their worker centricity—they are supporting the […]
Just like the world at large, the world of work shifts and changes over time. The future of work refers to an informed perspective on what businesses and other organistions need to know about how work could shift (given digitisation and other trends), plus how workforces and workplaces can prepare for those changes, big and small. When you think of the future of work, what do you picture? Offices that look more or less like today’s? Factories full of robots? Or something else entirely? While no one can predict the future with absolute certainty, the world of work is changing, just as the world itself is. Looking ahead at how work will shift, along with trends affecting the workforce and workplaces, can help you prepare for what is next: One in sixteen workers may have to switch occupations by 2030. That is more than 100 million workers across the eight economies studied—and the pandemic accelerated expected workforce transitions. Job growth will be more concentrated in high-skill jobs (for example, in healthcare or science, technology, engineering, and math [STEM] fields), while middle- and low-skill jobs (such as food service, production work, or office support roles) will decline. A few job categories could see more growth than others. The rise of e-commerce created demand for warehouse workers; investments in the green economy could increase the need for wind turbine technicians; aging populations in many advanced economies will increase demand for nurses, home health aides, and hearing-aid technicians; and teachers and training instructors will also continue to find work over the coming decade. But other types of jobs may be at risk: for example, as grocery stores increasingly install self-checkout counters, there may be a need for fewer clerks, and robotics used to process routine paperwork may lessen demand for some office workers. The future of work was shifting […]
By Chris Miller An epic account of the decades-long battle to control what has emerged as the world’s most critical resource—microchip technology—with the United States and China increasingly in conflict. You may be surprised to learn that microchips are the new oil—the scarce resource on which the modern world depends. Today, military, economic, and geopolitical power are built on a foundation of computer chips. Virtually everything—from missiles to microwaves, smartphones to the stock market—runs on chips. Until recently, America designed and built the fastest chips and maintained its lead as the #1 superpower. Now, America’s edge is slipping, undermined by competitors in Taiwan, Korea, Europe, and, above all, China. Today, as Chip War reveals, China, which spends more money each year importing chips than it spends importing oil, is pouring billions into a chip-building initiative to catch up to the US. At stake is America’s military superiority and economic prosperity. Economic historian Chris Miller explains how the semiconductor came to play a critical role in modern life and how the U.S. become dominant in chip design and manufacturing and applied this technology to military systems. America’s victory in the Cold War and its global military dominance stems from its ability to harness computing power more effectively than any other power. But here, too, China is catching up, with its chip-building ambitions and military modernization going hand in hand. America has let key components of the chip-building process slip out of its grasp, contributing not only to a worldwide chip shortage but also a new Cold War with a superpower adversary that is desperate to bridge the gap. Illuminating, timely, and fascinating, Chip War shows that, to make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chips.
-Adam Sharmer, senior partner, Dsifer As the pace of digital innovation accelerates, and organisations look to exploit the financial and strategic advantages of digital technology, automation and AI, data is increasingly becoming the lifeblood of organisational decision ecosystems. Whilst the collection and analysis of data to inform operational decision-making is not a new concept, advances in connectivity of technology, AI and machine learning as well as ERP and MES solutions have driven exponential gains in the availability, quality, and integration of data in manufacturing organisations. Spend a few moments on LinkedIn and its easy to become overwhelmed by the level technology available and the visions of possible futures available through its adoption. However, whilst there is no doubt that Industry 4.0 technologies represent a potential step-change in manufacturing performance, the reality is that the CAPEX required for investments in, for example, lights-out warehousing or production automation, is beyond the reach of a lot of organisations. As a result, the journey from current state to the Industry 4.0 utopia can seem like an insurmountable one. The good news is that, whilst strategic adoption of technology should be on the business plan for all organisations if they are to remain competitive, there is a lot of value that can be unlocked with the current context or with low levels of investment. In our experience, most organisations have increased their maturity in the collection and storage of data in all or most organisational processes. However, the true value of this data is not yet being realised. A deliberate focus on process, system, analytics and cultural dimensions to become a data-driven organisation, can unlock the latent value in current systems and data and build the foundation from which to optimise I4 technology. From our experience, data-driven organisations share the following characteristics: They collect […]
1. Flexible Batteries Standard rigid batteries may soon be a thing of the past as thin, flexible batteries – made of lightweight materials that can be twisted, bent and stretched – reach the market. This new generation of battery technology – expected to hit a market value of $240 million by 2027 – has applications across medical wearables, biomedical sensors, flexible displays and smart watches. 2. Generative Artificial Intelligence This year’s list would not be complete without mentioning generative AI – a new type of AI capable of generating new and original content by learning from large datasets that was catapulted into public dialogue at the end of 2022 with the public release of ChatGPT. Evolving rapidly, generative AI is set to disrupt multiple industries, with applications in education, research and beyond. 3.Sustainable Aviation Fuel With 2%-3% of annual global CO2 emissions coming from aviation, and no sign of long-haul electric flights, sustainable aviation fuel produced from biological (e.g. biomass) and non-biological (e.g. CO2) sources could be the answer to decarbonize the aviation industry in the short to medium term. 4. Designer Phages Phages are viruses that selectively infect specific types of bacteria. Equipped with increasingly sophisticated genetic engineering tools, scientists can now reprogramme phages to infect the bacteria of their choosing, allowing them to target one type of bacteria in a complex community of co-existing types of bacteria such as in plant, animal and human microbiomes. Though many of the near-term applications will be in research, there are signs these “designer” phages could eventually be used to treat microbiome-associated diseases or eliminate dangerous bacteria in food supply chains. 5. Metaverse for Mental Health Responding to the growing mental health crisis, product developers are starting to build shared virtual spaces to improve mental health. Video games are already being used to treat […]