Slowly but steadily, pressure is building for industrial companies to come to grips with the fact that technology-enabled disruptive change is here. This creates tension in most companies, because the culture in many industries is quite conservative when it comes to innovation. Product designers may be keen to innovate in their traditional areas of expertise, yet still be reluctant to consider new designs that rely on software and enhanced connectivity. Production groups find it difficult to justify even the time it takes to consider new ways of operating, and making do with aging production systems and techniques is the norm. So when a company’s visionaries talk about ‘Internet of Things’ or ‘Industrie 4.0′ AI, or other terms for technology-enabled digital transformation, they are taking a risk. But are they right? To be sure, there’s plenty of hype in the market. It’s easy to find incredible claims about “xx billions of connected devices” or “yy trillions of dollars.” On the other hand are the naysayers who dismiss the trends saying “we’ve been doing it for years” or “we’ll never do that?” You can even find surveys from respectable firms that say “half of manufacturing executives don’t know about IoT” or some other comforting statistic. We’re not the only ones, you think. Maybe it’s not real. And if it is, it looks like I’ve got plenty of time to figure it out. Wrong. That building pressure comes from the growing gap between two forces: The exponential growth of computing technologies, and the linear thinking and inertia of humans and human systems. And that pressure is akin to tectonic pressure building at a fault line: the changes are barely noticeable for a long time, until a seismic shift unleashes a massive change. This pressure derives from what Ray Kurzweil calls the law of […]
The mixture of happenstance and geostrategy that helped make a small European country key to the global semiconductor market is depicted in delicious detail in Chris Miller’s “Chip War.” The book couldn’t have been timed better. Miller, an associate professor of international history at the Fletcher School, uses a colorful cast of characters to tell the story of a truly pivotal industry’s formation, and explain why altering it in a meaningful way seems unlikely any time soon – regardless of mounting geopolitical pressure. Chips are coveted not least for the role they play in artificial intelligence tools seemingly poised to shake things up for just about everyone. The more we want them, though, the more expensive and difficult they are to make. It’s all gotten very complicated. Take the Dutch niche in the supply chain, for example – it’s based on one company’s machine “that took tens of billions of dollars and several decades to develop,” Miller writes, and uses light to print patterns on silicon by deploying lasers that can hit 50 million tin drops per second. It’s an industry full of such mind-bending extremes. In an interview with the Forum’s Radio Davos podcast, Miller marvelled at having recently visited a facility in the US being built with “seventh-biggest crane that exists in the world,” which will eventually assemble chips mounted with transistors “roughly the size of a coronavirus.” Nvidia, the company now most closely identified with chips powering artificial intelligence, features prominently in Miller’s book. The company traces its roots to a meeting at a 24-hour diner on the fringes of Silicon Valley, he writes. At a certain point it realised that its semiconductors used for video-game graphics could do a good job of training AI systems. Earlier this year, its market value increased by $184 billion in a single day. Nvidia’s chips aren’t made anywhere near […]
-Robert Le Busque, Regional Vice-President, Asia Pacific, Verizon Business In Australia, the rapid development of artificial intelligence (AI) has coincided with the Government’s announcement of the $15 billion National Reconstruction Fund (NRF), $1 billion of which will be dedicated to reinvigorating, accelerating, and building up advanced manufacturing capabilities. As businesses have already started adopting AI for various use cases, the Government has also recently launched a public consultation to seek advice on how to mitigate any potential risks and ensure it is applied responsibly. One of these uses cases is advanced manufacturing, which is a broad term that encompasses technologies applied to any manufacturing facility, including application software and AI tools to augment and generate higher value manufacturing output. Advanced manufacturing is also likely to use AI to embed faster and safer means of production and better operations management for industrial processes. Generating speed, safety and enhanced production Part of removing human error to make manufacturing safer and faster means reducing reliance on human observation. As machinery drivers and operators are among the top three occupation groups with the highest rate of work-related injuries in Australia at 6.5 percent in the 2021-2022 financial year, this makes a use case for AI analytics in augmenting workplace health and safety. High precision tracking, for example, detects faults in machinery micro-components at a scale previously unachievable – this improves productivity, output, and profitability for manufacturing companies. AI analytics can match workers’ physical position in relation to how they interact with machines, particularly heavy industrial machinery, facilitating a more comprehensive understanding of safety in workplace operations and injury. AI software can also manage workflows more effectively by identifying and allocating tasks. This can be taken one step further, with large site and heavy industrial manufacturing operations using digital twins to optimise these workflows – […]
Just like the world at large, the world of work shifts and changes over time. The future of work refers to an informed perspective on what businesses and other organisations need to know about how work could shift (given digitisation and other trends), plus how workforces and workplaces can prepare for those changes, big and small. When you think of the future of work, what do you picture? Offices that look more or less like today’s? Factories full of robots? Or something else entirely? While no one can predict the future with absolute certainty, the world of work is changing, just as the world itself is. Looking ahead at how work will shift, along with trends affecting the workforce and workplaces, can help you prepare for what is next: One in sixteen workers may have to switch occupations by 2030. That is more than 100 million workers across the eight economies studied—and the pandemic accelerated expected workforce transitions. Job growth will be more concentrated in high-skill jobs (for example, in healthcare or science, technology, engineering, and math [STEM] fields), while middle- and low-skill jobs (such as food service, production work, or office support roles) will decline. A few job categories could see more growth than others. The rise of e-commerce created demand for warehouse workers; investments in the green economy could increase the need for wind turbine technicians; aging populations in many advanced economies will increase demand for nurses, home health aides, and hearing-aid technicians; and teachers and training instructors will also continue to find work over the coming decade. But other types of jobs may be at risk: for example, as grocery stores increasingly install self-checkout counters, there may be a need for fewer clerks, and robotics used to process routine paperwork may lessen demand for some office workers. The future of work was shifting […]
Two management and technology experts show that AI is not a job destroyer, exploring worker-AI collaboration in real-world work settings. This book breaks through both the hype and the doom-and-gloom surrounding automation and the deployment of artificial intelligence-enabled—“smart”—systems at work. Management and technology experts Thomas Davenport and Steven Miller show that, contrary to widespread predictions, prescriptions, and denunciations, AI is not primarily a job destroyer. Rather, AI changes the way we work—by taking over some tasks but not entire jobs, freeing people to do other, more important and more challenging work. By offering detailed, real-world case studies of AI-augmented jobs in settings that range from finance to the factory floor, Davenport and Miller also show that AI in the workplace is not the stuff of futuristic speculation. It is happening now to many companies and workers. These cases include a digital system for life insurance underwriting that analyzes applications and third-party data in real time, allowing human underwriters to focus on more complex cases; an intelligent telemedicine platform with a chat-based interface; a machine learning-system that identifies impending train maintenance issues by analyzing diesel fuel samples; and Flippy, a robotic assistant for fast food preparation. For each one, Davenport and Miller describe in detail the work context for the system, interviewing job incumbents, managers, and technology vendors. Short “insight” chapters draw out common themes and consider the implications of human collaboration with smart systems.
Companies believe smart manufacturing is key to their business’ future success, The State of Smart Manufacturing Report reveals. Plex Systems, a Rockwell Automation company in cloud-delivered smart manufacturing solutions, has announced the results of its 7th annual study, “The State of Smart Manufacturing Report.” The research reveals that smart manufacturing adoption accelerated by 50% globally in 2021, and these new technologies are now solving the industry’s critical challenges. Asia-Pacific (APAC) organisations (93%) view smart manufacturing as “very” or “extremely” important to future success, compared to North America (NA) (84%) and Europe, Middle East, and Africa (EMEA) (75%). Reflecting their optimistic attitude towards smart manufacturing, most APAC companies reported a shorter timeframe – “within the next 7 to 11 months” – for adoption of smart manufacturing solutions, compared with NA and EMEA’s majority response of “within the next 1-2 years”. The global study surveyed more than 300 manufacturers in a variety of industries including automotive, aerospace, food and beverage, electronics, consumer goods, plastics and rubber, precision metal forming, and more. This year’s report, conducted in collaboration with Hanover Research, offers valuable insights into trends, challenges, and future plans for global manufacturers. The report aims to help manufacturers benchmark their technology strategy and incorporate smart manufacturing best practices to stay competitive and thrive. A scalable technology strategy makes it possible to incrementally adopt solutions and achieve value quickly. As manufacturers look to streamline processes and solve today’s challenges, they are placing significant value on using smart technology to address and improve actual business outcomes. The report’s findings show the pandemic both exposed and exacerbated pre-existing conditions in manufacturing. Skilled worker shortages, competition, and supply chain disruptions are the top three challenges being faced by Asia-Pacific companies because of the pandemic. With that in mind, nine in 10 APAC respondents (93%) believe smart […]
Companies at the forefront of the technology frontier are empowering their workers with digital technologies—and the skills they need to use them. For many members of the world’s workforces, change can sometimes be seen as a threat, particularly when it comes to technology. This is often coupled with fears that automation will replace people. But a look beyond the headlines shows that the reverse is proving to be true, with Fourth Industrial Revolution (4IR) technologies driving productivity and growth across manufacturing and production at brownfield and greenfield sites. These technologies are creating more and different jobs that are transforming manufacturing and helping to build fulfilling, rewarding, and sustainable careers. What’s more, with 4IR technologies in the hands of a workforce empowered with the skills needed to use them, an organisation’s digital transformation journey can move from aspiration to reality. While there is a common perception that digitisation and automation are a threat to the world’s workers, companies at the forefront of the technology frontier have actually created jobs—different, new roles that are much more high tech than the roles of the past. And with the current labor mismatch being felt in many countries, the time is now to further engage workers for a digitally enabled future. This focus is backed by growing research proving that workforce engagement is key. Over the last several years, research with the World Economic Forum, in collaboration with McKinsey, surveyed thousands of manufacturing sites on their way to digitising operations and have identified about 90 leaders. These are the lighthouses—sites and supply chains chosen by an independent panel of experts for leadership in creating dramatic improvements with technology. Together they create the Global Lighthouse Network, committed to sharing what they’ve learned along the way. A common theme among these sites is their worker centricity—they are supporting the […]
Stephen D King With the title clearly a take on the Lionel Shriver book, “We must talk about Kevin”, one is immediately warned to approach Stephen King’s book with trepidation. Like Kevin, inflation in this book is bad, particularly if out of control. And like Kevin’s parents, those responsible for keeping inflation in check should be made to read and understand the warning signs early enough and act decisively to tame it. Otherwise, it all ends in tragedy. Unless of course we learn the lessons – 14 of them according to King, and not just from the recent crises but going back 2,000 years. So we can live happily ever after and in one piece, unlike the characters in Lionel Shriver’s tale. Well- that at least is the conclusion one gets to when reading King’s book. There are some lighter moments like the recurring theme of the love/hate relationship between Richard Burton and Elisabeth Taylor to characterise what is going on between monetary authorities and the fiscal side of the equation. Periods of harmony are great. Pulling in opposite directions proves disastrous. And King offers lots of historical examples as to when things went well on the inflation front and when they didn’t. Some interesting conclusions though. In King’s view the argument used by central banks against tightening, that it was all just ‘transitory’, was fuelled by central banks having presided over a low inflation environment for nearly two decades, falling victim of their own anti-inflation propaganda. As a result, serious mistakes were made when circumstances suddenly changed. We saw the impact of the supply and demand imbalances when Covid restrictions ended and then the extra pressures on energy and food prices with the war in Ukraine. But in King’s view the reappearance of inflation in the last couple of […]
Just like the world at large, the world of work shifts and changes over time. The future of work refers to an informed perspective on what businesses and other organistions need to know about how work could shift (given digitisation and other trends), plus how workforces and workplaces can prepare for those changes, big and small. When you think of the future of work, what do you picture? Offices that look more or less like today’s? Factories full of robots? Or something else entirely? While no one can predict the future with absolute certainty, the world of work is changing, just as the world itself is. Looking ahead at how work will shift, along with trends affecting the workforce and workplaces, can help you prepare for what is next: One in sixteen workers may have to switch occupations by 2030. That is more than 100 million workers across the eight economies studied—and the pandemic accelerated expected workforce transitions. Job growth will be more concentrated in high-skill jobs (for example, in healthcare or science, technology, engineering, and math [STEM] fields), while middle- and low-skill jobs (such as food service, production work, or office support roles) will decline. A few job categories could see more growth than others. The rise of e-commerce created demand for warehouse workers; investments in the green economy could increase the need for wind turbine technicians; aging populations in many advanced economies will increase demand for nurses, home health aides, and hearing-aid technicians; and teachers and training instructors will also continue to find work over the coming decade. But other types of jobs may be at risk: for example, as grocery stores increasingly install self-checkout counters, there may be a need for fewer clerks, and robotics used to process routine paperwork may lessen demand for some office workers. The future of work was shifting […]
By Chris Miller An epic account of the decades-long battle to control what has emerged as the world’s most critical resource—microchip technology—with the United States and China increasingly in conflict. You may be surprised to learn that microchips are the new oil—the scarce resource on which the modern world depends. Today, military, economic, and geopolitical power are built on a foundation of computer chips. Virtually everything—from missiles to microwaves, smartphones to the stock market—runs on chips. Until recently, America designed and built the fastest chips and maintained its lead as the #1 superpower. Now, America’s edge is slipping, undermined by competitors in Taiwan, Korea, Europe, and, above all, China. Today, as Chip War reveals, China, which spends more money each year importing chips than it spends importing oil, is pouring billions into a chip-building initiative to catch up to the US. At stake is America’s military superiority and economic prosperity. Economic historian Chris Miller explains how the semiconductor came to play a critical role in modern life and how the U.S. become dominant in chip design and manufacturing and applied this technology to military systems. America’s victory in the Cold War and its global military dominance stems from its ability to harness computing power more effectively than any other power. But here, too, China is catching up, with its chip-building ambitions and military modernization going hand in hand. America has let key components of the chip-building process slip out of its grasp, contributing not only to a worldwide chip shortage but also a new Cold War with a superpower adversary that is desperate to bridge the gap. Illuminating, timely, and fascinating, Chip War shows that, to make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chips.