Slowly but steadily, pressure is building for industrial companies to come to grips with the fact that technology-enabled disruptive change is here. This creates tension in most companies, because the culture in many industries is quite conservative when it comes to innovation. Product designers may be keen to innovate in their traditional areas of expertise, yet still be reluctant to consider new designs that rely on software and enhanced connectivity. Production groups find it difficult to justify even the time it takes to consider new ways of operating, and making do with aging production systems and techniques is the norm. So when a company’s visionaries talk about ‘Internet of Things’ or ‘Industrie 4.0′ AI, or other terms for technology-enabled digital transformation, they are taking a risk. But are they right? To be sure, there’s plenty of hype in the market. It’s easy to find incredible claims about “xx billions of connected devices” or “yy trillions of dollars.” On the other hand are the naysayers who dismiss the trends saying “we’ve been doing it for years” or “we’ll never do that?” You can even find surveys from respectable firms that say “half of manufacturing executives don’t know about IoT” or some other comforting statistic. We’re not the only ones, you think. Maybe it’s not real. And if it is, it looks like I’ve got plenty of time to figure it out. Wrong. That building pressure comes from the growing gap between two forces: The exponential growth of computing technologies, and the linear thinking and inertia of humans and human systems. And that pressure is akin to tectonic pressure building at a fault line: the changes are barely noticeable for a long time, until a seismic shift unleashes a massive change. This pressure derives from what Ray Kurzweil calls the law of […]
Featured News Ticker
The mixture of happenstance and geostrategy that helped make a small European country key to the global semiconductor market is depicted in delicious detail in Chris Miller’s “Chip War.” The book couldn’t have been timed better. Miller, an associate professor of international history at the Fletcher School, uses a colorful cast of characters to tell the story of a truly pivotal industry’s formation, and explain why altering it in a meaningful way seems unlikely any time soon – regardless of mounting geopolitical pressure. Chips are coveted not least for the role they play in artificial intelligence tools seemingly poised to shake things up for just about everyone. The more we want them, though, the more expensive and difficult they are to make. It’s all gotten very complicated. Take the Dutch niche in the supply chain, for example – it’s based on one company’s machine “that took tens of billions of dollars and several decades to develop,” Miller writes, and uses light to print patterns on silicon by deploying lasers that can hit 50 million tin drops per second. It’s an industry full of such mind-bending extremes. In an interview with the Forum’s Radio Davos podcast, Miller marvelled at having recently visited a facility in the US being built with “seventh-biggest crane that exists in the world,” which will eventually assemble chips mounted with transistors “roughly the size of a coronavirus.” Nvidia, the company now most closely identified with chips powering artificial intelligence, features prominently in Miller’s book. The company traces its roots to a meeting at a 24-hour diner on the fringes of Silicon Valley, he writes. At a certain point it realised that its semiconductors used for video-game graphics could do a good job of training AI systems. Earlier this year, its market value increased by $184 billion in a single day. Nvidia’s chips aren’t made anywhere near […]
-Robert Le Busque, Regional Vice-President, Asia Pacific, Verizon Business In Australia, the rapid development of artificial intelligence (AI) has coincided with the Government’s announcement of the $15 billion National Reconstruction Fund (NRF), $1 billion of which will be dedicated to reinvigorating, accelerating, and building up advanced manufacturing capabilities. As businesses have already started adopting AI for various use cases, the Government has also recently launched a public consultation to seek advice on how to mitigate any potential risks and ensure it is applied responsibly. One of these uses cases is advanced manufacturing, which is a broad term that encompasses technologies applied to any manufacturing facility, including application software and AI tools to augment and generate higher value manufacturing output. Advanced manufacturing is also likely to use AI to embed faster and safer means of production and better operations management for industrial processes. Generating speed, safety and enhanced production Part of removing human error to make manufacturing safer and faster means reducing reliance on human observation. As machinery drivers and operators are among the top three occupation groups with the highest rate of work-related injuries in Australia at 6.5 percent in the 2021-2022 financial year, this makes a use case for AI analytics in augmenting workplace health and safety. High precision tracking, for example, detects faults in machinery micro-components at a scale previously unachievable – this improves productivity, output, and profitability for manufacturing companies. AI analytics can match workers’ physical position in relation to how they interact with machines, particularly heavy industrial machinery, facilitating a more comprehensive understanding of safety in workplace operations and injury. AI software can also manage workflows more effectively by identifying and allocating tasks. This can be taken one step further, with large site and heavy industrial manufacturing operations using digital twins to optimise these workflows – […]
Two management and technology experts show that AI is not a job destroyer, exploring worker-AI collaboration in real-world work settings. This book breaks through both the hype and the doom-and-gloom surrounding automation and the deployment of artificial intelligence-enabled—“smart”—systems at work. Management and technology experts Thomas Davenport and Steven Miller show that, contrary to widespread predictions, prescriptions, and denunciations, AI is not primarily a job destroyer. Rather, AI changes the way we work—by taking over some tasks but not entire jobs, freeing people to do other, more important and more challenging work. By offering detailed, real-world case studies of AI-augmented jobs in settings that range from finance to the factory floor, Davenport and Miller also show that AI in the workplace is not the stuff of futuristic speculation. It is happening now to many companies and workers. These cases include a digital system for life insurance underwriting that analyzes applications and third-party data in real time, allowing human underwriters to focus on more complex cases; an intelligent telemedicine platform with a chat-based interface; a machine learning-system that identifies impending train maintenance issues by analyzing diesel fuel samples; and Flippy, a robotic assistant for fast food preparation. For each one, Davenport and Miller describe in detail the work context for the system, interviewing job incumbents, managers, and technology vendors. Short “insight” chapters draw out common themes and consider the implications of human collaboration with smart systems.
Stephen D King With the title clearly a take on the Lionel Shriver book, “We must talk about Kevin”, one is immediately warned to approach Stephen King’s book with trepidation. Like Kevin, inflation in this book is bad, particularly if out of control. And like Kevin’s parents, those responsible for keeping inflation in check should be made to read and understand the warning signs early enough and act decisively to tame it. Otherwise, it all ends in tragedy. Unless of course we learn the lessons – 14 of them according to King, and not just from the recent crises but going back 2,000 years. So we can live happily ever after and in one piece, unlike the characters in Lionel Shriver’s tale. Well- that at least is the conclusion one gets to when reading King’s book. There are some lighter moments like the recurring theme of the love/hate relationship between Richard Burton and Elisabeth Taylor to characterise what is going on between monetary authorities and the fiscal side of the equation. Periods of harmony are great. Pulling in opposite directions proves disastrous. And King offers lots of historical examples as to when things went well on the inflation front and when they didn’t. Some interesting conclusions though. In King’s view the argument used by central banks against tightening, that it was all just ‘transitory’, was fuelled by central banks having presided over a low inflation environment for nearly two decades, falling victim of their own anti-inflation propaganda. As a result, serious mistakes were made when circumstances suddenly changed. We saw the impact of the supply and demand imbalances when Covid restrictions ended and then the extra pressures on energy and food prices with the war in Ukraine. But in King’s view the reappearance of inflation in the last couple of […]
Just like the world at large, the world of work shifts and changes over time. The future of work refers to an informed perspective on what businesses and other organistions need to know about how work could shift (given digitisation and other trends), plus how workforces and workplaces can prepare for those changes, big and small. When you think of the future of work, what do you picture? Offices that look more or less like today’s? Factories full of robots? Or something else entirely? While no one can predict the future with absolute certainty, the world of work is changing, just as the world itself is. Looking ahead at how work will shift, along with trends affecting the workforce and workplaces, can help you prepare for what is next: One in sixteen workers may have to switch occupations by 2030. That is more than 100 million workers across the eight economies studied—and the pandemic accelerated expected workforce transitions. Job growth will be more concentrated in high-skill jobs (for example, in healthcare or science, technology, engineering, and math [STEM] fields), while middle- and low-skill jobs (such as food service, production work, or office support roles) will decline. A few job categories could see more growth than others. The rise of e-commerce created demand for warehouse workers; investments in the green economy could increase the need for wind turbine technicians; aging populations in many advanced economies will increase demand for nurses, home health aides, and hearing-aid technicians; and teachers and training instructors will also continue to find work over the coming decade. But other types of jobs may be at risk: for example, as grocery stores increasingly install self-checkout counters, there may be a need for fewer clerks, and robotics used to process routine paperwork may lessen demand for some office workers. The future of work was shifting […]
By Chris Miller An epic account of the decades-long battle to control what has emerged as the world’s most critical resource—microchip technology—with the United States and China increasingly in conflict. You may be surprised to learn that microchips are the new oil—the scarce resource on which the modern world depends. Today, military, economic, and geopolitical power are built on a foundation of computer chips. Virtually everything—from missiles to microwaves, smartphones to the stock market—runs on chips. Until recently, America designed and built the fastest chips and maintained its lead as the #1 superpower. Now, America’s edge is slipping, undermined by competitors in Taiwan, Korea, Europe, and, above all, China. Today, as Chip War reveals, China, which spends more money each year importing chips than it spends importing oil, is pouring billions into a chip-building initiative to catch up to the US. At stake is America’s military superiority and economic prosperity. Economic historian Chris Miller explains how the semiconductor came to play a critical role in modern life and how the U.S. become dominant in chip design and manufacturing and applied this technology to military systems. America’s victory in the Cold War and its global military dominance stems from its ability to harness computing power more effectively than any other power. But here, too, China is catching up, with its chip-building ambitions and military modernization going hand in hand. America has let key components of the chip-building process slip out of its grasp, contributing not only to a worldwide chip shortage but also a new Cold War with a superpower adversary that is desperate to bridge the gap. Illuminating, timely, and fascinating, Chip War shows that, to make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chips.
-Adam Sharmer, senior partner, Dsifer As the pace of digital innovation accelerates, and organisations look to exploit the financial and strategic advantages of digital technology, automation and AI, data is increasingly becoming the lifeblood of organisational decision ecosystems. Whilst the collection and analysis of data to inform operational decision-making is not a new concept, advances in connectivity of technology, AI and machine learning as well as ERP and MES solutions have driven exponential gains in the availability, quality, and integration of data in manufacturing organisations. Spend a few moments on LinkedIn and its easy to become overwhelmed by the level technology available and the visions of possible futures available through its adoption. However, whilst there is no doubt that Industry 4.0 technologies represent a potential step-change in manufacturing performance, the reality is that the CAPEX required for investments in, for example, lights-out warehousing or production automation, is beyond the reach of a lot of organisations. As a result, the journey from current state to the Industry 4.0 utopia can seem like an insurmountable one. The good news is that, whilst strategic adoption of technology should be on the business plan for all organisations if they are to remain competitive, there is a lot of value that can be unlocked with the current context or with low levels of investment. In our experience, most organisations have increased their maturity in the collection and storage of data in all or most organisational processes. However, the true value of this data is not yet being realised. A deliberate focus on process, system, analytics and cultural dimensions to become a data-driven organisation, can unlock the latent value in current systems and data and build the foundation from which to optimise I4 technology. From our experience, data-driven organisations share the following characteristics: They collect […]
By Matt Hale, International Sales & Marketing Director, HRS Heat Exchangers In 2021 the global market for zero liquid discharge (ZLD) technology was estimated at US$1 billion and is forecast to grow at almost 12% over the next ten years. The rise is being driven in particular by an increase in adoption of the technology by the food and drink and textile industries as a growing world population puts greater pressure on fresh water supplies. What is ZLD? Zero liquid discharge (ZLD) is a liquid waste stream treatment which involves transforming liquid waste streams into clean water (which can be reused) and a minimum volume of solid residues. One of the advantages of ZLD over other treatment techniques is its theoretical ability to separate unwanted materials from water, whether they are benign, hazardous or toxic. The resulting solid residue is often more stable, making it suitable for recycling or landfill. However, poor management or handling of the remaining residue can result in unintended environmental consequences. For example, storage ponds may leak or affect local wildlife while there is the potential for toxic chemicals to leach into groundwater from landfill. It is therefore important that when implementing a ZLD system, full consideration is given to the entire process, including the ultimate fate of liquid and (semi-)solid waste streams. A well-designed ZLD system should minimise or even eliminate liquid waste streams, resulting in clean water for reuse or environmentally-friendly discharge, and a solid residue suitable for further processing (often to recover valuable components for use elsewhere) or for safe disposal. The factors driving ZLD uptake According to Transparency Market Research, ZLD is being implemented across a wide range of industries, including chemical and petrochemical production, food and drink production, textiles, energy and power, and pharmaceutical manufacturing. These industries are being driven to […]
Electronics manufacturers adopt generative AI and Omniverse to digitalise state-of-the-art factories
Electronics manufacturers worldwide are advancing their industrial digitalisation efforts using a new, comprehensive reference workflow that combines Nvidia technologies for generative AI, 3D collaboration, simulation and autonomous machines. Supported by an expansive partner network, the workflow helps manufacturers plan, build, operate and optimize their factories with an array of technologies. These include: Omniverse, which connects top computer-aided design apps, as well as APIs and cutting-edge frameworks for generative AI; the Isaac Sim application for simulating and testing robots; and the Metropolis vision AI framework, now enabled for automated optical inspection. At Computex, Nvidia founder and CEO, Jensen Huang, showcased a demo of an entirely digitalised smart factory — an industry first for electronics makers. “The world’s largest industries make physical things. Building them digitally first can save enormous costs,” said Huang. “We make it easy for electronics makers to build and operate virtual factories, digitalise their manufacturing and inspection workflows, and greatly improve quality and safety while reducing costly last-minute surprises and delays.” Foxconn Industrial Internet, a service arm of the world’s largest technology manufacturer, is working with Nvidia Metropolis ecosystem partners to automate significant portions of its circuit-board quality-assurance inspection points. Innodisk is deploying Nvidia Metropolis to automate optical inspection processes on its production lines, saving cost and improving production efficiency. Pegatron, a leading electronics manufacturer and service provider, is using the reference workflow to digitalise its circuit-board factories with simulation, robotics and automated production inspection. Quanta, a major manufacturer of laptops and other electronic hardware, is using AI robots from its subsidiary Techman Robot to inspect the quality of manufactured products. Techman is leveraging Isaac Sim to simulate, test and optimise its state-of-the-art collaborative robots while using NVIDIA AI and GPUs for inference on the robots themselves. Wistron, one of the world’s largest suppliers of information and communications products, is tapping Omniverse to build digital twins of its automated […]