Trends Identified

Plasmonic Materials - Light-controlled nanomaterials are revolutionizing sensor technology
Writing in Scientific American in 2007, Harry A. Atwater of the California Institute of Technology predicted that a technology he called “plasmonics” could eventually lead to an array of applications, from highly sensitive biological detectors to invisibility cloaks. A decade later various plasmonic technologies are already a commercial reality, and others are transitioning from the laboratory to the market. These technologies all rely on controlling the interaction between an electromagnetic field and the free electrons in a metal (typically gold or silver) that account for the metal’s conductivity and optical properties. Free electrons on a metal’s surface oscillate collectively when hit by light, forming what is known as surface plasmon. When a piece of metal is large, the free electrons reflect the light that hits them, giving the material its shine. But when a metal measures just a few nanometers, its free electrons are confined in a very small space, limiting the frequency at which they can vibrate. The specific frequency of the oscillation depends on the size of the metal nanoparticle. In a phenomenon called resonance, the plasmon absorbs only the fraction of incoming light that oscillates at the same frequency as the plasmon itself does (reflecting the rest of the light). This surface plasmon resonance can be exploited to create nanoantennas, efficient solar cells and other useful devices.
2018
Top 10 Emerging Technologies of 2018
Scientific American
Algorithms for Quantum Computers - Developers are perfecting programs meant to run on quantum computers
Quantum computers exploit quantum mechanics to perform calculations. Their basic unit of computation, the qubit, is analogous to the standard bit (zero or one), but it is in a quantum superposition between two computational quantum states: it can be a zero and a one at the same time. That property, along with another uniquely quantum feature known as entanglement, can enable quantum computers to resolve certain classes of problems more efficiently than any conventional computer can. This technology, while exciting, is notoriously finicky. A process called decoherence, for example, can disrupt its function. Investigators have determined that stringently controlled quantum computers that have a few thousand qubits could be made to withstand decoherence through a technique known as quantum error correction. But the largest quantum computers that laboratories have demonstrated so far—the most notable examples are from IBM, Google, Rigetti Computing and IonQ—contain just tens of quantum bits. These versions, which John Preskill of the California Institute of Technology named noisy intermediatescale quantum (NISQ) computers, cannot perform error correction yet. Nevertheless, a burst of research on algorithms written specifically for NISQs might enable these devices to perform certain calculations more efficiently than classic computers.
2018
Top 10 Emerging Technologies of 2018
Scientific American
Augmented Reality Everywhere - Coming soon: the world overlaid with data
Virtual reality (VR) immerses you in a fictional, isolated universe. Augmented reality (AR), in contrast, overlays computer-generated information on the real world in real time. As you look at or wear a device equipped with AR software and a camera—be it a smartphone, a tablet, a headset or smart glasses—the program analyzes the incoming video stream, downloads extensive information about the scene and superposes on it relevant data, images or animations, often in 3-D.
2018
Top 10 Emerging Technologies of 2018
Scientific American
Advanced Diagnostics for Personalized Medicine - A new generation of tools could help end one-size-fits-all therapeutics.
For most of the 20th century all women with breast cancer received similar treatment. Therapy has since become more individualized: breast cancers are now divided into subtypes and treated accordingly. Many women whose tumors produce estrogen receptors, for instance, may receive drugs that specifically target those receptors, along with standard postsurgery chemotherapy. This year researchers took a step closer to even more personalized treatment. They identified a significant fraction of patients whose tumors possess characteristics that indicate they can safely forgo chemo—and avoid its often serious side effects.
2018
Top 10 Emerging Technologies of 2018
Scientific American
Connectivity-driven business models
For years, companies shared business models and tried to outperform each other. Today, connectivity is enabling new business models. For example, more than half of the respondents expect to see pay-per-use models within their own industries, with data monetization by far the next most common business model. Software is becoming much more important than hardware, and customer interactions are increasingly digitized, in many cases managing without intermediaries. Consequently, connectivity-driven fields such as shared mobility are expected to grow significantly in the coming years.
2018
Disruptive forces in the industrial sectors - Global executive survey
McKinsey
AI and autonomous systems
Learning from data and developing smart algorithms has become a competitive advantage. Executives from all sectors believe that AI and autono­mous systems will affect the entire industry. Investment in AI is at unprecedented levels from both tech firms and traditional manufacturers. Driverless vehicles are AI’s poster child, but industrial companies are also investing in machine learning and robotics to develop specific technologies related to their core businesses.
2018
Disruptive forces in the industrial sectors - Global executive survey
McKinsey
Internet of Things (IoT)
This much hyped term refers to the sensor-enabled devices that can communicate with one another via the Internet. The possible uses are still being unearthed, but the McKinsey Global Institute predicts that the annual economic impact of IoT applica­tions could be as much as USD 11.1 trillion by 2025. MGI suggests that factories are likely to see the greatest potential impact from IoT use – as much as USD 3.7 trillion per year – with substantial productivity improvements, including 10 to 20 percent energy savings and a 10 to 25 percent improvement in labor efficiency.
2018
Disruptive forces in the industrial sectors - Global executive survey
McKinsey
Electrification
Replacing traditional energy sources with electric energy – most notably in vehicles – is being driven by regulatory and technological changes and by growing consumer demand. The growth in electric vehicles sales is expected to be 25 to 30 percent a year to 2025 (see Exhibit 4). A senior executive at a European OEM believes it will affect at least half of the sector’s revenues, both in vehicles and infrastructure. Stricter emission regulations and lower battery costs are all contributing to the flurry of activity in this area.
2018
Disruptive forces in the industrial sectors - Global executive survey
McKinsey
Cybersecurity
The increase in connectivity between companies and consumers as well as within organizations, production facilities, transportation systems, defense systems, etc. means that cybersecurity is critically important. Once closed systems are now open, increas­ing vulnerability and placing ever higher-value assets and processes at risk, leading to an annual growth in the market for cybersecurity of 5 to 10 percent until 2025 (see Exhibit 4). Our survey revealed widespread and growing concern on this topic, and many companies are starting to bring in the skills they need for tackling cybersecurity concerns. Some even see cybersecurity as a battleground for competitive advantage and differentiation.
2018
Disruptive forces in the industrial sectors - Global executive survey
McKinsey
Artificial intelligence and machine learning
Progress in AI has accelerated rapidly since around 2010, driven by the confluence of the growing availability of large data sets from commerce, social media, science and other sources; continued improvements in computational power; and the development of better machine learning algorithms and techniques (such as “deep learning”). Systems are now capable of learning how to accomplish a task without having been provided with explicit steps for doing so. Once designed and deployed, the neural network that underpins modern AI can formulate its own rules for interpreting new data and designing solutions, with minimal— or no— human participation.
2018
World Economic And Social Survey 2018: Frontier Technologies For Sustainable Development
United Nations