Trends Identified

Navigating complexity to exceed expectations
Technological progress, shifting demographics, urban expansion, the rise of emerging markets and a changing planet are moving the world beyond globalisation to a multi-polar reality. As this happens CEOs are learning that much of their success depends on sensing and addressing the rapidly changing values and expectations of their many stakeholders.
2016
19th Annual global CEO survey
PWC
Negative/disruptive impact of intelligent automation/digital labor
17% of the respondents view this as a negative trend
2017
Adoption of intelligent automation does not equal success. 4Q 2017 KPMG Global Insights Pulse Survey Report.
KPMG
Negative/disruptive impact of intelligent automation/digital labor
14% of the respondents view this as a negative trend.
2019
4Q 2018 KPMG Global Insights Pulse Survey Report
KPMG
Net neutrality
The Federal Communications Commission voted in December to repeal net neutrality regulations put in place during the Obama administration. A Pew Research Center analysis of comments submitted online to the FCC about net neutrality found that during the four-month period (April 27 to Aug. 30, 2017) in which the FCC accepted comments on net neutrality, an average of 172,246 posts were submitted per day.
2017
Key trends shaping technology in 2017
Pew Research Center
Network Growth
Technological advances, and a greater understanding of social, physical and virtual network behaviour, will converge to drive new types of network architecture and applications. These will be increasingly accessed by remote and distributed means. Technology applications such as those supporting social networking will continue to reconfigure and enable new social models and means of interacting. This will raise fundamental issues about privacy, security, legal frameworks and the mechanisms for influence. The rate of growth of hardware development is unlikely to reduce before 2020, and software technology may fail to keep pace with these advances, contributing to an increasing proportion of major project failures. The growth of many networks is unlikely to be governed by top-down planning; such growth is likely to occur in a decentralised manner, often analogous to nature. In order to improve effectiveness and reduce vulnerability increased understanding of network topology and nodal behaviour, including people, will be required. There will be changes in network technology driven by: the need to improve end-to-end security; the requirement to support large numbers of Internet-enabled devices; and the ability to directly convert from optical to wireless connectivity. The evolution of ICT devices will be driven by their increasingly wide range of applications and rising demand by society. Increased Internet penetration across the globe, particularly in heavily populated areas, will influence Internet content and ownership.
2010
Global strategic trends - out to 2040
UK, Ministry of Defence
Neural stem cell therapy
The technology that collects adult stem cells from the patient’s body (skin) and cultivate it to neural stem cells, and implant it onto damaged brain. There is currently no treatment for degenerative brain diseases such as Alzheimer’s and Parkinsons disease. We expect this could provide a fundamental treatment method to replace dead brain cells into neural stem cells.
2013
KISTEP 10 Emerging Technologies 2013
South Korea, Korea Institute of S&T Evaluation and Planning (KISTEP)
Neuro-technology
Smart technologies will be crucial technologies until 2030 and beyond. They will help societies to monitor, detect as well as respond or adapt to changes in their environment. Smart technologies are already and will become a part of our daily lives. 37 For example, smart electricity metering has addressed the problem of the losses of electricity due to theft. 38 Emerging technologies in the area of artificial intelligence have received much attention in which computer systems that carry out tasks normally done by humans, such as speech recognition and decision making. Another example is robotics which is understood as machines or mechanical systems that automatically handle tasks. Mesoscience 39 powered virtual reality gives us the possibility to realize the logic and structural consistence between problems, physical models, numerical methods and hardware, which, together with the dramatic development of computing technology, is opening a new era for virtual reality. Digital Automation characterizes the increasing ability of computers to overtake cognitive - and not just physical - tasks, enabling recent innovations like driverless cars, IBM Watson, e-discovery platforms for legal practice, and personalization algorithms for Web search, e-commerce, and social networks. The potential consequences of automation and artificial intelligence on employment are emerging areas in need of examination; the expansion of computing and machine intelligence is likely to affect healthcare, education, privacy and cybersecurity, and energy and environmental management. Recent studies are pointing to the possibility that a significant number of jobs - or job tasks - are amenable to automation, leading to a job polarization where demand for middle-income jobs are reduced while non-routine cognitive jobs (e.g., financial analysis or computer programming) and non-routine manual jobs (e.g., hairdressing) would be less unaffected. At this point, more study is warranted to understand implications for employment and socio-economic development in a specific national context. Autonomous vehicles or self-driving cars hold the promise to increase traffic efficiency, productivity, reduce traffic congestions and pollution, and save driving time. In 2016, the Dubai Autonomous Transportation Strategy was launched which foresees 25 per cent of all trips in Dubai to be driverless by 2030. The Autonomous Transportation Challenge as launched as a request for proposals to global R&D centres to apply this technology in Dubai. It will make Dubai the world’s largest R&D lab for driverless transportation. 40
2016
Global sustainable development report 2016
United Nations
Neuromorphic Hardware – Using Nature’s Designs
Neuromorphic hardware is based on conventional processors that are conceptually inspired by neurobiological architectures. Neuromorphic systems are at the very early prototype stage between basic and applied research but the topic is gaining traction within the industry. Companies such as IBM, Intel, Samsung, HP and Google are using the neuromorphic concept to build energy-effi cient networks inspired by biology. Neuromorphic hardware promises new designs for diff erent ways of computing and extreme performance while using little energy. It is suitable for use cases based on machine learning, in particular for pattern recognition, event-driven vision processing, and robotics. It is in competition with quantum computing and in both cases the complexities are potential threats. For now, classical GPUs are more accessible and easily programmable than neuromorphic silicon and programming neuromorphic hardware requires new methodologies that still have to be developed. Based on our learning from the neuromorphic hardware research project within the “Human Brain Project” at University Heidelberg, but we believe that the neuromorphic approach will lead to new concepts in combination with machine learning and powerful graphical GPUs.
2018
Trend Report 2018 - Emerging Technology Trends
SAP
Neuromorphic technology
Even today’s best supercomputers cannot rival the sophistication of the human brain. Computers are linear, moving data back and forth between memory chips and a central processor over a high-speed backbone. The brain, on the other hand, is fully interconnected, with logic and memory intimately cross-linked at billions of times the density and diversity of that found in a modern computer. Neuromorphic chips aim to process information in a fundamentally different way from traditional hardware, mimicking the brain’s architecture to deliver a huge increase in a computer’s thinking and responding power. Miniaturization has delivered massive increases in conventional computing power over the years, but the bottleneck of shifting data constantly between stored memory and central processors uses large amounts of energy and creates unwanted heat, limiting further improvements. In contrast, neuromorphic chips can be more energy efficient and powerful, combining data-storage and data-processing components into the same interconnected modules. In this sense, the system copies the networked neurons that, in their billions, make up the human brain. Neuromorphic technology will be the next stage in powerful computing, enabling vastly more rapid processing of data and a better capacity for machine learning. IBM’s million-neuron TrueNorth chip, revealed in prototype in August 2014, has a power efficiency for certain tasks that is hundreds of times superior to a conventional CPU (Central Processing Unit), and more comparable for the first time to the human cortex. With vastly more compute power available for far less energy and volume, neuromorphic chips should allow more intelligent small-scale machines to drive the next stage in miniaturization and artificial intelligence. Potential applications include: drones better able to process and respond to visual cues, much more powerful and intelligent cameras and smartphones, and data-crunching on a scale that may help unlock the secrets of financial markets or climate forecasting. Computers will be able to anticipate and learn, rather than merely respond in pre-programmed ways.
2015
Top 10 emerging technologies of 2015
World Economic Forum (WEF)
Neuroprosthetics
2017
Top 50 Emerging Technologies 2017
Frost & Sullivan