Never let a serious crisis go to waste.

~ Rahm Emanuel

The crisis in 2020 is serving as a catalyst for ushering into the fourth Industrial Revolution or Industry 4.0. In the next 5-10 years our society will completely transform into the new world. This is the beginning of a new era, with many opportunities, but also challenges and dangers.

Historical Context

The first profound shift in our way of living, the transition from foraging to farming, happened around 10.000 years ago and was made possible by the domestication of animals. The agrarian revolution combined the efforts of animals with those of humans for the purpose of production, transportation and communication. Little by little, food production improved, spurring population growth and enabling larger human settlements. This eventually led to urbanization and the rise of cities.

The agrarian revolution was followed by a series of industrial revolutions that began in the second half of the 18th century. These marked the transition from muscle power to mechanical power, evolving to where today, with the fourth industrial revolution, enhanced cognitive power is augmenting human production.

The first industrial revolution spanned from about 1760 to around 1840. Triggered by the construction of railroads and the invention of the steam engine, it ushered in mechanical production.

The second industrial revolution, which started in the late 19th century and into the early 20th century, made mass production possible, fostered by the advent of electricity and the assembly line.

The third industrial revolution began in the 1960s. It is usually called the computer or digital revolution because it was catalyzed by the development of semiconductors, mainframe computing (1960s), personal computing (1970s and 80s) and the internet (1990s).

The fourth industrial revolution began at the turn of this century and builds on the digital revolution. It is characterized by a much more ubiquitous and mobile internet, by smaller and more powerful sensors that have become cheaper, and by artificial intelligence and machine learning. Digital technologies are becoming more sophisticated and integrated and are, as a result, transforming societies and the global economy.

The fourth industrial revolution, however, is not only about smart and connected machines and systems. Its scope is much wider. Occurring simultaneously are waves of further breakthroughs in areas ranging from gene sequencing to nanotechnology, from renewables to quantum computing, from advanced robotics to neurotechnology. It is the fusion of these technologies and their interaction across the physical, digital and biological domains that make the fourth industrial revolution fundamentally different from previous revolutions.

The question for all, without exception, is no longer Am I going to be disrupted? but When is disruption coming, what form will it take and how will it affect me?

The reality of disruption and the inevitability of the impact it will have on us does not mean that we are powerless in the face of it. It is our responsibility to prepare and make choices.

Key Drivers

All new developments and technologies have one key feature in common: they leverage the pervasive power of digitization and information technology. Gene sequencing, for example, could not happen without progress in computing power and data analytics. Similarly, advanced robots would not exist without artificial intelligence, which itself largely depends on computing power. To identify the mega trends and convey the broad landscape of technological drivers of the fourth industrial revolution, we can look into three clusters: physical, digital and biological. All three are deeply interrelated and the various technologies benefit from one another based on the discoveries and progress each makes.

Physical

There are four main physical manifestations of the technological mega trends, which are the easiest to see because of their tangible nature:

  • autonomous vehicles
  • 3D printing
  • advanced robotics
  • new materials

Autonomous vehicles

The driverless car dominates the news, but there are now many other autonomous vehicles including trucks, drones, aircrafts and boats. As technologies such as sensors and artificial intelligence progress, the capabilities of all these autonomous machines improve at a rapid pace. It is only a question of a few years before low-cost, commercially available drones, together with submersibles, are used in different applications.

As drones become capable of sensing and responding to their environment (altering their flight path to avoid collisions), they will be able to do tasks such as checking electric power lines or delivering medical supplies in war zones. In agriculture, the use of drones, combined with data analytics, will enable more precise and efficient use of fertilizer and water, for example.

3D printing

Also called additive manufacturing, 3D printing consists of creating a physical object by printing layer upon layer from a digital 3D drawing or model. This is the opposite of subtractive manufacturing, which is how things have been made until now, with layers being removed from a piece of material until the desired shape is obtained. By contrast, 3D printing starts with loose material and then builds an object into a three-dimensional shape using a digital template.

The technology is being used in a broad range of applications, from large (wind turbines) to small (medical implants). For the moment, it is primarily limited to applications in the automotive, aerospace and medical industries. Unlike mass-produced manufactured goods, 3D-printed products can be easily customized. As current size, cost and speed constraints are progressively overcome, 3D printing will become more pervasive to include integrated electronic components such as circuit boards and even human cells and organs. Researchers are already working on 4D, a process that would create a new generation of self-altering products capable of responding to environmental changes such as heat and humidity. This technology could be used in clothing or footwear, as well as in health-related products such as implants designed to adapt to the human body.

Advanced robotics

Until recently, the use of robots was confined to tightly controlled tasks in specific industries such as automotive. Today, however, robots are increasingly used across all sectors and for a wide range of tasks from precision agriculture to nursing. Rapid progress in robotics will soon make collaboration between humans and machines an everyday reality. Moreover, because of other technological advances, robots are becoming more adaptive and flexible, with their structural and functional design inspired by complex biological structures (an extension of a process called bio mimicry, whereby natures patterns and strategies are imitated).

Advances in sensors are enabling robots to understand and respond better to their environment and to engage in a broader variety of tasks such as household chores. Contrary to the past when they had to be programmed through an autonomous unit, robots can now access information remotely via the cloud and thus connect with a network of other robots. When the next generation of robots emerges, they will likely reflect an increasing emphasis on humanmachine collaboration. There are ethical and psychological questions raised by human-machine relations.

New materials

With attributes that seemed unimaginable a few years ago, new materials are coming to market. On the whole, they are lighter, stronger, recyclable and adaptive. There are now applications for smart materials that are self-healing or self-cleaning, metals with memory that revert to their original shapes, ceramics and crystals that turn pressure into energy, and so on.

Like many innovations of the fourth industrial revolution, it is hard to know where developments in new materials will lead. Take advanced nanomaterials such as graphene, which is about 200 times stronger than steel, a million-times thinner than a human hair, and an efficient conductor of heat and electricity. When graphene becomes price competitive (gram for gram, it is one of the most expensive materials on earth, with a micrometer-sized flake costing more than $1000), it could significantly disrupt the manufacturing and infrastructure industries. It could also profoundly affect countries that are heavily reliant on a particular commodity.

Other new materials could play a major role in mitigating the global risks we face. Innovations in thermoset plastics, for example, could make reusable materials that have been considered nearly impossible to recycle but are used in everything from mobile phones and circuit boards to aerospace industry parts. The recent discovery of new classes of recyclable thermosetting polymers, called polyhexahydrotriazines, is a major step toward the circular economy, which is regenerative by design and works by decoupling growth and resource needs.

Digital

One of the main bridges between the physical and digital applications enabled by the fourth industrial revolution is the internet of things (IoT), sometimes called the internet of all things. In its simplest form, it can be described as a relationship between things (products, services, places, etc.) and people that is made possible by connected technologies and various platforms.

Sensors and numerous other means of connecting things in the physical world to virtual networks are proliferating at an astounding pace. Smaller, cheaper and smarter sensors are being installed in homes, clothes and accessories, cities, transport and energy networks, as well as manufacturing processes. Today, there are billions of devices around the world such as smartphones, tablets and computers that are connected to the internet. Their numbers are expected to increase dramatically over the next few years, with estimates ranging from several billions to more than a trillion. This will radically alter the way in which we manage supply chains by enabling us to monitor and optimize assets and activities to a very granular level. In the process, it will have transformative impact across all industries, from manufacturing to infrastructure to healthcare.

Consider remote monitoring, a widespread application of the IoT. Any package, pallet or container can now be equipped with a sensor, transmitter or radio frequency identification (RFID) tag that allows a company to track where it is as it moves through the supply chain, how it is performing, how it is being used, and so on. Similarly, customers can continuously track (practically in real time) the progress of the package or document they are expecting. For companies that are in the business of operating long and complex supply chains, this is transformative. In the near future, similar monitoring systems will also be applied to the movement and tracking of people.

The digital revolution is creating radically new approaches that revolutionize the way in which individuals and institutions engage and collaborate. For example, the blockchain, often described as a distributed ledger, is a secure protocol where a network of computers collectively verifies a transaction before it can be recorded and approved. The technology that underpins the blockchain creates trust by enabling people who do not know each other (and thus have no underlying basis for trust) to collaborate without having to go through a neutral central authority, a custodian or central ledger. In essence, the blockchain is a shared, programmable, cryptographically secure and therefore trusted ledger which no single user controls and which can be inspected by everyone.

Bitcoin is so far the best known blockchain application, but the technology will soon give rise to countless others. If, at the moment, blockchain technology records financial transactions made with digital currencies such as Bitcoin, it will serve as a registrar for things as different as birth and death certificates, titles of ownership, marriage licenses, educational degrees, insurance claims, medical procedures and votes, essentially any kind of transaction that can be expressed in code. Some countries or institutions are already investigating the blockchains potential. The government of Honduras, for example, is using the technology to handle land titles, while the Isle of Man is testing its use in company registration.

On a broader scale, technology-enabled platforms make possible what is now called the on-demand economy (referred to by some as the sharing economy). These platforms, which are easy to use on a smartphone, convene people, assets and data, creating entirely new ways of consuming goods and services. They lower barriers for businesses and individuals to create wealth, altering personal and professional environments.

The Uber model epitomizes the disruptive power of these technology platforms. These platform businesses are rapidly multiplying to offer new services ranging from laundry to shopping, from chores to parking, from home-stays to sharing long-distance rides. They have one thing in common: by matching supply and demand in a very accessible (low cost) way, by providing consumers with diverse goods, and by allowing both parties to interact and give feedback, these platforms therefore seed trust. This enables the effective use of underutilized assets, namely those belonging to people who had previously never thought of themselves as suppliers (seat in their car, a spare bedroom in their home, a commercial link between a retailer and manufacturer, or the time and skill to provide a service like delivery, home repair or administrative tasks).

The on-demand economy raises the fundamental question: What is worth owning, the platform or the underlying asset? Uber, the worlds largest taxi company, owns no vehicles. Facebook, the worlds most popular media owner, creates no content. Alibaba, the most valuable retailer, has no inventory. And Airbnb, the worlds largest accommodation provider, owns no real estate.

Digital platforms have dramatically reduced the transaction and friction costs incurred when individuals or organizations share the use of an asset or provide a service. Each transaction can now be divided into very fine increments, with economic gains for all parties involved. In addition, when using digital platforms, the marginal cost of producing each additional product, good or service tends toward zero. This has dramatic implications for business and society.

Biological

Innovations in the biological realm, and genetics in particular, are nothing less than breathtaking. In recent years, considerable progress has been achieved in reducing the cost and increasing the ease of genetic sequencing and, lately, in activating or editing genes. It took more than 10 years, at a cost of $2.7 billion, to complete the Human Genome Project. Today, a genome can be sequenced in a few hours and for less than a thousand dollars. With advances in computing power, scientists no longer go by trial and error; rather, they test the way in which specific genetic variations generate particular traits and diseases.

Synthetic biology is the next step. It will provide us with the ability to customize organisms by writing DNA. Setting aside the profound ethical issues this raises, these advances will not only have a profound and immediate impact on medicine but also on agriculture and the production of biofuels.

Many of our intractable health challenges, from heart disease to cancer, have a genetic component. Because of this, the ability to determine our individual genetic make-up in an efficient and cost-effective manner (through sequencing machines used in routine diagnostics) will revolutionize personalized and effective healthcare. Informed by a tumors genetic make-up, doctors will be able to make decisions about a patients cancer treatment.

While our understanding of the links between genetic markers and disease is still poor, increasing amounts of data will make precision medicine possible, enabling the development of highly targeted therapies to improve treatment outcomes. Already, IBMs Watson supercomputer system can help recommend, in just a few minutes, personalized treatments for cancer patients by comparing the histories of disease and treatment, scans and genetic data against the (almost) complete universe of up-to-date medical knowledge.

The ability to edit biology can be applied to practically any cell type, enabling the creation of genetically modified plants or animals, as well as modifying the cells of adult organisms including humans. This differs from genetic engineering practiced in the 1980s in that it is much more precise, efficient and easier to use than previous methods. In fact, the science is progressing so fast that the limitations are now less technical than they are legal, regulatory and ethical. The list of potential applications is virtually endless, ranging from the ability to modify animals so that they can be raised on a diet that is more economical or better suited to local conditions, to creating food crops that are capable of withstanding extreme temperatures or drought.

As research into genetic engineering progresses (for example, the development of the CRISPR/Cas9 method in gene editing and therapy), the constraints of effective delivery and specificity will be overcome, leaving us with one immediate and most challenging question, particularly from an ethical viewpoint: How will genetic editing revolutionize medical research and medical treatment? In principle, both plants and animals could potentially be engineered to produce pharmaceuticals and other forms of treatment. The day when cows are engineered to produce in its milk a blood-clotting element, which hemophiliacs lack, is not far off. Researchers have already started to engineer the genomes of pigs with the goal of growing organs suitable for human transplantation (a process called xenotransplantation, which could not be envisaged until now because of the risk of immune rejection by the human body and of disease transmission from animals to humans).

3D manufacturing will be combined with gene editing to produce living tissues for the purpose of tissue repair and regeneration, a process called bioprinting. This has already been used to generate skin, bone, heart and vascular tissue. Eventually, printed liver-cell layers will be used to create transplant organs.

We are developing new ways to embed and employ devices that monitor our activity levels and blood chemistry, and how all of this links to well-being, mental health and productivity at home and at work. We are also learning far more about how the human brain functions, and we are seeing exciting developments in the field of neurotechnology. This is underscored by the fact that over the past few years two of the most funded research programs in the world are in brain sciences.

It is in the biological domain where are the greatest challenges for the development of both social norms and appropriate regulation. We are confronted with new questions around what it means to be human, what data and information about our bodies and health can or should be shared with others, and what rights and responsibilities we have when it comes to changing the very genetic code of future generations.

To return to the issue of genetic editing, that it is now far easier to manipulate with precision the human genome within viable embryos means that we are likely to see the advent of designer babies in the future who possess particular traits or who are resistant to a specific disease. Discussions about the opportunities and challenges of these capabilities are under way. Notably, in December 2015, the National Academy of Sciences and National Academy of Medicine of the US, the Chinese Academy of Sciences and the Royal Society of the UK convened an International Summit on Human Gene Editing. Despite such deliberations, we are not yet prepared to confront the realities and consequences of the latest genetic techniques even though they are coming. The social, medical, ethical and psychological challenges that they pose are considerable and need to be resolved or, at the very least, properly addressed.

Impact

The scale and breadth of the unfolding technological revolution will usher in economic, social and cultural changes of such phenomenal proportions that they are almost impossible to envisage. The fourth industrial revolution will impact economy, business, governments and countries, society and individuals.

The most important thing is that changes can't be forced and mandatory for individuals (like mandatory vaccination and chip implants) and limiting freedom and participation in the new society. Otherwise, it wouldn't be industrial revolution, but tyranny and would be time to drop off from society and go to the jungle.

What we can do to prepare:

  • Educate about upcoming changes and technologies and adjust our skills and strategies accordingly
  • Protect our privacy
  • Start growing food, move to homestead / off-grid and try to be self-sufficient as much as possible

Conclusion

It's going to be a wild ride, so make sure to prepare properly! It's a double edge sword, and I am wondering how far are we from the Black Mirror TV show? These short reports can shed some light on the concerns: