Gen2 RPA for staffing problems

Gen2 RPA – A Potential Solution to Staffing Problems?

The COVID pandemic and the emerging recession are bringing significant staffing problems for sectors such as education, healthcare, retail, transportation, and construction, all of which face difficulties in finding skilled labor. While technological advancements cannot fully replace the roles and functions performed by humans, second-generation Robotic Process Automation (Gen2 RPA) currently represent the best viable option to address the labor shortage and overworked employees.

According to Aaron Bultman, product director at digital transformation platform Nintex, “RPA is a form of business process automation that allows anyone to define a set of instructions for a robot or ‘bot’ to perform. RPA bots are capable of mimicking most human-computer interactions to carry out a ton of error-free tasks at high volume and speed.” However, the first generation of RPA, though it has on the rise for the past years, has already proved its limitations. Proprietary toolsets, bots that crash with every software update or change in the environment, and scaling-up difficulties, are just some of the problems the first generation of RPA is currently facing.

A better, quicker, and more cost-effective form of robotic process automation, Gen2 RPA provides developers with more control and flexibility than proprietary versions. This technology is based on open-source architecture and scalable cloud technologies. It can automate practically any repeatable business activity, supporting administrators in finance, human resources, information technology, and compliance in dealing with operational concerns.

A study conducted by SmartSheet showed that workers could save at least six hours a week when automation is implemented, leading to more standardized and streamlined processes and improved employee life-work balance. At the same time, by automating repetitive tasks, companies leave more room for creativity, innovation, and problem-solving, allowing their employees to focus more on customer experience.

If you enjoyed this article, we invite you to discover our upcoming Supertrends Platform, conceived to give you access to trends and innovation impacting your areas of interest directly from one place.

cyberwar infrastructure

Critical Infrastructure Protection: As Cyberwar Scenarios Fail to Materialize, Are We Focusing on the Wrong Threats?

A number of recent high-profile attacks on critical infrastructures in several countries have raised concerns about protecting vital public assets. While many experts have long predicted a “Digital Pearl Harbor” involving high-tech cyberattacks, these operations have been carried out as low-tech attacks using angle grinders and explosives. Are we preparing for the wrong threats, mesmerized by the prospect of extremely sophisticated low-likelihood, high-impact incidents?

For decades, some security experts and media alike have been touting the specter of “cyberwar” – the disruption or destruction of critical assets essential to the functioning of society by operatives who exploit vulnerabilities in digital networks. Such scenarios often envisage shadowy actors, directly or indirectly controlled by hostile state agencies, burrowing into another nation’s vital systems over time, only to suddenly shut them down in an instant without warning when their governments order them to do so during a crisis or military confrontation, bringing the targeted country to its knees through the failure of its infrastructure backbone.

At this point, the narrative goes, the hackers will open dams and floodgates, delete crucial data, overload energy transmission networks, or hijack command and control systems for airports, hospitals, and similar facilities, while their governments avoid accountability due to the difficulty of attributing such operations to state actors. For several decades, these fears have been summarized in the notion of a “Digital Pearl Harbor” – a sudden, violent, devastating attack in the virtual battlespace, carried out at low cost, low risk, and with little effort thanks to the pervasiveness and vulnerability of computer networks.

Four fallacies about cyberattacks

Myriam Dunn Cavelty, Senior Lecturer for Security Studies and Deputy for Research and Teaching at the Center for Security Studies (CSS) at ETH Zurich in Switzerland, thinks that such scenarios are too simplistic. “The assumption that all cyberattacks are cheap and easy – and therefore the logical weapon of choice – is wrong,” she told Supertrends.

In a recent paper titled “Goodbye Cyberwar: Ukraine as Reality Check”, which she co-authored with her colleague Lennart Maschmeyer, she criticizes the prevalent “cyberwar” narrative and lists four fallacies about the ”Digital Pearl Harbor” scenario.

“The assumption that all cyberattacks are cheap and easy – and therefore the logical weapon of choice – is wrong.” Myriam Dunn Cavelty

First of all, the idea that every vulnerability will be exploited is wrong. In reality, the authors argue, the existence of a vulnerability reveals nothing about why, how, and when it would make sense for an adversary to exploit it.

Second, contrary to popular belief, a network intrusion in and of itself is not proof of success. Rather, the success of any operation depends on the political or strategic effects that it achieves.

Third, while it may appear that digital cyberwar tools are cheap and easy to use, the fact is that realizing strategic goals with controlled, targeted attacks is “hard, complicated, and risky,” as Dunn Cavelty and Maschmeyer argue. The final fallacy is that cyberwar operations can be deployed at short notice, like conventional weapons. In reality, they take months, if not years, to prepare and deliver, and must be integrated into chains of command. The perpetrator cannot simply “pull the trigger”.

‘Largely unnoticed’ incidents

These fallacies may explain why threat scenarios of widespread cyberattacks have mostly failed to materialize since the beginning of the invasion of Ukraine by Russia, which had been regarded as having some of the most formidable capabilities in this field. So far, these appear to be largely overblown. While the threat against critical infrastructures is real, and should certainly not be discounted, it has not materialized as predicted, neither on the expected scale nor in terms of sophistication.

Cyberattacks have occurred, for example in Estonia, where the banking sector was targeted in August 2022 in a campaign for which Russian actors were blamed, but which came nowhere close to crippling the economy or bringing society to its knees. Those cyberattacks were described by Luukas Ilves, undersecretary for digital transformation at Estonia’s Ministry of Economic Affairs and Communications, as “the most extensive cyber attacks […] since 2007”. However, as Ilves clarified: “With some brief and minor exceptions, websites remained fully available throughout the day. The attack has gone largely unnoticed in Estonia.”

Other events that were initially reported as serious attacks by Russian hacker groups against critical infrastructure proved, upon closer examination, to be exaggerated, such as a reported cyberattack against several US airports in October 2022. As later transpired, all that happened was that websites providing flight information had been subjected to denial-of-service attacks, creating a minor inconvenience for travelers while the airports’ operations remained unaffected.

‘Brute-force’ attacks

This is not to say, however, that there have been no attacks on critical infrastructure at all. For example, since Russia invaded its neighbor in February 2022, it has inflicted countless strikes, including with artillery and drones, on the country’s civilian energy infrastructure. Meanwhile, Germany – a supplier of arms and equipment to Ukraine – has experienced two of the most serious attacks on its national infrastructure in recent memory.

On 26 September 2022, the Nord Stream 2 underwater pipeline, which had been built at a cost of €9.5 billion to convey Siberian gas from Russia to Germany via the Baltic Sea but was never commissioned due to Moscow’s war of aggression, was hit by a series of explosions attributed to sabotage by unknown actor. The resulting damage to the pipeline rendered it unusable, and most likely also irreparable.

On 26 September 2022, the Nord Stream 2 underwater pipeline was hit by a series of explosions attributed to sabotage by an unknown actor. (Right: Drone view of an underwater explosion and gas leak on the sea surface)

Less than two weeks later, on 8 October, Germany’s national railway operator Deutsche Bahn experienced a large-scale failure of its GSM-R communications network, a key element of the European Train Control System (ETCS). This caused a complete breakdown of rail traffic across northern Germany and adjoining European networks. As soon became clear, this was not an accident: In quick succession, the unknown perpetrators had targeted a digital transmission hub as well as its backup facility, in two different locations over 500 kilometers apart. In both cases, they gained access to cable ducts covered by heavy concrete slabs and sliced through the cable bundles with an angle grinder.

Distracted by cyber-doom

It’s notable that these attacks against critical infrastructures involved low-tech “kinetic” weapons rather than highly sophisticated penetrations or manipulations of digital networks. Instead of relying on high-tech tools to exploit hidden digital vulnerabilities, the perpetrators relied on “brute-force attacks” in the literal sense. Have we therefore been preparing for the wrong threats? Instead of focusing on low-likelihood cyberwar scenarios with potentially devastating impacts, should we spend more effort on hardening our facilities and infrastructures against bad actors wielding bombs, hammers, backhoes, and angle grinders?

“Mounting evidence shows that cyber-attacks are relatively slow, ineffective, and unreliable.” Myriam Dunn Cavelty

Myriam Dunn Cavelty certainly thinks so. When it comes to attacks in the digital sphere, “targeted and destructive effects, delivered at a specific time, are very hard to pull off, and the likelihood that something goes wrong during the operation is very high. Using old-fashioned means like bombs or explosives is much more efficient,” she told Supertrends. “Mounting evidence shows that cyber-attacks are relatively slow, ineffective, and unreliable.”

The rise of digital technology was accompanied from the start by “cyber-doom” scenarios. Societies increasingly dependent on networked elements, such as supervisory control and data acquisition (SCADA) control systems, in critical infrastructures like energy generation and transmission facilities, potable and wastewater systems, hospitals, etc. seemed suddenly vulnerable to new threats and risks.

Prompted by fears of future cyberwarfare operations that could disable or destroy key elements of daily life, governments built up both defensive and offensive capabilities.

However, experts may have consistently underestimated the practical difficulties of carrying out cyberattacks on a massive scale, while at the same time overestimating the value of such attacks in terms of achieving strategic aims. As Dunn Cavelty and Maschmeyer note, cyber operations can be useful for intelligence-gathering and influence operations to amplify divisions in society. However, the threats to critical infrastructures are more likely to come from elsewhere.

“The hyperbolic term “cyberwar” has distorted the debate for almost 30 years. It is high time to stop waiting for a cyberwar that will not come,” they conclude.

If you enjoyed this article, we invite you to discover our upcoming Supertrends Platform, conceived to give you access to trends and innovation impacting your areas of interest directly from one place.

blockchain mainstream

Blockchain, Crypto, and Decentralized Finance – Towards Mainstream Adoption

In an interview with Supertrends, Karina Rothoff Brix, country manager for Denmark at Firi, the Nordic region’s largest cryptocurrency exchange, shares some insights on the role of crypto in the decentralization of the financial system.

As a digital ledger of transactions that is duplicated and distributed across the entire network of computer systems, blockchain technology allows users to record information in a way that makes it difficult or impossible to tamper with. The technology has applications in multiple fields such as healthcare (e.g., to preserve and exchange patient data, track medical goods and confirm their authenticity, etc.), food and agriculture (e.g., increase supply chain transparency), automotive (e.g., increase efficiency by tracking the ownership, location, and movement of parts and goods), and government (audit trail for regulatory compliance, contract and identity management), to name just a few.

The rise of Decentralized Finance (DeFi)

However, the field where blockchain is expected to have the highest impact or go faster into mainstream is the financial industry. This technology is currently driving the shift from centralized finance (where banks or third parties store, manage, and transfer the money between transaction partners) to decentralized finance, a system that eliminates intermediaries and enables peer-to-peer financial networks.


“Because the information on a blockchain is duplicated to a lot of computers all around the world, the system is more secure than saving all data in a few spots. Changing the data is impossible, as many so-called ‘validators’ are involved without knowing each other. Therefore, decentralized financial systems do not require the involvement of centralized parties such as banks, but at the same time are capable of enabling payments, money lending, and interest-bearing accounts,” says Brix, who has over 15 years of experience in driving innovation and implementing digitalization projects.


The many faces of crypto

Crypto is usually defined as a class of digital assets that are generated using cryptographic techniques and can be traded, exchanged, or used as a store of value. As Brix explains, the term “cryptocurrency” can be misleading: “Crypto is much more than a currency.

Crypto is much more than a currency. Simplified, crypto can represent coins, tokens, or NFTs. Different crypto can be coded to encompass different possibilities and values. This is why regulators are struggling with how to define crypto.

Simplified, crypto can represent coins, tokens, or NFTs [non-fungible tokens]. They are all digital assets, but different in the sense that a coin is a crypto that functions on its native blockchain, whereas a token is crypto on a non-native blockchain. A unique token is called an NFT. Different cryptos can be coded to encompass different possibilities and values. This is why regulators are struggling with how to define crypto.”

These developments have led to the rise of tokenomics, an industry branch that covers a token’s creation principles, content, and distribution. This information usually stored in a so-called “white paper.” From this perspective, not all crypto assets are created equal, and they do not all have the same value propositions and tokenomics. Brix also notes that crypto can be a security token, a utility token, a commodity token, a governance token, or a combination of these. Moreover, the range of possibilities and projects is in full expansion, since innovations in this space are advancing very rapidly.

Currently, there are several types of blockchain networks that differ in terms of technical infrastructure, speed, verification, authorization procedures, energy consumption, access, risk, etc. Various projects can be built on top of these networks, with a broad range of applications. Brix points out that it is too early to say which ones will still be functional in the next five years or which blockchain is a better fit for a certain industry.

Regulating the unregulated crypto market

Even though crypto assets have been around for more than a decade, it is only now that regulatory efforts are beginning to pick up and become a priority on the political agenda. Most countries are already exploring ways of adapting existing regulations to crypto, and of enabling legal transactions with digital assets.

In the EU, the Council Presidency and the European Parliament reached a provisional agreement (the MiCA regulation) regarding the regulation of crypto assets in June 2022.

The provisions are expected to come into effect in 2024 and harmonize crypto services across all member states. In the US, President Joe Biden has signed a US$1.2 trillion infrastructure bill that, among other things, advances the regulation of the cryptocurrency industry with a series of amendments to be enforced starting 2024. In the Asia-Pacific region, efforts to regulate crypto range from a complete ban in China to more progressive approaches in Indonesia and the Philippines. 

The major change that is likely to occur when regulation is in place is that institutional investors will start to participate in the process. A lot of big companies and pension funds are reluctant as the regulation is not clear, but the interest from the big players is definitely there and will materialize in the coming years.

Emerging markets lead the way in crypto adoption

Despite an adoption rate that varies significantly from country to country, current trends indicate a steady increase in the global adoption of cryptocurrencies compared to 2019 levels. Emerging markets lead the Global Crypto Adoption Index, with countries such as Vietnam, Philippines, Ukraine, India, Pakistan, Brazil, Thailand, Nigeria, Turkey, and Morocco in top positions.

Brix explains why low and middle-income countries dominate the index: “In the emerging markets, crypto is a way to have access to banking, make trades, earn an income, and travel with finances. A huge part of the population in emerging markets doesn’t have access to a bank account, and as a result, is excluded from the economy. Decentralized Finance changes this.” Moreover, in emerging economies, crypto trading is also a viable hedge against inflation, allowing people to preserve their savings in times of fiat currency volatility.

What does the future of crypto hold?

In line with the common expert opinion in the industry, Brix also expects that crypto will see mainstream adoption in the next decade, despite the current volatility and global financial turbulence.

Notable in this sense are initiatives such as the ones in El Salvador and Central Africa, where the governments have declared bitcoin to be legal tender and all corporate and private entities own a crypto wallet.

“A completely new and different economy is currently developing based on cryptos, blockchain, and Web3. The way people are trading is changing dramatically, and the young generation is the biggest driver.”

Crypto is also starting to gain an increasingly important role in financial transactions – with cities such as Colorado (US) and Zug (Switzerland) allowing citizens to pay their taxes with Bitcoin, while other municipalities such as those in Miami and New York City have already developed their own tokens.

Karina Rothoff Brix is Country Manager in Denmark for Firi, the largest crypto exchange in the Nordics. The former head of the Copenhagen School of Entrepreneurship at Copenhagen Business School (CBS) and head of the Center for Lifelong Learning at the Danish Technical University (DTU), Karina is also a recognized expert at the Danish Innovation Fund. For the last 15 years, Karina has worked with tech-scaleups and emerging businesses and has spoken on numerous occasions about the impact of emerging technologies on the future of businesses. In 2022, she published “Kryptovaluta og Blockchain” (Cryptocurrencies and Blockchain), an inspired and clear introduction to the future of cryptocurrencies, blockchain, and Web3.

If you want to stay up to date with the main milestones and developments related to sustainable banking and future-related innovations, consider signing up for the Supertrends platform!

Quantum devices, communication,

Enabling Communication Between Quantum and Traditional Devices

Quantum computers and superconducting microprocessors usually operate optimally at temperatures around absolute zero (-459.67° Fahrenheit). However, they still have to exchange information and interact with traditional devices running at room temperature. Researchers from the  University of California, Santa Barbara, have developed a device that mediates the communication between these two types of devices, hoping to enable seamless integration between cutting-edge and traditional technologies in the future.

Quantum computers, devices that operate based on quantum physics laws, are expected to revolutionize all industries due to their capacity to solve problems that are out of reach for traditional computational devices. Even though some prototypes have been proven to work at room temperature, most quantum computers need to be cooled at temperatures close to absolute zero to minimize errors and facilitate the quantum states. At the same time, quantum devices haven’t yet reached their full potential; thus, present operational solutions propose a hybrid approach, in which computations are performed partly on a quantum device and partly on a traditional one.

Currently, the connection between cryogenic systems and room-temperature electronics is established via standard metal wires. However, these wires transfer heat into the circuits and allow only small amounts of data to be transmitted. The solution proposed by Paolo Pintus, the lead researcher within UC Santa Barbara’s Optoelectronics Research Group, is to convert data from electric current to light pulses using magnetic fields. Then, the light can be transferred via fiber-optic cables, which have a larger data capacity and minimize the heat that leaks into the cryogenic system.

The prototype has already been tested in projects developed together with the Tokyo Institute of Technology and the Quantum Computing and Engineering group of BBN Raytheon. According to Pintus, “[t]he promising results demonstrated in this work could pave the way for a new class of energy-efficient cryogenic devices, leading the research toward high-performing (unexplored) magneto-optic materials that can operate at low temperatures.”

Did you enjoy this article? Discover our Supertrends Platform and learn how it can inform your strategic decision-making, and unlock the full potential of your organization by accelerating your innovation efforts.

Vaccines mRNA - Supertrends

Tailor-Made Software Lets BioNTech Provide Customized mRNA Treatments

In just three years, German pharma company BioNTech, which delivered the first vaccine against COVID-19, has become a major player in the biotech field, not only delivering hundreds of millions of vaccine doses, but also continuing its original focus on mRNA vaccines and other drugs to treat cancer. The company’s rapid growth has not only brought fame and revenues, however, but also new collaborations, production facilities, and supply chains that need to be carefully managed on a country-by-country basis, especially because its core product are personalized mRNA vaccines that are specifically tailored to individual patients.

In order to handle the massive increase in logistics while continuing to coordinate its research and distribution, BioNTech has entered into a partnership with the Fraunhofer Institute for Industrial Mathematics ITWM, a renowned German applied research institute that develops and implements technologies spanning theoretical and applied mathematics in collaboration with industry partners. Working together, Fraunhofer ITWM and BioNTech developed two software platforms whose algorithms support planning, management, and automation of the pharma company’s global research and distribution work, including cancer treatment and vaccination applications, and to adapt to new requirements.

Fraunhofer ITWM researcher Heiner Ackermann, who works at the High Performance Center Simulation and Software Based Innovation in Kaiserslautern, Germany, said the software tools are able to handle the complexity of BioNTech’s work flows in a way that off-the-shelf solutions cannot match. As such, they provide a “solution that uses flexible mathematical methods and models – a tailor-made solution that is not only specifically designed for the processes at BioNTech, but can also optimize them,” Ackermann explained.

The challenges of managing the complex operations of a global biotech corporation include applications for regulatory approval, setting up and carrying out pharmaceutical trials, or dealing with industry-specific problems such as fluctuating process times and higher reject rates caused by defective tissue samples, to name just a few. But for BioNTech, these challenges are compounded by the fact that its individualized cancer drugs are designed differently for each patient in small batches. They are then distributed to many countries, each of which has its own regulatory requirements governing everything from initial approval to rules about shelf life. 

Now, the company has received its own customized solutions to deal with this high level of complexity. With the two new software platforms, BioNTech will be able to establish durable and stable production processes for vaccine production and individualized mRNA-based cancer treatments. “Thanks to our successful collaboration with the Fraunhofer ITWM team, BioNTech has acquired tailor-made solutions that provide vital support in high-stakes situations. We will continue to use the software-optimized processes in other areas in the future,” said Oliver Henning, Senior Vice President Operations at BioNTech.

Did you enjoy this article?
To learn more about the technology, challenges, and opportunities in the health and life sciences sector, read our industry overview.

[Source]

digital money

Digital Money. The Rise of a New Era

As an alternative to conventional forms of payment, digital money promises to speed up financial transactions and improve transparency. However, this emerging trend also brings a series of challenges, such as volatility and the risk of security breaches. So what should we think of this type of digital asset? What are the benefits, opportunities, and implications, and how far has its adoption advanced? In the following, we list five of the most important things to know regarding the rise of digital money.

Digital currencies are alternative methods of payment that exist only in electronic form and have no physical representation. Managed and transferred via online platforms, these currencies can also be exchanged via trading services and, in some cases, converted into their traditional physical cash equivalent via ATMs.

After being first conceptualized in 1983 by David Chaum, a US cryptographer, the concept of digital value transfer and digital cash only started to gain traction after Bitcoin was developed and became popular. Over a span of more than 20 years, the trend evolved slowly, with several failed attempts to introduce digital money into the market (e.g., DigiCash, the company that developed the first digital currency, eCash, went bankrupt in 1998).

Fast-forward another 20 years, and the trend is in full swing, with adopters and supporters from all levels of society, from governments to businesses and individual users. However, there are still many issues that need to be understood and addressed before the trend enters into the mainstream. Here are some of the main points to be aware of in relation to digital money.

Digital money comes in different shapes and forms

Even though terms such as “digital money”, “digital currencies”, and “cryptocurrencies” are sometimes used interchangeably, there are three main types of digital money, each with different characteristics in terms of centralization, encryption, and transparency:  

  • Cryptocurrencies are a form of digital money that is created using cryptography. Being supported by blockchain and unable to operate outside this platform, its purchasing power depends on its user community. On the one hand, this allows for complete decentralization, with no authorities or governments being involved from a regulatory perspective. On the other hand, this leads to high volatility and a limited legal framework. The strongest cryptocurrencies in terms of market capitalization and trading volume are Bitcoin and Ethereum.
  • Central Bank Digital Currency (CBDC) is a digital version of a country’s currency, backed by the central bank. It can be developed on different platforms (i.e., digital ledgers) and is therefore not limited to the blockchain (distributed ledger technology). The central bank retains control over the currency, issuing it and governing transactions. According to the Atlantic Council, 105 countries, representing over 95 percent of global GDP, are exploring a CBDC.
  • Stablecoins are digital money whose market value is tied to an external reference (i.e., another currency, the price of a commodity, etc.) and are developed in an attempt to counterbalance the high volatility of cryptocurrencies. There is a general trend towards tighter regulation of this type of currency. Tether (USDT) and U.S. Dollar Coin (USDC) are the strongest stablecoins in terms of market capitalization.

The market is moving fast

Currently, there are over 20’000 cryptocurrencies in circulation, the equivalent of US$1.07 trillion in market value. Based on a research study conducted by the Atlantic Council, ten countries have already fully launched a digital currency, 14 countries are currently conducting pilot projects to assess the feasibility and implications of digital money, 26 are in the process of developing the necessary systems and procedures, and 47 are still in the research phase.

Figure 1: Example of projects that explore the use of digital currencies (CBDCs) for retail, wholesale, and cross-border payments

On a global level, the Digital Currency Global Initiative, a joint project between ITU and Stanford University, aims to further explore the technical implications and challenges of digital money, develop metrics and means of standardization, and promote best practices and learnings from pilot implementations.

The number of technology and service providers for digital currencies is increasing

A significant number of fintech companies that develop platforms or solutions for digital currencies have been founded in the last decade. Either leveraging the distributed ledger technology or blockchain-agnostic, these companies aim to provide central banks, financial institutions, governments, and participants in financial transactions with the necessary tools to implement and use digital currencies.

Figure 2: Number of companies founded between 1983-2021 that develop technologies or services for digital money (source: CrunchBase)

Companies such as FTX US (former LedgerX) and BiKi.com provide platforms for digital currency trading services. Bitt, TradeBlock, and Fluency develop solutions that help wholesale or retail customers to develop, customize, and integrate digital financial instruments, while companies such as Coinfirm focus on facilitating regulatory and compliance processes. The market of solution providers is constantly expanding, covering all aspects from technological enablers to exchange platforms and regulatory systems.

A society governed by digital money comes with additional challenges

Despite the promise of digital money to increase access to payments, efficiency, and resilience, the road to mainstream digital currencies is not a smooth one. First, a consistent and unitary legal framework still needs to be developed, and all parties involved need to reach a mutual agreement. The amount of human and financial resources for scaling up the pilot projects is also considerable (e.g., a lack of resources was the reason why Uruguay has not launched its second digital currency project).

As with most emergent technologies, digital money is not yet entirely accepted by consumers due to concerns related to privacy and safety. To a certain degree, these fears are justified, with digital assets being more vulnerable to cyberattacks than traditional money. On top of that, due to uncertainty over the technology and questions as to whether the technology is fully scalable and can meet the demands of a large population, the path to full adoption could still be a long one.  

Moreover, some critics see CBDCs as a “slippery slope” leading to economic influence and social control by the state. They believe the centralized version of digital money would give governments the opportunity to control spending, allow only certain purchases, and provide money with expiry dates which would allow them to force private citizens to spend their money instead of saving it.

Digital money will significantly disrupt the financial sector

There are multiple scenarios for the future of financial services in relation to the digitalization and decentralization trends. Many experts are inclined to predict a future coexistence of physical and digital money, while others point towards complementarity – with each type of currency having its unique role in financial transactions. However, there are also supporters of the takeover scenario, where physical money is entirely replaced by its digital counterparts.

Either way, the financial sector has to develop contingency plans and adjust to changes related to business models (e.g., changes in the product and services demand and offering), regulation and compliance, infrastructure adjustments, identity management, cybersecurity, audits, and financial reporting, as well as employee capacity building.

Although the popularity of digital currencies is constantly growing, they are not yet widely perceived and accepted as a reliable and secure alternative to physical money. However, the trend is currently gaining ground and is expected to gain increased traction in the near to medium future.

Stay up to date with the main milestones and the developments related to digital money by signing up to the Supertrends platform!

6G – The New Frontier in Information and Communications Technology

While the efforts to implement and increase the adoption rate of 5G technology are still at the starting line, researchers and players in the communication industry are already setting their eyes on the next frontier: 6G. Even though this technology is still in the research phase, it has the potential to propel the IT sector to a new level, allowing for very high processing speeds, low latency, and increased bandwidth. Moreover, adopting this technology will support the IT sector in aligning with societal goals (e.g., high-speed services available anywhere, anytime), satisfy increasing market expectations, and improve the efficiency of the sector’s operations.

Over a span of 30 years, mobile communications technology has gone from barely maintaining the connection during phone calls to secured conversations, networks that support rapid and clear transmission of data, SMS, roaming conference calls, multimedia services, VoIP apps, video streaming, and video conferencing services.

The evolution of communication technology

5G – one step away from market adoption

The current fifth-generation technology is faster than any other previous generation, promising reduced battery consumption, improved coverage, and seamless device-to-device communication. Commercial 5G networks are operational; however, the adoption rates are low, and the roll-out is proceeding at different speeds across the world. According to Ericsson’s Mobility Report, the number of 5G subscriptions is expected to reach one billion in 2022, while the GSMA, an industry organization representing the interests of mobile network operators, expects 5G to account for 21 percent of all mobile connections in 2025.   

Even though the next logical step after 5G is 6G, there is also a significant intermediate evolutionary phase, 5G Advanced, which is now taking shape and starting to be implemented across industries.

6G – the next frontier

While 5G is still under roll-out, efforts to shape the 6G infrastructure have already begun. This technology will take IT applications for smart cities, smart farming, industrial automation, and robotics to the next level. Moreover, because 6G will be built upon the previous generation in terms of technological infrastructure and use cases, it will allow the IT sector to scale it up in an optimized and cost-effective way.

The vast majority of IT applications will benefit from this surge in terms of efficiency, capabilities, and speed: Digital twins, virtual models of physical objects or processes, will be operational at a larger scale, new types of man-machine interfaces will be enabled, and the potential of artificial intelligence and machine learning will be unleashed. In terms of localization and geospatial imagery, 6G will substantially improve positioning accuracy and potentially extend coverage into space while at the same time meeting extreme connectivity requirements, including sub-millisecond latency.

IT players that are already developing AI-based applications will benefit from a synergy between 6G and AI that can unlock new opportunities in an unprecedented way: On the one hand, AI will help improve 6G performance; on the other hand, 6G will provide the infrastructure to propel the use of AI across all sectors and in multiple use cases.

Innovators and early adopters

Important players in the IC&T field have already kicked off 6G-related research projects or experiments. In 2020, China launched the first 6G experimental satellite to test data transmission using the terahertz spectrum. The country also holds most 6G patents, closely followed by the US.

In April 2011, AT&T, one of the world’s largest telecommunications companies, applied for experimental licenses with the US Federal Communications Commission. This would allow the company to showcase the functionality and capabilities of 5G Advanced and 6G wireless systems.

Samsung, a multinational electronics and information technology company, plans to host its first 6G forum, where scientists and industry experts will explore next-generation communication technologies. Nokia Bell Labs, the Finnish multinational IT&C and consumer electronics company, has also begun research work in the 6G area, planning to make this technology commercially available by 2030.

Public or private organizations in Japan, Germany, South Korea, and Russia are also establishing research facilities or starting pilot projects. In the US, the Next G Alliance, launched in 2020, aims to advance North American leadership in 6G. On the same note, the EU initiated the 6G flagship project, aiming to advance the research in this area.

An eye on the future

The vision of the future of 6G tends to converge across business players, industry organizations, and research centers. The most optimistic prediction places 6G commercial roll-out as early as 2028, while more conservative approaches predict that it will become available in 2035. Most of the roadmaps of the important players in the telecommunication field envision 6G commercial availability in 2030.    

However, there are still important challenges that need to be addressed before full deployment of the new generation of communication technology becomes possible, such as new technological advancements to differentiate the new generation from the previous one, global, unified standards in order to prevent market confusion and fragmentation, as well as diverse and secure supply chains.

Find out more about other technologies that will have a tremendous impact on the IT sector in the future.

smart city Copenhagen smart city Copenhagen smart city Copenhagen smart city Copenhagen

Smart City Copenhagen: Europe’s Largest Living Lab

From urban design to smart city technologies, Copenhagen is a pioneer and a test bed for solutions that can improve sustainability and quality of life in a rapidly urbanizing world. Supertrends spoke to Anders Sloth, Head of Smart City Technologies and Solutions at Copenhagen Capacity, about why Denmark’s capital is attracting providers of smart city solutions, which technology fields are expected to have the biggest impact on city life, and why smart solutions need not necessarily be high-tech solutions.

Interview: Chris Findlay

Supertrends:    Anders Sloth, thank you for taking the time to talk to us. To begin, could you briefly describe your work at Copenhagen Capacity, and what it entails?

Anders Sloth:  In a nutshell, Copenhagen Capacity works to promote investment and attract talent to the greater Copenhagen area. As a Senior Investment Manager and Head of Smart City Technologies and Solutions, I work with smart city technologies and companies interested in investing in Copenhagen, which can be venture capital companies or foreign investors who have an interest in Danish smart city companies, but also foreign smart city companies who want to establish a business here because of market or project opportunities that align with their product. We help them network with local stakeholders and assist with taxation and administrative matters related to setting up a company here.

ST:                  Can you give some examples of the kinds of technologies we are talking about?

AS:                  These are digital applications that generate and communicate data to create the interconnectivity that cities need to become more intelligent. In the field of mobility, for example, we have sensors that monitor traffic in cities and control the traffic lights to optimize the flow of pedestrians, cars, and cyclists. Other solutions monitor and assess air and water quality and pollution to inform data-based solutions for creating healthy and livable cities. Ultimately, it’s about generating data to make the city a happier and more livable place.

ST:                  So Copenhagen itself is a smart city and a testbed for smart city technologies?

AS:                  Yes, there’s a lot of momentum here from political and municipal actors to create strong smart city incentives. Copenhagen has set itself the very ambitious target of becoming the first carbon neutral capital by 2025, which also includes technical components. Our city is Europe’s largest living lab for testing outdoor lighting and smart city technologies.


Copenhagen has set itself the very ambitious target of becoming the first carbon neutral capital by 2025.

Anders Sloth


ST:                  But there’s also a lot of original research being undertaken in and around Copenhagen.

AS:                  That’s right. For example, we collaborate very closely with the Technical University of Denmark (DTU) and promote their R&D projects, and we also see many companies from overseas coming here to share knowledge and learn about innovations in areas like water management. After all, DTU was recently ranked the world’s number two university when it comes to research, just after MIT in Boston. And this is also an attractive factor for companies looking to relocate – knowing that they will be able to recruit highly qualified talent from our universities.

ST:                  In which technical fields do you expect to see particularly interesting developments related to the smart city concept?

AS:                  One of the main areas is mobility and transportation. Although Copenhagen has great infrastructure for cyclists and we have a lot of people cycling, road traffic is still the vertical that is emitting the most CO2. So we need to optimize road traffic in cities, whether through car-free zones, banning diesel cars in the city, creating incentives for EVs, or offering more accessible parking to prevent cars emitting CO2 for 20 minutes as they drive around looking for parking spaces.

ST:                  It’s interesting that smart city solutions don’t always have to be high-tech, IoT-connected digital gadgets.

AS:                  You’re so right. Ultimately, it’s about communicating the concept of smart city in such a way that more people understand and become familiar with it. It’s not about technical buzzwords – we need to present simple solutions in order to secure political support and help the public understand what it’s about, because smart cities are the future.

ST:                  One of the challenges in raising awareness is that many smart infrastructures are not visible to the public, but in the background – underground or behind walls. Which smart city solutions will be the most visible ones in the coming years?

AS:                  Probably the most noticeable ones will be the technologies that the public can interact with through smartphones. New types of software and applications for the general public will include an evaluation element, where the users can give feedback about how this technology has affected their overall happiness in terms of living in the city. Dubai has made an effort to measure the smartness of a city not in terms of technology, but in terms of happiness. In other words, they do not implement new technologies in the city unless they can actually monitor whether they create a sense of happiness for the people.

ST:                  Cities are where a lot of the world’s emissions are caused. How can smart city solutions help achieve that zero emissions goal that Copenhagen has also set itself?

AS:                  Research shows that 70 percent of the UN Sustainable Development Goals can be achieved through smart city technology, which is quite high. To bring about a green transition in the context of increasing urbanization, smart city concepts and digital applications are inevitable. If we don’t find a way of dealing with urban growth efficiently by managing CO2 emissions and air quality perspective, it won’t be much fun living in the cities of the future. Whether in Copenhagen or even more crowded cities like Beijing, we need data on emissions, air quality, public transport capacity etc. to ensure we are comfortable and have a really good feeling about the city we live in.


Seventy percent of the UN Sustainable Development Goals can be achieved through smart city technology.

Anders Sloth


ST:                  Speaking of urbanization and people moving to the cities, what makes a city smart in terms of how it uses its available space and land?

AS:                  This relates to the density and spatial footprint of buildings, but also of the cities overall. It’s a complex topic, especially given the shift to remote work in the current pandemic, which may determine livability conditions in the future. Here in Copenhagen, one of our planning priorities is to connect the suburbs even more closely to the city and make them attractive. Looking at buildings in particular, we need to build vertically as opposed to horizontally, but also be really smart about using the available area when you design buildings. Suppose you could somehow create apartments that are 55 m2 in size, but feel like 70 m2 due to advanced design and architecture – then you can fit more people into buildings without causing too much conflict.

ST:                  It seems that problems are inevitable because it’s almost taken as a fact of life that cities are just going to get bigger and bigger and bigger as more and more people move into the metropolises worldwide. Yet we’re not incentivizing people to move out into the country, due to the imperative to build more efficiently, more densely, and not have infrastructures spread out over hundreds of square kilometers if you can have the same amount of people living in 20 km2, for example. It almost seems like there are two different goals there that are in conflict with each other.

AS:                  That’s true, and it will be challenging to create the right incentives for dealing with these complex issues. If you live outside the city and have the choice between commuting to work by car or by public transport, it may be hard to relate to the idea that your individual choices help to cut down the national CO2 emissions by a minuscule fraction. The reward for taking public transport, or for recycling waste, may just not be large enough. On the other hand, punishing such behavior may conflict with our democratic and Nordic values. So the question is, which incentives could get people to do the right thing? Recently, the CEO of one of the biggest banks in Denmark suggested that homeowners who invest in energy efficiency or rooftop solar panels could get cheaper bank loans or mortgages. That’s an incentive individuals can relate to in a totally different way, because if optimizing home energy use means €500 savings on a monthly bank loan, that can have a very strong positive impact on the average family.

ST:                  Is Copenhagen aiming for any measurable future milestones in terms of smart city technology – for example, a point when all new office buildings will have energy management systems installed, or when all commutes will be done by autonomous vehicles?

AS:                  You have to remember that smart solutions, whether they be autonomous vehicles or intelligent buildings, also require appropriate infrastructure. If we were to say that all road traffic around Copenhagen should be autonomous vehicles by 2030, we would also have to build an infrastructure that fits with the way autonomous vehicles maneuver around. In terms of investment in infrastructure, it’s challenging to set those kinds of milestones. My take is that if none of these things happen at a scale where it makes a big difference before 2030, I don’t think we’ll have a planet for much longer. So, in that context, 2030 is probably the most important milestone on a global scale, regardless of any technology that supports climate action. Our contribution at Copenhagen Capacity is to connect with the people who can find those solutions and optimize them, and to attract smart city companies who have an interest in investing or establishing themselves in the greater Copenhagen area. We can help them find the right local partners to develop project opportunities and support them in building tools and solutions that can make this city, and all cities, sustainable in the long run.


Anders Sloth Nielsen has a decade of experience from the cleantech sector and specializes in urban development and how smart city technologies can play a crucial role in developing sustainable cities of the future. Currently, Anders is Senior Investment Manager, Head of Smart City at Copenhagen Capacity (CopCap) where his work focus on advising foreign companies and investments on the establishment and investment opportunities in Greater Copenhagen. Prior to CopCap, Anders was employed at CLEAN, which is the Danish National Environmental cluster organization, leading various cleantech projects globally. Anders also has a background in entrpreneurship where he has been the founder of a SaaS networking start-up, which, amongst several milestones, successfully secured funding from the Danish Innovation Fund.
sustainable digital fashion sustainable digital fashion sustainable digital fashion sustainable digital fashion

The Avatar’s New Clothes: Digital Design for Greener Fashion

When we think about fashion, we tend to think of material objects such as colorful textiles, cutting tables, scissors, sewing needles, or pins. However, the future of this industry will be shaped just as much by high-resolution screens, blockchain, 3D modeling, virtual and augmented reality, holograms, and artificial intelligence. While it is a natural concept for those now in their 20s, digital fashion might require a certain mindset shift for those who grew up in an analog world. However, in a future where our main priority will be to preserve our natural resources, virtual fashion is certain to become a sustainable alternative.

Put on your headset, position yourself comfortably in front of your screen and immerse yourself in the metaverse. Depending on your current mood and inspiration, choose your avatar and outfit and decide whether you want to hang out with other people, attend an event, visit a particular place, or shop around. All of this can be done from the comfort of your chair and using nothing but digital artifacts…

This is the metaverse, a universe of virtual spaces that will form the immersive version of the internet. Currently, under construction by multiple players in the gaming, blockchain and crypto worlds, web 3.0, as it’s also known, will become the main distribution channel for virtual fashion, according to Michaela Larosse, Head of Content & Strategy at The Fabricant.

“From a consumer point of view, we are all living digital lives, expressing ourselves in multi-media and virtual realities. When self-expression and the exploration of identity through the medium of fashion exists beyond the physical realm, it allows us to transcend the boundaries and limitations of reality; in the digital environment, we can express our multiple selves and explore new possibilities of who we might be,” says Larosse.

Founded in 2018 as the world’s first digital-only fashion house, Amsterdam-based The Fabricant operates at the intersection of fashion and technology. Currently, the company focuses on two core businesses. One is their own digital couture house and label, which creates garments that will never exist in the physical world. The other relies on developing digital versions of physical garments for established brands that want to expand their presence in the 3D digital space.


“From The Fabricant’s perspective, we are building our business for a future where physical fashion becomes utilitarian in response to our planetary circumstances and the need to preserve natural resources, but the digital environment is where we will let our fashion imaginations run wild.”

Michaela Larosse


The digital transformation of fashion

Even though one might expect this industry to be the last one to go digital, there are numerous technologies that have tremendous potential in connection with fashion. Augmented reality, an immersive 3D technology that combines digital information with a person’s physical environment, already allows for digital clothing try-ons. In addition, digital supermodels are taking over the fashion industry, with data suggesting that in some cases, virtual personas outperform human influencers.

Artificial intelligence and machine learning make sense of large amounts of data, from customer size and measurements to consumer behavior and sustainability-related metrics. Besides allowing for traceability and transparency regarding the provenance of materials and the supply chain, blockchain also facilitates a new trend in fashion: NFTs (non-fungible tokens).

These are digital assets that provide proof of ownership in the virtual space. As the first company to ever release a fashion NFT, The Fabricant sold their digital Iridescence dress for US$9,500 at an auction in May 2019. RTFKT, a market leader in digital artifacts, sold US$3.1 mn worth of sneaker NFTs in seven minutes in February 2021.

Consumer readiness

For those anchored in the physical world, who experienced floppy disks and portable cassette players firsthand – the metaverse and digital images of themselves might seem pointless, if not absurd. However, Larosse points out that for young millennials and Gen Z, as digital natives whose virtual lives have equal validity to their physical lives, digital fashion is an obvious concept that makes complete sense and doesn’t require any explanation.

Being able to dress avatars (virtual representations of oneself) any way they want gives this generation a new avenue of self-expression that allows them to explore and experiment with their identity.
“Of course, we’re all used to wearing fabric against the skin, but fashion is an emotional experience. And you don’t lose that emotional experience when you create something digitally.”
Michaela Larosse

On the other hand, more and more brands are entering the digital world to keep up brand awareness, reach customers via a multichannel approach, and offer them immersive experiences. Companies such as Levi’s and Ralph Lauren have already released lines of virtual clothing, Gucci recently sold one of its digital bags at a higher price than its physical counterpart, while Balenciaga dropped an entire digital fashion collection in the global multiplayer game Fortnite.

How does digital fashion contribute to a more sustainable fashion future?

The credo of The Fabricant underscores the company’s commitment to sustainability: “We waste nothing but data and exploit nothing but our imagination.” Larosse points out that, unlike the traditional fashion paradigm, which can be wasteful and exploitative in so many ways, digital fashion does not cause overflowing landfills, excessive water usage and pollution, child labor, or animal cruelty.


“The historical process of sampling is so wasteful. It hasn’t changed in 200 years. So many of the principles according to which the current fashion industry operates are created for societies that don’t exist anymore. But now, we have the tech to disrupt that narrative and allow for more sustainable initiatives. We have begun to be much more cognizant of our planetary circumstances, and that requires quite radical interventions in the way that we do things.”

Michaela Larosse


Another important aspect is the way virtual fashion and 3D modeling can help make the sampling process more sustainable. By creating 3D samples, companies can bypass the extremely resource-intensive process of moving bolts of fabric across companies’ branches or ateliers, sometimes flying them across the world, transforming them into samples of different sizes and colors, and then shipping them back to the original point. If additional alterations are requested, the entire process is repeated, together with further carbon emissions and environmental burdens.

Big brands such as Peak Performance, Nike, and UnderArmour have already started to use 3D sampling by creating virtual versions of their garments. These digital assets can then be rebuilt as needed.

Today, a vast array of technological applications are poised to shape the future of the fashion industry. Keeping track of how these advances affect the environment and society in general, from creating a digital narrative to more informed and data-based decision-making, will be one of the main challenges going forward. Sustainability is becoming a key criterion in assessing the viability of a technological trend or business model.

All pictures in this article featuring digital fashion items were kindly provided by The Fabricant.

Want to learn more about the future of fashion?

In the latest issue of our innovating sustainability report series in partnership with Valuer.ai, we shone a spotlight on the innovative companies working to make the fashion industry more sustainable. In order to support the industry’s transition to a cleaner future, we decided to release this report for free. Download our Sustainability Trends in Fashion report for free.

Artificial Intelligence becomes an essential part of future healthcare

In many developed countries, healthcare has become an unsustainable business in recent years, partly due to the aging population and prevalence of chronic diseases. In the US, healthcare spending grew 4.6 percent in 2019 alone, amounting to US$3.8 trillion, or 18 percent of the nation’s Gross Domestic Product (GDP). However, this increase in spending did not translate into better patient care, nor has it reduced resource scarcity and imbalance in the healthcare industry. How can we transform the healthcare sector to make it more efficient and sustainable? Artificial intelligence (AI) will play a critical role in the future of healthcare.  

The Seven Revolutions in Healthcare That Will Impact Your Life – Part 7

(Missed the previous one? You can read Part 6 here – Robots Become Reliable Assistants for Healthcare Professionals and Patients)

From fee-for-service to fee-for-value

In the current healthcare system, doctors and other healthcare providers are paid for the number of patients seen or procedures performed. This fee-for-service model means that healthcare providers are rewarded for volume rather than for value.

What is the biggest barrier to practicing medicine today?


“‘Production pressure’ – the requirement to see more patients in less time because of the misconception that the value of a physician is determined by the number of patients he/she sees”– Lucian Leape, physician and professor at Harvard School of Public Health, leader of patient safety movement.1


Value-based healthcare rewards healthcare providers based on the quality of care they provided. The implementation of AI can greatly improve the value of healthcare providers by making sense of medical data, automating routine procedures, and improving efficiency and effectiveness.

AI will reshape radiology

Machine learning is great at recognizing patterns, which has translated into fast progress in analyzing medical images.

While AI played no role at all in radiology as recently as 2015, 30 percent of radiologists had adopted the technology by 2020, according to a study by the American College of Radiology.2

One of the diseases where machine learning has proven its value in early diagnosis and prognosis is dementia, the leading cause of disability and dependency among the elderly. Diagnosing dementia in an early phase is a challenge due to the lack of symptoms and visible changes in brain images at the preclinical stage. By studying patterns in thousands of brain scans from dementia patients, scientists in the UK have developed an algorithm that can detect early signs of dementia in brain scans that are not visible even to radiologists. The algorithm has reduced the diagnosis procedure from several scans and tests across several weeks to just one single scan. 

Data experts believe that AI, rather than replacing radiologists Altogether, will automate redundancies, prevent mistakes, and optimize how radiologists practice, which will ultimately lead to better patient outcomes.3

AI and big data are advancing precision medicine

All humans are different from one another due to genetic, environmental, and lifestyle factors. But in conventional medicine, patients with the same disease are typically given the same standard treatment. This is often the reason for unreliable outcomes. In precision medicine, medical decisions are tailored to a subgroup of patients. Multidimensional datasets are used to train algorithms to identify subgroup patients with similar biological and other characteristics. Precision medicine offers clinicians the opportunity to prepare tailor-made preventative or therapeutic interventions. It has already led to promising results in AI-powered prognosis for cancer and cardiovascular disease.4

In 2018, Chiba University set up the first AI center in a medical school in Japan. The center uses AI to analyze genomic and clinic data such as gene expression, metabolism, gut microbiome, environmental exposures, and lifestyle factors. By doing so, researchers are able to predict the efficacy of treatments and future outcomes. In one of their studies, researchers used machine learning to identify a group of early-stage ovarian cancer patients who would respond poorly to a particular treatment beforehand. This finding gave the clinicians the opportunity to design a new treatment approach for the subgroup of patients.


“Predictive algorithms can help identify disease groups that haven’t been recognized by clinicians, as well as guide the selection of personalized treatment options for these patients.” – Eiryo Kawakami, professor of artificial intelligence medicine at Chiba University.5


Translate AI from labs to real-life patient care

To laypeople, the notion of an AI healthcare solution may sound like a complex one. But Professor Sebastien Ourselin, Head of the School of Biomedical Engineering & Imaging Sciences at King’s College London, says the new approach will make his work easier. Ourselin and a team of data scientists and clinicians at AI Centre for Value-Based Healthcare are working together with The National Health Service (NHS) and other partners to deploy AI solutions into real hospitals in the UK.

“AI is just a way to make sense of all of those data by training models which will hopefully be able to save us time in making the diagnosis, prognosis and be able as well to increase the effectiveness of the treatment.” – Professor Sebastien Ourselin, Head of School of Biomedical Engineering & Imaging Sciences at King’s College London6

The transformation of AI solutions from labs to real patient care is not an easy task. Medical data in the real world is often unstructured, comprehensive, and filled with various terms, abbreviations, and misspellings. The strategy of the AI Centre for Value Based Healthcare is to first convert static snapshots of clinical data into real-time, actionable analytics, then build an infrastructure to link data together and train the algorithm. Eventually, with the help of AI, actionable models are formulated that can be deployed in real hospitals. The NHS plans to deploy the first prototype in ten hospitals later this year. The full deployment of AI solutions will be carried out in the next two years.

Do you think we will see AI-powered healthcare solutions in real hospitals soon? Search “Future of healthcare” on the Supertrends Pro app and tell us about your thoughts on AI healthcare solutions:

This blog concludes our series on the future of healthcare. Thank you for following our ideas on what will happen in the future of healthcare and what it may mean to your life. Take advantage of the Supertrends Pro app’s free trial to make your voice heard on the “future of healthcare”. The final timeline will be revealed in November. Scroll down to the bottom of this page and sign up for our newsletter so you won’t miss it!


[1] Pittman D., 10 Questions: Lucian Leape, MD. MedPage Today. 12 January 2014. https://www.medpagetoday.com/PublicHealthPolicy/GeneralProfessionalIssues/43757

[2] Siwicki B., Mass General Brigham and the future of AI in radiology. Healthcare IT News. 10 May 2021. https://www.healthcareitnews.com/news/mass-general-brigham-and-future-ai-radiology

[3] Siwicki B., Mass General Brigham and the future of AI in radiology. Healthcare IT News. 2021.

[4] Uddin, M., Wang, Y. and Woodbury-Smith, M. Artificial intelligence for precision medicine in neurodevelopmental disorders. npj Digit. Med. 2, 112 (2019). https://doi.org/10.1038/s41746-019-0191-0

[5] Nature research custom media, Chiba University. Advancing precision medicine using AI and big data. Nature portfolio. Accessed on 27 August 2021. https://www.nature.com/articles/d42473-020-00349-9

[6] Ourselin S., The future of healthcare with artificial intelligence. 26 June 2021, Future of Healthcare (Webinar). NewScientistLive. https://app.konf.co/event/sJ0Vy6Kn/session/4499

Fashion, apparel, textile

Algorithms for Fashion. How Machine Learning Makes the Apparel Industry More Sustainable

In the fashion industry, return rates – sometimes referred to as “the plague of eCommerce” – are skyrocketing. Besides placing a burden on the companies’ budgets and logistics, this behavior is also detrimental to the environment. However, innovative software programs based on Artificial Intelligence and Machine Learning have the potential to reduce return rates, leading to a more sustainable industry.

The fashion industry is frequently accused of unsustainable practices, with its high carbon emissions, consumerism, high water consumption, toxic effluents and materials, child labor, and breach of human rights being just a few examples. While predictions regarding the industry’s capacity to meet decarbonization goals look quite somber, the industry has launched numerous initiatives to promote a sustainable agenda. From new business models and innovative materials to new production processes, green distribution, and packaging alternatives, these efforts are showing great promise, even if they are still in their infancy.

Isabelle Ohnemus is the founder and CEO of EyeFitU, a Swiss startup at the confluence between fashion and technology. Highly concerned by the increasing return rates in retail and e-commerce, she and her team developed a Size-as-a-Service AI software that aims to address retailers’ trillion-dollar problem while simultaneously helping them reduce their carbon footprint..

Returns – a major threat to the sustainability of fashion industry

E-commerce is expected to increase its market penetration from 46.6 percent in 2020 to 60.32 percent in 2024. While between eight and ten percent of products purchased in stores are returned, the numbers rise to a whopping 40 percent for online purchases.

Ohnemus points out that consumers behave very differently in the e-commerce environment than in brick-and-mortar stores. Clients buy clothes to wear on special occasions and then return them, purchase items only to try them on, without any intention of keeping them, or order multiple sizes and colors only to pick the one that fits best.


“Most of the time, companies are not even able to tell you what happens with the items you just returned. In most cases, they end up in a landfill even if they are in perfect condition because the processing costs are too high for the company. And that’s after they have been shipped over multiple countries or even world regions.” – Isabelle Ohnemus, founder and CEO, EyeFitU


The consequences are noticeable on all fronts: Retailers have to deal with increased shipping and fulfillment costs, disturbed supply chains, and logistical complications, and require more storage space as well as human resources to handle the returns.

fashion sustainability

Example of a retailer’s returns operations, adapted after Cullinane, Browne, Wang, and Karlsson (2019)

Shipping parcels between postal offices, hubs, consolidation centers, sorting departments, and repackaging locations takes a toll on the environment. In addition, large quantities of items that aren’t included again in the retail process add up to the millions of tonnes of textiles already in landfills.

Machine Learning algorithms could help the apparel industry

Since one of the main reasons for returning clothes is that they are the wrong size or fit, Ohnemus partnered up with AI and machine-learning specialists to develop software capable of helping customers find the best suitable garment for their body type and preferences. The app can be used both in brick-and-mortar stores and in online environments and offers in-store contactless sizing, as well as QR code and smart mirror integration.

With one of the largest databases of body measurements per region, the software uses machine learning, user-generated data, and multi-parameter algorithms to determine the correct clothing size for a specific person, taking into consideration various body shapes and sizes. After its implementation, clients reported a decrease in returns of up to 55 percent, with significant increases in conversion rates and average order value.

Final thoughts on sustainability

However, like many other industries, fashion is also prone to greenwashing. Customers and stakeholders are overwhelmed with misleading messages regarding a company’s environmental performance, being led to believe that the firm really cares for the environment. Given the vast proliferation of such deceptive self-promotion practices in the last decades, it is becoming increasingly difficult for customers to differentiate between true claims and mere marketing messages.


“I don’t think that any company today buys software for the environmental impact. They might love the sustainability story and the PR impact, but actually what sells the product are its KPIs – decrease in returns, increase in basket size and conversion rates.”


Even though the reality behind corporate claims of environmentalism may sometimes be disappointing, the scale and pace of current innovation and technological developments can bring about a significant shift towards new, more sustainable business models that are geared not only toward profit, but also to the needs of the planet and its inhabitants. Access to the right data, gathered through automated processes and processed by powerful analytical tools, might be the first step towards truly sustainable brands.

However, as Ohnemus points out, behavioral change has to come from both sides – from companies and consumers. Technology can support this change by providing consumers with tailor-made solutions to ensure their online fashion purchases satisfy their needs. Reducing returns is a three-way win-win-win situation: Good for the industry, good for consumers, good for the climate and environment.

Find out more on the topic in our free reports dedicated to sustainability.

Quantum Computing in Banking and Finance – Threat or Opportunity?

What do companies such as J.P. Morgan, Wells Fargo, Barclays, Mitsubishi Financial Group, Citigroup, Goldman Sachs, or Caixa Bank have in common (besides being banking and financial giants)? They have all started to invest in and experiment with quantum computing applications.

Even though it is an emerging technology that still needs to mature in many ways to fulfill its wide range of promises, quantum computing has already started to make its way into various industries. The business world now faces steady pressure to familiarize itself with the technology, assess its potential, find specific use cases, and decide upon a potential long-term strategy.

Quantum computers are an entirely new type of hardware operating on quantum physics principles. While traditional computers use bits and a binary system of representing the information (either zero or one), quantum devices store the information in qubits, which can find themselves in a particular state, superposition (both zero and one at the same time). This allows them to process a vast amount of information significantly faster than classical devices. However, quantum hardware technology still needs to develop; therefore, most of the advantages that quantum computers offer compared to conventional computers are almost entirely theoretical.

Companies in the banking and financial sector are already experimenting with this technology to either harness its potential or take precautions with regard to its implications.

Quantum computing as a threat

Banks, hedge funds, asset managers, and all types of financial institutions deal with very sensitive customer data as well as information regarding transactions and contracts. Moreover, regulators require this data to be stored for periods ranging from several years to several decades. Therefore, it is paramount that it should remain secure and private. Some of the encryption algorithms used today rely on complex mathematical problems that classical computers cannot solve.

In a keynote presentation at the Inside Quantum Technology 2021 conference, Dan Garrison, who guided the creation of Accenture’s Quantum Computing Program, mentioned that if all classic computers would work together to break an encryption key (e.g., the one protecting a bank account), this would take approximatively 14 billion years. However, it has been theoretically proven that a quantum computer would be able to break some types of encryption in a matter of minutes or seconds, and several algorithms that can do that have already been developed.

Quantum hardware hasn’t yet reached the necessary level of development to run such algorithms. Nevertheless, as soon as large-scale, fault-tolerant universal quantum computers become available, there is a risk that all the data and private information concerning people, businesses, and transactions may be exposed. Some scientists expect this to happen in the next decade. Based on the principle “harvest now, decrypt later,” it is believed that nefarious actors are now hoarding encrypted data, with a view to accessing it as soon as more powerful quantum devices become available.

Therefore, by starting to use quantum-resistant algorithms already at this stage, the data owners could protect their information in the future, too.


“In the Finance sector, which deals with sensitive and private information, our greatest concern is what we call post-quantum cryptography (PQC). This refers to the landscape of privacy, cryptography, and encryption after the day when quantum computers become capable of breaking many of today’s encryptions. Post Quantum Cryptography should be something that is on everybody’s mind.” Peter Bordow, Principal Systems Architect for Advanced Technologies at Wells Fargo.


Quantum computing as an opportunity

Optimal arbitrage, credit scoring, derivative pricing – all these financial procedures involve many mathematical calculations and become even more complicated and resource-intensive as the number of variables increases. At some point, people have to settle for less-than-optimal solutions, because the complexity of the problem surpasses the capabilities of current technology and methods.

These so-called intractable problems (that can’t be solved by a traditional computer in a reasonable amount of time) represent the best use-cases for quantum technology.

One of the most acclaimed applications of quantum computing in the financial sector are the accurate simulation of markets and the ability to predict how a change in a commodity price will influence the cost of other assets.

According to experts in the field, quantum computers would be to perform so-called Monte Carlo simulations to forecast future markets, predict the price of options, or assess risk and uncertainty in financial models.

By optimizing machine learning and employing algorithms capable of recognizing patterns in large amounts of data, quantum computers could perform these highly complex forecasts and predictions.

Trading and portfolio optimization are other areas where quantum computing could significantly help. Having to consider the market volatility, customer preferences, regulations, and other constraints, traders are currently limited by computational limitations and transaction costs in simulating a large number of scenarios and improving portfolio diversification. Scientists have already proved that quantum technology can deal with the complexity of these problems.


Currently, Dharma Capital and Toshiba have joined forces in exploring the potential of quantum computers in assessing the effectiveness of high-frequency trading strategies for listed stocks in Japanese markets.


In a panel discussion during the Inside Quantum Technology 2021 conference, Steve Flinter, Vice President within Mastercard’s Artificial Intelligence & Machine Learning Department, declared that Mastercard had already started two years previously to explore use cases for quantum computers. Even though retail banking and payments are not typical use cases for these devices, Flinter believes that besides optimization problems, quantum computers could be successfully employed to make sense of petabytes of data.

Marcin Detyniecky, Group Chief Data Scientist and Head of AI Research and Thought Leadership at Axa Insurance, also points out that in the financial industry, quantum computers could have a positive impact in areas such as foreign exchange optimization, asset allocation, large-scale portfolio optimization, disaster simulations, and risk modeling.

Commercial quantum applications for the financial industry

Of the dozens of quantum software start-ups around the globe, Multiverse Computing and Chicago Quantum have already developed specific quantum solutions for the financial sector and announced encouraging results in the area of portfolio optimization.



Multiverse Computing’s most mature product, an investment optimization tool, is capable of improving asset allocation and management, generating twice the ROI on average while the risk and volatility remain constant. Besides that, the company develops quantum-inspired solutions to predict financial crashes, determine anomalies in big unlabeled datasets, and identify tax fraud.

Chicago Quantum’s proprietary algorithm identifies efficient stock portfolios and, according to the company, “is currently beating the S&P 500 and the NASDAQ Composite 100 indices”.

In terms of quantum security for financial institutions, there are already several companies on the market offering quantum encryption devices and solutions. QuintessenceLabs offers data-protection solutions and encryption keys based on quantum technology, designed to withstand any malicious attacks both from classical and quantum computers. ID Quantique is also commercializing a quantum random number generator, along with quantum-safe network encryption and quantum key distribution solutions. Similar services are provided by Cambridge QC, evolutionQ, IBM, Infineon, ISARA, and Microsoft, to name but a few.

“Wait and watch” or “go ahead”

The future development of quantum solutions within the financial and banking industry is not without challenges. Finding out which problems are suitable to be tackled by quantum computers and which not, increasing the interface accessibility and the availability of software, extending the interest in this technology beyond an elite group of mathematicians and physicians – these are only a few of the challenges that this field will have to deal with in the future.

However, experts have warned that adopting quantum-based solutions is a long and complex process that depends not only on the company’s capacity to define problems, migrate data, and adjust the infrastructure but also on its ability to include suppliers and clients in this process as well.


“This is a long game. It is not a light switch that you flip, and suddenly you’re all done in a few months, and you’ve mitigated all your risk exposure.” Peter Bordow, Principal Systems Architect for Advanced Technologies at Wells Fargo


Nonetheless, quantum computing technology is not fully developed yet, and most of its applications and promised benefits are still conceptual. Therefore, companies in the financial and banking industry are faced with two alternatives: To wait and watch, or to go ahead. The first option implies ignoring emerging trends and reacting only when the threats or the opportunities have been identified. The second one relies on a more proactive approach, where companies already start to familiarize themselves with the quantum technology, identify use cases, and start testing the integration of quantum security solutions. This option might prove more valuable in the long run and help them mitigate future risks.


Find out more about the expected breakthroughs in quantum computing. Read our report, Supertrends in Quantum Computing, for a complete overview of quantum technology, as well as key players and investors in this field.

© 2021 Supertrends

References

Egger, Daniel D, Gambella Claudio, Marecek Jakub, McFaddin Scott, Mevissen Martin, Raymon Rudy, Simonetto Andrea, Woerner Stefan, and Yndurain Elena. 2020. “Quantum Computing for Finance: State of the Art and Future Prospects.” IEEE Transactions on Quantum Engineering, vol. 1 (IEE Transactions ) 1-24.

Wells Fargo. 2020. Post-Quantum Cryptography (PQC) and the Quantum Threat. Position Report, San Francisco: Wells Fargo.

A Stepping Stone Towards a Sustainable Company: Gathering the Right Data

As a Senior Management Consultant at IBM, Anja Juhl Jensen has been working with digital innovation and disruptive technologies for more than 16 years. Currently, she supports IBM customers in the use of technologies such as artificial intelligence and blockchain to set a sustainable course for their organization. In an interview with Supertrends, she calls attention to an essential step in achieving a sustainable organization: proper data collection and analysis.

Anja Juhl Jensen has extensive experience working with disruptive technologies that drive sustainability. Other areas of expertise include innovation, cognitive solutions, SAP software, compliance, and security. Currently, Ms. Jensen is involved in projects focused on gathering data and develop real-time sustainability reports as well as product modernization using artificial intelligence, image recognition, and machine learning.

With increased support from governments and socio-environmental activists, sustainability is now part of the mindsets of many businesses and consumers. Defined as an integrated effort to balance economic profit, the environment, and society’s wellbeing, sustainability has become a gold standard in the last decade.

Driven by increasingly stronger regulations, by the desire to boost their public image, or simply because of genuine concern for the future of humankind, companies have started to look into strategies to improve their environmental and social impact.

The current state of affairs

Despite growing interest in sustainable business goals, companies’ attitudes and interests towards this issue vary significantly. Jensen highlights three main categories:

The “business-as-usual” firms are on one side of the spectrum. They do the bare minimum and meet only the most stringent regulations. In their case, sustainability is a PR exercise: Sustainability-related terms show up on their website or in glossy reports, without any real effort to make them a reality. This type of company has no genuine interest in changing its business model to become more sustainable.

“Optimizers” represent a different approach on the spectrum. Setting sustainable-related goals is seen as a means to optimize and improve processes and operations within the organization. These companies have already developed a sustainability strategy and started gathering data. Often, however, this is done manually and unsystematically. These companies have the right intentions but lack the necessary knowledge base to address sustainability issues.
Finally, the “re-inventors” are determined to substantially change their business model and develop new, sustainable products or services. Various surveys show that these companies have already started getting value back from the investment in sustainability and re-invention of the company.

Companies’ attitudes towards the sustainability challenge

However, optimizing a business or re-designing corporate strategy requires sound data and cannot be done without a thorough analysis of the company’s current environmental, social, and economic impact.

Moreover, an increasing number of European regulations (e.g., Product Environmental Footprint (PEF) or EU Taxonomy) stipulate clear conditions that companies must meet to be qualified as environmentally sustainable. To prove that they meet these criteria, companies need access to proprietary data that spans a large spectrum.

The common issue – lack of consistent, comprehensive data

According to the Greenhouse Gas Protocol (GHG) Protocol, which sets global standardized frameworks for greenhouse gas emissions, corporations are required to monitor and report three major factors: direct emissions – fuel burned for the company’s direct activities (Scope 1); indirect emissions – resulting from the consumption of purchased energy, steam, heat, and cooling (Scope 2); and emissions that occur in the value chain of the reporting company (Scope 3).

As Jensen notes, today most of this information is gathered manually, usually once a year, when companies release a sustainability report. Besides being time-consuming, the process also lacks transparency in terms of data provenance or calculation. Moreover, transferring and converting information between various systems is slow and difficult, making data mapping across specific sites, brands, and geographies a cumbersome effort.

The problem becomes even more complex when it comes to gathering data from suppliers. For example, a dairy producer who obtains milk from over 12,000 farmers needs to know the footprint of each liter of milk acquired. Getting this information from the suppliers can become very difficult if they don’t monitor sustainability parameters or use a different reporting system. Currently, companies work with industry averages, but in the future, the need for exact data will significantly increase.

Despite being perceived as an onerous process, gathering this type of data can also be beneficial for the company, as Jensen points out:


“When companies start collecting data dynamically and automized, they can start analyzing it across the whole company. This way, they can immediately see where resources are wasted, where there are gaps in the process, what could be optimized, and where their emissions are coming from.”


However, based on the old principle “You get what you measure”, companies need relevant inputs, actionable metrics, and robust analytics to get an overview of all sustainability areas and make informed decisions.

Technology – a powerful sustainability enabler

New technologies have the potential to enable, facilitate, and speed up the data gathering, analysis, and reporting processes, at the same time making them more reliable and transparent. Digital platforms can integrate data related to a wide range of sustainability initiatives (e.g., water and energy consumption, sources of raw materials, etc.) and provide the necessary insight for business decisions and operational plans.


“Some might see all the new regulations coming as a constraint for companies. This is the case to a certain extent. However, limitations also spark innovation. I strongly believe that we will see an acceleration of innovations based on the new insight companies get from their data when they are forced to start reporting on it.”


Through its smart, connected sensors, the Internet of Things enables access to an increasing amount of data. This way, companies can monitor their emissions in real time. Due to its use of distributed ledgers and its shared record of transactions, blockchain technology becomes an essential tool in monitoring the sustainability results all through the supply chain. Ultimately, artificial intelligence and machine learning can analyze the data and identify potential improvement areas.

Meeting sustainability requirements brings value for organizations

By using an integrated, automated system of gathering and analyzing sustainability-related data, companies have the possibility to increase their sustainability ratings, lower the cost of operations, and increase their efficiency. Jensen is convinced that such platforms can also lead to potential new sources of revenue, easier talent acquisition and staff retention, and improved brand perception and customer loyalty.

Common standards and automated reporting methods would also increase transparency in the supply chain, leading to improved process efficiency and reduced costs.

Businesses that are neglecting or don’t want to invest resources in meeting sustainability expectations might significantly lower their chances of benefiting from the myriad of opportunities in the current global economy.


Are you interested in sustainability topics? Visit the Supertrends App and search for ‘sustainability’ to have an overview of our expert- and crowd-predicted future timeline of sustainability. Not an App user yet? Visit the Supertrends Pro – page to learn about your benefits and request a trial – for free!

© 2021 Supertrends

Accountant reviewing documents Accountant reviewing documents Accountant reviewing documents Accountant reviewing documents

The Future of Accounting is Digital

The financial services industry is evolving rapidly, propelled by the rise of new technologies and software. Accounting is also stepping up its pace, striving to implement and extend the usage of electronic invoice systems. Once rolled out on a larger scale, electronic billing has the potential to save costs, speed up transactions, and ease the environmental burden.

Serial entrepreneur, author of ten books, and fintech enthusiast Werner Valeur strongly believes that the future of accounting is digital. From automated data capture, direct data transfer between transaction participants, and real-time payments to the replacement of Excel spreadsheets and innovative ways to automate financial processes – Valeur is optimistic about the potential of this industry and the vast opportunities that digitalization will facilitate.


“Digitalized and automated processes allow for more data points. This way, the company has access to more information and can better optimize its strategy.”


Electronic invoicing is one of the major goals and an essential step in the future development of accounting systems. Electronic bill payment is a system that allows the transaction parties to generate, send, pay, and trace their bills electronically via the internet, instead of the obsolete exchange of paper invoices via post or email (scans). This process is sped up and facilitated by the latest technological developments, access to cloud services, widespread internet access, and the increase in digital literacy among the general population.

In 2017, 90 percent of the documents were managed as paper records, but experts and business analyses indicate a significant change in the future[1]. Valeur expects that electronic invoices will become a generally accepted transaction model in northern Europe by 2025, and in the other regions of Europe a few years later.

Werner Valeur in his office
Werner Valeur in his office; Photo courtesy of Werner Valeur

Currently, Latin America and Scandinavia are the world leaders in implementing e-invoices on all three primary levels: business-to-consumer, business-to-business, and government-to-business[2].

Multiple national initiatives have been launched across different countries, but the most important challenges are posed by cross-border harmonization, collaborative approaches, and the spread of adoption.

Already in 2014, the European Parliament and Council set in place Directive 2014/55/EU, aiming to develop a common European standard on electronic invoicing. This will lead to increased interoperability and better synchronization across countries.

Researchers have already identified a high adoption rate of e-invoices among young consumers, which will speed up implementation even more, given the rise of a new generation of “digital natives”. 


“In accounting, there are a lot of highly educated people, and the potential for advancement is huge. However, most of them have been educated according to outdated principles which don’t fit anymore in the current digital society. Therefore, the digitalization of accounting should already begin within the education system.”


Regarding the impact at company level, e-invoicing is expected to generate up to 80 percent in cost savings compared to paper-based processes, reduce the payment-associated risks and lead to centralized management of financial documents.

Are you an expert in your field?

Supertrends is a technology and media company dedicated to uncovering the future. At Supertrends, we believe in expertise. In order to maintain the quality and value of our articles, reports and timeline, we go straight to the sources of knowledge about the future – experts. Supertrends experts are highly qualified individuals who are knowledgeable about, and actively working to create, the innovations that will shape our future. If this sounds like you, you can read about the application process as well as the benefits of being a Supertrends expert here.

© 2021 Supertrends

References

[1]Edicom Group, “E-Invoicing Status Worldwide,” June 5, 2017, https://globaleinvoicing.com/en/news/e-invoicing-status-worldwide.

[2]Bruno Koch, “The E-Invoicing Journey 2019-2025” (Billentis, September 2019).

  • 1
  • 2

Join us

    

Sign up for our Newsletter

Subscribe

Supertrends AG, Erlenstrasse 16, 6300 Zug