top of page
14.KC.png

New Computing Governance Models

Beginning with the first chess-playing computer, IBM’s Deep Blue, which made an impressive showing against world champion Garry Kasparov in 1996 before a revamp helped it beat Kasparov in a match held the following year, algorithms have started to outperform humans in ways that were unimaginable in the not-too-distant past. As computing power grows, so does a need to bolster sustainability and protect privacy. The past few decades have seen an exponential increase in computational capability.

 

While Deep Blue’s success was built on raw computing power - by simply calculating every possible outcome of the chess match ahead of time - modern systems such as AlphaGo (the original version of which was used to defeat a human professional at the boardgame Go in 2016) and DeepMind’s Starcraft 2 AI (which reached grandmaster level in the real-time strategy game StarCraft II in 2019) manage to excel at games commonly believed to require a significant amount of human intuition. Meanwhile, artificial intelligence and machine learning are now regularly used to influence our interest in products, predict our voting habits, and estimate our life expectancy based on interests and habits.

Technological development is happening at an increasing pace, leaving less leeway for current governance models to be updated in a way that will provide state-of-the-art regulation for different aspects of the computing world. As technology rapidly evolves, so must our ability to manage its impact.

 

Sticking with outdated regulation may not only fail to protect the general population when it comes to privacy breaches (both intentional and otherwise) and other matters, but also hinder innovation in a highly-competitive environment. Developing new governance models that simultaneously enable healthy innovation and swift development while protecting the public interest in terms of security, privacy, and sustainability remains a daunting challenge.

 

However, there have been some signs of potential progress. For example, the European Union adopted power supply unit efficiency requirements for many types of servers and data storage products in 2019, and the EU’s General Data Privacy Regulation (GDPR), implemented in 2018, is an effort to give users more power over the gathering and distribution of their personal data.

Environmental Impact of Computing

we.png

The manufacture, use, and disposal of computing technology can have grave consequences. The data centers necessary to power cloud computing and provide constant access to the internet account for about 2% of global greenhouse gas emissions - a carbon footprint equivalent to that of the airline industry.

 

In China, electricity consumption related to data centres accounted for 2.35% of the country’s total in 2018, and is on track to increase by 66% over five years starting in 2019, according to a report published by Greenpeace and North China Electric Power University (the same report noted that China’s data centres generated carbon dioxide emissions equivalent to more than half a million railcars’ worth of burned coal in 2018).

 Artificial intelligence, with its energy-intensive need for data that must be shuttled around the world in growing amounts, is a primary cause of the growth in demand for data centres and transmission services. However, energy efficiency is improving, according to a report published by the International Energy Agency in 2019; while data centre traffic was anticipated to grow by 80% over the next three years, global data centre energy demand was expected to decrease slightly by 2021.

The production of electronics has a global environmental impact, due in particular to a heavy reliance on the mining and processing of rare earth elements. These elements - such as europium, neodymium, and terbium - are required for the batteries, displays, hard discs and microphones that make computing possible. Despite their name, these elements are not that rare (even the rarest of them are still more than 100 times more common than gold), though harvesting them can be a difficult and dirty process.

 

More than 80% of current global production of rare earth elements takes place in China. In addition to China, suitable mining sites are primarily located in Australia, Japan, Brazil, India, and Russia - and increased mining activity in these places has stirred geopolitical interest. While the extraction and purification of rare earth elements is highly demanding in a way that aggravates their environmental impact, efforts are underway in places like Japan and France to develop profitable methods of recycling rare earth elements that are extracted from electronic waste.

14 KC.png

FUTURE OF COMPUTING

When you think about your carbon footprint, what comes to mind? Driving and flying, probably. Perhaps home energy consumption or those daily Amazon deliveries. But what about watching Netflix or having Zoom meetings? Ever thought about the carbon footprint of the silicon chips inside your phone, smartwatch or the countless other devices inside your home?

 

In the not-too-distant past, computing was limited to a few large machines installed at a handful of universities. Now, fully-fledged information and entertainment systems are sitting in just about everyone’s pocket. New approaches like quantum computing and artificial neurons promise to push performance and efficiency even further. However, technological progress has raised significant concerns related to our ability to protect both sensitive data and the environment - which can only be addressed with regulation designed to keep pace with innovation. 

 

Every aspect of modern computing, from the smallest chip to the largest data center comes with a carbon price tag. For the better part of a century, the tech industry and the field of computation as a whole have focused on building smaller, faster, more powerful devices — but few have considered their overall environmental impact.

It is time to take action! Join our Tribe of Changemakers. Sign up for Virtual Conversations! 

Widespread Applications

we.png

Computer applications have moved beyond simple calculations. Developments like 3D printing and the Internet of Things are pushing computing needs into surprising new places. Devices can now make both broad predictions about weather patterns and financial markets, and narrowly-targeted recommendations about everything from when to buy a house to what to watch on Netflix.


Basically, it is now harder to name a single area where computers are not indispensable than it is to list their potential applications. More types of devices are starting to be connected to the internet, and the Internet of Things is therefore a natural progression - ushering in a period when refrigerators know if you are running out of milk, your smartwatch is monitoring your heartbeat and can call an ambulance if you have a heart attack, and your washing machine can conjure up new detergent at the push of a button. Algorithms may become able to predict which drugs can be most effective with the least amount of side effects, in the same way that they can now offer up targeted advertisements and news stories on social media services that match the interests of users.

If we think beyond current capabilities and try to glimpse into the future of computing - and the wealth of possibilities it could offer - that may include optimized 3D printing. That is, the 3D printing of both micro- and macro-structures (from single atoms to entire buildings), and of prosthetics and biomaterials including everything from foliage to meat.

 

Mobile phones, for example, have become mobile computers with far more capabilities than a simple call function, while cars are evolving to become primarily large computers with a mobility function.

 

Much like access to water, which evolved from a single well in the village centre to vast systems of canals and pipes with water pouring out of every tap essentially for free (in most Western societies, anyway), computing has evolved from a few select computers installed at just a handful of universities several decades ago to full-scale computing and entertainment systems sitting on just about anyone’s desk. Now, it only seems like a matter of time until computing penetrates the human body.

A Broader Computing Reach

rr.png

Computers have evolved from calculating machines to essential equipment for work, entertainment, and daily communication. For computing to continue progressing at a swift pace, it needs to become more essential in more fields. Laptops and tablets are already being used as training aids in primary schools, and the borders between computers, video game consoles, and TV screens are nearly non-existent as more movies, shows and games are being stored in the cloud and made available on demand.

 

The next logical step is entertainment that takes place not on pre-determined screens, but in augmented reality wherever a viewer might be. For computing to continue at its current rate of growth, its integration will have to spread further. One significant area that may be increasingly penetrated is the biotechnology and health sector.

 

Biological and medical research at the molecular level - in particular DNA research - provides a means of predicting allergies and disease, including cancers. This research can be computationally intensive, as it takes into account factors like a person’s DNA information and details about their full family tree. However, the challenges this presents must be viewed not just in relation to the limits of technology and the laws of physics, but also relative to social norms.

 The further spread of computing’s impact on biology and medicine will rely on a tighter integration of computer science and biomedical research. For example, next-generation computers could support drug design in a way that helps suppress side-effects, or strengthens desired effects.

This sort of computational capacity could also penetrate autonomous logistics and production in a way that improves mobility.

 

Better self-driving cars, for example, could lower the risk of accidents. And the same underlying math used to develop self-driving cars can also be applied to optimizing production lines, and to improving power distribution (especially in the face of an impending increase in power demand when electric vehicles become more popular). Similar optimization can be applied to finance, in the form of portfolio management or high-speed trading, for example.

 

It is important to note that the growth of computing is tied not just to better and more powerful hardware with greater amounts of memory; progress has also come in the form of entirely new architectures like quantum computing, neural computing, and quantum-inspired algorithms that provide a new and potentially speedier way to perform calculations.

Overcoming Technological Limitations

14..png

The power consumption of CPUs can be reduced, by shrinking the size of the semiconductors where they are contained. But the structures in semiconductor chips cannot just be decreased arbitrarily - even when using highly advanced manufacturing processes, like multiple patterning or extreme ultra-violet (EUV) lithography, which only a small number of companies have the resources to pursue.

 

Computing limitations are perhaps clearest in terms of CPU clock speed; these have been temporarily addressed by joining multiple cores into one chip, though that has led only to marginal improvement in recent years. In addition to CPU speed, other hurdles include the amount of data that can be stored on hard disks, which is approaching classical limits (while general-purpose CPUs can be partially replaced by specialized processors for specific tasks, that can be prohibitively expensive). The limitations placed on classical computing are being addressed both in terms of hardware and software.

 On the hardware side, researchers and engineers have looked into basic computing units that are fundamentally different than transistors (the tiny, revolutionary switches underpinning CPUs that emerged in the middle of the 20th century). Examples of these approaches include “neuromorphic” computing systems that mimic the human nervous system, and quantum computing - an entirely unique model that keys on quantum mechanics to potentially outperform classical computers. In addition, companies are becoming more open to the idea of moving beyond traditional materials such as silicon; single atoms and photons (particles of light) are being investigated as new computational media, for example.

In terms of software, decades of quantum-computing research has spawned new ways of thinking about computing with classical devices - resulting in novel classical algorithms (or “quantum-inspired algorithms”) that can potentially solve specific problems in a manner that is orders of magnitude faster than was thought possible just a handful of years ago.

 

Fresh approaches like quantum computing could help break through capacity limits. Constantly increasing computing capacity is the backbone of modern society. Advances in computational capability come at a cost, however. For example, significant increases in required electrical and cooling power have environmental repercussions, and place natural limits on the performance of the central processing units (CPUs) necessary to execute instructions.

Cybersecurity Evolution

GOAL 14.png

Secure digital communication relies on exchanging keys to encrypt information in a way that makes it prohibitively costly to even attempt to crack. RSA cryptosystem, which is widely used, is “asymmetric” - that is, based on a process that involves the open exchange of a public key used to encrypt information, which can only be decrypted by the owner of a corresponding private key. New computing technologies create a need for more sophisticated cybersecurity.

 

Consider the following example: assuming that it will take 10 years to agree on and widely establish a new encryption standard, and assuming a safe-storage requirement of 10 years, current security providers must now consider the potential firepower of computing devices that will be in use 20 years in the future (and some applications will require an even longer period to consider).

 

The underlying security of this system is based on a mathematical problem: while it is relatively easy to calculate the product of two (known) numbers, it is very hard to find the factors that compose a single, large number (hence the term “asymmetric”). To date, a lot of encryption has been based on asymmetric protocols. However, new technologies - in particular, quantum computers, quantum simulators, and wave-based analog computers - may be able to overcome the mathematical challenge of asymmetric cryptosystems. While these technologies are still in their infancy (quantum computing, which harnesses quantum mechanics to boost computing power, was considered years away from maturity as of 2020), it is only a matter of time until they mature.

The time needed for this maturity must be considered in relation to the time needed to bolster encryption protocols so they can withstand attacks from next-generation computers. In the US, the National Institute of Standards is moving towards the first set of encryption standards designed to help withstand attacks from next-generation devices like quantum computers. However, it is unclear both how long it will take to reach consensus agreement on these new standards, and how they will perform under intense scientific scrutiny.

 

In addition, the use of improved encryption techniques will require additional computational power - which will in turn potentially increase electrical power consumption needs, and impact internet speeds.

Unlocking Big Data

A steadily growing mountain of data is being generated by an increasing number of entities, and becoming available to a growing number of organizations. Advances in computing will help glean insights from progressively bigger datasets. In fact, the amount of data recorded by humanity doubles about every two years. Service providers and users are now graduating from searching for and processing raw data, to finding patterns and relationships between available information and metadata - in order to bolster precision.

 

Amazon, Netflix and others have based much of their businesses on recommendation systems that push additional products to customers based on their online activity. Grouping and categorizing people like this can be of interest not just to businesses but also to governments - though as more data (including faulty data) become available, this is becoming a bigger challenge.

 

While big data can be used both for good and for bad, that should not obscure the fact that data is useless without the algorithms necessary to evaluate and correlate it. The potential unlocking of any available information also raises privacy and human rights concerns; for example, someone’s behavior, health history, and genealogy could be used to affect their insurance costs, while their movements could be tracked in a bid to identify hobbies and friends (or potential criminal activity), and their interests could be exploited to affect their political preferences and voting behavior.

 

Trained systems such as so-called neural networks are only as good as the data they are trained on; human bias infecting that data can result in a biased system. For example, a human resources department comparing the profiles of applicants may try to make predictions about their future reliability and dedication, and to find a match between their goals and those of the company; but if the algorithm being used to do this has been trained on biased data, it will reflect biased decisions.

 

However, a careful evaluation of these programs can be deployed in order to test for bias in data, and in turn for bias being applied in making hiring decisions. It is therefore necessary that close scrutiny is applied to any algorithm or dataset being used to engineer important aspects of life, work, and technological progress.

bottom of page