As technology becomes integrated with more and more aspects of our lives, the profile of risks associated with technology is also expanding. New advances in many kinds of technologies pose potentially significant ethical challenges (e.g., ‘Artificial intelligence’, ‘Biotechnology’, or decarbonization technologies). This coincides with our increasing use of these technologies creating potential risks at a macro level (e.g., cybersecurity of a nation’s critical infrastructures) and at a micro level (e.g., security of personal data and individuals’ vulnerability to online manipulation). Such risks are certainly to be expected with the advent of disruptive technologies and they are the price we have to pay for the great benefits these technologies offer us… it is a question of how well we recognize and mitigate these risks so as to ensure that new technologies can be used for the benefit of all.

Ethics of technology

Many governments around the world are turning their attention to the ethics of technology and the implications of fast-developing technology for future societies.

Ethics related to the use of ‘Artificial intelligence’ (AI) for automated vehicles, automated decisions and consumer interactions are topics that are frequently raised[1] and governments will increasingly be expected to address concerns around digital harm, disinformation, antitrust and foreign interference.[2] The AI-enabled technologies of the future must benefit from effective ‘technical, legal, and ethical frameworks’, according to the UK Ministry of Defence. Ethical questions are perhaps most critical in the area of militarized AI, and the use of technology in conflict. While machines could behave without regard for human suffering, they may also more accurately calculate the costs of conflict. Complexities can be expected to arise if countries develop conflicting ethical and legal frameworks for AI, both in military contexts and more broadly.[3] Other key ethical issues related to AI systems are about unwanted bias, eavesdropping, and safety, and industry is already busy trying to address these. The ISO/IEC committee working on AI (ISO/IEC JTC 1/SC 42 Artificial intelligence) has collected 132 use cases for AI, including ethical considerations and societal concerns for each (for more details, see ISO/IEC TR 24030:2021 Information technology — Artificial intelligence (AI) — Use cases).

When considering the ethics of using AI, however, it is equally important to consider the ethics of not using AI. The risks of using AI are frequently discussed, but one question that is not addressed often enough is - when does it become unethical for us not to use AI? For example, if AI technology could predict the next pandemic or speed up vaccine development, one could argue that it would be unethical not to use the technology. There are plenty of examples like this, for instance, a common question posed is: if an AI enabled autonomous vehicle had to hit someone, who should it hit? But is this the right question if the proper use of AI enabled autonomous driving can help save lives by reducing accidents overall?

Of course, AI is not the only emerging technology that could pose significant ethical challenges in the future. Advancements in biotechnology could – alone, or in combination with AI – lead to the creation of synthetic life forms or augmented human beings, with enhanced physical or cognitive abilities. How to regulate technologies that can fundamentally alter human capabilities or change the human gene pool “could prompt strident domestic and international battles” in coming decades (see ‘Gene editing’). Even technological advances to treat diseases could engender political debates about the ethics of access (since treatments are likely to be available only to those who can afford them).[4] Not to mention continued ethical debates about genetically engineered crops and foods and their potential ecological or health-related consequences.[5]

As the climate crisis becomes more urgent, we may also soon face ethical issues related to the use of new technologies for decarbonization. While geoengineering technologies (carbon dioxide removal and solar radiation management) have for many years been considered morally unacceptable, they are now gaining more attention as potential solutions of last resort.[6] Ethical concerns here range from distributive justice for future generations or vulnerable populations (negative effects of geoengineering actions could disproportionately some countries or populations e.g., by increasing drought in Africa and Asia), to procedural justice questions (who should decide to use these technologies and how?).



It’s all about trust
Artificial intelligence (AI) has the potential to aid progress in everything from the medical sphere to saving our planet, yet as the technology becomes ever more complex, questions of trust arise. Increased …
To ethicize or not to ethicize…
Ethical decision making isn’t just another form of problem solving. As artificial intelligence (AI) grows in capability and influence, experts are treading uncharted territory to develop the International …
Big plans for big data
Smart organizations have long relied on data to help make strategic business decisions. But “big data” has its challenges that need to be addressed before it can have real impact. A new study group will …

data privacy

“Trust and accountability are the new litmus tests for businesses in a world where digital is everywhere.”[7]

In the future, will data privacy be a thing of the past? Many sources agree that there is a clear trend towards the progressive loss of privacy that accompanies new developments in technology. According to the UK Ministry for Defence, “In the coming decades, every facet of one’s life is likely to be recorded by the ubiquitous presence of wearable devices, smart sensors and the ‘Internet of things’ (IoT)”.[3] But at the same time, there is also a trend towards emphasizing privacy, for example, using privacy by design development. Once privacy respecting technology is available, the market has the choice, and the global success of general data protection regulation (GDPR) principles is an indicator of this trend.[8]

The use of biometric data, such as fingerprints and facial mapping, is increasing in both private (e.g., social media and personal technology products) and public (law enforcement and population surveillance) contexts.[9,10] Consumer trust will be an increasingly important issue as technology becomes more and more prolific in everyday activities. Already, a majority of consumers are wary of connected devices and fearful of misuse of their personal data.[7,11] Some even suggest there may be a ‘digital bubble’, the bursting of which will be due in part to privacy concerns – “Concerns about data privacy have called into question whether digital technologies will continue to grow at this rate”.[11] At the same time, companies are adjusting to market conditions and, if the market demands privacy, industry will develop appropriate products.[7] Industry needs to realize that privacy respecting products are not much more expensive (if well done), but can instead provide a competitive advantage, since trust is a key decision factor for consumers faced with multiple options. Initiatives allowing the creation of ‘digital trust’, such as yelp and foursquare, are thus likely to grow in popularity.[12] Once society acknowledges that data has a value and therefore the data owner needs to be paid, a ‘new balance’ will be established. The question is if and when such an acknowledgement may come.

In the meantime, to reassure consumers, both government regulation and businesses leadership are necessary to establish privacy and data management standards that keep pace with emerging needs.[10] Indeed, this will be a growing consumer expectation.[7] Ultimately, it seems inevitably that technology will permeate almost everything we do and lead to enormous improvements in quality of life across society… but these benefits will need to be carefully balanced with the accompanying risks to privacy and security.[12]



Protecting privacy and consent online
For everyone concerned about online privacy, ISO/IEC 29184 has just been published.
Safe, secure and private, whatever your business
ISO/IEC 27009, just updated, will enable businesses and organizations from all sectors to coherently address information security, cybersecurity and privacy protection.
How Microsoft makes your data its priority
Privacy protection is a societal need in a world that’s becoming ever more connected. As requirements for data protection toughen, ISO/IEC 27701 can help business manage its privacy risks with confidence. …
Tackling privacy information management head on: first International Standard just published
We are more connected than ever, bringing with it the joys, and risks, of our digital world. Cybersecurity is a growing concern, with attacks against business almost doubling over the last few years and …
Data privacy by design: a new standard ensures consumer privacy at every step
On the eve of new EU regulations, and in the wake of recent large-scale data privacy breaches, a new ISO committee is leading the way with guidelines that put the consumer back in control.

cyber vulnerability

Increasing reliance on technology and the proliferation of digital devices in daily life will create increasing risks related to data privacy, cyber-attacks, and consequences of system failure.[3,13] The key factor for prevention is risk awareness and proactive risk mitigation.

New digital technologies present serious challenges for governments and organizations and cybersecurity will remain a priority as critical infrastructure is increasingly connected to online systems and technological dependence on the internet continues to rise (see ‘Spread of the internet’). Internationally, states will have to respond to evolving cyber threats and prepare for cyber-attacks as an instrument of war, counterintelligence, and political interference.[9,13,14] One data breach can impact multiple nations sharing online systems.[15]  If they are aware, national leaders may take appropriate steps to protect large-scale systems such as electrical, communications, financial, logistical, and food-production grids.[9] They need to be proactive. Common Criteria for Information Technology Security Evaluation or the EU Cybersecurity Act are two examples of such proactive ventures.

Questions around ‘cyber borders’ may be part of the discussion around ensuring protection from attacks and states and organizations alike must prepare for developments in cyber-crime.[3] As more and more citizens are connected to, and reliant on, online networks, the potential for terrorist attacks will grow, if the system is not resilient enough and sufficiently protected.[9] For developing countries in particular, preparedness for cyber-threats will need to accompany digitalization programs and development of connected systems.[16]

Finally, cyber-vulnerability does not exist only at the level of countries and organizations. Looked at from a slightly different perspective, the vulnerability of individuals is also set to increase because of their online exposure. For example, more people will get their information online, leaving them potentially more exposed to misinformation (‘fake news’), which could be used to manipulate individuals or even on a larger scale to influence public opinion.[13]

To effectively mitigate these risks related to cyber-vulnerability, people cannot rely on government action alone – society needs to be the driving force. Society needs to demand that organizations maintain highly sophisticated information security systems to foster consumer trust and remain competitive.[2]



Cybersecurity in cars
New standard just published to help keep the hackers at bay.
The cybersecurity skills gap
Why education is our best weapon against cybercrime.
Keeping cybersafe
New guidance on cybersecurity frameworks just published.
Cybersecurity in the driver’s seat
As the world gets more connected, so do our cars. But greater connectivity equates to more data that could get into the wrong hands. Cybersecurity in automotive engineering is an industry with the wind …
Smart manufacturing: new ISO guidance to reduce the risks of cyber-attacks on machinery
In our hyper-connected world, IT security covers not just our data but virtually everything that moves.
How to tackle today’s IT security risks
Industry experts estimate that annual losses from cybercrime could rise to USD 2 trillion by next year. With countless new targets added every day, especially mobile devices and connected “things”, a joined-up …
The quest for cyber-trust
With technology becoming ever more sophisticated and offering both enhanced opportunities and new vulnerabilities and threats, there is a danger that organizations of every different type leave themselves …
Are we safe in the Internet of Things?
Suppose a criminal were using your nanny cam to keep an eye on your house. Or your refrigerator sent out spam e-mails on your behalf to people you don’t even know. Now imagine someone hacked into your …


  1. Digital megatrends. A perspective on the coming decade of digital disruption (Commonwealth Scientific and Industrial Research Organisation, 2019)
  2. The global risks report 2021 (World Economic Forum, 2021)
  3. Global strategic trends. The future starts today (UK Ministry of Defence, 2018)
  4. Global trends. Paradox of Progress (US National Intelligence Council, 2017)
  5. Global trends 2040. A more contested world (US National Intelligence Council, 2021)
  6. Ethics of geoengineering (Viterbi Conversations in Ethics, 2021)
  7. Technology vision 2020. We, the post-digital people (Accenture, 2020)
  8. Two years of GDPR. questions and answers (European Commission, 2020)
  9. Global trends and the future of Latin America. Why and how Latin America should think about the future (Inter-American Dialogue, 2016)
  10. 20 New technology trends we will see in the 2020s (BBC Science Focus Magazine, 2020)
  11. Beyond the noise. The megatrends of tomorrow's world (Deloitte, 2017)
  12. Future outlook. 100 Global trends for 2050 (UAE Ministry of Cabinet Affairs and the Future, 2017)
  13. Global trends to 2030. Challenges and choices for Europe (European Strategy and Policy Analysis System, 2019)
  14. Global risks 2035 update. Decline or new renaissance? (Atlantic Council, 2019)
  15. Asia pacific megatrends 2040 (Commonwealth Scientific and Industrial Research Organisation, 2019)
  16. Foresight Africa. Top priorities for the continent 2020-2030 (Brookings Institute, 2020)