Our own daily habits have become currency for others. But it is a quickly–changing field that is not bound to physical geographies; an AI application can easily be spread around the world in seconds.
We are still bound to many of the major technology providers even amidst a growing distrust. Technology used to be the most trusted sector globally by the Edelman Trust Barometer, but that trust continues to erode by every year the study is being done. What might organisations do to rebuild trust from their users?
Machine learning, artificial intelligence and social systems are only good if they make a real difference for the many. In the second session of Imperial Tech Foresight 2040: Intentional Creations, three researchers from Imperial College London give us a future perspective on how we can balance ethics and technology.
Privacy and personal data has been under a lot of scrutiny recently, often warranted when it is unclear what the intentions of a business might be (and see our overview of the Meta-Motivations session for why clear and transparent intentions are important).
Meanwhile, our individual quest for a smarter, more efficient and less taxing lifestyle with a bent for improving our wellbeing has been favouring seamless digital interfaces without any friction. Anticipatory technologies that assess and respond to our needs without us even asking has become increasingly ingrained in our technology dependency.
Future of databox
It is lovely with devices that respond to your needs and allow you to quickly make decisions, but do we really know what data these devices area using on how it is being shared?
Dr Anna-Maria Mandalari, from Imperial’s Dyson School of Design Engineering, explains how new systems, interventions and experiences that will set the tone for devices that enable trust across society – showing a future where we take back control of our device data by using new powerful ideas.
Digital justice
Professor Jeremy Pitt, from the Department of Electrical and Electronic Engineering at Imperial, wants to take things back to basics with using social theory to create AI for good. How might we create networks that distil fairness and justice?
He applies these systems on sustainable development goals to allow us to create systems that help us share energy and resources, encouraging a truly egalitarian sharing economy.
Developing human-centred codes of conduct
Taking a look back at how previous technologies were implemented and deployed during global industrial advances should also help us make better, more informed, choices about how we do this going forward, to reduce the potential negative impacts.
Our impact on the environment has been well-documented, and the current climate emergency follows years of neglect as a result of the unintended consequences of economic development.
This has compromised ecosystem services and disrupted the natural balance, and failure to take into account the effect of technology on society could be catastrophic.
Professor Rafael Calvo, from the Dyson School of Design Engineering at Imperial, will lead us on a journey of discovery about why we cannot risk underestimating the societal impact of technologies on humanity and human wellbeing, especially when it comes to AI.
We, instead, should use new forms of ethics as a guide to creating design principles that put the end-users’ health and autonomy at the centre of development. Making a future that promotes positive human experiences. This fascinating trio’s insights will be of interest to people working in all sectors and industries, and especially those who are currently developing technologies with AI underpinning their success. There has never been more scrutiny over the increasingly personalised user experience of technology in the home, at work and in our day-to-day lives, and that scrutiny will only increase if companies cannot consistently prove that they are acting legally, ethically and with societal benefit ahead of profit alone.
Register for the event
Imperial Tech Foresight 2040: Intentional Creations is a three-part virtual conference taking place in June 2020, using our understanding of the past, present and emerging signals of change to make better decisions about tomorrow.
Moral Machines, the second session in Tech Foresight 2040: Intentional Creations, takes place on Thursday 18 June, 15:00 BST.
Interact online by using the hashtag #TF2040, and follow @ICTechForesight on Twitter for updates and information about the event.
Read about Meta-Motivation, the topic for the first session of this year’s conference on 11 June 2020.
Find out about Malleable Matter, the subject of this year’s third session of Tech Foresight 2040 on 25 June 2020.