TF 2043 Humans in the Futures Loop

31st January 2024
Younger generations coming into power today have a problematic inheritance: the legacy of the problems caused by preceding generations and technologies that do not reflect more modern thinking. In this blog we set out to explore the area of different human values being ‘baked into’ technologies. When humans are 'in the loop’ of technology driven decision making, what are their interests and who is being left out?

The challenging legacy for new generations

As automation and autonomy of technology increases, so we are becoming more aware of the diversity of society. Not just diversity by scientific classification but by soft, cultural and temporal patterns. Generations of people with proximate age, also known as age cohorts, are defined by shared experiences and values.

The younger generations coming into power today have a problematic inheritance. The legacy of the problems caused by preceding ones, for example climate change, and technologies that do not reflect more modern thinking.

We set out to explore the area of different, and perhaps outmoded, human values being ‘baked into’ technologies used in the present and the future. When humans are ‘in the loop’ – programmatically, through dataset priming or actively – of technology driven decision making, what are their interests and who is being left out?

Addressing the problem

In response to feedback from our Imperial Business Partners Members in 2023, we have set out to explore intergenerational differences in relation to technology design in our latest 20+ Futures activity.

Generations: humans in the futures loop

The concept of ‘generations’ that we will be exploring throughout this work comes from the generational theories of William Strauss and Neil Howe, as well as Pew Research. Their generational archetypes are widely known and used, and many of you will fall into these categories, whether you like the labels or not: for example Baby Boomers, Generation X, Millennials etc.

While multiple generations are concurrently active at any one time, there are shifts, or times of change in dominance in positions of power.

Nowadays for example, Baby Boomers (b.1946-1964) and Generation X (b.1965-1980) are on the decline in numbers and positions of power, having shaped much of society, politics and economics throughout the late 20th and early 21st centuries. New values are coming through from the rising wave of the Millennial generation (b.1981-1996) and following them, keeping the wave going together, Gen Z / Homeland (b.1996-present).

“I’m not a fan of nostalgia. Wouldn’t we just love things to be the way we imagine they used to be, but actually weren’t?” – Mr. B, The Gentleman Rhymer

The generations in descendance, Baby Boomers and Generation X, have made much of planetary exploitation and consumption. As they shuffle sheepishly and gloomily offstage, they leave the ascending generations a legacy of problems, as stated by the World Economic Forum (WEF) for example:

  • global weather changes
  • AI generated misinformation and disinformation
  • societal and political polarization
  • a cost-of-living crisis
  • rising cyberattacks

These are ‘used futures,’ where people are being told what the future is going to be like for them as it has already been prescribed. Even if those futures don’t really work!

The generations ascending not only have to cope with legacy of the above problems but, due to economic stagnation, are more and more reliant on inherited wealth. They also have a business ecosystem that exists on a life support of ‘wizard money‘ such as Quantitative Easing and cost-cutting. Aside from inherited wealth, their future has been described as a ‘legacy of broken promises’ at the 2024 WEF in Davos.

Software, AI technology and their relationship with humans today

We are particularly interested in software because it is commonly used, has shorter cycles of development and underpins much of the affluent world. It is also a channel for generational bias and assumptions to be unconsciously embedded from concept through design to realisation, unless conscious checks are performed. AI has a potential bias, even at times, racist, from the data that is used to train or prime it.

As AI is now being released as tools to the human crowd in user friendly platforms, like ChatGPT and Midjourney, that means it does not require experts in its application for the end-user. As with microcomputing, the worldwide web, social media, mobile phones and drones, there will be an explosion of experiments and meaning-making. At the individual level for example, personal assistants, teachers, specialist coaches, life coaches, psychological advisors and even virtual companions may be deployed using this technology. Deployed as a means for individuals to take more control over their time, and to manage the tedious choice-making and complexity of the modern world, due to a society facing peak consumption.

There is a great shift in technology taking place as, through the deployment of new satellite networks, it is also becoming more ubiquitous, interconnected and available.

Technology will not only be more autonomous in smaller systems but also in large-system wide ones. We see the glimmer of this in the increasing degree of automation in stock market systems.

The Loop: where do we find humans in software today?

As noted above, there are generational biases in systems and in ecosystems. In the military, increasingly automatic and autonomous systems are being used in overt and covert operations. In them, a human is often called ‘in(to) the loop’ of important decisions such as ones causing death. The human makes the final decision. Humans have individual values, as for example, a particular religious upbringing or political leaning, that colours their outlook on life.

As we automate more and more of our systems, especially civilian ones, where are humans in the loop? Who are these humans? What are the generational values that are embedded in these systems and proxies for people?  We might also think about AI proxies for non-humans. The design company Superflux proposed an AI to represent the planet in a piece of work for DEFRA about water management solutions.

The technocultural pollution challenge: is there really a problem?

A term we use where human values are implicitly rather than explicitly transmitted is ’technocultural pollution’.

In the decades of digital technology, we have seen human values shape systems and even carry them into different domains. For example, an Enterprise Resource Management system that was defined and built in Germany for manufacturing is deployed worldwide. The industrial-age language of management mindset deploys too if applied to an Information Age organisation developing software. The language of industrial production is applied to human resource efficiency while software companies also think in terms of creativity, skills culture and effectiveness.

A more modern example is the App Store. In more liberal and democratic countries we used to utilise one-to-one software provider relationships. Now, mostly driven by mobile devices, we have moved a mediated and federated App Store model, which sees many providers under an umbrella but with business relationships and policing of them still done on a one-to-one basis. The consumer has a freedom of choice in this environment but also, paradoxically, a higher cognitive load of navigating it. There is also an increased risk of bad actor vendors or vendors who have a frail long-term business case by being based on investment rather than generated profits.

China has taken a different approach. It uses the ‘Super App’ approach of a government-owned and gated multi-provider system. The apps within that are government-sanctioned and access uses a clear government citizen identifier. This approach may reflect a less liberal and state-run economy; however, the stronger validation of suppliers and citizen identity may solve some of the consumer fraud use cases experienced in an App Store environment.

Ideology may therefore be carried by technologies. The mobile phone, and such devices, are globally available and desired. Their business model is not just practical but ideological, with the App Store model that they use. The desire for the technology in less liberal economies can create a less visible flow of an ideology of consumption and choice.

We ask ourselves though, is this a conscious design and implementation of ideals, or is it the unconscious emergence from technologies that become available?

This is just a short discussion of cultural pollution that may take place without our conscious awareness or consideration of the values underneath. In short, software platforms have more than a functional value. They also reflect human values.

Problems to solve and opportunities to open the future

In the previous sections, we have talked about generations and the legacy problems to solve and software design. There are indeed legacy problems to solve and there is therefore great investment in ‘Pain Tech’, technologies like carbon capture to deal with internal combustion emissions (ICE) from vehicles in order to solve them. ‘Pain Tech’ is somewhat easy to attract funding and attention. Problems often have strong evidence behind them and offer opportunities for measurement of impact. However, by focussing on tackling the immediate problems we see in front of us, we lose the space and opportunity to think beyond them into new futures such as a world without personal commuting at all.

But what about getting ahead of future problems and opening future opportunities? What are the new futures that can be envisioned? What ‘Progress Tech’, or technology that disintermediates old technology and legacy business models that can be planned, designed and built for? Rather than trying to depopulate the future of problems, how do we populate it with opportunities? Do we consciously design for progress, or do we wait, in hope that products, services and experiences will simply emerge?

The development of AI, and the next wave of technology to be opened to the masses through platforms such as ChatGPT will be hugely important. It is less programmed than trained using masses of data. Here data is the weak point, for example the bias we discussed earlier and with the choice and depth of the data sets used. We may still need AI to have conscious constraints when we know it will have incomplete, biased or simply incorrect data.

AI is a technology that can be shaped with the future in mind rather than limiting ourselves to improving the past model of consumption that has, and continues, to harm the planet for human life. We introduced and explored this in our 2041 Scenarios, ‘computation, energy and the planet’ a couple of years ago anticipating the huge energy demands that increased technology use will demand.

Some of the great shapers of today’s world and tomorrows are clashing in this space. For example, Larry Page at Google wants to create greater intelligences than humans are capable of, while Elon Musk argues it should be designed to augment it. It is a contested technology and ideology.

The bigger challenge of keeping humans in the loop

How do we stand on the shoulders of the past rather than topple off, or be toppled off them? At Imperial, one of the places that we do this is by our nature of Applied Research, our strong start up program and our relationships through Imperial Enterprise and groups such as Imperial Business Partners.  We exist to make our ideas land.

Our exploration

We will explore this intriguing problem by the following means:

  • Running an eLab Idea Challenge in June 2024 for Imperial Students to ponder the possibilities of ‘Pain Tech’ and ‘Progress Tech
  • Videos from Imperial’s experts on this idea proposition. We want our academic experts to challenge our own foresight driven assumptions
  • A Tech Foresight Day in July 2024, for an audience of Imperial’s industry partners, where elements will touch upon the challenge of generational differences.

If you’d like to know more about anything discussed in this article, please contact Emilie Didier