Ageing and inequalities

As the world evolves, we need to reconsider which inequalities we will have to deal with while ageing.

C4D9DEC1-A193-43E3-915D-465D3C42F2CE

The launch of the Centre for Ageing and Inequalities at Newcastle University marks the achievement, or – if you prefer – the laying of an important milestone in research on ageing. The subject, inequalities linked to the different phases of life, is not a new one, and a good corpus of literature is available on the subject[1]. Inequalities in quality of life, health, and resource characteristics tend to increase with age, reflecting processes of cumulative dis/advantage and the socioeconomic gradient. The research approach has thus far been orientated towards exploring the dimensions of health, the digital divide, social inclusion, and financial stability through rather classical parameters of a mainly statistical and socio-demographic nature: race, gender, geography, and social class. All variables, certainly divisive (as we well know), are now widely explored, scientifically measured, based on quite solid data, and become basic assumptions on our reasoning and policies affecting inequalities in old age.

However, the social, economic, political, and technological contexts in which we are evolving call for a multidisciplinary exploration of these same factors, a journey both wide and deep, so as to highlight a series of background correlations of influence between domains that traditional approaches tend, due to culture, methodology or siloed visions, not to seek out, and this is what the Centre, co-directed by the brilliant Professor Tom Scharf and Professor Shirley Jordan, intends to do.

The launch of the Centre is an opportunity to stimulate a debate on how the context of inequalities towards the older adults is evolving in relation to what we have traditionally defined up to now, and what the challenges and the opportunities are that are not so much in front of us, but rather upon us. Challenges that are already present today, ones that we have made our own at NICA. So, let me abstract 6 hotspots we shall focus our energy in our contribution to the new-born Centre.

First, it’s time to go beyond “multidisciplinarity”, as we used to call it

No one application, device, pill, legislation, or other magic solution can address the diverse opportunities and challenges of an ageing population. Working with citizens and communities to grow understanding and insights requires a holistic and empathetic approach, one that considers the challenge reading between the lines of life.

Consequently, there is no discourse on this subject that does not directly involve the older adults both as a source and as an endpoint of the context in which inequalities are explicitly perpetrated. On this methodological basis we aim to go beyond the concept of “multidisciplinary” as usually intended by “involving researchers from different disciplines” and include those uncoded contexts and disciplines derived by unexplored data driven applications, increased lifespan experiences, cross-domain professions (for example, the growing community of AI Ethics Developers), and knowledge derived from real life applications. Understanding and coding inequalities and therefore design novel strategies to practically help the older adults to deal with them will require brand new skills, tools, business models, breakthrough collaboration, and an openness to new ways of thinking. What is happening “out there” in a society deeply influenced by digital interactions which clearly seem to drive the behaviours of the contemporary society itself? We must look there, to a model that is fuelled by Bauman’s “we are what we buy” sharpest analysis, at those dynamics, if we want to really tackle the upcoming inequalities. Ageing creates so many intertwined challenges and opportunities, and so there must be an equally comprehensive and innovative range of responses.

Let’s remind ourselves of the core: it is a matter of power

Aware of its growing necessity, and driven by today’s prevailing techno-centrism, we have probably focused a lot of our most recent attention on the implications of machine ethics and algorithmic biases. To be clear, we have only scratched the surface of crucial topics for an Ethical AI such as accountability, value alignment, explainability, fairness and user data rights, and we are only at the beginning of a journey that is expected to be as interesting as it is complicated. Yet, it already seems time to take a sidestep and agree on a fundamental yet apparently forgotten fact, a decisive fact when it comes to inequalities. The discourse on technology – and artificial intelligence in particular – and its applications in the actions of our daily lives is a discourse that has to do with power. We’re looking at a profound concentration of power into extraordinarily few hands. This power span from economy to surveillance. And we already have evidence of how the trend is to further exacerbate this concentration to the detriment of people’s interests.

As said, we should probably step aside and look to a translation of the concept of power, where it is now, applying its dynamics in our daily digital-life and how these dynamics are source of increasing inequalities towards all of us, and toward an ageing population. Risk zones[2] represented by data control & monetization; implicit trust & user understanding; the surveillance state; truth, disinformation, and propaganda; hateful & criminal actors are the fertile ground where the Inequalities 2.0 will flourish and show all their deadly effects.

The black boxes: are we ready for a world in which…

How do those risk zones affect in practice our everyday lives and could become the “next inequalities”? Where do we hide the risks in which inequalities widen until they become not so much a social phenomenon to be observed, but a very specific and precise attack on our autonomy as individuals, on our free will, on our personal dignity? Because at the point where we are now, inequality, the unconditional exercise of power, domination over minorities or vulnerable or less informed or less educated to perceive the above risks, control and the attack on individual freedom are factors insidiously grafted onto what appear to be those, little, innocent actions that hundreds of millions of people, including the older adults and in increasing numbers, do every day, and which, given their intangible and interstitial nature, have not yet been adequately mapped out and made the subject of in-depth and widespread study.

In other words, are we ready for a world in which…?[3]

  • Video-faking algorithms are so advanced that faked videos are impossible to distinguish from real footage influencing more and more people (and their peers) opinion, choices, and decisions.
  • Conversation bots have been trained to imitate specific people, using data sets collected from public social media posts.
  • Automation could eliminate millions of jobs putting already marginalized communities even further at risk.
  • In the workplace, algorithms can identify individuals likely suffering from various mental illnesses, from depression to sociopathy. These services also predict who may develop symptoms of mental illness soon, based on trends in the individuals’ social postings. This data is used to offer support and resources to current employees, recommend reassignment when necessary, and suggest hiring/firing decisions.
  • “Predictive justice” tools become the preferred method for determining prison sentences.
  • Mortgage rates, loan approvals, and credit access are decided on deep data collected through social platforms. It takes into consideration the credit histories of close friends and family, locations visited (including frequency of visits to places like bars and legal marijuana dispensaries), and “semantic analysis” of messages and photos to indicate whether individuals are generally happy, angry, anxious, or depressed.
  • Facial recognition technology is a mainstream tool available to any individual or organization. Subscribers can tap into a database with hundreds of millions of faces indexed and clearly recognizable, discriminating by wrinkles, just for example.
  • Free health insurances are offered to anyone who agrees to install a smart toilet in their home and submit its data to the company. Smart toilets can detect stress hormones, infectious diseases, alcohol and drug use, and blood sugar levels, among many other things.
  • Twenty-five percent of online orders are delivered by drone. Many of these drones are fitted with cameras and other sensors to collect data as they fly over neighbourhoods, providing an additional revenue stream for shippers and merchants. Individuals who opt for free, unlimited drone delivery consent to the collection of data from their homes and yards. Entire neighbourhoods where drone delivery is legally permitted are subject to the same data collection activities—even though not all their individual residents or households have explicitly consented.
  • Self-driving vehicles become vulnerable to a new type of real-time ransomware. Hackers access the car remotely, turn off the engine, and refuse to start the car again until the driver pays a ransom.

Someone may argue that the above state-of-the-nation is affecting everyone, not necessarily the older adults. Which is my point: are we sure? Who is the increasing demographic in usage of social media, buying (legal) marijuana, challenged in the workplace, buying more electric-autonomous-driving vehicles, at higher risk of cognitive disease, slowed by a natural decline in information processing, empowered to “age in place” thanks to IoT AI based solutions, served by remote doctors, cared by automated systems?

The other side of the coin

It is easy to blame the evolution of technology today, as it is easy to list the threads older adults are exposed to. What about fighting them, in reality?

One thing I think we can improve on is to avoid focusing only on identifying inequalities and thinking that it is up to others to find solutions to combat them. The cycle of proliferation of inequalities linked to the evolution of technology is too fast for traditional research. It is therefore appropriate for research not only to ‘map’, but also to explore techniques for suggesting how to respond to the risks of new forms of inequality. How? By using the same weapons with which the risks of inequalities are generated, and thus by using, for example, artificial intelligence to identify the very risks to which we are exposed.

Two examples speak louder than a thousand words:

  • About fake news and the inequality generated by truth, disinformation, and propaganda
    • We assume that computer-generated text fools humans by sticking to the most likely words at each position, a trick that fools humans. In contrast, natural writing more frequently selects unpredictable words that make sense to the domain. That means that we can detect whether a text looks too likely to be from a human writer. The Giant Language Test Room or GLTR aims to take the same models that are used to generated fake text as a tool for detection.
  • About the discrimination towards older adults in recruitment online
    • Employment platforms (e.g., Indeed and LinkedIn) are now among the primary mechanisms for job posting, job search, and initial negotiations. This exposure of a job advert has obvious benefits for the employer, but this exposure also has the power to alienate and exclude large portions of society. In particular, the word choice of a single job advert can, perhaps unintentionally, exclude thousands of people by their individual traits, like gender and race. Age is a particular trait that garners more attention as ageism is often cited in the literature as being overlooked, misunderstood, and generally escaping social awareness. The Exclusion Spotter gives feedback to recruiters and employers on which words in their advert are ageist and possibly excluding people by age.

The above are just two projects between the many already deployed and between the many others we should suggest, promote, nurture, sustain, and fund. I am profoundly convinced that there is no alternative today to just standing on one side of the fence to fight inequality, but that the time has come to act and to do so within a process that unites research and business in the same effort.

It’s time to raise the issue of ageing and human rights

Too often older people are seen as a homogenous group without an appreciation of individual aspirations, capabilities, vulnerabilities, and contributions within diverse communities. This acts as a barrier to understanding and connecting with a significant and growing part of our population and is reinforced by ageist stereotypes, assumptions, and narrow methods of engagement. Over the last decade there have been important national and international contributions that have explored the human rights of older people and identified possible gaps and how to address them (for example, UN Open-Ended Working Group on Ageing; the International Older Persons’ Human Rights Index – IOPHRI). We argue that the actual relationship between formal human rights of older people, the ethics – as values and standards that prescribe what we ought to do – and the real world is complex and not yet fully explored. There is a real opportunity to develop better insights and understanding on the opportunities and challenges of an ageing population by exploring how ethics are developed and practiced across different policy and societal domains, for this diverse population. Perspectives that recognise ethics as situated and relational across and social, cultural and political contexts can offer deep insights on human experience and everyday lives.

So, join us. This is not a NICA or Newcastle University initiative, it is a global call to all interested in the future of our society.

References

[1] (Google report more than 6,650,000 records for the “ageing+inequalities” query. The GSA journals only count more than 2000 papers and articles on the topic)

[2] Abstract from “The Ethical OS

[3] The following is an abstract from ibidum.