In a distant galaxy, well, California, in fact, AI systems are built, and we can rebuild everything from our work and banks to how we manage our health, and even age.
Engineers are training machines that solve complex problems, mimic human reasoning, and bring robots one step closer.
Still, while AI emerging from Silicon Valley may resemble Star Wars’ futuristic technology, they lack the familiar arc of narrative: the presence of an old, clever guide. Luke had Obi-Wan. Oprah Winfrey had Maya Angelou. Bill Gates turned to Warren Buffett.
Subscribe to Kiplinger’s personal finances
Be smarter, better informed investors.
Save up to 74%
Sign up for Kiplinger’s free e-newsletter
Remember profits and prosperity with expert advice on investments, taxes, retirements, personal finances and more – directly to email.
Profit and prosperity with the advice of the best experts – directly to your email.
In the past, young technological visionaries often sought experienced mentors to turn raw innovation into lasting impacts. Steve Jobs, for example, coached the young Mark Zuckerberg. Today, the future is primarily written by young people.
The imbalances of that generation are impressive, especially considering who is most affected.
How AI lacks the wisdom of the elderly
According to AARP, 59% of Americans over 50 say that technology today is not designed with them in mind. This is a huge blind spot, considering that over 120 million Americans are now over 50. By 2030, at least 21% of the population is over 65, and the first wave of millennials will begin to turn 50.
Steve Jobs once said, “Design isn’t just about how it looks, it feels, it’s about how it works.”
It is a place where many elderly people feel left behind. “I think the biggest disconnection lies in what is prioritized in the technology itself,” says Dr. Brittne Kakulla, Senior Research Advisor at AARP. “The older adults prioritize functionality over flashes, which can counter the tech industry’s obsession with speed and novelty.”
This raises a central question: how well does AI work when the younger generation mainly program it and integrates into everything from healthcare to personal finance?
Researchers and advocates say they have a high stake. Excluding older people from AI design, testing and governance can not only overlook needs, but also lead to systems that are lacking for everyone.
Machine Ghost: How AI Training and Why Bias Is Integrated?
Artificial intelligence systems are designed to recognize patterns and make decisions by analyzing huge amounts of data. Through machine learning, these systems “learse” from what they are given, such as language, lab results, or consumer behavior, and generate output based on that information.
Simply put, it determines what comes into it. If that input reflects bias, or if the team building the model is not diverse, those flaws will be incorporated into the technology itself.
A review published by Nature identified age-related biases across the machine learning pipeline, from how data is selected to how models are evaluated.
A real world example is already holding a red flag. A 2024 study found that responses with age-related stereotypes tightened, such as AI chatbots equipped with “large language models” (LLMS) are frequently returned, such as age-related stereotypes, such as Copilot and confusion. This is evidence that bias is not theoretical, but is embedded in the behavior of the tool itself.
It is the adaptability of these systems that exacerbate the problem. Most large language models rely on large datasets, but their character and tone can be changed quickly. Elon Musk’s Grok, for example, sparked a backlash after parroting a false story about South African white genocide. Even when users asked about unrelated topics like baseball, Grok responded with a fake genocide story.
How to give feedback
If you use LLM or chatbots, be aware of age-related content. If you find it biased or inaccurate, you can usually provide feedback by clicking on the thumb or thumb down sign or by clicking “Report.”
As philosopher Matteo Paschinelli told The New York Times, “AI needs us. Make creatures, constantly produce and feed machines. We need our ideas and the originality of our lives.”
But what happens if that perspective is incomplete?
What happens when an elderly person is left behind?
Get medical care where AI is already playing an increasing role in diagnosing illness, managing treatment plans, and predicting medical outcomes.
A policy summary published by the World Health Organization warns that if these systems rely on data skewed into younger populations, they can be “systematically distinguished on a much larger scale than biased individuals.” The authors note that AI tools may miss symptoms, delay diagnosis, and strengthen care disparities in older adults.
The results go far beyond healthcare. At work, AI is rapidly reshaping recruitment and skill expectations. According to job list data, one in four high-tech jobs in the US looked for candidates with artificial intelligence skills. However, older workers are left behind not for their ability, but rather for recognition.
A nonprofit survey found that only 32% of U.S. employers are “more likely to consider more than 60 candidates, including AI tools, as candidates over the age of 60,” compared to 90% who consider applicants under the age of 35.
Once biases are formed to build and train AI, the results can be technology that is not accurate, not comprehensive, and ineffective for everyone.
Inclusion: Why does AI need elderly people?
Of course, the stereotype that older people are technology avoidance is unbearable. In fact, AARP reports that generative use of Americans over 50 doubled from 9% to 18% in 2024, with an additional 30% excited about the possibility.
The employment of workplaces is also increasing. According to the Generation, 15% of MidCareer and older workers already use AI tools regularly, most of which are used as self-taught “power users” who rely on multiple times a week.
“The growth in AI adoption suggests that it is more relevant for older adults,” says Dr. Kakula.
Therefore, advocates argue that inclusion of older people in AI development could lead to more functional, ethical and widely available tools.
This means building an age group team that will help mid- and late-stage experts identify blind spots. Securing training data reflects not only digital natives but also aging populations. It then creates an AI monitoring role that elicits the experience and judgment that only decades of workforce can provide.
Dr. Kakula also emphasizes, “Tech companies need to explain diversity across their entire life stages and design their life stages.” She points out a good example of a translation tool that is popular among older people. The same AI features can help someone in their 50s while traveling, support speech-to-text needs of the 70s, and help communicate with caregivers in the 80s.
Ultimately, becoming an AI depends on who trains and guides it. To ignore the elderly in that process is to repeat one of humanity’s most enduring mistakes, and to downplay wisdom until it’s too late.
As Benjamin Franklin once said, “The tragedy of life is that we are getting too old and too late.”
It would be wise not to program that tragedy into our machines.