


It started with a high school typing course.
Wanda Woods enrolled because her father advised that typing proficiency would lead to jobs. Sure enough, the federal Environmental Protection Agency hired her as an after-school worker while she was still a junior.
Her supervisor “sat me down and put me on a machine called a word processor,” Ms. Woods, now 67, recalled. “It was big and bulky and used magnetic cards to store information. I thought, ‘I kinda like this.’”
Decades later, she was still liking it. In 2012 — the first year that more than half of Americans over 65 used the internet — she started a computer training business.
Now she is an instructor with Senior Planet in Denver, an AARP-supported effort to help older people learn and stay abreast of technology. Ms. Woods has no plans to retire. Staying involved with tech “keeps me in the know, too,” she said.
Some neuroscientists researching the effects of technology on older adults are inclined to agree. The first cohort of seniors to have contended — not always enthusiastically — with a digital society has reached the age when cognitive impairment becomes more common.
Given decades of alarms about technology’s threats to our brains and well-being — sometimes called “digital dementia” — one might expect to start seeing negative effects.
The opposite appears true. “Among the digital pioneer generation, use of everyday digital technology has been associated with reduced risk of cognitive impairment and dementia,” said Michael Scullin, a cognitive neuroscientist at Baylor University.
It’s almost akin to hearing from a nutritionist that bacon is good for you.
“It flips the script that technology is always bad,” said Dr. Murali Doraiswamy, director of the Neurocognitive Disorders Program at Duke University, who was not involved with the study. “It’s refreshing and provocative and poses a hypothesis that deserves further research.”
Dr. Scullin and Jared Benge, a neuropsychologist at the University of Texas at Austin, were co-authors of a recent analysis investigating the effects of technology use on people over 50 (average age: 69).
They found that those who used computers, smartphones, the internet or a mix did better on cognitive tests, with lower rates of cognitive impairment or dementia diagnoses, than those who avoided technology or used it less often.
“Normally, you see a lot of variability across studies,” Dr. Scullin said. But in this analysis of 57 studies involving more than 411,000 seniors, published in Nature Human Behavior, almost 90 percent of the studies found that technology had a protective cognitive effect.
Much of the apprehension about technology and cognition arose from research on children and adolescents, whose brains are still developing.
“There’s pretty compelling data that difficulties can emerge with attention or mental health or behavioral problems” when young people are overexposed to screens and digital devices, Dr. Scullin said.
Older adults’ brains are also malleable, but less so. And those who began grappling with technology in midlife had already learned “foundational abilities and skills,” Dr. Scullin said.
Then, to participate in a swiftly evolving society, they had to learn a whole lot more.
Years of online brain-training experiments that last a few weeks or months have produced varying results. Often, they improve the ability to perform the task in question without enhancing other skills.
“I tend to be pretty skeptical” of their benefit, said Walter Boot, a psychologist at the Center on Aging and Behavioral Research at Weill Cornell Medicine. “Cognition is really hard to change.”
The new analysis, however, reflects “technology use in the wild,” he said, with adults “having to adapt to a rapidly changing technological environment” over several decades. He found the study’s conclusions “plausible.”
Analyses like this can’t determine causality. Does technology improve older people’s cognition, or do people with low cognitive ability avoid technology? Is tech adoption just a proxy for enough wealth to buy a laptop?
“We still don’t know if it’s chicken or egg,” Dr. Doraiswamy said.
Yet when Dr. Scullin and Dr. Benge accounted for health, education, socioeconomic status and other demographic variables, they still found significantly higher cognitive ability among older digital technology users.
What might explain the apparent connection?
“These devices represent complex new challenges,” Dr. Scullin said. “If you don’t give up on them, if you push through the frustration, you’re engaging in the same challenges that studies have shown to be cognitively beneficial.”
Even handling the constant updates, the troubleshooting and the sometimes maddening new operating systems might prove advantageous. “Having to relearn something is another positive mental challenge,” he said.
Still, digital technology may also protect brain health by fostering social connections, known to help stave off cognitive decline. Or its reminders and prompts could partially compensate for memory loss, as Dr. Scullin and Dr. Benge found in a smartphone study, while its apps help preserve functional abilities like shopping and banking.
Numerous studies have shown that while the number of people with dementia is increasing as the population ages, the proportion of older adults who develop dementia has been falling in the United States and in several European countries.
Researchers have attributed the decline to a variety of factors, including reduced smoking, higher education levels and better blood pressure treatments. Possibly, Dr. Doraiswamy said, engaging with technology has been part of the pattern.
Of course, digital technologies present risks, too. Online fraud and scams target older adults, and while they are less apt to report fraud losses than younger people, the amounts they lose are much higher, according to the Federal Trade Commission. Disinformation poses its own hazards.
And as with users of any age, more is not necessarily better.
“If you’re bingeing Netflix 10 hours a day, you may lose social connections,” Dr. Doraiswamy pointed out. Technology, he noted, cannot “substitute for other brain-healthy activities” like exercising and eating sensibly.
An unanswered question: Will this supposed benefit extend to subsequent generations, digital natives more comfortable with the technology their grandparents often labored over? “The technology is not static — it still changes,” Dr. Boot said. “So maybe it’s not a one-time effect.”
But the change tech has wrought “follows a pattern,” he added. “A new technology gets introduced, and there’s a kind of panic.”
From television and video games to the latest and perhaps scariest development, artificial intelligence, “a lot of it is an overblown initial reaction,” he said. “Then, over time, we see it’s not so bad and may actually have benefits.”
Like most people her age, Ms. Woods grew up in an analog world of paper checks and paper maps. But as she moved from one employer to another through the ’80s and ’90s, she progressed to IBM desktops and mastered Lotus 1-2-3 and Windows 3.1.
Along the way, her personal life turned digital, too: a home desktop when her sons needed one for school, a cellphone after she and her husband couldn’t summon help for a roadside flat, a smartwatch to track her steps.
These days Ms. Woods pays bills and shops online, uses a digital calendar and group-texts her relatives. And she seems unafraid of A.I., the most earthshaking new tech.
Last year, Ms. Woods turned to A.I. chatbots like Gemini and ChatGPT to plan an R.V. excursion to South Carolina. Now, she’s using them to arrange a family cruise celebrating her fiftieth wedding anniversary.
The New Old Age is produced through a partnership with KFF Health News.