Watching too much TV can shrink your BRAIN, study finds

Watching too much TV can shrink your BRAIN and lead to a decline in memory, study warns

  • TV viewing is a type of sedentary behaviour that doesn’t require much thought  
  • High amounts of viewing in midlife are linked to declines in cognitive function
  • This can also result in a reduction in the amount of grey matter – or brain size 
  • They said moderating TV viewing in midlife could improve brain health later

Watching a lot of television in middle age can cause your brain to shrink and lead to a decline in your ability to think properly, according to a new study.

Researchers from the University of Alabama at Birmingham and Columbia University in New York examined self-reported television viewing habits and related brain size and health of groups of volunteers aged in their 50s and 70s. 

Exercise and physical activity has been shown to benefit brain health, but the researchers wanted to find out the impact of sedentary behaviour on the brain. 

They found that spending a lot of time glued to the television during midlife led to greater levels of cognitive decline and a drop in brain size later in life. 

Those who viewed a lot of television in midlife had a 6.9 per cent decline in cognitive function by the time they were in their 70s and a 0.5 per cent reduction in grey matter compared to those who watched little television. 

Engaging in healthy behaviours between 45 and 64, including limiting TV time, ‘may be important factors to support a healthy brain later in life,’ the team said. 

Watching a lot of television in middle age can cause your brain to shrink and lead to a decline in your ability to think properly, according to a new study. Stock image

WHAT IS DEMENTIA? THE KILLER DISEASE THAT ROBS SUFFERERS OF THEIR MEMORIES 

A GLOBAL CONCERN 

Dementia is an umbrella term used to describe a range of progressive neurological disorders (those affecting the brain) which impact memory, thinking and behaviour. 

There are many different types of dementia, of which Alzheimer’s disease is the most common.

Some people may have a combination of types of dementia.

Regardless of which type is diagnosed, each person will experience their dementia in their own unique way.

Dementia is a global concern but it is most often seen in wealthier countries, where people are likely to live into very old age.

HOW MANY PEOPLE ARE AFFECTED?

The Alzheimer’s Society reports there are more than 850,000 people living with dementia in the UK today, of which more than 500,000 have Alzheimer’s.

It is estimated that the number of people living with dementia in the UK by 2025 will rise to over 1 million.

In the US, it’s estimated there are 5.5 million Alzheimer’s sufferers. A similar percentage rise is expected in the coming years.

As a person’s age increases, so does the risk of them developing dementia.

Rates of diagnosis are improving but many people with dementia are thought to still be undiagnosed.

IS THERE A CURE?

Currently there is no cure for dementia.

But new drugs can slow down its progression and the earlier it is spotted the more effective treatments are.

Source: Alzheimer’s Society 

Cognition includes one’s abilities to remember, think, reason, communicate and solve problems, the researchers said, and with an increase in life expectancy, there is also an increase in cognitive impairment and dementia. 

An ageing population with multiple factors that do not support a healthy brain may lead to an increased number of people with dementia, the team said.

Worldwide, more than seven million new dementia cases are diagnosed annually and by 2050 the prevalence of the disease will increase by up to 264 per cent. 

While there is no cure for dementia, a recent report showed that nearly 40 per cent of worldwide diagnoses may be prevented or delayed.

This can be done by modifying twelve risk factors including exercise, according to one of the study authors Priya Palta from Columbia University..

To better understand the effects of sedentary behaviour during midlife on brain health, the teams looked at television viewing information collected during midlife – 45 to 64 – from participants in a wider US neurocognitive study. 

Participants were asked how much they watched television in leisure time, with self-reported responses not based on specific hours but a general timeframe.

They were asked whether they never or seldom watched TV (low), sometimes watched TV (medium/moderate) or often/very often watched TV (high). 

Palta’s study focused on cognitive decline and risk of dementia, while a linked study by Kelley Pettee Gabriel of the University of Alabama focused on structural brain markers from brain imaging scans.

In the study by Palta there were 10,700 adult participants with an average age of 59 and mostly female who provided self-reported assessments of their TV habits.

There were completed during three visits between 1987 and 1995, with 6,463 reporting that their viewing habits hadn’t changed over the eight years. 

They received additional cognitive tests of working memory, language and executive function/processing speed in 1998 and again in 2013.

In this study, the researchers found that compared to people with low viewing habits, those with moderate or high viewing had a 6.9 per cent decline in cognitive function over the 15 years from first visit to final tests. 

High amounts of television viewing were not notably associated with a higher dementia risk, just a decline in general cognitive function.

Participants’ reported physical activity and exercise habits did not appear to alter the relationship between time spent watching television during midlife and changes in cognitive function and risk of dementia, the authors discovered.

Gabriel’s study included 1,601 adults with an average age of 76 from the same group of participants but who also underwent an MRI scan of their brain in 2013. 

Gabriel’s study included 1,601 adults with an average age of 76 from the same group of participants but who also underwent an MRI scan of their brain in 2013. Stock image

WHEN DO HUMAN BRAINS BECOME ‘OLD’?

The human brain becomes ‘old’ at just 25, research suggested in February 2017.

Cerebrospinal fluid (CSF), which is found in the brain and spinal cord, changes its speed of movement in people older than their mid-20s, a Lancaster University study found.

These movements are linked to breathing and heart rates, with CSF changes previously being associated with conditions such as multiple sclerosis and high blood pressure.

It is unclear if these CSF changes are associated with brain disorders that typically affect the elderly, such as dementia.

Previous research suggests the volume and weight of the brain begins to decline by around five per cent per decade when a person reaches 40 years old.

On the back of these findings, study author Professor Aneta Stefanovska added further research ‘may open up new frontiers in the understanding and diagnosis of various neurodegenerative and ageing-related diseases to improve diagnostic procedures and patient prognosis.’

The discovery came to light during the development of a new method of investigating brain function, which has revealed the stage in life when the brain starts to deteriorate. 

Previous research carried out by Imperial College London suggests brains’ grey matter, which enables the organ to function, shrinks during middle age and is related to cell death.

White matter, which enables communication between nerve clusters, also appears to decline at around 40.

This is also when the deterioration of myelin sheath occurs. Myelin sheath is a fatty substances that surrounds nerve cells and ensures proper function of the nervous system.

These changes are thought to occur due to a reduction in the hormones dopamine and serotonin. 

Of this group, 971 people reported persistent levels of television viewing over the eight years the surveys were conducted. 

Using the brain MRI scans, researchers looked at several structural brain markers, including deep grey matter volume in the brain of each participant. 

Grey matter is the darker tissue of the brain and spinal cord and it is involved in muscle control, seeing and hearing, decision-making and other brain functions. 

The higher a person’s volume of brain grey matter, the better cognitive skills they typically have, according to the study authors.

In this study, researchers found that when compared to participants who said they rarely wathc TV, those who watched a lot had lower volumes of grey matter.

This indicates greater brain atrophy or deterioration, according to Gabriel.

The association with the level of TV watching to brain grey matter was greater with persistent television viewing throughout midlife, he said.

Specifically, compared to people who said they never or seldom watched TV, people who reported they sometimes or often/very often watched TV had lower volumes of deep grey matter in late life.

The participants’ self-reported physical activity and exercise habits did not change the associations between the level of television viewing during midlife with brain structure measures of grey matter.

‘Our findings suggest that the amount of television viewing, a type of sedentary behaviour, may be related to cognitive decline and imaging markers of brain health,’ explained Palta, talking of the combined studies. 

‘Therefore, reducing sedentary behaviours, such as television viewing, may be an important lifestyle modification target to support optimal brain health,’ Palta said.

‘In the context of cognitive and brain health, not all sedentary behaviours are equal,’ added another study author Ryan Dougherty of Johns Hopkins University.

‘Non-stimulating sedentary activities such as television viewing are linked to greater risk of developing cognitive impairment, whereas cognitively stimulating sedentary activities [such as reading, computer and board games] are associated with maintained cognition and reduced likelihood of dementia,’ he explained.

‘Considering the contextual differences in varying sedentary behaviours is critical when investigating cognitive and brain health.’

Researchers from the University of Alabama at Birmingham and Columbia University in New York examined self-reported television viewing habits and related brain size and health of groups of volunteers aged in their 50s and 70s. Stock image

IS THERE A PILL FOR ALZHEIMER’S DISEASE?

A breakthrough Alzheimer’s drug edges scientists one step closer to a cure, new research suggested in November 2017.

Taken twice a day, a tablet, known as LMTX, significantly improves dementia sufferers’ brain injuries to the extent their MRI scans resemble those of healthy people after just nine months, a study found.

Lead author Professor Gordon Wilcock from the University of Oxford told MailOnline: ‘I haven’t seen such brain injury recovery before after a drug treatment.’

LMTX, which is under investigation, also significantly improves patients’ abilities to carry out everyday tasks such as bathing and dressing themselves, while also boosting their capabilities to correctly name objects and remember the date, the research adds.

The drug contains a chemical that dissolves protein ‘tangles’ in the brain that clump together to form plaques in the region associated with memory, according to its manufacturer TauRx Pharmaceuticals.

Dissolving these tangles and preventing the formation of new plaques may slow or even halt memory loss in dementia sufferers, the pharma company adds. 

The researchers, from the universities of Oxford and Aberdeen, analysed 800 Alzheimer’s patients across 12 countries.

The study’s participants received either 100mg or 4mg LMTX tablets twice a day for 18 months.

They were tested on their ability to name objects, follow commands such as ‘make a fist’, recall items from a list of 10 and identify their name, the time and date.

Their ability to eat without help, use a telephone, wash and dress themselves, and control their bowel and bladder was also assessed.

MRI scans monitored the participants’ brain injury. 

Dougherty’s study explored the risk of heart attack in young adults, also found a link between TV viewing and the volume of grey matter in the brain.

The data came from the same dataset as the previous two studies but focused on 599 participants with an average age of 30 when they were quizzed about viewing habits and 50 when they underwent follow up examinations. 

During the 20-year period, from 1990 to 2011, volunteers had follow up visits every five years on how many hours they watched TV per day in the previous year.  

Twenty years into the study MRI scans were taken to assess structural measures of grey matter in the brain and they found that greater television in mid-adulthood also resulted in lower levels of grey matter volume by midlife. 

Considering the effect estimates, a one-hour greater mean television time was associated with approximately a 0.5 per cent reduction in grey matter volume.

As with Gabriel’s study, the participants’ physical activity and exercise habits did not impact the association between the level of television viewing during midlife with brain structure measures of grey matter.

‘In our findings, television viewing remained associated with cognitive function and grey matter volume after accounting for physical activity, suggesting that this sedentary behaviour may impart a unique risk with respect to brain and cognitive health,’ Dougherty said. 

‘This is an important finding since it is now well accepted that the neurobiology of dementia including brain atrophy begins during midlife. 

‘That’s a period were modifiable behaviours such as excessive television viewing can be targeted and reduced to promote healthy brain ageing.’

The three researchers agreed there is a need to identify modifiable behaviours, such as excessive television viewing, that may be targeted prior to the development of cognitive impairment to offset the risk of dementia. 

Promoting healthy brain ageing is important, they said, particularly given current trends in television viewing and binge-watching behaviours.

Limitations of these studies are that television viewing was based on participant reporting, which may not be accurate. 

Television viewing is only one type of sedentary behaviour and provides an incomplete picture of total sedentary time.

Engaging in healthy behaviours between 45 and 64, including limiting TV time, ‘may be important factors to support a healthy brain later in life,’ the team said

‘This research is very timely and important in the midst of the current COVID-19 pandemic because we know people are spending more time engaging in sedentary behaviours, said American Heart Association president Mitchell SV Elkind.

‘These are interesting correlations among television viewing, cognitive decline and brain structure. Television viewing is just one type of sedentary behaviour yet it’s easy to modify and could make a big difference in maintaining brain health.’ 

The three studies are being presented at the American Heart Association Epidemiology, Prevention, Lifestyle & Cardiometabolic Health Conference. 

Relief for parents? Study of over 400,000 teens reveals social media is ‘no more harmful’ to youngsters’ mental health than TV was in the 1990s 

Using social media is ‘no more harmful’ to young people’s mental health than watching TV was to youngsters in the 1990s, a new study claims. 

Researchers from Oxford University used data from three large surveys to look into the lives of more than 400,000 young people in the UK and US.

Researchers from Oxford University used data from three large surveys to look into the lives of more than 400,000 young people in the UK and US

It is popularly believed that new technology, particularly social media, is responsible for declining mental health among young people and a range of other social ills. 

The team explored the associations between technology use and mental health problems in teenagers, declaring the link between the two is ‘thin at best’.  

They found some limited link between emotional problems and social media, but no ‘smoking gun’ pointing to any wider mental health problems linked to its use. 

Lead author Dr Matt Vuorre says concerns of this type are not new, nor are they well justified by current data.

He compared the ‘fear of social media’ to  warnings of ‘square eyes’ if children watch too much television, or that radio would turn teens to a life of crime. 

Then, as now, says Dr Vuorre, the popular idea does not appear to be supported by hard evidence, or that technology use has become more harmful over time. 

‘Any understanding of 21st-century adolescence would be incomplete without an appreciation of social-media platforms and other digital technologies, which have become an integral element of young people’s everyday lives over the past few decades,’ the team wrote. 

The research involved three large surveys of young people who reported on their personal use of technology and various mental health-related issues. 

Using this large data set, the team set about investigating links between technology use and mental health problems, and whether they have increased over time.

They studied this question by modelling four different mental health outcomes against three forms of technology use across three large nationally representative data sets.

From these eight models, they found one clinically relevant self-reported mental health outcome, depression, for which the links to technology use had become consistently less negative over time.

However, this decline was found for both television and social media. 

According to Dr Vuorre, these survey responses do not establish a smoking gun link between the use of technology and mental health issues.

Source: Read Full Article

Previous post Swifts fly faster and futher than previously thought, scientists say
Next post What Does ‘Do It’ By Chloe X Halle Really Mean?