Neurology

Dr Linda Calabresi
Clinical Articles iconClinical Articles

Adolescent boys who struggle to understand how basic machines work and young girls who have difficulty remembering words are at increased risk of developing dementia when they’re older, new research has found. According to the longitudinal study published in The Journal of the American Medical Association, lower mechanical reasoning in adolescence in boys was associated with a 17% higher risk of having dementia when they were 70. With girls it was a lower memory for words in adolescence that increased the odds of developing the degenerative disease. It has been known for some time that the smarter you are throughout life, even as a child the less likely it is that you will develop dementia. Not a guarantee of protection – just a general trend. It has to do with cognitive reserve, the US researchers explain. “Based on the cognitive reserve hypothesis, high levels of cognitive functioning and reserve accumulated throughout the life course may protect against brain pathology and clinical manifestations of dementia,” they wrote. This theory has been supported by a number of studies such as the Scottish Mental Health Survey that showed that lower mental ability at age 11 increased the risk of dementia down the track. But what had been less well-defined was whether there were any particular aspects of intelligence in young people that were better predictors (or protectors) of dementia than others. This study goes some way to addressing this. Researchers were able to link sociobehavioural data collected from high school children back in 1960 with Medicare claims data over 50 years later that identified those people who had been diagnosed with Alzheimer's disease and related disorders. Interestingly, poor adolescent performance in other areas of intelligence such as mathematics and visualisation were also associated with dementia but not nearly to the extent of mechanical reasoning and word memory. So why is this so? The study authors say there are a few possible explanations. Maybe the poor performance in adolescence reflected poor brain development earlier in life, a risk factor for dementia. Or maybe these adolescents are more susceptible to neuropathology as they get older? Or maybe they are the adolescents who adopt poor health behaviours such as smoking and little exercise? “Regardless of mechanism, our findings emphasise that early-life risk stretches across the life course,” they said. And what can be done about it? That’s the million-dollar question. The researchers say the hope is if we know the at-risk group we can get aggressive with preventive management early. “Efforts to promote cognitive reserve-building experiences and positive health behaviours throughout the life course may prevent or delay clinical symptoms of Alzheimer's disease and related disorder.” An accompanying editorial takes this concept a little further. Dr Tom Russ, a Scottish psychiatrist says interventional research has identified a number of factors that can potentially influence cognitive reserve. These include modifiable health factors, education, social support, positive affect, stimulating activities and/or novel experiences, and cognitive training. As Dr Russ says, you can’t necessarily change all of these risk factors, and even the ones you can change may become less modifiable later in life. But as this study demonstrates, you may be able to work on a person’s cognitive reserve at different stages throughout their life to ultimately lower their risk of dementia.   Reference
  1. Huang AR, Strombotne KL, Horner EM, Lapham SJ, Adolescent Cognitive Aptitudes and Later-in-Life Alzheimer Disease and Related Disorders. JAMA Network Open [Internet]. 2018 Sep; 1(5): e181726. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.1726.
  2. Russ TC, Intelligence, Cognitive Reserve, and Dementia: Time for Intervention? JAMA Network Open [Internet]. 2018 Sep; 1(5): e181724. Available from: https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2701735 doi:10.1001/jamanetworkopen.2018.1724.

Healthed
Clinical Articles iconClinical Articles

New research from South Australian scientists has shown that vitamin D (also commonly known as the sunshine vitamin) is unlikely to protect individuals from multiple sclerosis, Parkinson's disease, Alzheimer's disease or other brain-related disorders. The findings, released today in the science journal 'Nutritional Neuroscience' reported that researchers had failed to find solid clinical evidence for vitamin D as a protective neurological agent. "Our work counters an emerging belief held in some quarters suggesting that higher levels of vitamin D can impact positively on brain health," says lead author Krystal Iacopetta, PhD candidate at the University of Adelaide. Based on a systematic review of over 70 pre-clinical and clinical studies, Ms Iacopetta investigated the role of vitamin D across a wide range of neurodegenerative diseases. "Past studies had found that patients with a neurodegenerative disease tended to have lower levels of vitamin D compared to healthy members of the population," she says. "This led to the hypothesis that increasing vitamin D levels, either through more UV and sun exposure or by taking vitamin D supplements, could potentially have a positive impact. A widely held community belief is that these supplements could reduce the risk of developing brain-related disorders or limit their progression." "The results of our in-depth review and an analysis of all the scientific literature however, indicates that this is not the case and that there is no convincing evidence supporting vitamin D as a protective agent for the brain," she says. >> Read more   Source: News Medical Net

Expert/s: Healthed
Dr Linda Calabresi
Clinical Articles iconClinical Articles

In what will be seen as a blow to cryptic crossword compilers the world over, it appears wealth is a better determinant of whether you keep your marbles than education. In a UK prospective study of over 6000 adults aged over 65 years, researchers found those people in the lowest quintile in terms of socioeconomic status were almost 70% more likely to get dementia than those categorised to be in the top fifth, over a 12 year follow-up period. Depressingly, this finding held true regardless of education level. “This longitudinal cohort study found that wealth in late life, but not education, was associated with increased risk of dementia, suggesting people with fewer financial resources were at higher risk,” the study authors said. On further analysis, researchers found the association between wealth, or the lack thereof and dementia was even more pronounced in the younger participants in the cohort. So what did the researchers think was the reason behind the link between poverty and dementia? One explanation was that having money allowed one to access more mentally stimulating environments including cultural resources (reading, theatre etc) and increased social networks that might help preserve cognitive function. While on the flip side, poverty (or ‘persistent socioeconomic disadvantage’ as the authors describe it) affects physiological functioning, increasing the risk of depression, vascular disease and stroke – all known risk factors for dementia. Other factors such as poor diet and lack of exercise also appear to more common among poorer people in the community. All this seems fairly logical, but what of the lack of a protective effect of education? Well, the researchers think this might be a particularly British phenomenon in this age group. “This might be a specific cohort effect in the English population born and educated in the period surrounding the World War II,” they suggested. A number of other studies have shown other results, with some, including the well-respected Canadian Study of Health and Aging-  showing the complete opposite – education protects against dementia. Consequently, the authors of this study, published in JAMA Psychiatry, hypothesise that perhaps this cohort of patients may have been unable to access higher education because of military service or financial restrictions but were able to access intellectually challenging jobs after the war. All in all, the study is an observational one and it is possible there are a number of confounding factors from smoking to availability of medical care that play a role in why poorer people are at greater risk of dementia. And while the researchers are not advocating older people give up their Bridge game and just buy lottery tickets, it would seem money is useful, if not for happiness, then at least for preserving brain power. Ref: JAMA Psychiatry doi:10.1001/jamapsychiatry.2018.1012

Dr Wei Luan
Clinical Articles iconClinical Articles

To understand how the healthy brain works and what occurs in brain disease, neuroscientists use many microscopy techniques, ranging from whole-brain human MRIs to imaging within a single neuron (brain cell), creating stunning images in the process.

Dr Perminder Sachdev
Clinical Articles iconClinical Articles

When people think of lithium, it’s usually to do with batteries, but lithium also has a long history in medicine. Lithium carbonate, or lithium salt, is mainly used to treat and prevent bipolar disorder. This is a condition in which a person experiences significant mood swings from highs that can tip into mania to lows that can plunge into depression. More recently, though, lithium has been explored as a potential preventive therapy for dementia. A recent paper even led some to question whether we should start putting lithium in drinking water to lower population dementia rates.
But despite early studies linking lithium to better cognitive function, there is currently not enough evidence to start using it as a preventive dementia strategy.

Lithium’s medical history

Lithium is a soft, light-silver metal present in many water systems, which means humans have always been exposed to it. Its concentrations in water range from undetectable to very high, especially in geothermal waters and oil-gas field brines. The high concentration of lithium in some natural springs led to it being related to healing. In the 19th century, lithium water was used to treat gout and rheumatism. Of course this was with little objective evidence of any benefit. Early attempts to treat diseases such as kidney stones with higher doses of lithium often led to lithium toxicity – potentially irreversible damage to the kidneys and brain. The landmark event in the medical history of lithium was a 1949 paper by Australian psychiatrist John Cade in the Medical Journal of Australia. This demonstrated its benefit in bipolar disorder, then known as manic-depressive illness. The psychiatric community took some time to absorb this finding – the US regulator the Food and Drug Administration only approved lithium for use in 1970. After that, lithium as a drug transformed psychiatric practice, especially in the treatment and prevention of bipolar disorder. This led to extensive research into the mechanisms of lithium in the brain.
Read more: What is bipolar disorder?

How lithium affects the brain

We don’t know exactly how lithium works, but we know it helps the way brain cell connections remodel themselves, usually referred to as synaptic plasticity. It also protects brain neurons by controlling cellular pathways, such as those involved in oxidative stress (where the brain struggles to control toxins) and inflammation. Animal studies have shown that long-term treatment with lithium leads to improvement in memory and learning. These observations led to studies of lithium’s protective effects on brain neurons in bipolar patients who had been taking it for a long time. One of these was a review of more than 20 studies, seven of which examined dementia rates in patients with mood disorders (such as bipolar) being treated with standard therapeutic doses of lithium. Five of these studies showed lithium treatment was related to low dementia rates. The review looked at four randomised controlled trials (comparing one group of patients on lithium with a group taking a placebo). These examined lithium’s effects on cognitive impairment (such as memory loss) or dementia over six to 15 months. One study did not show a statistically significant benefit on cognition but showed a biologically positive effect on the levels of a protein that promotes nerve cell growth. The other three showed statistically significant, albeit modest, beneficial effects of lithium on cognitive decline.
Read more: How we can protect our brains from memory loss and dementia

Lithium in water

A number of epidemiological studies – which track patterns and causes of diseases in populations – have linked lithium concentrations in drinking water with rates of psychiatric disease. In the above-mentioned review, nine out of 11 studies found an association between trace-dose lithium (low doses in drinking water but not detectable in blood of the people consuming it) and low rates of suicide and, less commonly, homicide, mortality and crime. More recently, researchers in Denmark conducted a nation-wide study linking dementia rates based on hospital records for people aged 50-90 with their likely exposure to lithium. This was based on the lithium levels in the waterworks predominantly supplying the region where they lived. Those with higher dementia rates came from regions with lower mean levels of lithium in the water than those without. This was 11.5 micrograms (µg) per litre compared to 12.2µg per litre. The Danish population is geographically stable and the health record linkage is excellent for such studies. The reliability and validity of dementia diagnosis in Danish health registers is also high. But the study had a number of limitations. The lithium intake was based on sampling of waterworks that provide water to only 42% of the population. The sampling was done for only four years (2009-2013) and extrapolated to a lifetime. Many potential, additional variables were not considered. For instance, a major source of lithium is diet, and some bottled water contains lithium. The study did not take this into account. An intriguing aspect of the results, for which no explanation was given, was that the relationship wasn’t linear. That is, lower doses (5.1-10µg per litre) increased the risk of dementia by about 20%, whereas exposure to levels over 15µg/L reduced the risk by about the same amount.

We’re not there yet

Observational studies (which make educated assumptions by observing a sample of the population) have considerable merit in the epidemiology of dementia, but have sometimes led to blind alleys. Aluminium is a useful example, with its preventive role in dementia still unclear after several decades of observations. A concern is lithium may take the same path.
Read more – In defence of observational science: randomised experiments aren’t the only way to the truth
Lithium was once widely used as an elixir and even as a salt substitute, but was discredited because of lack of effectiveness, marked toxicity and early death. We must wait for more observational studies with the rigour such studies warrant before we start clinical tests of its effects in drinking water. The ConversationWe must also study the potential harmful effects of lithium on the thyroid and the kidney, as these organs bear the brunt of long-term harms of lithium. For now, there is insufficient evidence to add lithium to the drinking water. Perminder Sachdev, Scientia Professor of Neuropsychiatry, Centre for Healthy Brain Ageing (CHeBA), School of Psychiatry, UNSW This article was originally published on The Conversation. Read the original article.