ibiofeedback
ibiofeedback
Marilyn Allen PhD
275 posts
Let your health be your wealth!
Don't wanna be here? Send us removal request.
ibiofeedback · 6 months ago
Text
Tumblr media
It's my 12 year anniversary on Tumblr 🥳
0 notes
ibiofeedback · 5 years ago
Text
https://neurosciencestuff.tumblr.com/post/636890794928799744/to-protect-your-brain-dont-be-too-kind
9 notes · View notes
ibiofeedback · 5 years ago
Text
Tumblr media
1 note · View note
ibiofeedback · 5 years ago
Text
Tumblr media
0 notes
ibiofeedback · 5 years ago
Text
Tumblr media
0 notes
ibiofeedback · 5 years ago
Text
Tumblr media
0 notes
ibiofeedback · 6 years ago
Photo
Tumblr media
Eyes reveal early Alzheimer’s disease
Reduced blood capillaries in the back of the eye may be a new, noninvasive way to diagnose early cognitive impairment, the precursor to Alzheimer’s disease in which individuals become forgetful, reports a newly published Northwestern Medicine study.
Scientists detected these vascular changes in the human eye non-invasively, with an infrared camera and without the need for dyes or expensive MRI scanners. The back of the eye is optically accessible to a new type of technology (OCT angiography) that can quantify capillary changes in great detail and with unparalleled resolution, making the eye an ideal mirror for what is going on in the brain.
“Once our results are validated, this approach could potentially provide an additional type of biomarker to identify individuals at high risk of progressing to Alzheimer’s,” said Dr. Amani Fawzi, a professor of ophthalmology at Northwestern University Feinberg School of Medicine and a Northwestern Medicine physician. “These individuals can then be followed more closely and could be prime candidates for new therapies aimed at slowing down the progression of the disease or preventing the onset of the dementia associated with Alzheimer’s.”
Therapies for Alzheimer’s are more effective if they are started before extensive brain damage and cognitive decline have occurred, added Fawzi, the Cyrus Tang and Lee Jampol Professor of Ophthalmology.
The study was published in PLOS ONE.
It’s known that patients with Alzheimer’s have decreased retinal blood flow and vessel density but it had not been known if these changes are also present in individuals with early Alzheimer’s or forgetful mild cognitive impairment who have a higher risk for progressing to dementia.
Multicenter trials could be implemented using this simple technology in Alzheimer’s clinics. Larger datasets will be important to validate the marker as well as find the best algorithm and combination of tests that will detect high-risk subjects, said Sandra Weintraub, a co-author and professor of neurology and of psychiatry and behavioral sciences at Feinberg.  
Weintraub and her team at the Northwestern Mesulam Center for Cognitive Neurology and Alzheimer’s Disease recruited 32 participants who had cognitive testing consistent with the forgetful type of cognitive impairment, and age-, gender- and race- matched them to subjects who tested as cognitively normal for their age. All individuals underwent the eye imaging with OCT angiography. The data were analyzed to identify whether the vascular capillaries in the back of the eye were different between the two groups of individuals.
Now the team hopes to correlate these findings with other more standard (but also more invasive) types of Alzheimer’s biomarkers as well as explore the longitudinal changes in the eye parameters in these subjects.
“Ideally the retinal findings would correlate well with other brain biomarkers,” Fawzi said. “​Long-term studies are also important to see if the retinal capillaries will change more dramatically in those who progressively decline and develop Alzheimer’s dementia.“
123 notes · View notes
ibiofeedback · 6 years ago
Text
Tumblr media
0 notes
ibiofeedback · 6 years ago
Text
What you eat could impact your brain and memory
You may be familiar with the saying, “You are what you eat,” but did you know the food you eat could impact your memory?
Assistant Professor Auriel Willette and his team of researchers in the Department of Food Science and Human Nutrition analyzed data and discovered a satiety hormone that, at higher levels, could decrease a person’s likelihood of developing Alzheimer’s disease. A paper outlining the results of their study recently was accepted for publication in Neurobiology of Aging.
Using data from the Alzheimer’s Disease Neuroimaging Initiative (ADNI), the researchers looked at the satiety hormone, Cholecystokinin (CCK), in 287 subjects. CCK is found in both the small intestines and the brain. In the small intestines, CCK allows for the absorption of fats and proteins. In the brain, CCK is located in the hippocampus, which is the memory-forming region of the brain.
Alexandra Plagman, graduate student in nutritional science, said they chose to focus on CCK because it is highly expressed in memory formation. The researchers wanted to see if there was any significance between levels of CCK and levels of memory and gray matter in the hippocampus and other important areas.
“It will hopefully help to shed further light on how satiety hormones in the blood and brain affect brain function,” Willette said.
The researchers found for individuals who have higher CCK levels, their chance of having mild cognitive impairment, a precursor state to Alzheimer’s disease, or Alzheimer’s disease decreased by 65 percent.
They also looked p-tau and tau proteins, which are thought to be toxic to the brain, to see how these might impact CCK and memory. They found that as tau levels increased, higher CCK was no longer related to less memory decline.
The researchers hope this study will encourage others to look into the nutritional aspect of diets, versus just looking at caloric intake. Plagman already is looking at how diet impacts an individual’s CCK levels through researching fasting glucose and ketone bodies.
“By looking at the nutritional aspect, we can tell if a certain diet could prevent Alzheimer’s disease or prevent progression of the disease,” Plagman said.
“The regulation of when and how much we eat can have some association with how good our memory is,” Willette added. “Bottom line: what we eat and what our body does with it affects our brain.”
118 notes · View notes
ibiofeedback · 6 years ago
Text
Women’s brains appear three years younger than men’s
Time wears differently on women’s and men’s brains. While the brain tends to shrink with age, men’s diminish faster than women’s. The brain’s metabolism slows as people grow older, and this, too, may differ between men and women.
A new study from Washington University School of Medicine in St. Louis finds that women’s brains appear to be about three years younger than men’s of the same chronological age, metabolically speaking. The findings, available online the week of Feb. 4 in Proceedings of the National Academy of Sciences, could be one clue to why women tend to stay mentally sharp longer than men.
“We’re just starting to understand how various sex-related factors might affect the trajectory of brain aging and how that might influence the vulnerability of the brain to neurodegenerative diseases,” said senior author Manu Goyal, MD, an assistant professor of radiology at the university’s Mallinckrodt Institute of Radiology. “Brain metabolism might help us understand some of the differences we see between men and women as they age.”
The brain runs on sugar, but how the brain uses sugar changes as people grow and age. Babies and children use some of their brain fuel in a process called aerobic glycolysis that sustains brain development and maturation. The rest of the sugar is burned to power the day-to-day tasks of thinking and doing. In adolescents and young adults, a considerable portion of brain sugar also is devoted to aerobic glycolysis, but the fraction drops steadily with age, leveling off at very low amounts by the time people are in their 60s.
But researchers have understood little about how brain metabolism differs between men and women. So Goyal and colleagues, including Marcus Raichle, MD, the Alan A. and Edith L. Wolff Distinguished Professor of Medicine and a professor of radiology, and Andrei Vlassenko, MD, PhD, an associate professor of radiology, studied 205 people to figure out how their brains use sugar.
The study participants – 121 women and 84 men, ranging in age from 20 to 82 years – underwent PET scans to measure the flow of oxygen and glucose in their brains. For each person, the researchers determined the fraction of sugar committed to aerobic glycolysis in various regions of the brain. They trained a machine-learning algorithm to find a relationship between age and brain metabolism by feeding it the men’s ages and brain metabolism data. Then, the researchers entered women’s brain metabolism data into the algorithm and directed the program to calculate each woman’s brain age from its metabolism. The algorithm yielded brain ages an average of 3.8 years younger than the women’s chronological ages.
The researchers also performed the analysis in reverse: They trained the algorithm on women’s data and applied it to men’s. This time, the algorithm reported that men’s brains were 2.4 years older than their true ages.
“The average difference in calculated brain age between men and women is significant and reproducible, but it is only a fraction of the difference between any two individuals,” Goyal said. “It is stronger than many sex differences that have been reported, but it’s nowhere near as big a difference as some sex differences, such as height.”
The relative youthfulness of women’s brains was detectable even among the youngest participants, who were in their 20s.
“It’s not that men’s brains age faster – they start adulthood about three years older than women, and that persists throughout life,” said Goyal, who is also an assistant professor of neurology and of neuroscience. “What we don’t know is what it means. I think this could mean that the reason women don’t experience as much cognitive decline in later years is because their brains are effectively younger, and we’re currently working on a study to confirm that.”
Older women tend to score better than men of the same age on tests of reason, memory and problem solving. Goyal, Raichle, Vlassenko and colleagues are now following a cohort of adults over time to see whether people with younger-looking brains are less likely to develop cognitive problems.
138 notes · View notes
ibiofeedback · 6 years ago
Text
Tumblr media
0 notes
ibiofeedback · 6 years ago
Text
Tumblr media
0 notes
ibiofeedback · 6 years ago
Photo
Tumblr media
Never-before-seen DNA recombination in the brain linked to Alzheimer’s disease
Scientists from Sanford Burnham Prebys Medical Discovery Institute (SBP) have identified gene recombination in neurons that produces thousands of new gene variants within Alzheimer’s disease brains. The study, published in Nature, reveals for the first time how the Alzheimer’s-linked gene, APP, is recombined by using the same type of enzyme found in HIV.
Using new analytical methods focused on single and multiple-cell samples, the researchers found that the APP gene, which produces the toxic beta amyloid proteins defining Alzheimer’s disease, gives rise to novel gene variants in neurons—creating a genomic mosaic. The process required reverse transcription and reinsertion of the variants back into the original genome, producing permanent DNA sequence changes within the cell’s DNA blueprint.
“We used new approaches to study the APP gene, which gives rise to amyloid plaques, a pathological hallmark of the disease,” says Jerold Chun, M.D., Ph.D., senior author of the paper and professor and senior vice president of Neuroscience Drug Discovery at SBP. “Gene recombination was discovered as both a normal process for the brain and one that goes wrong in Alzheimer’s disease.”
One hundred percent of the Alzheimer’s disease brain samples contained an over-abundance of distinct APP gene variants, compared to samples from normal brains. Among these Alzheimer’s-enriched variations, the scientists identified 11 single-nucleotide changes identical to known mutations in familial Alzheimer’s disease—a very rare inherited form of the disorder. Although found in a mosaic pattern, the identical APP variants were observed in the most common form of Alzheimer’s disease, further linking gene recombination in neurons to disease.
“These findings may fundamentally change how we understand the brain and Alzheimer’s disease,” says Chun. “If we imagine DNA as a language that each cell uses to ‘speak,’ we found that in neurons, just a single word may produce many thousands of new, previously unrecognized words. This is a bit like a secret code embedded within our normal language that is decoded by gene recombination. The secret code is being used in healthy brains but also appears to be disrupted in Alzheimer’s disease.”
Potential near-term Alzheimer’s treatment uncovered  
The scientists found that the gene recombination process required an enzyme called reverse transcriptase, the same type of enzyme HIV uses to infect cells. Although there is no medical evidence that HIV or AIDS causes Alzheimer’s disease, existing FDA-approved antiretroviral therapies for HIV that block reverse transcriptase might also be able to halt the recombination process and could be explored as a new treatment for Alzheimer’s disease. The scientists noted the relative absence of proven Alzheimer’s disease in aging HIV patients on antiretroviral medication, supporting this possibility.  
“Our findings provide a scientific rationale for immediate clinical evaluation of HIV antiretroviral therapies in people with Alzheimer’s disease,” says Chun. “Such studies may also be valuable for high-risk populations, such as people with rare genetic forms of Alzheimer’s disease.”
Adds first author Ming-Hsiang Lee, Ph.D., a research associate in the Chun laboratory, “Reverse transcriptase is an error-prone enzyme—meaning it makes lots of mistakes. This helps explain why copies of the APP gene are not accurate in Alzheimer’s disease and how the diversity of DNA in the neurons is created.”
An explanation for recent clinical trial setbacks
The amyloid hypothesis, or the theory that accumulation of a protein called beta-amyloid in the brain causes Alzheimer’s disease, has driven Alzheimer’s research to date. However, treatments that target beta-amyloid have notoriously failed in clinical trials. Today’s findings offer a potential answer to this mystery.
“The thousands of APP gene variations in Alzheimer’s disease provide a possible explanation for the failures of more than 400 clinical trials targeting single forms of beta-amyloid or involved enzymes,” says Chun. “APP gene recombination in Alzheimer’s disease may be producing many other genotoxic changes as well as disease-related proteins that were therapeutically missed in prior clinical trials. The functions of APP and beta-amyloid that are central to the amyloid hypothesis can now be re-evaluated in light of our gene recombination discovery.”  
Close of one chapter opens another
“Today’s discovery is a step forward—but there is so much that we still don’t know,” says Chun. “We hope to evaluate gene recombination in more brains, in different parts of the brain and involving other recombined genes—in Alzheimer’s disease as well as other neurodegenerative and neurological diseases—and use this knowledge to design effective therapies targeting gene recombination.”
He adds, “It is important to note that none of this work would have been possible without the altruistic generosity of brain donors and their loving families, to whom we are most grateful. Their generosity is yielding fundamental insights into the brain,and are leading us toward developing new and effective ways of treating Alzheimer’s disease and possibly other brain disorders—potentially helping millions of people. There is much more important work to be done.”
97 notes · View notes
ibiofeedback · 6 years ago
Photo
Tumblr media
Brain Signature of Depressed Mood Unveiled in New Study
Most of us have had moments when we’re feeling down – maybe we can’t stop thinking about our worst mistakes, or our most embarrassing memories – but for some, these poor mood states can be relentless and even debilitating. Now, new research from UC San Francisco has identified a common pattern of brain activity that may be behind those feelings of low mood, particularly in people who have a tendency towards anxiety. The newly discovered network is a significant advance in research on the neurobiology of mood, and could serve as a biomarker to help scientists developing new therapies to help people with mood disorders such as depression.
Most human brain research on mood disorders has relied on studies in which participants lie in an fMRI scanner and look at upsetting images or listen to sad stories. These studies have helped scientists identify brain areas associated with emotion in healthy and depressed individuals, but they don’t reveal much about the natural mood fluctuations that people experience over the course of a day or provide insight into the actual mechanisms of brain activity underlying mood.
Newly published research by UCSF Health neurosurgeon and neuroscientist Edward Chang, MD, and psychiatrist and neuroscientist Vikaas Sohal, MD, PhD – both members of the UCSF Weill Institute for Neurosciences and the recently launched UCSF Dolby Family Center for Mood Disorders – has begun to fill these gaps in our understanding of the neuroscience of mood by continuously recording brain activity for a week or more in human volunteers and linking their day-to-day mood swings to specific patterns of brain activity.
Matching Brain Activity to Reported Mood
The new study – which appears online in Cell on Nov. 8, 2018 – was funded by the Systems-Based Neurotechnology for Emerging Therapies (SUBNETS) program of the Defense Advanced Research Projects Agency (DARPA). Launched in 2014 under the auspices of the White House BRAIN Initiative, this multi-institutional collaborative project seeks to enhance understanding of the brain circuitry underlying neuropsychiatric conditions such as depression and anxiety, and to develop novel technology to treat these disabling brain disorders.
“It is remarkable that we are able to see the actual neural substrates of human mood directly from the brain,” Chang said. “The findings have scientific implications for our understanding of how specific brain regions contribute to mood disorders, but also practical implications for identifying biomarkers that could be used for new technology designed to treat these disorders, which is a major priority of our SUBNETS effort.”
The researchers recruited 21 patient volunteers with epilepsy who had had 40 to 70 electrodes implanted on the brain’s surface and in deeper structures of the brain as part of standard preparation for surgery to remove seizure-causing brain tissue. The researchers recorded a wide range of brain activity in these patients over the course of seven to 10 days, particularly focusing on certain deep brain structures that have been previously implicated in mood regulation. Meanwhile, the patients regularly logged their mood throughout the day with tablet-based software.
The researchers then used computational algorithms to match patterns of brain activity to changes in the patients’ reported mood. These new algorithms were developed by the lead author, Lowry Kirkby, PhD, a postdoctoral researcher in the Sohal lab, and Francisco Luongo, PhD, a recent alumnus of UCSF’s Neuroscience Graduate Program.
‘A Powerfully Informative Biomarker’
To avoid biasing their analysis at the onset, the team did not look at the mood surveys right away. Instead, they first analyzed the long-term recordings of brain activity in each participant to identify so-called intrinsic coherence networks (ICNs). ICNs are groups of brain regions whose activity patterns regularly fluctuate together at a common frequency (like members of a college band, marching in lockstep). This synchronization was used as a clue about brain regions that communicate with one another in potentially important ways.
Then, to compare results across the unique brains and distinct electrode placements of all 21 research participants, the researchers mapped each subject’s ICNs onto neural connectivity diagrams. Comparing these standardized records of network activity across subjects revealed several “cliques” – groups of brain regions that repeatedly became synchronized at specific frequencies, and were therefore likely to represent functional brain networks.
One such clique was highly active and coordinated in 13 research participants, all of whom had also scored high on a psychological assessment of baseline anxiety conducted prior to the start of the study. In these same individuals, changes in the activity of this brain network were also highly correlated with day-to-day bouts of low or depressed mood. This mood-related network was characterized by so-called beta waves — synchronized oscillations between 13 and 30 cycles per second – in the hippocampus and amygdala, two deep brain regions which have long been linked, respectively, to memory and to negative emotion.
Sohal said the research team was at first taken aback by the clarity of the finding. “We were quite surprised to identify a single signal that almost completely accounted for bouts of depressed mood in such a large set of people,” said Sohal. “Finding such a powerfully informative biomarker was more than what we’d expected at this stage of the project.”
Surprisingly, this powerful link between of mood-associated beta waves in the amygdala and hippocampus was entirely absent from eight other research participants, all of whom had comparatively low preexisting anxiety, suggesting new questions about how the brains of people prone to anxiety may differ from others in how they process emotional situations.
“Based on what we know about these brain structures, this suggests that interactions between the amygdala and hippocampus might be linked to recalling emotional memories, and that this pathway is particularly strong in people with high levels of anxiety, whose mood might then be heavily influenced by recalling emotion-laden memories,” Sohal said. “We will need to investigate that hypothesis further, but as a psychiatrist it’s deeply satisfying to begin to be able to provide a conceptual framework to patients to help them understand what they are going through when they feel down.”
Both Chang and Sohal emphasized that support from both the Weill Institute and Dolby Center is critical to the future success of this project and to ongoing work needed to validate the findings and develop potential therapeutic applications.
“The Dolby and Weill families have both recognized a special opportunity to break down walls between psychiatry, neurology, and neurosurgery in order to fill gaps in our knowledge about how mood is processed in the brain and to move towards new biologically driven therapies for mood disorders,” Chang said.
120 notes · View notes
ibiofeedback · 6 years ago
Photo
Tumblr media
Brain activity pattern may be early sign of schizophrenia
Schizophrenia, a brain disorder that produces hallucinations, delusions, and cognitive impairments, usually strikes during adolescence or young adulthood. While some signs can suggest that a person is at high risk for developing the disorder, there is no way to definitively diagnose it until the first psychotic episode occurs.
MIT neuroscientists working with researchers at Beth Israel Deaconess Medical Center, Brigham and Women’s Hospital, and the Shanghai Mental Health Center have now identified a pattern of brain activity correlated with development of schizophrenia, which they say could be used as a marker to diagnose the disease earlier.
“You can consider this pattern to be a risk factor. If we use these types of brain measurements, then maybe we can predict a little bit better who will end up developing psychosis, and that may also help tailor interventions,” says Guusje Collin, a visiting scientist at MIT’s McGovern Institute for Brain Research and the lead author of the paper.
The study, which appears in the journal Molecular Psychiatry on Nov. 8, was performed at the Shanghai Mental Health Center. Susan Whitfield-Gabrieli, a visiting scientist at the McGovern Institute and a professor of psychology at Northeastern University, is one of the principal investigators for the study, along with Jijun Wang of the Shanghai Mental Health Center, William Stone of Beth Israel Deaconess Medical Center, the late Larry Seidman of Beth Israel Deaconess Medical Center, and Martha Shenton of Brigham and Women’s Hospital.
Abnormal connections
Before they experience a psychotic episode, characterized by sudden changes in behavior and a loss of touch with reality, patients can experience milder symptoms such as disordered thinking. This kind of thinking can lead to behaviors such as jumping from topic to topic at random, or giving answers unrelated to the original question. Previous studies have shown that about 25 percent of people who experience these early symptoms go on to develop schizophrenia.
The research team performed the study at the Shanghai Mental Health Center because the huge volume of patients who visit the hospital annually gave them a large enough sample of people at high risk of developing schizophrenia.
The researchers followed 158 people between the ages of 13 and 34 who were identified as high-risk because they had experienced early symptoms. The team also included 93 control subjects, who did not have any risk factors. At the beginning of the study, the researchers used functional magnetic resonance imaging (fMRI) to measure a type of brain activity involving “resting state networks.” Resting state networks consist of brain regions that preferentially connect with and communicate with each other when the brain is not performing any particular cognitive task.
“We were interested in looking at the intrinsic functional architecture of the brain to see if we could detect early aberrant brain connectivity or networks in individuals who are in the clinically high-risk phase of the disorder,” Whitfield-Gabrieli says.
One year after the initial scans, 23 of the high-risk patients had experienced a psychotic episode and were diagnosed with schizophrenia. In those patients’ scans, taken before their diagnosis, the researchers found a distinctive pattern of activity that was different from the healthy control subjects and the at-risk subjects who had not developed psychosis.
For example, in most people, a part of the brain known as the superior temporal gyrus, which is involved in auditory processing, is highly connected to brain regions involved in sensory perception and motor control. However, in patients who developed psychosis, the superior temporal gyrus became more connected to limbic regions, which are involved in processing emotions. This could help explain why patients with schizophrenia usually experience auditory hallucinations, the researchers say.
Meanwhile, the high-risk subjects who did not develop psychosis showed network connectivity nearly identical to that of the healthy subjects.
Early intervention
This type of distinctive brain activity could be useful as an early indicator of schizophrenia, especially since it is possible that it could be seen in even younger patients. The researchers are now performing similar studies with younger at-risk populations, including children with a family history of schizophrenia.
“That really gets at the heart of how we can translate this clinically, because we can get in earlier and earlier to identify aberrant networks in the hopes that we can do earlier interventions, and possibly even prevent psychiatric disorders,” Whitfield-Gabrieli says.
She and her colleagues are now testing early interventions that could help to combat the symptoms of schizophrenia, including cognitive behavioral therapy and neural feedback. The neural feedback approach involves training patients to use mindfulness meditation to reduce activity in the superior temporal gyrus, which tends to increase before and during auditory hallucinations.
The researchers also plan to continue following the patients in the current study, and they are now analyzing some additional data on the white matter connections in the brains of these patients, to see if those connections might yield additional differences that could also serve as early indicators of disease.
100 notes · View notes
ibiofeedback · 6 years ago
Text
Tumblr media Tumblr media
0 notes
ibiofeedback · 6 years ago
Photo
Tumblr media
Study pinpoints cell types affected in brains of multiple sclerosis patients
Projection neurons have been implicated in the progression of multiple sclerosis. A new study reports projection neurons are damaged by immune cells. This damage could contribute to both atrophy and cognitive changes associated with the disease.
41 notes · View notes