Customize

ScnTO's Very Own Last Chance Thread

Discussion in 'News and Current Events' started by DeathHamster, May 15, 2011.

Thread Status:
Not open for further replies.
  1. Natter Bored Member

    What a wonderful cognition. You need to do a Scientology Communication Course. Bring funds. Lots and lots of funds, preferably in cash, but we can also religiously work with credit cards, money orders, and legal trust funds. Would you like to join staff and get Scietology courses for free? Sign here please.

    You may also qualify for the Sea Org. Have you taken LSD? No? Lovely, you are qualified for our special discounted billion-year contract, with no pay and no vacations. Sea org staff are tough, stupid, sons-of-bitches.
    • Like Like x 1
  2. Anonymous Member

    Scientists trick the brain into Barbie-doll size

    May 25, 2011
    1-scientiststr.jpg
    Enlarge
    Henrik Ehrsson and Björn van der Hoort conducting an experiment. Photo: Staffan Larsson
    (Medical Xpress) -- Imagine shrinking to the size of a doll in your sleep. When you wake up, will you perceive yourself as tiny or the world as being populated by giants? Researchers at Karolinska Institutet in Sweden may have found the answer.
    According to the textbooks, our perception of size and distance is a product of how the brain interprets different visual cues, such as the size of an object on the retina and its movement across the visual field. Some researchers have claimed that our bodies also influence our perception of the world, so that the taller you are, the shorter distances appear to be. However, there has been no way of testing this hypothesis experimentally – until now.
    Henrik Ehrsson and his colleagues at Karolinska Institutet have already managed to create the illusion of body-swapping with other people or mannequins(Now they have used the same techniques to create the illusion of having a very small doll-sized body or a very large, 13-foot tall body. Their results, published in the online open access journal PLoS ONE, show for the first time that the size of our bodies has a profound effect on how we perceive the space around us.
    "Tiny bodies perceive the world as huge, and vice versa," says study leader Henrik Ehrsson.
    The altered perception of space was assessed by having subjects estimate the size of different blocks and then walk over to the blocks with their eyes shut. The illusion of having a small body caused an overestimation of size and distance, an effect that was reversed for large bodies.
    One strategy that the brain uses to judge size is through comparison – if a person stands beside a tree it computes the size of both. However, the sensed own body seems to serve as a fundamental reference that affects this and other visual mechanisms.
    "Even though we know just how large people are, the illusion makes us perceive other people as giants; it's a very weird experience," says Dr Ehrsson, who also tried the experiment on himself.
    The study also shows that it is perfectly possible to create an illusion of body-swapping with extremely small or large artificial bodies; an effect that Dr Ehrsson believes has considerable potential practical applications.
    "It's possible, in theory, to produce an illusion of being a microscopic robot that can carry out operations in the human body, or a giant robot repairing a nuclear power plant after an accident," he says.
    More information: van der Hoort B, Guterstam A, Ehrsson HH (2011) Being Barbie: The Size of One's Own Body Determines the Perceived Size of the World. PLoS ONE 6(5): e20195. doi:10.1371/journal.pone.0020195
    Provided by Public Library of Science (news : web)
  3. Anonymous Member

    How to leave your body

    February 20, 2011

    Leave your body and shake hands with yourself, gain an extra limb or change into a robot for a while. Swedish neuroscientist Henrik Ehrsson has demonstrated that the brain's image of the body is negotiable. Applications stretch from touch-sensitive prostheses to robotics and virtual worlds.
    Ask a child if their hands belong to them and they will answer, "Of course!" But how does the brain actually identify its own body? And why do we experience our centre of awareness as located inside a physical body?
    In a series of studies, neuroscientist Henrik Ehrsson of the Swedish medical university Karolinska Institutet has shown that the brain's perception of its own body can alter remarkably. Through the coordinated manipulation of the different senses, subjects can be made to feel that their body suddenly includes artificial objects or that they have departed their body entirely to enter another. His experiments have been published in Science and other leading scientific periodicals and journals, and have garnered considered international attention.
    "By clarifying how the normal brain produces a sense of ownership of the body, we can learn to project ownership onto artificial bodies and simulated virtual ones, and even make two people have the experience of swapping bodies with one another," says Dr Ehrsson.
    The research addresses fundamental questions about the relationship between mind and body, which have been the topic of theological, philosophical and psychological discussion for centuries but which have only recently been accessible to experimental investigation. The key to solving the problem is to identify the multisensory mechanisms through which the central nervous system distinguishes between sensory signals from the body and from its environment.
    The research may have important implications in a wide range of areas, such as developing hand prostheses that feel more like real hands and the next generation of virtual reality applications, where the sense of self is projected onto computer-generated 'virtual bodies'.
    Researchers are currently looking into what kind of bodies the brain can perceive as its own. The self can, for example, be transferred into a body of another sex, age and size, but not into objects such as blocks or chairs. One ongoing project with potential applications in robotics is examining if the perceived body can be shrunk to the size of a Barbie doll; another is studying if the brain can accept a body with three arms.
    "This could give paralysed people a third prosthetic arm, which they would perceive as real," says Dr Ehrsson.
    Provided by Karolinska Institutet (news : web)
  4. Anonymous Member

    Perceiving touch and your self outside of your body

    August 5, 2009

    When you feel you are being touched, usually someone or something is physically touching you and you perceive that your "self" is located in the same place as your body. In new research published in the open-access, peer-reviewed journal PLoS ONE, neuroscientists at the Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland, investigated the relationship between bodily self-consciousness and the way touch stimuli are spatially represented in humans. They found that sensations of touch can be felt and mislocalised towards where a "virtual" body is seen. These findings will provide new avenues for the animation of virtual worlds and machines.
    In their previous research, Professor Olaf Blanke's lab at the EPFL found that the consciousness of one's own body (the sense of self-identification and self-location) can be altered in healthy people under certain experimental conditions, yielding similar sensations to those felt in out-of-body experiences. In this new study, Aspell and colleagues in Blanke's lab used a crossmodal congruency task to determine whether there is a change in touch perception during this illusion.
    A number of earlier studies showed that if a rubber hand is positioned such that it extends from a person's arm while her actual hand is hidden from view, and both her real hand and the rubber hand are stroked at the same time, she seems to feel the touch in the location where she sees the rubber hand being touched. This effect and the experienced 'ownership' of the rubber hand is the "rubber hand illusion."
    Aspell, a postdoctoral researcher, along with graduate student Bigna Lenggenhager and Professor Olaf Blanke sought to expand on this research to see whether there are changes in touch perception when humans experience ownership of a whole virtual body. They designed a novel behavioural task in which the experimental participants had to try to detect where on their body vibrations were occurring. At the same time, they viewed their own body via a head-mounted display connected to a camera filming the participant's back from two metres away. The participants had to ignore light flashes that appeared on their body near the vibrators. To induce the feeling that they were located in the position where they viewed their body (i.e. two metres in front of them), participants were stroked on their backs with a stick. This induced a "full body illusion" in which a person perceives herself as being located outside the confines of her own body.
    By measuring how strongly the light flashes interfered with the perception of the vibrations, the researchers were able to show that the mapping of touch sensations was altered during the full body illusion. The mapping of touch in space was shifted towards the virtual body when subjects felt themselves to be located where the virtual body was seen.
    This study demonstrates that changes in self-consciousness ('where am I located?' and 'what is my body?') are accompanied by changes in where touch sensations are experienced in space. Importantly, these data reveal that brain mechanisms of multisensory processing are crucial for the "I" of conscious experience and can be scientifically manipulated in order to animate and incarnate virtual humans, robots, and machines.
    More information: Aspell JE, Lenggenhager B, Blanke O (2009) Keeping in Touch with One's Self: Multisensory Mechanisms of Self-Consciousness. PLoS ONE 4(8): e6488. doi: 10.1371/journal.pone.0006488
    Source: Public Library of Science (news : web)
  5. Anonymous Member

    Scientists propose explanation for out-of-body experiences

    August 23, 2007

    Using virtual reality goggles to mix up the sensory signals reaching the brain, scientists have induced out-of-body-like experiences in healthy people, suggesting a scientific explanation for a phenomenon often thought to be a figment of the imagination.
    The sight of their bodies located somewhere else -- thanks to the goggles -- plus the feel of their real bodies being touched simultaneously made volunteers sense that they had moved outside of their physical bodies, according to a pair of studies in the 24 August 2007 issue of the journal Science.
    A disconnect between the brain circuits that process both these types of sensory information may thus be responsible for some out-of-body experiences, the researchers say.
    Out-of-body experiences, which generally involve the feeling of disembodiment and seeing one’s own body from a location outside the body, can occur in part through drug use, epileptic seizures and other types of brain disturbances.
    By projecting a person’s awareness into a virtual body, the techniques used in these studies may be useful for training people to do delicate “teleoperating” tasks, such as performing surgeries remotely. The findings may also remove some of the stigma that patients with neurological disorders may feel about having these experiences, which are frequently attributed to an active imagination or some sort of paranormal phenomenon.
    The studies also help solve the age-old question of how we perceive our own bodies.
    “I’m interested in why we feel that our selves are inside our bodies -- why we have an ‘in-body experience,’ if you like. This has been discussed for centuries in philosophy, but it’s hard to tackle experimentally,” said Science Brevium author Henrik Ehrsson of University College London, in London, and the Karolinska Institute in Stockholm.
    Both Ehrsson and another research team, led by Olaf Blanke of the Ecole Polytechique Fédérale de Lausanne (EPFL) and the University Hospital in Geneva, Switzerland, used video cameras and virtual reality goggles to show volunteers images of their own bodies from the perspective of someone behind them. The researchers also touched the volunteers’ bodies, both physically and virtually.
    The volunteers in Ehrsson’s study viewed images recorded by the cameras through their headsets. In Blanke and colleagues’ study, the video was converted into holograph-like computer simulations.
    Ehrsson had the volunteers watch a plastic rod moving toward a location just below the cameras while their real chests were simultaneously touched in the corresponding spot. Questionnaire responses afterwards indicated that the volunteers felt they were located back where the cameras were placed, watching a dummy or a body that belonged to someone else.
    “This experiment suggests that the first-person visual perspective is critically important for the in-body experience. In other words, we feel that our self is located where the eyes are,” Ehrsson said.
    Ehrsson also had the volunteers watch a hammer swing down to a point below the camera, as though it were going to “hurt” an unseen portion of the virtual body. Measurements of skin conductance, which reflects emotional responses such as fear, indicated that the volunteers sensed their “selves” had left their physical bodies and moved to the virtual bodies.
    Blanke’s team used a similar setup to create out-of-body-like experiences (which they cautioned lacked some aspects of full-blown out-of-body experiences).
    After the virtual reality exercise, a researcher would blindfold the volunteers and guide them backward. When the volunteers were asked to return to their original position, they tended to drift toward where they had seen their virtual bodies standing.
    Both studies conclude that “multisensory conflict” is a key mechanism underlying out-of-body experiences.
    “Brain dysfunctions that interfere with interpreting sensory signals may be responsible for some clinical cases of out-of-body experiences,” Ehrsson said. “Though, whether all out-of-body experiences arise from the same causes is still an open question.”
    Bodily self-consciousness may also involve a cognitive dimension – the ability to distinguish between one’s own body and other objects – in addition to sensory signals, Blanke and his coauthors propose.
    Supporting this idea, Blanke’s team reports that when the volunteers viewed a human-sized block instead of an image of a human body, they successfully returned to their original standing place, indicating that no out-of-body-like illusion had occurred.
    “Full-body consciousness seems to require not just the ‘bottom up’ process of correlating sensory information but also the ‘top down’ knowledge about human bodies,” Blanke said.
    Some of the out-of-body experiences that have previously eluded scientific explanation may be related to distorted “full-body perception,” according to Blanke. Virtual reality systems may provide further answers.
    “We have decades of intense research on visual perception, but not very much yet on body perception. But that may change, now virtual reality offers a way to manipulate full body perception more systematically and probe out-of-body experiences and bodily self consciousness in a new way,” Blanke said.
    Source: American Association for the Advancement of Science
  6. Anonymous Member

    Sensory deprivation can produce hallucinations in only 15 minutes

    October 23, 2009 by Lin Edwards img-dot.gif
    413px-RobertFuddBewusstsein17Jh.jpg
    Enlarge
    Robert Fludd's depiction of perception (1619). Image: Wikimedia Commons
    (PhysOrg.com) -- A new study has found that even a short period of sensory deprivation is enough to produce hallucinations even in people who are not normally prone to them.
    The 19 volunteers in the study were chosen from over 200 applicants who all completed a Revised Hallucinations Scale questionnaire, which is designed to determine if people are predisposed to hallucinations. The researchers selected nine subjects from applicants who scored in the upper 20th percentile and 10 from the lower 20th.
    The researchers, from the University College, London, placed the volunteers one at a time into an anechoic chamber. The chamber had thick outer walls, inner walls of metallic acoustic panels, and a layer of fiberglass sandwiched between them, and completely dampened sound to below the threshold of hearing, and also blocked out all light.
    The subject sat in a padded chair in the sensory deprivation room for 15 minutes, during which time many of the subjects reported hallucinations, a depressed mood or paranoia. The volunteers could have used the panic button to be immediately released from the chamber, but none did. After the experiment they completed a Psychotomimetic States Inventory test to determine if they had experienced hallucinations or other experiences resembling psychoses. The test was developed originally to study the experiences of users of recreational drugs.
    Of the nine volunteers who had high scores on the first questionnaire, almost all reported experiencing something "very special or important" while inside the chamber. Six saw objects that were not there, five had hallucinations of faces, four reported a heightened sense of smell, and two felt there was an evil presence in the chamber with them.
    The 10 volunteers who had lower scores on the questionnaire, indicating they were less prone to hallucinations, still reported experiencing hallucinations and delusions, but to a lesser degree than the other group.
    One of the researchers, psychologist Oliver Mason, said the results of the experiment support the idea that hallucinations are produced through what the scientists call faulty source monitoring: the brain misidentifies the source of its own thoughts as arising from outside the body. Mason was not surprised by the rather dramatic results after such a short time, saying the psychosis-inducing effect of sensory deprivation is analogous to the effect of drugs such as cannabis and ketamine, especially in those prone to psychoses. The findings may be important because they suggest that mental illness and normality occur on a continuum.
    Future research planned includes studying the effects of sensory deprivation on recreational drug users and people with schizophrenia.
    The results of the study are published in the October edition of the Journal of Nervous and Mental Disease.
    More information: Journal of Nervous and Mental Disease, The psychotomimetic effects of short-term sensory deprivation, Mason OJ, Brady F.; 197(10):783-5
    via Wired
    • Like Like x 1
  7. Anonymous Member

    Drug may help overwrite bad memories

    May 26, 2011

    Recalling painful memories while under the influence of the drug metyrapone reduces the brain's ability to re-record the negative emotions associated with them, according to University of Montreal researchers at the Centre for Studies on Human Stress of Louis-H. Lafontaine Hospital.
    The team's study challenges the theory that memories cannot be modified once they are stored in the brain. "Metyrapone is a drug that significantly decreases the levels of cortisol, a stress hormone that is involved in memory recall," explained lead author Marie-France Marin. Manipulating cortisol close to the time of forming new memories can decrease the negative emotions that may be associated with them. "The results show that when we decrease stress hormone levels at the time of recall of a negative event, we can impair the memory for this negative event with a long-lasting effect," said Dr. Sonia Lupien, who directed the research.
    Thirty-three men participated in the study, which involved learning a story composed of neutral and negative events. Three days later, they were divided into three groups – participants in the first group received a single dose of metyrapone, the second received double, while the third were given placebo. They were then asked to remember the story. Their memory performance was then evaluated again four days later, once the drug had cleared out.. "We found that the men in the group who received two doses of metyrapone were impaired when retrieving the negative events of the story, while they showed no impairment recalling the neutral parts of the story," Marin explained. "We were surprised that the decreased memory of negative information was still present once cortisol levels had returned to normal."
    The research offers hope to people suffering from syndromes such as post-traumatic stress disorder. "Our findings may help people deal with traumatic events by offering them the opportunity to 'write-over' the emotional part of their memories during therapy," Marin said. One major hurdle, however, is the fact that metyrapone is no longer commercially produced. Nevertheless, the findings are very promising in terms of future clinical treatments. "Other drugs also decrease cortisol levels, and further studies with these compounds will enable us to gain a better understanding of the brain mechanisms involved in the modulation of negative memories."
    More information: The study received funding from the Canadian Institutes for Health Research and was published in the Journal of Clinical Endocrinology & Metabolism.
    Provided by University of Montreal (news : web)
  8. Anonymous Member

    Brain cell networks recreated with new view of activity behind memory formation

    May 25, 2011
    pittteamrecr.jpg
    Enlarge
    A fluorescent image of the neural network model developed at Pitt reveals the interconnection (red) between individual brain cells (blue). Adhesive proteins (green) allow the network to be constructed on silicon discs for experimentation. Credit: U. Pittsburgh
    University of Pittsburgh researchers have reproduced the brain's complex electrical impulses onto models made of living brain cells that provide an unprecedented view of the neuron activity behind memory formation.
    The team fashioned ring-shaped networks of brain cells that were not only capable of transmitting an electrical impulse, but also remained in a state of persistent activity associated with memory formation, said lead researcher Henry Zeringue [zuh-rang], a bioengineering professor in Pitt's Swanson School of Engineering. Magnetic resonance images have suggested that working memories are formed when the cortex, or outer layer of the brain, launches into extended electrical activity after the initial stimulus, Zeringue explained. But the brain's complex structure and the diminutive scale of neural networks mean that observing this activity in real time can be nearly impossible, he added.
    The Pitt team, however, was able to generate and prolong this excited state in groups of 40 to 60 brain cells harvested from the hippocampus of rats—the part of the brain associated with memory formation. In addition, the researchers produced the networks on glass slides that allowed them to observe the cells' interplay. The work was conducted in Zeringue's lab by Pitt bioengineering doctoral student Ashwin Vishwanathan, who most recently reported the work in the Royal Society of Chemistry (UK) journal, Lab on a Chip. Vishwanathan coauthored the paper with Zeringue and Guo-Qiang Bi, a neurobiology professor in Pitt's School of Medicine. The work was conducted through the Center for the Neural Basis of Cognition, which is jointly operated by Pitt and Carnegie Mellon University.
    To produce the models, the Pitt team stamped adhesive proteins onto silicon discs. Once the proteins were cultured and dried, cultured hippocampus cells from embryonic rats were fused to the proteins and then given time to grow and connect to form a natural network. The researchers disabled the cells' inhibitory response and then excited the neurons with an electrical pulse.
    Zeringue and his colleagues were able to sustain the resulting burst of network activity for up to what in neuronal time is 12 long seconds. Compared to the natural duration of .25 seconds at most, the model's 12 seconds permitted an extensive observation of how the neurons transmitted and held the electrical charge, Zeringue said.
    Unraveling the mechanics of this network communication is key to understanding the cellular and molecular basis of memory creation, Zeringue said. The format developed at Pitt makes neural networks more accessible for experimentation. For instance, the team found that when activity in one neuron is suppressed, the others respond with greater excitement.
    "We can look at neurons as individuals, but that doesn't reveal a lot," Zeringue said. "Neurons are more connected and interdependent than any other cell in the body. Just because we know how one neuron reacts to something, a whole network can react not only differently, but sometimes in the complete opposite manner predicted."
    Zeringue will next work to understand the underlying factors that govern network communication and stimulation, such as the various electrical pathways between cells and the genetic makeup of individual cells.
    Provided by University of Pittsburgh
  9. Anonymous Member

    At the forefront of optogenetics

    May 24, 2011
    attheforefro.jpg
    (Medical Xpress) -- In the last couple of years scientists from the Friedrich Miescher Institute for Biomedical Research have developed new strategies to stimulate individual brain cells with light. Optogenetic technologies were named "Method of the Year" by the leading scientific journal Nature Methods in 2010. FMI scientists not only apply this to meet their biomedical needs but refine the tool as well. A recent publication in PNAS is further testimony to this distinctive expertise at the FMI.
    Untangling the networks of neurons in the brain that control sensory impressions, learning and memory is a daunting task. For many years research efforts addressing the biological basis of mental diseases such as anxiety disorders or dementia led to disappointing results, because existing technologies were simply too imprecise. It is therefore no surprise that a new technology called optogenetics has captured the imagination of many in this field, and in particular neuroscientists at the Friedrich Miescher Institute for Biomedical Research of the Novartis Research Foundation.
    Using optogenetic tools, FMI scientists are now able to activate defined neurons selectively, rapidly and reversibly by a short pulse of light. A protein borrowed from an algae - Channelrhodopsin 2 (ChR2) - is at the core of this new technology. This protein is an ion channel that opens upon a light stimulus and rapidly discharges the membrane potential of excitable cells. In neurons, this equals activation. By introducing ChR2 into subsets of neurons, scientists are now able to study their function with high precision.
    In action for biomedical research
    Andreas Lüthi has pioneered the application of this method together with Thomas Oertner, Rainer Friedrich and Botond Roska at the FMI. Lüthi is interested in the cellular mechanisms controlling fear and its extinction. "The inability to control or inhibit inappropriate fear responses is a hallmark of anxiety disorders. Optogenetics has allowed us in the past two years to identify neuronal circuits responsible for fear processing and to improve our understanding of the neurobiological basis of anxiety disorders," comments Lüthi. The same is true for Rainer Friedrich's studies of neuronal circuits in the olfactory system and forebrain. "Optogenetics opens new avenues for us to understand how neuronal circuits process odour information," comments Friedrich, "by taking optogenetics to the single-cell level, we can analyze the functions of specific cells during information processing in the intact brain. This is leading to a novel understanding of fundamental neuronal computations and their contributions to behavior."
    Botond Roska takes optogenetics even one step further. He is interested in how neurons interact in local neuronal networks namely the retina. He and his team use optogenetics to genetically identify cell types and stimulate them specifically. Moreover, he has devised an optogenetic method to restore vision in mice suffering from retinal degeneration. "Introducing a light-sensitive archaebacterial protein into the remaining but non-functional cone cells of the retina not only reactivates the cone cells' ability to interact with the rest of the visual system, it also prompts sophisticated visually guided behavior," comments Roska.
    Improving optogenetics
    While these innovative findings already illustrate the power of this new method, some of the natural traits of ChR2 are not optimal for applications in neuroscience. Thomas Oertner and his team at the FMI worked together with biophysicists at the Humboldt University in Berlin to create a novel ChR2 variant with improved properties, called ET/TC. "We wanted to identify a ChR2 mutant that reacts fast and strongly to small amounts of light. In the past, mutants either exhibited one feature or the other," comments Oertner. The ET/TC variant is the first to combine both: It activates neurons fast with high fidelity and across a wide range of light intensities. Their results have just been published in the Proceedings of the National Academy of Science (May 2011).
    Optogenetics, however, is not restricted to electrical activation of cells: Earlier this year, Oertner's group has demonstrated the power of a light-sensitive bacterial enzyme to control cAMP levels in neurons. As a consequence, scientists can not only interfere with the communication between neurons in the brain, but also directly stimulate intracellular signalling cascades that regulate the strength of synaptic connections. As it is customary in the basic research community, these new tools are freely shared with laboratories around the world.
    More information:
    High-efficiency channelrhodopsins for fast neuronal stimulation at low light levels. PNAS 2011
    Light Modulation of Cellular cAMP by bPAC, JBC 2011
    Provided by Friedrich Miescher Institute for Biomedical Research
  10. Anonymous Member

    What doesn't kill the brain makes it stronger: Possible new strategy for treating neurologic disorders

    May 23, 2011

    Johns Hopkins scientists say that a newly discovered "survival protein" protects the brain against the effects of stroke in rodent brain tissue by interfering with a particular kind of cell death that's also implicated in complications from diabetes and heart attack.
    Reporting in the May 22 advance online edition of Nature Medicine, the Johns Hopkins team says it exploited the fact that when brain tissue is subjected to a stressful but not lethal insult a defense response occurs that protects cells from subsequent insult. The scientists dissected this preconditioning pathway to identify the most critical molecular players, of which a newly identified protein protector – called Iduna -- is one.
    Named for a mythological Norwegian goddess who guards a tree full of golden apples used to restore health to sick and injured gods, the Iduna protein increased three- to four-fold in preconditioned mouse brain tissue, according to the scientists.
    "Apparently, what doesn't kill you makes you stronger," says Valina Dawson, Ph.D., professor of neurology and neuroscience in the Johns Hopkins Institute of Cell Engineering. "This protective response was broad in its defense of neurons and glia and blood vessels – the entire brain. It's not just a delay of death, but real protection that lasts for about 72 hours."
    The team noted that Iduna works by interrupting a cascade of molecular events that result in a common and widespread type of brain cell death called parthanatos often found in cases of stroke, Parkinson's Disease, diabetes and heart attack. By binding with a molecule known as PAR polymer, Iduna prevents the movement of cell-death-inducing factor (AIF) into a cell's nucleus.
    In some of the experiments, Dawson and her team exposed mouse brain cells to short bursts of a toxic chemical, and then screened these "preconditioned" cells for genes that turned on as a result of the insult. Focusing on Iduna, the researchers turned up the gene's activity in the cells during exposure to the toxic chemical that induced preconditioning. Cells deficient in Iduna did not survive, but those with more Iduna did.
    In another series of experiments in live mice, the team injected a toxic chemical into the brains of a control group of normal mice and also into a group that had been genetically engineered to produce three to four times the normal amount of Iduna – as if they had been preconditioned. The engineered mice with more Iduna were much less susceptible to brain cell death: They had more functional tissue and markedly reduced stroke damage in their brains. In addition, the Iduna mice were less impaired in their ability to move around in their cages.
    "Identifying protective molecules such as Iduna might someday lead to drugs that trigger the brain survival mechanism when people have a stroke or Parkinson's disease," says Ted Dawson, M.D., Ph.D., Leonard and Madlyn Abramson Professor in Neurodegenerative Diseases and scientific director of the Johns Hopkins Institute for Cell Engineering.
    In research published April 5 in Science Signaling, the Dawsons' laboratories previously revealed the mechanism that underpins AIF's pivotal role in parthanatos.
    By studying the 3-D structure of AIF, the team first identified the molecular pocket that looked like a potential PAR binding site. They then swapped that region out for a different one to see if it indeed took up PAR. Using HeLa cells in addition to mouse nerve and skin cells, the scientists noted that the AIF with the swapped region did not bind PAR and was not able to move into the nucleus.
    The team genetically manipulated neurons so that they didn't make any AIF, then in some cells added wild-type AIF, and in others added an AIF that did not bind PAR. When those cells were stressed using the "stroke in a dish" technique, the cells with normal AIF died while those with the AIF that could not bind PAR did not, revealing that PAR binding to AIF is required for parthanatos.
    "These findings suggest that identifying chemicals that block PAR binding to AIF could be very protective," says Ted Dawson. "On the other hand, identifying chemicals that mimic the effect of PAR polymer could be novel therapeutic agents that would kill cancers by causing cell death."
    Provided by Johns Hopkins Medical Institutions
  11. Anonymous Member

    Scientists make strides in vision research

    May 20, 2011
    2-ucsbscientis.jpg
    Enlarge
    This is a whole mouse retina labeled to reveal the terminals of cone photoreceptors (in blue), and one of their post-synaptic partners, the horizontal cells (in red). The dendritic branches of the horizontal cells make fine contacts with each cone terminal. Credit: Patrick Keele
    New research at UC Santa Barbara is contributing to the basic biological understanding of how retinas develop. The study is part of the campus's expanding vision research.
    The new studies are published in recent online versions of The Proceedings of the National Academy of Sciences (PNAS), and Investigative Ophthalmology and Visual Science (IOVS).
    The scientists document how they used mice as a research model organism to show that the size of different populations of retinal neurons display wide-ranging variability among individuals. In the PNAS article, they demonstrate a nearly two-fold variation in the number of interneurons called horizontal cells. In the IOVS article, they report a conspicuous variation in the number of cone photoreceptors.
    "These studies individually demonstrate the genetic determinants of nerve cell number," said Benjamin E. Reese, senior author and professor with the Neuroscience Research Institute and the Department of Psychological and Brain Sciences. "Together, they show that different nerve cell types are modulated independent of one another."
    Using recombinant inbred mice, Irene Whitney, graduate student and first author of both articles, and Mary Raven, staff scientist and co-author, have been able to identify genomic loci where polymorphic genes must contribute to such natural variation. In the IOVS article, they describe this natural variation for the population of cone photoreceptors, and identify two potential causal genes that may modulate cone photoreceptor production on chromosome 10.
    In the PNAS article, the scientists –– working will colleagues from four other U.S. institutions –– identify a promising candidate gene at a locus on chromosome 13, a transcription factor gene called Islet-1. This gene was confirmed to be critical for regulating horizontal cell number in genetically modified mice, in which the Islet-1 gene was rendered nonfunctional. The scientists verified that expression of this gene differs between these strains of mice during the developmental period when horizontal cells are produced. They also showed that the source of this variable expression must be due to a genetic variant within a regulatory region of the gene itself. Finally, they identified such a single nucleotide polymorphism creating an E-box, a DNA sequence bound by a family of transcription factors that have recently been shown to play a role in retinal development.
    The team explained that such natural variation in the ratio of nerve cells requires a degree of plasticity in the process of forming neural connectivity, to ensure that the entire visual field is served by neural circuits that mediate our visual abilities. A series of other published and submitted studies from the Reese lab document this very plasticity in different strains of mice and in genetically modified mice.
    Efforts to use genetic engineering and stem cell biology to repair diseased retinas depend upon a fuller appreciation of the developmental biology of the retina, explained Reese.
    "These particular studies are just one contribution in an enormously complex process," said Reese. "Our fundamental interest is in the development the retina –– how you 'build' this neural tissue that, when fully mature, will mediate our visual abilities."
    Vision research at UCSB has been steadily expanding in recent decades. "Since I arrived here in 1971, UCSB's vision research has grown to include dozens of scientists, in a number of labs, contributing to an explosion of research in the field," said Steven Fisher, professor emeritus in the Department of Molecular, Cellular, and Developmental Biology, and professor in the Neuroscience Research Institute.
    Provided by University of California - Santa Barbara (news : web)
  12. Anonymous Member

    Researchers connect electrical brain disturbances to worse outcomes following neurotrauma

    May 19, 2011

    Electrical disturbances that spread through an injured brain like tsunamis have a direct link to poor recovery and can last far longer than previously realized, researchers at the University of Cincinnati Neuroscience Institute (UCNI) have found.
    The disturbances, known as cortical spreading depolarizations, are short-circuits (electrical failures) that occur in a localized, or specific, area of injury and result in dampened brain waves. Because of their localization, the depolarizations are invisible in routine electroencephalography (EEG) exams. But they represent an extreme change in voltage—up to 10 times greater than any brain pattern that is normally present.
    "Over the last several years we've learned how to measure and record spreading depolarization in the human brain, and we have known that these depolarizations occur in many patients who have suffered neurotrauma," says Jed Hartings, PhD, research assistant professor in UC's department of neurosurgery and director of clinical monitoring for the Mayfield Clinic. "But we didn't know what they meant or whether they were relevant. For the first time we now know that they relate to worse outcomes for patients who have suffered trauma to the brain."
    That finding, Hartings adds, could eventually lead to new therapies. "If we can find a way to stabilize the brain's electrical activity and block spreading depolarizations, perhaps we can improve patients' outcomes."
    An Advance Access, online version of the research, published April 7 by the neurology journal Brain appears online in its complete, paginated form today.
    The observational, multi-site study of 53 patients represents the initial phase of a four-year, $1.96 million grant awarded by the Department of Defense (DOD). The topic of spreading depolarizations is of keen interest to the U.S. military because head injuries have emerged as the signature wounds of the wars in Iraq and Afghanistan.
    Of the study's 53 participants, 10 had experienced the most severe form of depolarizations. All died or had severe disabilities six months after their injury.
    "Spreading depolarizations, which occur in up to 60 percent of patients who have experienced serious neurotrauma, are electrical failures of the brain's local networks," explains Hartings, the study's principal investigator. "When these networks fail, brain waves can no longer be generated, and they become dampened, or depressed, in amplitude."
    Hartings likens each brain cell, or neuron, to a battery. When spreading depolarization occurs, the cell discharges its electricity completely. "The neuron, once alive with electrical activity, stops working and has to be resuscitated with glucose and oxygen," Hartings says. "You could also liken it to a battery in your car. If it drains, then the car doesn't work."
    Because networks of the brain's cortex are connected in a continuum, a depolarization triggered by an injury will spread across the cortex like a tsunami on the ocean. The wave of short-circuiting cells travels almost imperceptibly, at a speed of 1 to 5 millimeters per minute.
    Previous studies had suggested that depolarizations would last no longer than two or three minutes. But Hartings and his team have shown that, after trauma, they can be very long-lasting.
    "We found that 25 percent of the cortical spreading depolarizations lasted longer than three minutes, with durations that ranged up to 16 minutes," Hartings says. "These are the types of depolarizations that are typically observed with a developing brain infarction, or stroke. It was a surprise to see them in trauma."
    To measure depolarizations, researchers placed a linear strip of electrodes on the surface of the brain, near the injured area, during neurosurgery at UC Health University Hospital. Only patients who required neurosurgery to treat their injuries were enrolled in the study. The electrode strip records computerized brainwaves similar to those of an EEG. But whereas EEG electrodes are placed on the scalp, spreading depolarization electrodes must be placed inside the skull, on the surface of the brain. The electrodes are removed three to seven days after implantation, without additional surgery.
    New technology enabled the researchers not only to determine whether or not spreading depolarizations were occurring, but also—for the first time—to measure their duration. "A new signal-processing technique that we developed allowed us to measure exactly how long the cerebral cortex remains short-circuited, or depolarized," Hartings says. "This measurement—called the direct-current shift duration—is a direct index of how harmful the depolarization is, and of the brain's degree of injury."
    The signal-processing technique is a computer program that Hartings and his colleagues published in 2009 in the Journal of Neurophysiology.
    The spreading depolarization grant is being issued through the DOD's Psychological Health and Traumatic Brain Injury (PH/TBI) Research Program (formerly known as the Post Traumatic Stress Disorder/TBI Research Program). Hartings' co-investigator at UC is Lori Shutter, MD, director of the Neurocritical Care Program at UCNI and associate professor of neurosurgery and neurology at UC. Also participating are researchers at the University of Miami, University of Pittsburgh, Virginia Commonwealth University and King's College Hospital in London.
    UC began enrolling patients in early 2009. Patients in the study are not U.S. soldiers, but rather individuals who have suffered brain injury through falls, vehicular accidents, or other misfortunes.
    More information: http://brain.oxfordjournals.org/
    Provided by University of Cincinnati Academic Health Center
  13. Anonymous Member

    Queen's scientists teaming up to cure premature baby blindness

    May 17, 2011

    Scientists from the School of Medicine, Dentistry and Biomedical Sciences at Queen's University Belfast are teaming up to develop a cure to an illness that can lead to blindness in premature babies, thanks to funding from children's charity Action Medical Research.
    Two teams from the Centre for Vision and Vascular Science at Queen's are taking different approaches to a condition called Retinopathy of Prematurity (ROP). The condition can lead to blindness in premature babies, putting the youngest, sickest and smallest babies most at risk, including over 3,000 babies who are born more than 12 weeks early each year in the UK.
    ROP is caused by blood vessels in the eye growing abnormally and causing damage to the retina – the light-sensitive inner lining of the eye. Evidence suggests it develops in two stages:
    • Stage 1. Premature babies have poorly developed lungs and need extra oxygen to help them breathe. Unfortunately the blood vessels that supply the eye's light-sensitive retina are damaged by this additional oxygen and stop growing properly, meaning the retina does not get enough nutrients.
    • Stage 2. Eventually, in response to this damage, new vessels grow, in an attempt to rescue the retina, but they are abnormal and actually damage the eye, causing vision loss.
    The first team, led by Dr Denise McDonald, has the aim of tackling the disease at a very early stage, which will minimise the damaging effects of ROP.
    The second team, led by Dr Derek Brazil, is investigating whether stem cells from babies' own umbilical cords might have the power to repair their damaged eyes and save their sight.
    About one in ten babies with ROP develops severe disease, which threatens his or her sight. If this is detected early enough, laser treatment can save the most important part of a baby's vision – the sharp, central vision we need to look straight ahead. However, this causes permanent loss of a baby's peripheral vision and may induce short-sightedness. What's more, it doesn't always work, meaning some babies still go blind.
    Dr Brazil believes it may be possible to protect babies from ROP, and save their sight, by treating them with a special type of stem cell taken from their own umbilical cords. Dr Brazil and his colleagues Dr Michelle Hookham, Dr Reinhold Medina and the Centre Director Professor Alan Stitt, were awarded a two-year grant by Action Medical Research, to undertake this important work.
    He said: "We hope our laboratory work will reveal whether vascular stem cells have the potential to repair damage to babies' eyes and save their sight. If so, it is possible that in the future vascular stem cells could be taken from a baby's own umbilical cord just after birth and then grown in the laboratory in case treatment is needed.
    Taking a different approach, Dr McDonald and her team are exploring a key step in the early stages of the disease process. While laser treatment tackles stage 2 of the disease process, by stopping abnormal blood vessels from growing, by this stage the disease can already be quite severe.
    Dr McDonald and her team are looking for possible new treatments which will protect the retinal blood vessels from the effect of high oxygen which occurs in stage one.
    Evidence suggests that certain cofactors protect and encourage normal growth of the delicate blood vessels that supply the retina, as long as they are present in sufficient quantities. In contrast, low levels of these cofactors seem to be linked to the destruction of blood vessels. The researchers are investigating the role of specific cofactors and ways to enhance their function as a possible treatment for ROP.
    Dr Denise McDonald and her colleague, Dr Tom Gardiner, were awarded a two-year research grant from Action Medical Research for the project.
    Dr Alexandra Dedman, Senior Research Evaluation Manager from Action Medical Research, said: "We are delighted to be funding these two expert research teams in Belfast who both have longstanding track records, recognised internationally. Their work in this area has the potential to change the lives of babies around the world suffering from this condition."
    Both Dr Brazil's and Dr McDonald's teams are based at the Centre for Vision and Vascular Science at Queen's University Belfast, which contains state-of-the art facilities and equipment. The centre has a long history of successful research into many of the leading causes of vision loss. Both projects involve collaboration with Dr Eibhlin McLoone, consultant paediatric ophthalmologist at the Royal Victoria Hospital.
    Provided by Queen's University Belfast
  14. Anonymous Member

    New understanding of brain chemistry could prevent brain damage after injury

    May 15, 2011

    A protective molecule has been identified in the brain which, if used artificially, may prevent brain damage from the likes of stroke, head injury and Alzheimer's.
    By looking at what happens in the brain after an injury, new research has finally ended speculation over whether a key molecule, 'KCC2' causes brain cell death after an injury or prevents it. The finding, published today (16th May 2011) in The Journal of Physiology now opens the door to the development of artificial forms of the compound which could provide 'neuroprotection' to those who have suffered a brain injury – to prevent further damage.
    Lead author of the research Dr Igor Medina from the Universite de la Mediterranee said: "Neuron damage can result from acute events such as stroke, epilepsy or head injury or by chronic degeneration found in Alzheimer's and Parkinson's.
    "When brain tissue is damaged, cells often continue to die after the initial stimulus has stopped. So it is important to find a way of stopping this cascade of cell death."
    KCC2 is known as a 'neuronal membrane transporter' and plays a valuable role in regulating brain cell growth and their connections with other neurons. Previous research has shown that the level of KCC2 drops drastically after the brain has been injured, but it was unknown whether this drop was causing the damage to the cells, or was decreasing because of it.
    "The destiny of neurons in a damaged brain depends on a tiny equilibrium between pro-survival and pro-death signals. We wanted to know what KCC2 was signalling for – was it killing neurons or protecting them after an injury? Our study has found that KCC2 actually rescues the damaged cells."
    The team studied damaged neurons from the hippocampus region of the brain, an area responsible for attention, spatial memory and navigation. They removed KCC2 altogether from damaged cells and found they died. But when they artificially increased the levels of KCC2 (by stimulating its expression using gene therapy), they found the damaged cells were protected from further damage, and death.
    Dr Medina continued: "The death of neurons in the brain can be triggered by an imbalance of oxygen – known as oxidative damage, or where cells are incorrectly instructed to die by a neurotransmitter – a process known as excitotoxicity. KCC2 protects against both. It's really encouraging that we have identified a means of potentially protecting the brain from these common conditions."
    Now the protective function of KCC2 is known, scientists can look at ways to maintain its levels in the brains of injured patients and prevent the cascade of damage that occurs. This could be achieved pharmaceutically, to naturally increase the levels of KCC2, or with gene therapy to introduce artificial KCC2.
    "Neuroprotective agents that may stem from this research would benefit the victims of car crashes, stroke and those suffering with epilepsy, Parkinson's and Alzheimer's – it's a major focus for further studies," concluded Dr Medina.
    More information: Knocking down of the KCC2 in rat hippocampal neurons increases intracellular chloride concentration and compromises neuronal survival. Christophe Pellegrino, Olena Gubkina, Michael Schaefer, H´el`ene Becq, Anastasia Ludwig, Marat Mukhtarov, Ilona Chudotvorova, Severine Corby, Yuriy Salyha, Sergey Salozhin, Piotr Bregestovski and Igor Medina. .J Physiol 589, 2475-2496.
    Provided by Wiley (news : web)
  15. Anonymous Member

    Strong magnetic fields for new insights into the brain

    May 13, 2011
    strongmagnet.jpg
    Enlarge
    Siemens will install three powerful, high-field magnetic resonance tomographs (MRT) at the University of Maastricht, The Netherlands, and thus provide entirely new insights into the human brain. The MRTs are to be dedicated to the renowned research project Brains Unlimited, whose objective is to further investigate how the human brain functions. Siemens delivers one of the worldwide most powerful MRT systems with a magnetic field strength of 9.4 Tesla, as well as two systems with three and seven Tesla, respectively.
    Currently, Siemens is the only company capable of supplying a 9.4 Tesla MRT system for human research. Its magnetic field is nearly 200,000 times stronger than that of the earth and it is significantly more powerful than the 1.5 Tesla of standard MRT devices for clinical routine. The ultra-high-field system helps research scientists to detect much more details from inside the human body and identify brain structures and functions that exist on a microscopic scale.
    Using these research studies, scientists hope to obtain greater insight into the causes of serious illnesses such as multiple sclerosis, Alzheimer’s disease, Parkinson’s disease, epilepsy, and into the growth of tumors. In addition, the causes of behavioral changes, disorders such as difficulties in reading and writing (dyslexia), and attention deficit hyperactivity disorder (ADHD-ADD) will be investigated. Among further research projects, the university plans to use the systems to investigate how the structure of musicians’ ears differs from that of other people.
    The order was one of the largest of this type in the history of Siemens Healthcare in the Netherlands and includes the construction of a special building to accommodate the three MRT systems. The Brains Unlimited project is an initiative from the M-BIC (Maastricht Brain Imaging Center). The M-BIC is part of the Faculty of Psychology and Neuroscience and works closely together with brain scientists at Maastricht University Medical Center. Brains Unlimited is funded by the European Union, the Province of Limburg, and the Municipality of Maastricht. CAUTION – Investigational Device. Limited by Federal law to investigational use.
    Source: Siemens
  16. Natter Bored Member

    ScnTO to the white courtesy phone please.

    Kindly out-create these SPs by copying mindless LRH drivel, which will obsessively cause these pea-brained meat-brained idiots to worship the 1 true god, L. Ron Hubbard, he who causitively dropped his body without the assistance of psych meds, and is waiting patiently for us on Target Two.

    Dammit, my meds are running low, brb, gotta get more legal mind-enhancing pills.
  17. Anonymous Member

    Potential target for treating schizophrenia found

    May 11, 2011

    (Medical Xpress) -- Scientists at the University of Glasgow have identified a potential target for the treatment of schizophrenia.
    Schizophrenia is a mental condition in which individuals experience a range of symptoms, including auditory hallucinations, paranoid delusions and muddled thought or speech.
    It is one of the most common mental health conditions, affecting 2-4 people per 1,000 in the UK.
    It is widely believed that a special protein called DISC1, which plays a key role in the development of the brain cortex, may be a susceptibility factor for schizophrenia, as well as mood disorders and autism.
    The cortex is a part of the brain that plays a key role in memory, attention, awareness, thought, language and consciousness. While it is well-known that defects in this region are associated with schizophrenia, it is not understood how these defects develop.
    DISC1 is a so-called ‘signalling scaffold protein’ because it acts as a control centre by recruiting other types of proteins, attracting them to its surface where they generate and interpret signals able to control brain development and function.
    Professor Miles Houslay, of the Institute of Neuroscience & Psychology at the University of Glasgow, said: “While it is now well-recognised that DISC1 is a major susceptibility factor for these brain diseases, we still don’t understand enough about the range of processes it controls and how they go wrong in mental illness.”
    However, as reported in the latest edition of the journal Nature, the Glasgow team, working with colleagues from John Hopkins University, Duke University and Keio University, Tokyo, have shown that DISC1 acts as a molecular switch that controls two key stages in the development of the cortex.
    One stage involves how cells in the cortex multiply in development and the other stage relates to how brain cells migrate within the cortex to specific locations that allow for correct functioning.
    Prof Houslay added: “These processes are critical for normal brain function. However, as these new results show that DISC1 is a protein whose function can be dynamically regulated, it opens up the possibility of pharmaceutical and biotech companies designing new medicines able to correct defects in DISC1 that lead to the debilitating disease of schizophrenia.
    “Schizophrenia, mood disorders and autism cause great emotional and financial hardships for individuals, their families and for society as a whole. Because of this we desperately need to know what goes wrong in the brain that leads to these debilitating conditions.”
    More information: DISC1-dependent switch from progenitor proliferation to migration in the developing cortex, Nature, Volume: 473, Pages: 92–96 Date published: 05 May 2011. DOI: doi:10.1038/nature09859 http://www.nature. … re09859.html
    Abstract
    Regulatory mechanisms governing the sequence from progenitor cell proliferation to neuronal migration during corticogenesis are poorly understood. Here we report that phosphorylation of DISC1, a major susceptibility factor for several mental disorders, acts as a molecular switch from maintaining proliferation of mitotic progenitor cells to activating migration of postmitotic neurons in mice. Unphosphorylated DISC1 regulates canonical Wnt signalling via an interaction with GSK3β, whereas specific phosphorylation at serine 710 (S710) triggers the recruitment of Bardet–Biedl syndrome (BBS) proteins to the centrosome. In support of this model, loss of BBS1 leads to defects in migration, but not proliferation, whereas DISC1 knockdown leads to deficits in both. A phospho-dead mutant can only rescue proliferation, whereas a phospho-mimic mutant rescues exclusively migration defects. These data highlight a dual role for DISC1 in corticogenesis and indicate that phosphorylation of this protein at S710 activates a key developmental switch.
    Provided by University of Glasgow
  18. Anonymous Member

    The brain performs visual search near optimally

    May 8, 2011

    In the wild, mammals survive because they can see and evade predators lurking in the shadowy bushes.
    That ability translates to the human world. Transportation Security Administration screeners can pick out dangerous objects in an image of our messy and stuffed suitcases. We get out of the house every morning because we find our car keys on that cluttered shelf next to the door.
    This ability to recognize target objects surrounded by distracters is one of the remarkable functions of our nervous system.
    "Visual search is an important task for the brain. Surprisingly, even in a complex task like detecting an object in a scene with distracters, we find that people's performance is near optimal. That means that the brain manages to do the best possible job given the available information," said Dr. Wei Ji Ma, assistant professor of neuroscience at Baylor College of Medicine. A report on research by him and colleagues from other institutions appears online in the journal Nature Neuroscience.
    Recognizing the target is more than figuring out each individual object.
    "Target detection involves integrating information from multiple locations," said Ma. "Many objects might look like the target for which you are searching. It is a cognitive judgment as well as a visual one."
    One factor that must be taken into account is reliability of the information.
    "We study that in particular," said Ma. "If you are a detective, you weight different pieces of information based on the reliability of the source. Similarly, the brain has to weight different pieces of visual information."
    In his study, he and his colleagues used computer screens to show subjects sets of lines that might or might not contain a line oriented in a particular way. To manipulate reliability, they randomly varied the contrast of each line, making the target easier or more difficult to detect. Each screen was shown for only a fraction of a second, making the search task very difficult.
    "We found that even in this complex task, people came close to being optimal in detecting the target," he said. "That means that humans can in a split second integrate information across space while taking into account the reliability of that information. That is important in our daily lives."
    The task was deliberately made very hard so that people made mistakes, he said, but their answers were as good as they could be given the noise that is inherent to visual observations.
    In the second part of their study, they determined that this ability might rely on groups (populations) of neurons that respond differently to different line orientations. Using such populations, they were able to construct a neural network that could weight information by the appropriate reliability.
    They simulated this task on the computer and reproduced the behavior of human subjects, giving credence to their argument that the task requires populations of neurons.
    "The visual system is automatically and subconsciously doing complex tasks," said Ma. "People see objects and how they relate to one another. We don't just see with our eyes. We see with our brains. Our eyes are the camera, but the process of interpreting the image in our brains is seeing."
    The next question is when does a visual task become so complex that the human brain fails to be optimal?
    More information: Nature Neuroscience: http://www.nature.com/neuro/index.html
    Provided by Baylor College of Medicine (news : web)
  19. Anonymous Member

    Buenos 'notch-es': Universal signaling pathway found to regulate sleep

    May 5, 2011

    Sleeping worms have much to teach people, a notion famously applied by the children's show "Sesame Street," in which Oscar the Grouch often reads bedtime stories to his pet worm Slimy. Based on research with their own worms, a team of neurobiologists at Brown University and several other institutions has now found that "Notch," a fundamental signaling pathway found in all animals, is directly involved in sleep in the nematode C. elegans.
    "This pathway is a major player in development across all animal species," said Anne Hart, associate professor of neuroscience at Brown. "The fact that this highly conserved pathway regulates how much these little animals sleep strongly suggests that it's going to play a critical role in other animals, including humans. The genes in this pathway are expressed in the human brain."
    The work, to be published May 24 in the journal Current Biology, offers new insights into what controls sleep. The lead authors are Komudi Singh, a postdoctoral fellow in the Department of Neuroscience at Brown University, and Michael Chao, a previous member of the Hart laboratory, who is now an associate professor at California State University–San Bernardino.
    "We understand sleep as little as we understand consciousness," said Hart, the paper's senior author. "We're not clear why sleep is required, how animals enter into a sleep state, how sleep is maintained, or how animals wake up. We're still trying to figure out what is critical at the cellular level and the molecular level."
    Ultimately, Hart added, researchers could use that knowledge to develop more precise and safer sleep aids.
    "We only have some really blunt tools that we can use to change sleep patterns," she said. "But there are definite side effects to manipulating sleep the way we do now."
    Mysterious napping
    Hart first realized that Notch pathway genes might be important for sleep when her group was investigating an entirely different behavior. She was studying the effect of this pathway on the nematodes' revulsion to an odious-smelling substance called octanol. What she found, and also reports in the Current Biology paper, is that adult nematodes without Notch pathway genes (like osm-11) have their Notch receptors turned off and, therefore, they do not avoid octanol as normal worms do.
    But she was shocked to find that the adult nematodes in which the osm-11 gene was overexpressed were doing something quite bizarre. "Normally, adult nematodes spend all of their time moving" she said. "But, these animals suddenly start taking spontaneous 'naps.' It was the oddest thing I'd seen in my career."
    Nematode sleep is not exactly the same as sleep in larger animals, but these worms do go into a quiescent sleep-like state when molting. The worms with too much osm-11 were dozing when they were not supposed to.
    Other experiments showed that worms lacking osm-11 and the related osm-7 genes were hyperactive, exhibiting twice as many body bends each minute as normal nematodes.
    The story became clear. The more Notch signaling was turned on, the sleepier the worms would be. When it is suppressed, they go into overdrive and become too active.
    In humans, the gene that is most similar to osm-11 is called Deltalike1 (abbreviated DLK1). It is expressed in regions of the brain associated with the sleep-wake cycle.
    Beyond Notch
    That result alone is not enough to lead directly to the development of a new sleep drug, even for worms. Notch signaling is implicated in a lot of different activities in the body, Hart said, some of which should not be encouraged.
    "Too much Notch signaling can cause cancer, so we would have to be very targeted in how we manipulate it," she said. "One of the next steps we're going to take is to look at the specific steps in Notch signaling that are pertinent to arousal and quiescence."
    Focusing on those steps could minimize side effects, Hart said.
    Provided by Brown University (news : web)
  20. Anonymous Member

    Controlling brain circuits with light

    May 3, 2011

    F1000 Biology Reports, the open-access, peer-reviewed journal from Faculty of 1000, today published a historical account of the beginnings of the optogenetic revolution by Edward Boyden.
    Commenting on Edward Boyden's article, Ben Barres, Head of the Neuronal & Glial Cell Biology Section of Faculty of 1000 and Professor at Stanford University School of Medicine said: "There will probably be a Nobel prize for optogenetics someday as it has revolutionized our attempts to understand how the brain works. This article provides a fascinating insight into the birth of optogenetics and the roles of the major players."
    The invention of optogenetics literally sheds light on how our brains work. Published in the May 2011 issue of F1000 Biology Reports, Edward Boyden's revealing article gives a unique perspective on the birth of optogenetics tools, new resources for analyzing and engineering brain circuits. These 'tools' take the form of genetically encoded molecules that, when targeted to specific neurons in the brain, enable their activity to be driven or silenced by light, thus revealing how entire neural circuits operate.
    By driving or quieting the activity of defined neurons embedded with an intact neural network, Boyden and his colleagues are able to determine what behaviors, neural computations, or pathologies those neurons were sufficient to cause or what brain functions, or pathologies, these neurons are necessary for.
    These tools are also being explored as components of neural control prosthetics capable of correcting neural circuit computations that have gone awry in brain disorders. Part of a systematic approach to neuroscience that is empowering new therapeutic strategies for neurological and psychiatric disorders, optogenetic tools are widely accepted as one of the technical advances of the decade, and could one day be used to treat neurological disorders such as Parkinsons.
    Using primary sources and his own experiences at Stanford, Boyden reconstructs a compelling case study of the development of optogenetic tools, providing an insight into the hard work and serendipity involved.
    Provided by Faculty of 1000: Biology and Medicine
  21. Anonymous Member

    A flash of insight

    April 29, 2011 By Anne Miller, Binghamton University
    aflashofinsi.jpg
    Enlarge
    Imagine never having seen a car before and trying to determine what makes the vehicle run. That’s how Christof Grewer begins to explain his research on tiny proteins in the brain.
    “We would be interested in seeing what happens when the car is moving, and we’d take pictures of that,” he says. “We’d see the pistons moving, and that would be the beginning of understanding.”
    Grewer, a biophysical chemist at Binghamton University, studies glutamate transport proteins, miniscule components of our brains that move glutamate among cells. Glutamate, an important molecule in cellular metabolism, is also a neurotransmitter.
    Scientists know the transport proteins are important, and they know they move glutamate in and out of cells through a sort of door in the cell wall, known as a glutamate transporter. But exactly how the proteins trigger those doors in the cell wall, and what makes them move glutamate to the inside or outside of a cell, is unknown.
    Learning how those triggers function could have major implications for human health. For example, during a stroke, when blood and oxygen to the brain are restricted, brain cells release glutamate into the space surrounding them. That starts a toxic chain that can kill brain cells and harm certain brain functions.
    Knowing how the glutamate molecules are transported through cell walls could one day lead to drugs that help or halt the transport.
    Grewer — one of perhaps two dozen researchers in the world who work on this problem — switches analogies as he continues describing the way these proteins move. Now he’s talking about a tall building.
    “People are transported in an elevator,” he says. “So in order for that to work, the door of the elevator has to open, and then the person has to step into the elevator. And then the elevator brings you to a higher floor, and then the door has to open, and the person has to walk out.”
    In this case, glutamate molecules are the people. The elevator cars are the glutamate transporters. And the electricity and wires that move elevator doors are — well, that’s what he’s trying to figure out.
    Grewer’s brainstorm was to create a method that uses lasers to trigger the transports’ action. By controlling when the movement happens, he can document it.
    It all goes back to his analogy of photographing a car’s pistons. Taking snapshots may illuminate how the transporters and glutamate molecules work together.
    Scientific serendipity
    Grewer stumbled onto the glutamate transporters.
    When he was a graduate student in physical chemistry at Johann Wolfgang Goethe-University in Frankfurt, Germany, his research focused on chemistry and light. His introduction to biochemistry — and to glutamate receptors — came during a post-doctoral fellowship at Cornell University.
    “We were trying to activate these receptors on a very fast time scale,” he says. “It’s not that easy to do.”
    His background in chemistry and physics brought fresh insight to the lab. What if, he thought, a flash of light could help trigger the transport process? By timing the reactions, the researchers could better capture what happens during the glutamate transfer.
    “They were so interesting to me that I just had to stay with them,” Grewer says of glutamate transporters. “I thought, that is just the most amazing thing to study.”
    Most biochemical research on the brain focuses on possible cures, says Peter Larsson of the University of Miami. Many researchers experiment with known drugs to judge their effect on brain function.
    “In most proteins, and in biology these days, we know the genetic code, and we know what the DNA looks like, and we know how many proteins you have in your body,” Larsson says. “But we don’t really know how these proteins work, how they function.”
    What sets Grewer apart in this small community of researchers? “He’s pioneering using lasers,” Larsson says. “It had been used on other types of proteins, but nobody has used it in this type of study.”
    Blending research, teaching
    Grewer took his studies back to Germany for a few years before accepting a post at the University of Miami School of Medicine.
    “In the medical school community, there is more interest in the neuroscience,” Grewer says of his time in Miami. But he didn’t teach much, and he missed working with undergraduates.
    At Binghamton, Grewer teaches every semester.
    Donald Nieman, dean of the Harpur College of Arts and Sciences at Binghamton, says Grewer’s arrival in 2008 also created opportunities for interdisciplinary collaborations in biology and chemistry. “While the research Christof does is very specific and doesn’t replicate what others are doing,” Nieman says, “the basic science and techniques he is using mesh nicely with the work of several faculty members.”
    Grewer’s research, which is supported by the National Institutes of Health, is painstaking and full of dead ends. Results are years, and possibly decades, in the making. Frustration comes easily.
    But teaching tempers that frustration, Grewer says.
    “With the teaching, you see the outcome much more quickly,” he says. “When you give a lecture and have a student later come to you with a question and say, ‘This is the first time I’ve ever really understood that’ — that’s a very gratifying feeling that you don’t often have in the research.
    “Teaching gives you the strength to keep going with the research.”
    Provided by Binghamton University
  22. Anonymous Member

    Fuck off with logic already, I want to hear about the discovery of fire by LRH in ye olden days.
    • Like Like x 2
  23. Anonymous Member

    Lost in translation: Scientist studies the neural origins of speech disorders

    April 20, 2011 By Heather Wuebker
    lostintransl.jpg
    Enlarge
    “The act of speech involves coordination between auditory and motor functions in the brain,” says Greg Hickok, director of UCI’s Center for Cognitive Neuroscience. “Depending on exactly how the process misfires, the result can be speech errors, stuttering or auditory hallucinations.” Credit: Steve Zylius / University Communications
    It can be heart-wrenching to watch a loved one try to verbally express him- or herself after suffering stroke-induced brain damage known as conduction aphasia.
    The disorder produces lesions that interfere with the neurological process of translating thought into speech, says UC Irvine cognitive neuroscientist Greg Hickok, and the interference is believed to occur in the Sylvian fissure dividing the brain’s parietal and temporal lobes.
    The same region, he says, could help explain why some people stutter and how schizophrenics can misinterpret their internal thoughts as external voices.
    In December, Hickok received a five-year, $3.2 million renewal grant from the National Institutes of Health to support his continued research on how neural abnormalities affect speech and language in stroke victims. The award supplements the $6.1 million he has already gotten to advance understanding of the brain’s role in speech and how abnormalities can inhibit this process.
    “The act of speech involves coordination between auditory and motor functions in the brain,” Hickok says. “This is obvious in visuomotor tasks like reaching for a cup, where we use visual information about its shape and location to guide our reach. It’s less obvious in language, but studies have shown that in the same way, a word’s sound guides our speech.”
    The director of UCI’s Center for Cognitive Neuroscience, Hickok first began seeing this in action at a neural level 10 years ago when utilizing fMRI to study brain processes related to speech production. He noticed that, in addition to the expected motor regions, auditory areas of the brain “lit up,” or activated, when people named pictures – even if they thought about but didn’t actually vocalize the words for them.
    “Stroke-based research found that these activations reflected the critical involvement of auditory areas in speaking. When these regions are damaged, patients tend to struggle to come up with words, and when they do speak, they make a lot of errors,” says Hickok, professor of cognitive sciences.
    He has since been using fMRI and stroke-based methods to zero in on the Sylvian parietal-temporal region of the brain, in which he believes the regulation of auditory and motor processes occurs.
    “In people with schizophrenia or aphasia and those who stutter, the coordination between perception and production is dysfunctional, and it appears to be happening in the SPT region,” Hickok says. “Depending on exactly how the process misfires, the result can be speech errors, stuttering or auditory hallucinations.”
    With his renewed funding, he’ll further study both SPT mechanics and speech perception as a whole. While it’s generally accepted in the cognitive neuroscience community that auditory and motor functions work together, Hickok explains, the details are not well understood.
    Continued NIH support has been helping him close this knowledge gap. In a paper published Feb. 10 in Neuron, Hickok presented a new model of how these two processes operate, illustrating the SPT zone’s role in translating auditory perception into motor output.
    He has also created a multi-university consortium for aphasia research. Hickok’s hope is that by conducting studies and sharing findings, he and others will contribute to better therapies for people with brain damage, lesions or neural abnormalities.
    Provided by University of California, Irvine (news : web)
    img-dot.gif img-dot.gif img-dot.gif img-dot.gif img-dot.gif img-dot.gif Tweet

    view popular send feedback to editors
    not rated yet

    Filter



    Move the slider to adjust rank threshold, so that you can hide some of the comments.
    Display comments: newest first



    WeStutterNSA
    Apr 21, 2011

    Rank: not rated yet
    Help for Children and Adults Who Stutter: For 35 years the National Stuttering Association (NSA), which is the largest self-help non-profit organization for people who stutter in the country, has connected kids and adults who stutter to other kids and adults who stutter through local chapter meetings, workshops, on-line support groups and annual conferences in which over 600 people who stutter attend each year! They also offer tons of great brochures, pamphlets and other reference tools for both people who stutter and professionals. To learn more, pls contact them at: westutter (dot) org, info (at) westutter (dot) org
  24. Anonymous Member

    Tinnitus caused by too little inhibition of brain auditory circuits, study says

    April 18, 2011

    Tinnitus, a relentless and often life-changing ringing in the ears known to disable soldiers exposed to blasts, unwary listeners of too-loud music and millions of others, is the result of under-inhibition of key neural pathways in the brain's auditory center, according to scientists at the University of Pittsburgh School of Medicine in this week's early online edition of the Proceedings of the National Academy of Sciences. The discovery, which used a new technique to image auditory circuits using slices of brain tissue in the lab, points the way to drug development and effective treatment for a condition that currently has no cure.
    Prior research has shown that auditory circuits in the brain are more excitable in tinnitus sufferers, but until now it has not been clear whether that is due to hyperactivity of excitatory neural pathways, reduced activity of inhibitory ones, or a bit of both, explained senior investigator Thanos Tzounopoulos, Ph.D., assistant professor of otolaryngology and neurobiology, Pitt School of Medicine.
    "This auditory imbalance leaves the patient hearing a constant ringing, buzzing or other irritating noise even when there is no actual sound," he said. "Tinnitus drowns out music, television, co-workers, friends and family, and it profoundly changes how the patient perceives and interacts with the world."
    According to the American Tinnitus Association, tinnitus is the most common service-connected disability among veterans of the Iraq and Afghanistan conflicts. Of the 50 million who have experienced it, 16 million have symptoms severe enough to seek medical attention and 2 million tinnitus sufferers are unable to carry out day-to-day activities.
    To identify what goes wrong in the brain's auditory circuits, Dr. Tzounopoulos' team created tinnitus in a mouse model. While the rodent was sedated, one ear was exposed to 45 minutes of 116 decibel (dB)-sound, equivalent to an ambulance siren. Intense noise exposure is thought to lead to damage in the cochlea, an inner ear structure critical to the neural transmission of sound waves, and clinically undetectable hearing loss.
    Several weeks later, the scientists confirmed the exposed mice had tinnitus by conducting startle experiments in which a continuous, 70dB tone was played for a period, then stopped briefly and then resumed before being interrupted with a much louder pulse.
    Mice with normal hearing could perceive the gap and, because they were aware something had changed, were less startled than mice with tinnitus, whose ear ringing masked the moment of silence in between the background tones.
    The scientists then sought to determine what had gone wrong in the balance of excitation and inhibition of the auditory circuits in the affected mice. They established that an imaging technique called flavoprotein autofluorescence (FA) could be used to reveal tinnitus-related hyperactivity in slices of the brain. Experiments were performed in the dorsal cochlear nucleus (DCN), a specialized auditory brain center that is crucial in the triggering of tinnitus. FA imaging showed that the tinnitus group had, as expected, a greater response than the control group to electrical stimulation. Most importantly, despite local stimulation, DCN responses spread farther in the affected mice.
    Dr. Tzounopoulos' new experimental approach has resolved why tinnitus-affected auditory centers show increased responsiveness. After administering a variety of agents that block specific excitatory and inhibitory receptors and seeing how the brain center responded, his team determined that blocking an inhibitory pathway that produces GABA, an inhibitory neurotransmitter, enhanced the response in the region surrounding the DCN in the control brain slices more so than it did in the tinnitus slices.
    "That means the DCN circuits are already 'disinhibited,' or blocked, in tinnitus," Dr. Tzounopoulos explained. "We couldn't block inhibition anymore to elevate the evoked response, like we could in the normal brain. And, when we blocked another inhibitory circuit mediated by the neurotransmitter glycine, or when we blocked excitatory pathways, there was no difference in the responses between the groups."
    This means that agents that increase GABA-mediated inhibition might be effective treatments for tinnitus, he added. Dr. Tzounopoulos' team is now trying to identify such drugs.
    Provided by University of Pittsburgh
  25. Anonymous Member

    SHUT UP SHUT UP SHUT UP

    Scientology can help you with something. Not sure what, though.
  26. Anonymous Member

    Anti-depressants boost brain cells after injury in early studies

    April 18, 2011

    Anti-depressants may help spur the creation and survival of new brain cells after brain injury, according to a study by neurosurgeons at the University of Rochester Medical Center.
    Jason Huang, M.D., and colleagues undertook the study after noticing that patients with brain injuries who had been prescribed anti-depressants were doing better in unexpected ways than their counterparts who were not taking such medications. Not only did their depression ease; their memory also seemed improved compared to patients not on the medication.
    "We saw these patients improving in multiple ways – their depression was improved, but so were their memory and cognitive functioning. We wanted to look at the issue more, so we went back to the laboratory to investigate it further," said Huang, associate professor of Neurosurgery and chief of Neurosurgery at Highland Hospital, an affiliate of the University of Rochester Medical Center.
    The team's findings were published online recently in the Journal of Neurotrauma.
    Huang said many patients who have a traumatic brain injury also experience depression – by some estimates, half of such patients are depressed. Doctors aren't sure whether the depression is a byproduct of the sudden, unfortunate change in circumstances that patients find themselves in, or whether the depression is a direct consequence of brain damage.
    Previous research by other groups indicated that anti-depressants help generate new brain cells and keep them healthy in healthy animals. That, together with the experience of his patients, led Huang to study the effects of the anti-depressant imipramine (also known as Tofranil) on mice that had injuries to their brains.
    Scientists found that imipramine boosted the number of neurons in the hippocampus, the part of the brain primarily responsible for memory. By one measure, mice treated with imipramine had approximately 70 percent more neurons after four weeks than mice that did not receive the medication.
    That change was borne out on behavioral tests as well. The team tested mice by using what scientists call a novel object recognition test. Like human infants, mice tend to spend more time sizing up objects that they haven't encountered before – or don't remember encountering – than they do objects that they've seen before. This gives scientists a way to measure a mouse's memory.
    The team found that mice that had been treated with imipramine had a better memory. They were more likely to remember objects they had seen previously and so spent more time exploring truly novel objects, compared to mice that did not receive the compound.
    The benefits did not extend to the motor skills of the mice – a finding that parallels what neurosurgeons like Huang have seen in their patients on anti-depressants, who don't show improved mobility after use of the medications.
    Scientists aren't sure whether the drug helps spur the creation of more new neurons, or whether it helps newly created neurons survive – or both. Some of the team's evidence indicates that the drug helps immature stems evolve into useful cells such as neurons and astrocytes, and to travel to the exact areas of the brain where they're needed.
    In addition to sorting out those questions, investigators will try to identify the molecular pathway that prompts the brain to create more neurons in response to anti-depressants. The team suspects that a molecule known as BDNF or brain-derived neurotrophic factor may play a role.
    Huang notes that one of his mentors, co-author Douglas H. Smith, M.D., of the University of Pennsylvania, has found that a brain injury itself also seems to prompt the brain to create more brain cells, perhaps as a way to compensate for injury.
    "The brain has an intrinsic mechanism to repair itself to a certain extent," said Huang. "Our goal is to learn more about that mechanism and improve it, to help patients recover even more brain function than they can now, even with extensive work and rehabilitation."
    Some of Huang's work is based on his experiences treating soldiers and civilians while working for four months as a neurosurgeon with the U.S. Army Reserve in Iraq, as well as more than a decade of experience treating patients affected by incidents like motor vehicle accidents.
    He said that traumatic brain injury – an injury experienced by approximately 1.4 million Americans each year – must be treated aggressively. Often this involves surgery to relieve pressure on the brain, other procedures to protect the brain against immediate further injury, and then rehabilitation for months or years.
    "It's exciting that the study involves a drug that is already safe and approved by FDA and is used clinically. If we could add a medication to the treatment regimen – even a slight improvement would be a big gain for these patients. It's our hope that the work will ultimately make a difference in patient care," added Huang, who is also a scientist in the Center for Neural Development and Disease.
    Provided by University of Rochester Medical Center (news : web)
  27. Anonymous Member

    Researchers link alcohol-dependence impulsivity to brain anomalies

    April 15, 2011

    Researchers already know that alcohol dependence (AD) is strongly associated with impaired impulse control or, more precisely, the inability to choose large, delayed rewards rather than smaller but more immediate rewards. Findings from a study using functional magnetic resonance imaging (fMRI) to investigate the neural basis of impulsive choice among individuals with alcohol use disorders (AUDs) suggest that impulsive choice in AD may be the result of functional anomalies in widely distributed but interconnected brain regions that are involved in cognitive and emotional control.
    Results will be published in the July 2011 issue of Alcoholism: Clinical & Experimental Research and are currently available at Early View.
    "Individuals with AD score higher on questionnaires
that measure impulsivity – for example, 'I act without thinking' – are less
able to delay gratification, and are less able to inhibit responses," said Eric D. Claus, a research scientist with The Mind Research Network and first author of the study.
    Given that impulsive choice in AUDs has been associated with impairment of
frontal cortical systems involved in behavioral control, Claus explained, this study was designed to examine the neural correlates of one specific aspect of
impulsivity, the ability to delay immediate gratification and instead
choose rewards in the future.
    "We investigated this choice process in
individuals with alcohol use problems ranging from alcohol abuse to
severe AD that required treatment," said Claus. "This is the
largest study to date that has investigated the neural correlates of
impulsive choice in AD, which enabled us to examine the full range of
AUDs instead of only examining extreme group
differences."
    Claus and his colleagues examined 150 individuals (103 males, 47 females) with various degrees of alcohol use. All of the participants completed a delay discounting task – during which two options were presented, a small monetary (e.g., $10) reward available immediately or a larger monetary reward (e.g., $30) available in time (e.g., two weeks) – while undergoing fMRI. Impulsive choice was defined as the selection of the more immediate option.
    "We showed two things," said Claus. "We replicated previous research by showing that AUD 
severity was associated with a greater tendency to discount future
rewards. In addition, we showed that when individuals with more
severe AUDs did delay gratification, they engaged the insula and
supplementary motor area – regions involved in emotional processing and
response conflict – to a greater degree than individuals with less
severe AUDs. In summary, these findings suggest that the
dysfunction in these regions is graded and increases as a function of
AUD severity, rather than operating as an all-or-none function."
    "This work showed that the brains of alcoholics don't behave all that differently from the brains of non-alcoholics during delay discounting but that the alcoholic brain had to work harder when they chose the delayed reward," said Daniel W. Hommer, chief of the Section of Brain Electrophysiology & Imaging at the National Institute on Alcohol Abuse and Alcoholism. "Many different studies have shown similar results, that is, alcoholics have a greater increase in brain blood flow to perform the same task as non-alcoholics."
    "The current study suggests that
the neural dysfunction underlying impulsive choice seems to increase
with AD severity," added Claus. "Now that we know that this neural dysfunction is associated with
impulsivity, the next steps are to determine whether this impulsivity
predates the onset of AD and whether neural measures of impulsivity
can predict who will respond best to particular types of treatment. Further, the particular neural dysfunction that we
observed indicates that individuals with more AD may be more impulsive
because their brain is aversive to delay gratification, and not because it is
rewarding to be impulsive. Clinicians might need to deal directly
with the aversion of choosing future benefits over immediate ones."
    "The most important thing about this paper is that it leads you to question what people mean by impulsive behavior and how should it be measured," said Hommer. "The field has defined increased discounting of time – failure to delay gratification – as a good measure of impulsiveness, but the results reported in this paper say 'Wait a minute, delay discounting does not correspond to what is usually meant by impulsiveness.' Rather, brain activity during a delay discounting task looks more like how the brain responds during conflicted decision-making than it does during rapid, unconflicted choice of a highly valued goal." Hommer added that this sort of debate is important to researchers, forcing them to think more carefully about what they mean by impulsive choice.
    Provided by Alcoholism: Clinical & Experimental Research
  28. Anonymous Member

    Scientists discover 'thunder' protein that regulates memory formation

    April 14, 2011

    Researchers at Johns Hopkins have discovered in mice a molecular wrecking ball that powers the demolition phase of a cycle that occurs at synapses — those specialized connections between nerve cells in the brain — and whose activity appears critical for both limiting and enhancing learning and memory.
    The newly revealed protein, which the researchers named thorase after Thor, the Norse god of thunder, belongs to a large family of enzymes that energize not only neurological construction jobs but also deconstruction projects. The discovery is described in the April 15 issue of Cell.
    "Thorase is vital for keeping in balance the molecular construction-deconstruction cycle we believe is required for memory formation," explains Valina Dawson, professor of neurology and neuroscience in the Johns Hopkins Institute of Cell Engineering. "It's a highly druggable target, which, depending on whether you enhance or inactivate it, may potentially result in new treatments for autism, PTSD, and memory dysfunction."
    The enzyme is one of many AAA+ ATPases that drive the assembly of proteins needed to form specialized receptors at the surfaces of synapses. These receptors are stimulated by neighboring neurons, setting up the signaling and answering connections vital to brain function. The Hopkins team showed how thorase regulates the all-important complementary process of receptor disassembly at synapses, which ultimately tamps down signaling.
    Prolonged excitation or inhibition of these receptors — due to injury, disease, genetic malfunction or drugs — has been implicated in a wide array of learning and memory disorders.
    "Change in the strength of the connections between two nerve cells forms the basis of our ability to learn and remember," Dawson says. This phenomenon, called synaptic plasticity, depends upon a balanced alternation of excitation and inhibition of receptors, she adds.
    Using a powerful microscope to look at labeled neurons from the brains of mice, the scientists saw that thorase was concentrated in the synaptic regions of cells, leading them to focus studies on the protein interactions that happen there.
    First, they cut a protein aptly called GRIP1 — it acts as scaffolding to hold GluR2 receptors to the surface — into various chunks and combined it with thorase. Encouraged by the fact that thorase and the GRIP1 scaffold did indeed bind tightly, they teased out the physiology of that interaction in the presence of lots of thorase and then no thorase.
    They discovered that the more thorase, the quicker the scaffolding deconstructed and the faster the surface receptors decreased. Thorase causes GluR2 receptors and GRIP1 to release their hold on each other, and therefore the receptor's grip at the surface of the synapse, they concluded.
    To see if the deconstruction of the protein complex had any effect on nerve-signaling processes, they again used cells to record receptor activity by measuring electric currents as they fluxed through cells with and without thorase. In the presence of extra thorase, surface receptor expression was decreased, resulting in reduced signaling.
    Next, the team measured the rates of receptor recycling by tagging the protein complex with a fluorescent marker. It could then be tracked as it was subsequently reinserted back into the surface membrane of a cell. In cells in which thorase was knocked out, there was very little deconstruction/turnover compared to normal cells. The scientists reversed the process by adding back thorase.
    Finally, the team conducted a series of memory tasks in order to compare the behaviors of normal mice with those genetically modified to lack thorase. When the animals lacking thorase were put into a simple maze, their behaviors revealed they had severe deficits in learning and memory.
    "Mice lacking thorase appear to stay in a constant state of stimulation, which prevents memory formation," Dawson explains. "Their receptors get up to the membrane where they are stimulated, but they aren't being recycled if thorase isn't present. If thorase doesn't stop the excitation by recycling the receptor, it continues on and has deleterious effects."
    More information: Cell: http://www.cell.com/current
    Provided by Johns Hopkins Medical Institutions
  29. Anonymous Member

    Rising star of brain found to regulate circadian rhythms

    April 14, 2011

    The circadian system that controls normal sleep patterns is regulated by a group of glial brain cells called astrocytes, according to a study published online on April 14th in Current Biology, a Cell Press publication. Neuroscientists from Tufts University School of Medicine found that disruption of astrocyte function in fruit flies (Drosophila) led to altered daily rhythms, an indication that these star-shaped glial cells contribute to the control of circadian behavior. These results provide, for the first time, a tractable genetic model to study the role of astrocytes in circadian rhythms and sleep disorders.
    According to the National Institute of Neurological Disorders and Stroke, more than 40 million Americans suffer from sleep disorders. Some sleep disorders arise from changes to the internal clock that is modulated by environmental signals, including light. Biologically, the internal clock is known to be composed of a network of neurons that controls rhythmic behaviors. Rob Jackson and his team previously had found that normal circadian rhythms require a glial-specific protein. In the new study, the team demonstrates that glia, and particularly astrocytes, are active cellular elements of the neural circuit that controls circadian rhythms in the adult brain.
    "This is significant because glia have been traditionally viewed as support cells rather than independent elements that can regulate neurons and behavior. Neurons have had center stage for some time but current research is establishing the role of glial cells in brain function," said Rob Jackson, PhD, professor of neuroscience at Tufts University School of Medicine (TUSM) and member of the genetics and neuroscience program faculties at the Sackler School of Graduate Biomedical Sciences at Tufts. Jackson is also the director of the Center for Neuroscience Research (CNR) at TUSM.
    "We used cellular and molecular genetic techniques to manipulate glial cells in the adult brain of fruit flies and found that such cells regulate neurons of the circadian network and behavior" said first author Fanny Ng, PhD, a postdoctoral associate in the Jackson lab. Ng added, "this is the first study to show that glia can modulate the release of a neuronal factor that is essential for normal circadian behavior."
    Jackson's team observed altered rhythms in locomotor activity with glial manipulations, an indication the circadian clock had been disrupted, which in humans can contribute to jet lag or serious sleep disorders.
    "In order to develop treatments for these disorders, we need to understand their cellular and molecular bases. Our work suggests that Drosophila can serve as a model system for genetic and molecular approaches to understand astrocyte function and astrocyte-neuron interactions. This undoubtedly will contribute to a better understanding of sleep and other neurological disorders that result from circadian dysfunction," said Jackson.
    More information: Ng FS, Tangredi MM, and Jackson FR. Current Biology. "Glial cells physiologically modulate clock neurons and circadian behavior in a calcium-dependent manner." DOI 10.1016/j.cub.2011.03.027
    Provided by Tufts University
  30. Anonymous Member

    Blood vessel simulation probes secrets of brain

    April 14, 2011 By Louise Lerner
    bloodvessels.jpg
    Enlarge
    Newer, faster supercomputers have allowed scientists to create detailed models of blood flow that help doctors understand what happens at the molecular level. Credit: Flikr
    (PhysOrg.com) -- Zoom down to one artery in your body, and the commotion is constant: blood cells hurtle down the passage with hundreds of their kin, bumping against other cells and the walls as they go. The many variables -- and the sheer immensity of the human circulatory system—have kept scientists from closely documenting the rough-and-tumble life inside blood vessels.
    This is an area of science called "biophysics", for the forces that govern red blood cells' movements at this level are best described by the laws of physics and can be mapped with mathematics. That's exactly what a team of scientists from Brown University led by G. E. Karniadakis and the U.S. Department of Energy's (DOE) Argonne National Laboratory are doing on the lab's supercomputer, hoping that a better map will lead to better diagnoses and treatments for patients with blood flow complications.
    Though we've come a long way from the ancient Greeks, who believed blood came from the liver, there's a surprising amount that we don't know about blood. Newer, faster supercomputers have allowed scientists to create detailed models of blood flow that help doctors understand what happens at the molecular level and, consequently, how heart and blood diseases can be treated.

    This video is not supported by your browser at this time.
    Argonne's Blue Gene/P supercomputer, housed at the Argonne Leadership Computing Facility (ALCF), allows scientists to tackle these immense problems with the power of 500 trillion calculations per second.
    One part of the study is mapping exactly how red blood cells move through the brain. For example, last year the team used similar modeling to discover that the malaria parasite makes its victims' red blood cells 50 times stiffer than normal.
    Healthy red blood cells are smooth and elastic; they need to squeeze and bend through tiny capillaries to deliver blood to all areas of the brain. But malaria-infected cells stiffen and stick to the walls, creating blockages in arteries and vessels. Malaria victims die because their brain tissues are deprived of oxygen. A more complete picture of how blood moves through the brain would allow doctors to understand the progression of diseases that affect blood flow, like malaria, diabetes and HIV.
    "Previous computer models haven't been able to accurately account for, say, the motion of the blood cells bending or buckling as they ricochet off the walls," said Joe Insley, a principal software developer at Argonne who is working with the team. "This simulation is powerful enough to incorporate that extra level of detail."
    Another part of the study seeks to understand the relationship between cerebrospinal fluid and blood flow in the brain. "Blood vessels expand if blood pressure is high; and since they are located between brain tissues, this can put dangerous pressure on the brain," said Leopold Grinberg, a Brown University scientist on the team. In healthy people, spinal fluid can drain to relieve pressure on brain tissues, but occasionally the system breaks down—leaving the brain vulnerable to damage.
    "Understanding how the system interacts will allow us to more accurately treat the problem," Grinberg said.
    But before the simulations are even run, there's a hurdle that researchers must face.
    It is a peculiarity of large computers that code for one computer doesn't always work well on another. A code written for a computer with two cores—what's probably in your home computer—doesn't translate well into a computer that has 160,000 cores, as Argonne's Blue Gene/P does.
    "I liken it to driving the family car on the Daytona 500 racetrack," explained Michael Papka, deputy associate laboratory director for Computing, Environment and Life Sciences at Argonne. "What may be suitable for driving around town isn't designed for high-speed racing. The ALCF staff helps the researchers rework their code for optimal performance on the big machines."
    For example, each core, performing its own small slice of the work, has to transmit its data to other cores once it finishes a particular task. If the work isn't equally distributed, some cores might finish earlier than others and sit idle as they wait for the others to catch up. Or if the network connecting them isn't well managed, the transmission of data might slow down the whole process.
    Because each supercomputer is individually designed, the Blue Gene/P's architecture is different from other supercomputers.
    "For example, one of the Blue Gene/P's strengths is good interconnects," said Vitali Morozov, a computational scientist at the ALCF. "The cores are beautifully arranged, and if you know how to use them it's very efficient—but it's tricky." Thus, to get the best performance out of the machine, the code has to be tuned to fit the computer.
    The team was allotted 50 million processor-hours on the Blue Gene/P through DOE's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. INCITE is a DOE program supported by the Office of Science's Office of Advanced Scientific Computing Research that provides access to computing power and resources to support computationally intensive, large-scale research projects to researchers from industry, academia and government research facilities.
    Provided by Argonne National Laboratory (news : web)
  31. Anonymous Member

    Neurons play role in controlling innate immunity in presence of pathogens

    April 13, 2011 by Mary Jane Gore

    There is finally definitive proof in a preclinical study published in Science on April 7 about which sensory neurons control innate (inborn and immediate) immunity in a pathogen’s presence.
    “Our studies have implications for the understanding of the effect of anger or acute stress on immunity, because acute stress suppresses immunity,” said senior author Alejandro Aballay, Ph.D. , Associate Professor in Molecular Genetics and Microbiology. “Our discovery -- that inactivation of a receptor for a noradrenaline-like molecule that is involved in the arousal response results in enhanced immunity -- suggests that there is a strong selective advantage to suppressing the immune response during stress.”
    The OCTR-1 receptor in the two neurons in question (ASH and ASI) is a catecholamine receptor (G-protein-coupled catecholamine receptor) similar to vertebrate adrenergic receptors, Aballay said. “OCTR-1 functions in sensory neurons that have the potential to sense pathogens or molecules related to inflammation, suggesting that host catecholamines regulate immune responses in response to changes in the surrounding environment.”
    In addition, the scientists found that the nervous system controls a family of genes that are part of a surprising unfolded protein response (UPR) pathway that is controlled by a different receptor, an apoptotic receptor that has the potential to sense the damage caused by infecting microorganisms. Aballay said that this work dovetails with the concept of endurance – how a body learns to co-exist with a pathogen, by igniting not only immune pathways that keep the pathogen in check but also pathways that control the damage induced by the infection.
    In this study, exposure to the dangerous bacterium Pseudomonas aeruginosa up-regulates genes to levels comparable to those induced by a well-known stressor of the endoplasmic reticulum (ER), a drug called tunicamycin. This study is the first direct demonstration that a bacterial infection can activate a non-canonical UPR pathway to alleviate the ER stress that occurs in the cells during innate immune response against bacterial infections, Aballay said. “We have been able to take advantage of the simple and well-studied nervous and immune systems of C. elegans to demonstrate that two neurons, which have the potential to sense disease, regulate conserved innate immunity pathways and control the organismal response to bacterial infections,” Aballay said.
    Recent mouse studies show that the nervous system, through the animal organ that can detect pheromones, for example, has the potential to “smell” molecules related to disease or inflammation in the outside world. Given the complexity of the mammalian nervous and immune systems, the role that such a “smelling” mechanism may have on a host’s response to infections remains unknown, but will be a future area for study, Aballay said.
    Provided by Duke University (news : web)
  32. Anonymous Member

    Magnetic fields prevent editor from talking (w/ video)

    April 12, 2011 by Lisa Zyga img-dot.gif
    tms.jpg
    Enlarge
    In this image from the video below, New Scientist editor Roger Highfield demonstrates the impact of TMS. Image credit: New Scientist.
    (PhysOrg.com) -- By holding an electromagnet close to a person’s skull, researchers can alter the neuron activity in the person’s brain. This technique, called transcranial magnetic stimulation (TMS), can be used for a variety of reasons, such as improving visual memory, impairing the brain’s activity to make moral judgments, and treating ADHD and severe depression. To demonstrate the kind of immediate and powerful impact that TMS can have, New Scientist editor Roger Highfield tried to recite the nursery rhyme "Humpty Dumpty," but found that his speech was interrupted by a magnetic field.
    In the video below, Vincent Walsh from the Institute of Cognitive Neuroscience at University College London uses magnets to turn off the speech center in Highfield’s brain for a fraction of a second. Walsh also demonstrates the method on himself.

    TMS inhibits the speech center in New Scientist editor Roger Highfield's brain. The loud clicking sounds are caused by rapid deformation of the TMS coil. Video credit: New Scientist.
    As this demonstration implies, TMS is generally considered to be safe. Although there have been a few cases of fainting and seizures, the risk is considered very low.
    When TMS is applied to most areas of the brain, participants do not consciously experience any effect, although their behavior changes. One exception is that, when TMS is applied to the visual cortex, participants may see flashes of light.
    Walsh and his colleagues are investigating how TMS can be used to treat migraines and strokes. As he explains in the video, sometimes migraines are caused by too much activity in the visual brain area, and sometimes by too little activity. TMS could potentially balance this activity out. If a person feels a migraine coming on, they could put electrodes on their head that provide very small currents to the brain to reduce pain for up to 90 minutes at a time.
    More information: via: New Scientist and PopSci
  33. Anonymous Member

  34. Anonymous Member

    Mapping the brain: New technique poised to untangle the complexity of the brain

    April 10, 2011
    attwellbrain.jpg
    The gold colour shows information superhighways in the brain: the gold is a protein making up myelin, which speeds the conduction of electrical signals along nerve cells, allowing us to think more quickly. Courtesy of Professor David Attwell (UCL Neuroscience, Physiology & Pharmacology)
    (PhysOrg.com) -- Scientists have moved a step closer to being able to develop a computer model of the brain after developing a technique to map both the connections and functions of nerve cells in the brain together for the first time.
    A new area of research is emerging in the neuroscience known as 'connectomics'. With parallels to genomics, which maps the our genetic make-up, connectomics aims to map the brain's connections (known as 'synapses'). By mapping these connections – and hence how information flows through the circuits of the brain – scientists hope to understand how perceptions, sensations and thoughts are generated in the brain and how these functions go wrong in diseases such as Alzheimer's disease, schizophrenia and stroke.
    Mapping the brain's connections is no trivial task, however: there are estimated to be one hundred billion nerve cells ('neurons') in the brain, each connected to thousands of other nerve cells – making an estimated 150 trillion synapses. Dr Tom Mrsic-Flogel, a Wellcome Trust Research Career Development Fellow at UCL (University College London), has been leading a team of researchers trying to make sense of this complexity.
    "How do we figure out how the brain's neural circuitry works?" he asks. "We first need to understand the function of each neuron and find out to which other brain cells it connects. If we can find a way of mapping the connections between nerve cells of certain functions, we will then be in a position to begin developing a computer model to explain how the complex dynamics of neural networks generate thoughts, sensations and movements."
    Nerve cells in different areas of the brain perform different functions. Dr Mrsic-Flogel and colleagues focus on the visual cortex, which processes information from the eye. For example, some neurons in this part of the brain specialise in detecting the edges in images; some will activate upon detection of a horizontal edge, others by a vertical edge. Higher up in visual hierarchy, some neurons respond to more complex visual features such as faces: lesions to this area of the brain can prevent people from being able to recognise faces, even though they can recognise individual features such as eyes and the nose, as was famously described in the book The Man Who Mistook Wife for a Hat by Oliver Sachs.
    In a study published online today in the journal Nature, the team at UCL describe a technique developed in mice which enables them to combine information about the function of neurons together with details of their synaptic connections.
    The researchers looked into the visual cortex of the mouse brain, which contains thousands of neurons and millions of different connections. Using high resolution imaging, they were able to detect which of these neurons responded to a particular stimulus, for example a horizontal edge.
    Taking a slice of the same tissue, the researchers then applied small currents to a subset of neurons in turn to see which other neurons responded – and hence which of these were synaptically connected. By repeating this technique many times, the researchers were able to trace the function and connectivity of hundreds of nerve cells in visual cortex.
    The study has resolved the debate about whether local connections between neurons are random – in other words, whether nerve cells connect sporadically, independent of function – or whether they are ordered, for example constrained by the properties of the neuron in terms of how it responds to particular stimuli. The researchers showed that neurons which responded very similarly to visual stimuli, such as those which respond to edges of the same orientation, tend to connect to each other much more than those that prefer different orientations.
    Using this technique, the researchers hope to begin generating a wiring diagram of a brain area with a particular behavioural function, such as the visual cortex. This knowledge is important for understanding the repertoire of computations carried out by neurons embedded in these highly complex circuits. The technique should also help reveal the functional circuit wiring of regions that underpin touch, hearing and movement.
    "We are beginning to untangle the complexity of the brain," says Dr Mrsic-Flogel. "Once we understand the function and connectivity of nerve cells spanning different layers of the brain, we can begin to develop a computer simulation of how this remarkable organ works. But it will take many years of concerted efforts amongst scientists and massive computer processing power before it can be realised."
    The research was supported by the Wellcome Trust, the European Research Council, the European Molecular Biology Organisation, the Medical Research Council, the Overseas Research Students Award Scheme and UCL.
    "The brain is an immensely complex organ and understanding its inner workings is one of science's ultimate goals," says Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust. "This important study presents neuroscientists with one of the key tools that will help them begin to navigate and survey the landscape of the brain."
    Provided by Wellcome Trust (news : web)
  35. Anonymous Member

    Human taste cells regenerate in a dish

    April 6, 2011

    Following years of futile attempts, new research from the Monell Center demonstrates that living human taste cells can be maintained in culture for at least seven months. The findings provide scientists with a valuable tool to learn about the human sense of taste and how it functions in health and disease.
    This advance ultimately will assist efforts to prevent and treat taste loss or impairment due to infection, radiation, chemotherapy and chemical exposures.
    "People who undergo chemotherapy or radiation therapy for oral cancer often lose their sense of taste, leading to decreased interest in food, weight loss, and malnutrition," said lead author M. Hakan Ozdener, M.D., Ph.D., M.P.H., a cellular biologist at Monell. "The success of this technique should provide hope for these people, as it finally provides us with a way to test drugs to promote recovery."
    Taste cells are found in papillae, the little bumps on our tongues. These cells contain the receptors that interact with chemicals in foods to allow us to sense sweet, salty, sour, bitter, and umami. They also are among the few cells in the body with the special capacity to regenerate, with new taste cells maturing from progenitor 'stem' cells every 10-14 days.
    For many years it was believed that taste cells needed to be attached to nerves in order to both function properly and regenerate. For this reason, scientists thought that it was not possible to isolate and grow these cells in culture, which limited the scope of studies to understand how human taste cells function.
    "It had become engrained in the collective consciousness that it wouldn't work," said Monell cellular biologist Nancy E. Rawson, Ph.D
    To dispel the long-held belief, the Monell scientists first demonstrated in 2006 that taste cells from rats could successfully be maintained in culture. In the current study, published online in the journal Chemical Senses, they then applied that methodology to a more clinically relevant population – humans.
    Taking tiny samples of tongue tissue from human volunteers, the researchers first adapted existing techniques to demonstrate that the human taste cells indeed can regenerate in culture.
    They went on to show that the new taste cells were functional, maintaining key molecular and physiological properties characteristic of the parent cells. For example, the new cells also were activated by sweet and bitter taste molecules.
    "By producing new taste cells outside the body, our results demonstrate that direct stimulation from nerves is not necessary to generate functional taste cells from precursors," said Ozdener.
    The establishment of a feasible long-term taste cell culture model opens a range of opportunities to increase understanding of the sense of taste.
    "Results from these cells are more likely to translate to the clinic than those obtained from other species or from systems not derived from taste tissue," said Rawson.
    The cells also can be used to screen and identify molecules that activate the taste receptors; one such example might be a salt replacer or enhancer.
    "The model will help scientists identify new approaches to design and establish cell culture models for other human cells that previously had resisted viable culture conditions," said Ozdener.
    Provided by Monell Chemical Senses Center (news : web)
  36. Anonymous Member

    Taste perception of bitter foods depends on genetics

    April 4, 2011
    tastepercept.jpg
    Grapefruit juice was one of the bitter foods used in taste research done by John Hayes, assistant professor of food science.
    (PhysOrg.com) -- How we perceive the taste of bitter foods -- and whether we like or dislike them, at least initially -- depends on which versions of taste-receptor genes a person has, according to a researcher in Penn State's College of Agricultural Sciences.
    Those genes affect dietary choices, such as whether we eat enough vegetables, drink alcoholic beverages or enjoy citrus fruits. "Just like some people are color blind, some people are taste blind and simply can't taste bitter things that others can," said John Hayes, assistant professor of food science.
    In a collaborative study that began when he was still a graduate student, Hayes and his colleagues at the University of Connecticut, the University of Florida and Brown University showed that how people perceive bitter tastes predicts their food choices. Work by the team and others suggests there is an unusually high level of variation in bitter-taste perception across people.
    Published in the March issue of the journal, Chemical Senses, the research was funded by grants from the National Institutes of Health and the U.S. Department of Agriculture.
    "In the early 1990s, researchers used bitter probes to identify individuals who experience all tastes and oral sensations more intensely, and thus the concept of supertasters was born," Hayes explained. "More recently, we have learned humans have 25 different bitter-taste genes, and it seems each one is tuned to pick up a different group of chemicals."
    "This study moves us beyond the one-size-fits-all approach," he said. "It turns out that different bitter foods act through different receptors, and people can be high or low responders for one but not another. Thus, you may despise grapefruit but have no problem with black coffee."
    Hayes and his colleagues tested approximately 100 healthy adults, primarily of European ancestry, in a laboratory setting. Each subject participated in two or three sessions that were each two hours long. They tasted and carefully rated the bitterness of grapefruit juice, alcohol (Scotch whiskey) and espresso coffee and provided detailed diet histories and DNA samples.
    Hayes pointed out that there are good reasons why human bitter-taste receptors are so refined -- because many things that are bitter also are toxic. "Through time we wanted to avoid them," he said. "There have been thousands of years of evolutionary pressure to avoid bitter compounds, since most are dangerous for us.
    "Being able to avoid bitter plant toxins gave our ancestors an evolutionary advantage."
    The research showed that people can be really sensitive to the bitterness of grapefruit juice, but not at all sensitive to alcohol, and vice-versa, Hayes noted.
    "Those bitter tastes are sensed through different pathways," he explained. "And this doesn't affect just bitterness. Since bitter and sweet are in opposition in the brain, if you experience more bitterness from a food, you also perceive less sweetness. This means not all foods taste the same to all people."
    Previous studies have shown that variations in sensing bitter taste influence people's diet choices, and subsequently their health. For example, people who are more sensitive to bitterness eat 25 percent fewer vegetables, Hayes noted. Because they eat fewer vegetables, they are at greater risk for colon cancer.
    Some of these genes also relate to alcohol abuse. "If you find alcohol to be really bitter initially, it is less likely you will become alcohol dependent," he said.
    While his study did not measure finicky eating, Hayes contended it still may provide new insight into pickiness. "Some people may not be acting whiny when they say they don't like certain foods -- they actually experience those foods differently," he said.
    Hayes is hoping to use bitter-taste research as a springboard to a better understanding of other aspects of food perception. "Bitterness is only one example of genetic differences that may alter sensations from food and influence liking," he said. "Our team also is interested in differences in sweetness perceptions and oral astringency -- that drying, puckering you get from strong tea and red wine.
    "We also are focusing on the burning sensations you get from spicy foods."
    Provided by Pennsylvania State University (news : web)
  37. Anonymous Member

    'Bionic eye' implant offers hope to the blind

    April 3, 2011 by Kerry Sheridan
    acameraonthe.jpg
    Enlarge
    Elias Konstantopoulus runs through an optics test with his "bionic" eye glasses during a session at the Lions Vision Research and Rehabilation Center at Johns Hopkins University in Baltimore, Maryland. Konstantopoulus is blind but working with Johns Hopkins University he has been implanted with a microchip and given a set of glasses that enable him to distinguish between light and dark.
    For a man whose view of the world has slowly faded to black over 30 years, a device that allows him to see flashes of light has enkindled his hope of one day gazing upon his grandson's face.
    A career electrician who grew up in Greece and came to the United States as a young man, Elias Konstantopoulos first noticed his vision getting poorer when at age 43 he absentmindedly tried on a relative's eyeglasses and found he could see more clearly with them than without.
    Soon after, he visited a doctor who tested his sight and discovered he was no longer able to see his outstretched arms from the corners of his eyes. His peripheral vision was deteriorating.
    He was diagnosed with an incurable condition known as retinitis pigmentosa, which affects about 100,000 people, or one in 3,000, in the United States.
    A leading form of hereditary blindness, the disease gradually eats away at the retina's rods and cones, which are photoreceptors that help people see light and identify color and detail.
    About 10 years later, he could no longer see well enough to keep working.
    "You lose your sight, you pretty much lose everything," said Konstantopoulos, who is now 72 and lost his final bit of vision about five years ago.
    When his doctor asked in 2009 if he would like to join a three-year trial of a futuristic technology involving an electrode array in his eye and a wireless camera mounted on a pair of glasses, Konstantopoulos was eager to take part.
    Now, every morning he puts on the glasses, straps a wireless device to his waist and stands by the window or out in the yard waiting to hear the sound of a car approaching. When it passes, he says he can see a block of light go by.
    He can also distinguish light-colored objects against dark backgrounds, and he can orient himself in a room by being able to see where there is an open window or door letting the sun in from outside.
    The device, known as the Argus II, is made by a California company called Second Sight. It was recently approved for use in Europe, and in the United States it has given a handful of test patients like Konstantopoulos cause for optimism.
    "Without the system, I can't see anything. With the system, it's some kind of hope. Something is there," he said.
    "Later on, who knows with technology what it can do? Everything comes little by little."
    The device is similar to the cochlear implants that have allowed hundreds of thousands of deaf people to hear again, and is part of a growing field known as neuromodulation, or the science that helps people regain lost abilities such as sight, hearing and movement by stimulating the brain, spinal cord or nerves.
    Ear implants work by picking up sound through a tiny microphone, then converting those signals into electrical impulses and sending them to an electrode array implanted in the patient. The electrodes gather the impulses and ship them to the auditory nerve, which hears them as sounds.
    The retinal prosthesis follows a similar process. A tiny video camera on the glasses captures images and converts them into electrical signals that are fed to an electrode array that is surgically implanted in the patient's eye.
    konstantopou.jpg
    Enlarge
    Elias Konstantopoulus puts on his bionic eye glasses at his home in Glen Burnie, Maryland. Konstantopoulus suffers from retinosa pigmentosa, a genetic eye condition that leads to incurable blindness, yet working with Johns Hopkins University he has been implanted with a microchip and the glasses that enable him to distinguish light and dark.
    The visual signals are sent to the optic nerve and then to the brain, and the patient sees them as flashes of light and blurry shapes.
    "It is still a very crude level of vision but it is the beginning of an improvement," said Gislin Dagnelie, an ophthalmologist who is working with Konstantopoulos and other blind patients at Johns Hopkins University in Baltimore. "We have to learn how to talk to the retina, basically."
    The implant is unnoticeable. The surgery took about three hours and caused hardly any pain, said Konstantopoulos.
    According to Second Sight vice president of business development Brian Mech, the latest generation of the technology has 60 electrodes, compared to an earlier version that had 16.
    "Surgery is much shorter and requires only one specialist (Argus I required 3)," Mech said.
    In all, 14 devices are being used in the United States and 16 in Europe. The Argus II costs about 100,000 dollars.
    The company plans to apply soon for a humanitarian device exemption with the Food and Drug Administration, and hopes for approval in 2012.
    In the meantime, Konstantopoulos practices with the device one day a week in the lab with Dagnelie. At each session, Konstantopoulos traces objects he sees on a computer screen. Sometimes they walk arm in arm around the medical complex trying to spot certain objects.
    He is gradually improving in his ability to interpret the light flashes and identify them as lines and shapes, the doctor said.
    But among other patients, the response "varies quite a bit."
    "People who have been blind for a long time probably don't have as much benefit," Dagnelie said.
    As time goes on, doctors hope that the device could extend to people who suffer from macular degeneration, the primary cause of vision loss among people over 60.
    "We hope that 10-15 years from now we'll have something that is quite useful, clinically," said the Dutch-born doctor.
    Konstantopoulos still manages to do plenty of work around the house. He recently retiled the bathroom floor and showed visitors how he can still operate his table saw in the garage, pausing a few times to ask if his 18-month-old grandson, Anthony, was underfoot.
    "He does everything. He is such a proud man," said his wife, Dina.
    Back in the living room, Konstantopoulos sat in his recliner and scooped up the chubby-cheeked little boy who calls him "Papou."
    "That has been my biggest complaint. I have never seen his face," he said, cradling the boy on his lap.
    "I cannot see his face. Yet."
  38. Anonymous Member

    What the brain saw

    March 31, 2011
    whatthebrain.jpg
    Enlarge
    Spike distributions for neurons responding to two features can have shapes that are difficult to understand. Credit: Courtesy of Dr. Tatyana Sharpee, Salk Institute for Biological Studies
    The moment we open our eyes, we perceive the world with apparent ease. But the question of how neurons in the retina encode what we "see" has been a tricky one. A key obstacle to understanding how our brain functions is that its components—neurons—respond in highly nonlinear ways to complex stimuli, making stimulus-response relationships extremely difficult to discern.
    Now a team of physicists at the Salk Institute for Biological Studies has developed a general mathematical framework that makes optimal use of limited measurements, bringing them a step closer to deciphering the "language of the brain." The approach, described in the current issue of the Public Library of Science, Computational Biology, reveals for the first time that only information about pairs of temporal stimulus patterns is relayed to the brain.
    "We were surprised to find that higher-order stimulus combinations were not encoded, because they are so prevalent in our natural environment," says the study's leader Tatyana Sharpee, Ph.D., an assistant professor in the Computational Neurobiology Laboratory and holder of the Helen McLorraine Developmental Chair in Neurobiology. "Humans are quite sensitive to changes in higher-order combinations of spatial patterns. We found it not to be the case for temporal patterns. This highlights a fundamental difference in the spatial and temporal aspects of visual encoding."

    This video is not supported by your browser at this time.
    This is an example of the flickering light stimulus presented during the experiment. Movie: Courtesy of Dr. Tatyana Sharpee, Salk Institute for Biological Studies
    The human face is a perfect example of a higher-order combination of spatial patterns. All components—eyes, nose, mouth—have very specific spatial relationships with each other, and not even Picasso, in his Cubist period, could throw the rules completely overboard.
    Our eyes take in the visual environment and transmit information about individual components, such as color, position, shape, motion and brightness to the brain. Individual neurons in the retina get excited by certain features and respond with an electrical signal, or spike, that is passed on to visual centers in the brain, where information sent by neurons with different preferences is assembled and processed.
    For simple sensory events—like turning on a light, for example—the brightness correlates well with the spike probability in a luminance-sensitive cell in the retina. "However, over the last decade or so, it has become apparent that neurons actually encode information about several features at the same time," says graduate student and first author Jeffrey D. Fitzgerald.
    "Up to this point, most of the work has been focused on identifying the features the cell responds to," he says. "The question of what kind of information about these features the cell is encoding had been ignored. The direct measurements of stimulus-response relationships often yielded weird shapes [see Figure 1, for example], and people didn't have a mathematical framework for analyzing it."
    To overcome those limitations, Fitzgerald and colleagues developed a so-called minimal model of the nonlinear relationships of information processing systems by maximizing a quantity that is referred to as noise entropy. The latter describes the uncertainty about a neuron's probability to spike in response to a stimulus.
    When Fitzgerald applied this approach to recordings of visual neurons probed with flickering movies, which co-author Lawrence Sincich and Jonathan Horton at the University of California, San Francisco, had made, he discovered that on average, first-order correlations accounted for 78 percent of the encoded information, while second-order correlations accounted for more than 92 percent. Thus, the brain received very little information about correlations that were higher than second order.
    "Biological systems across all scales, from molecules to ecosystems, can all be considered information processors that detect important events in their environment and transform them into actionable information," says Sharpee. "We therefore hope that this way of 'focusing' the data by identifying maximally informative, critical stimulus-response relationships will be useful in other areas of systems biology."
    Provided by Salk Institute (news : web)
  39. Anonymous Member

    Understanding schizophrenia: Researchers uncover new underlying mechanism

    March 30, 2011 By Matet Nabres

    (PhysOrg.com) -- A new way of thinking about the fundamental pathobiology of schizophrenia could one day lead to improved therapeutic approaches to treating this disorder. Researchers at the University of Toronto, the Hospital for Sick Children (SickKids) and Tufts University School of Medicine have linked proteins and genes that are implicated in schizophrenia in a novel way. The study is published in the March 27 advance online edition of Nature Medicine.
    Schizophrenia is a disorder that affects one per cent of Canadians and 24 million people worldwide. A team of researchers led by Professor Michael Salter of physiology and a senior scientists at SickKids identified a biochemical pathway in the brain that may contribute to the neurobiological basis of schizophrenia.
    “This is a paradigm shift in the way that we view the neural mechanisms of schizophrenia,” said Salter. “With our discovery we have brought together in a new way pieces of the schizophrenia puzzle. We hope that the understanding we have put together will lead to new forms of treatment that are more effective than the ones that are currently available.”
    The scientists studied in mice two partner proteins, NRG1 and ErbB4, and the effect they have on a key brain receptor known as the N-methyl D-aspartate glutamate receptor (NMDAR). While NRG1 and ErbB4 have been genetically implicated in schizophrenia, the new study finds an unexpected link to NMDARs.
    The NMDAR is a major component of synapses - the highly specialized sites of communication between the brain’s billions of individual nerve cells - that is critical for many brain functions including learning and memory. Suppressed functioning of NMDARs was suspected in schizophrenia because drugs that block NMDARs cause the hallucinations and disordered thought that occur in schizophrenia.
    It had been suspected that NRG1 and ErbB4 might suppress generally NMDAR function but the present study found this was not the case. Rather, the researchers discovered that NRG1 and ErbB4 work together through inhibiting another protein, Src. The link to NMDARs is that Src normally increases NMDAR function under circumstances when this is needed such as in learning and memory. The researchers found that by blocking Src, NRG1 and ErbB4 selectively prevented that critical boost in NMDAR function.
    The researchers also studied the responses of nerve cells during brain activity that mimicked normal brain oscillations known as theta rhythm. Theta rhythm activity, which is critical for learning and memory, is impaired in individuals with schizophrenia. The researchers determined that by acting through Src, NRG1 and ErbB4 greatly reduced the nerve cell responses to theta rhythm activity.
    The findings suggest new approaches to schizophrenia treatment by reversing the effects of NRG1 and ErbB4 through enhancing the Src boost of NMDARs. “The tricky part is that all of these proteins are involved in other functions of the body; we can’t randomly enhance or inhibit them as this would lead to side effects,” Salter said. “The key will be to develop clever ways to target the proteins in the context of the synapse.”
    Provided by University of Toronto (news : web)
  40. Anonymous Member

    Replaying our days learning in our sleep (w/ video)

    March 28, 2011 by Deborah Braconnier img-dot.gif
    journal.pone.0018056.g001.jpg
    Enlarge
    Mean reaction times (RT) showing a reduction during training session and a further decrease between pre and post-sleep testing. Image: doi:10.1371/journal.pone.0018056
    (PhysOrg.com) -- According to a recent study, our sleep may not be as empty of brain function as was originally thought. The study, published in Public Library of Science One, was led by sleep researcher Delphine Oudiette from the Universite Pierre et Marie Curie-Paris.
    Original theories were that, while sleeping, our minds were essential empty slates with little neurological activity. However, this recent study provides evidence that during sleep, our body replays the cognitive and motor skills learned throughout the preceding day. Providing evidence of this ‘replay’ hypothesis was the goal of this study.
    The research consisted of a simple test administered to 19 sleepwalkers, 20 REM sleep behavior disorder patients, and 18 healthy sleep control patients. Participants were taught a motor task involving hitting an assortment of colored-coded buttons in sequence. They were then asked to repeat this task while in bed but still awake. Researchers then taped the participants while they slept.

    This video is not supported by your browser at this time.
    Execution of the structured sequence in the training setting by a wake control (Part 1); execution of the sequence from memory by a wake control lying in a bed (Part 2); overt replay of a part of the structured sequence during slow-wave sleep in a sleepwalker (Part 3). Video credit: doi:10.1371/journal.pone.0018056
    What they discovered was that many of the participants would physically repeat and perform the tests they had previously been administered. Looking like a choreographed sleep dance, these participants were ‘practicing’ what they had learned, suggesting that cognitive and motor processing were functioning during sleep.
    This study provides evidence that, while sleeping, our brain function remains similar to that of when we are awake and learning. Essentially, for our brains, it appears as if sleep is not as much of a time of rest as it is a practice session for learning.
    While previously seen in animal studies, this is the first study which shows evidence of 'replay' sleep behavior in humans. Researchers are hoping that continued study will help provide information about cognitive functions occurring during sleep.
    More information: Oudiette D, Constantinescu I, Leclair-Visonneau L, Vidailhet M, Schwartz S, et al. (2011) Evidence for the Re-Enactment of a Recently Learned Behavior during Sleepwalking. PLoS ONE 6(3): e18056. doi:10.1371/journal.pone.0018056
    © 2010 PhysOrg.com
Thread Status:
Not open for further replies.

Share This Page

Customize Theme Colors

Close

Choose a color via Color picker or click the predefined style names!

Primary Color :

Secondary Color :
Predefined Skins