Planet Neuroscientists
  • More Neuroscience
    • Planet Neuroscience
    • Computational Neuroscience on the web
  • Options
    • Suggest a new feed
    • View Planet source
    • View Pluto source

Planet Neuroscientists

An aggregation of RSS feeds from various neuroscience blogs.

last updated by Pluto on 2022-07-02 08:16:33 UTC on behalf of the NeuroFedora SIG.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Feathers may have helped dinosaurs survive the Triassic mass extinction

    Widespread volcanic eruptions around 202 million years ago had a profound effect on Earth’s climate, triggering a mass extinction event that killed off three-fourths of the planet’s species, including many large reptiles. Yet dinosaurs, somehow, survived and went on to thrive.

    Dinosaurs are often thought of as heat-loving, well suited to the steamy greenhouse environment of the Triassic Period. But the secret to their survival may have been how well adapted they were to the cold, unlike other reptiles of the time. The dinosaurs’ warm coats of feathers could have helped the creatures weather relatively brief but intense bouts of volcanic winter associated with the massive eruptions, researchers report July 1 in Science Advances.

    “We’ve known for a while that there were probably volcanic winters” associated with the massive eruptions, says paleontologist Paul Olsen of the Lamont-Doherty Earth Observatory at Columbia University. Along with carbon dioxide, volcanoes spew sulfur particles into the atmosphere that can darken skies for years and lower global temperatures — as the Philippines’ Mount Pinatubo did after its powerful 1991 eruption (SN: 8/8/18). “But how [such winters] fit into the picture of the end-Triassic mass extinction has been very unclear.”

    In the new study, Olsen and his colleagues present the first physical evidence that not only did such winters occur at the end of the Triassic, but also that dinosaurs were there to weather them. At a site called the Junggar Basin, which at the close of the Triassic was found high in the Arctic Circle, the team identified rock fragments that could only have been deposited by ancient ice alongside the footprints of dinosaurs.

    “There is a stereotype that dinosaurs always lived in lush tropical jungles,” says Stephen Brusatte, a paleontologist at the University of Edinburgh who was not involved in the new study. “But this new research shows convincingly that the higher latitudes would have been freezing and even covered in ice during parts of the year” at the beginning of the rise of the dinosaurs, he says.

    The Triassic Period ended with a bang beginning around 202 million years ago, as the supercontinent Pangea began to break apart. Massive volcanic eruptions burst forth as the crust split, opening up a basin that became the Atlantic Ocean. The hardened lava from those eruptions now spans 7 million square kilometers across Africa, Europe and North and South America, forming a rock sequence collectively known as the Central Atlantic Magmatic Province, or CAMP.

    Carbon dioxide levels were extremely high during the late Triassic and early Jurassic, much of it now thought to have been pumped into the atmosphere by those eruptions. Earth has been assumed to have been in a steamy greenhouse state as a result. Supporting this hypothesis is the fact that there’s no evidence of any polar ice sheets at the time; instead, thick forests extended all the way to the poles.

    The Junggar Basin, in what’s now northwestern China, was one such region, covered with forests of conifers and deciduous trees growing alongside a massive ancient lake. Dinosaurs certainly lived there: No bones have yet been discovered at the site, but many footprints of the creatures are preserved in the shallow-water siltstones and sandstones that formed at the bottom of the lake.

    Chilling out

    About 202 million years ago, most of the world’s landmasses were stitched together into the supercontinent Pangea (illustrated). Dinosaur fossils, including footprints, have been found at sites (dots) across the supercontinent. At the Junggar Basin (red oval) in what’s now northwestern China, scientists discovered evidence of frigid winters at the end of the Triassic Period alongside dinosaur tracks.

    Locations of dinosaur fossils from the Triassic Period
    map showing locations where evidence of dinosaurs from 202 million years exists
    P. Olsen et al/Science Advances 2022

    The new data suggest that — despite the extremely high CO2 levels — this region also experienced harsh, frigid winters, with the lake at least partially freezing over. The evidence comes from the same rocks that bear the footprints. Analyzing the distribution of grain sizes in the rocks, the researchers determined that a large portion of the grains weren’t part of the original lake mud, but had been carried there from elsewhere.

    The most likely explanation, Olsen says, is that these grains are “ice-rafted debris” — a well-known phenomenon in which bits of rock freeze to the base of ice along a shoreline, and then hitch a ride with the ice as it eventually drifts into open water. As the floating ice melts, the bits of rock sink, deposited in new territory.

    Volcanic winters might last for tens or even hundreds of years, Olsen says, depending on how long volcanoes continue to erupt. In this case, the huge sheets of lava linked to the CAMP eruptions point to at least tens of thousands of years of eruption pulses, maybe even a million years. That could have kept the winters going for a good long time — long enough to drive many less-well-insulated reptiles off the face of the Earth, he adds. Episodes of those freezing conditions may have even extended all the way to the tropics, the team says.

    Evidence of feathers has been found in the fossils of many types of dinosaurs, from carnivorous theropods to herbivorous ornithischians. Recent reports that flying reptiles called pterosaurs had feathers too now suggests that the insulating fuzz has been around for even longer than once thought — possibly appearing as early as 250 million years ago, in a common ancestor of dinosaurs and pterosaurs (SN: 4/29/22).

    Thanks to those insulating feathers, dinosaurs were able to survive the lengthy winters that ensued during the end-Triassic mass extinction, Olsen and colleagues say. Dinosaurs might then have been able to spread rapidly during the Jurassic, occupying niches left vacant by less hardy reptiles.

    This study “shows the complexity of disentangling not only the success of certain groups, but also the causes and effects of mass extinction events,” says paleontologist Randall Irmis of the University of Utah in Salt Lake City, who was not connected with the study. “There’s a pretty good consensus that [the CAMP eruptions are] the cause of the mass extinction — but there are a lot of subtleties we haven’t appreciated.”

    That dinosaurs living in the far north at the time were able to survive due to their feathery insulation makes sense, Irmis says. But whether a volcanic winter caused by dimming could have extended far enough south to freeze the tropics too — giving dinosaurs a similar advantage there — isn’t yet clear. “Dimming is a global effect, but how that plays out is a lot more severe at the poles compared to low latitudes.” 

    Feathers are probably just one of many reasons why dinosaurs diversified and spread rapidly across the globe at the start of the Jurassic, Irmis says. “There’s a lot that plays into why they became such a successful group.”

    in Science News on 2022-07-01 18:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    A new look at the ‘mineral kingdom’ may transform how we search for life

    If every mineral tells a story, then geologists now have their equivalent of The Arabian Nights.

    For the first time, scientists have cataloged every different way that every known mineral can form and put all of that information in one place. This collection of mineral origin stories hints that Earth could have harbored life earlier than previously thought, quantifies the importance of water as the most transformative ingredient in geology, and may change how researchers look for signs of life and water on other planets. 

    “This is just going to be an explosion,” says Robert Hazen, a mineralogist and astrobiologist at the Carnegie Institution for Science in Washington, D.C. “You can ask a thousand questions now that we couldn’t have answered before.”

    For over 100 years, scientists have defined minerals in terms of “what,” focusing on their structure and chemical makeup. But that can make for an incomplete picture. For example, though all diamonds are a kind of crystalline carbon, three different diamonds might tell three different stories, Hazen says. One could have formed 5 billion years ago in a distant star, another may have been born in a meteorite impact, and a third could have been baked deep below the Earth’s crust.

    diamond that formed deep within the earth
    Diamonds have the same carbon structure, but they can form in different ways. This particular gem originated deep within the Earth.Rob Lavinsky/ARKENSTONE

    So Hazen and his colleagues set out to define a different approach to mineral classification. This new angle focuses on the “how” by thinking about minerals as things that evolve out of the history of life, Earth and the solar system, he and his team report July 1 in a pair of studies in American Mineralogist. The researchers defined 57 main ways that the “mineral kingdom” forms, with options ranging from condensation out of the space between stars to formation in the excrement of bats. 

    The information in the catalog isn’t new, but it was previously scattered throughout thousands of scientific papers. The “audacity” of their work, Hazen says, was to go through and compile it all together for the more than 5,600 known types of minerals. That makes the catalog a one-stop shop for those who want to use minerals to understand the past.

    The compilation also allowed the team to take a step back and think about mineral evolution from a broader perspective. Patterns immediately popped out. One of the new studies shows that over half of all known mineral kinds form in ways that ought to have been possible on the newborn Earth. The implication: Of all the geologic environments that scientists have considered as potential crucibles for the beginning of life on Earth, most could have existed as early as 4.3 billion years ago (SN: 9/24/20). Life, therefore, may have formed almost as soon as Earth did, or at the very least, had more time to arise than scientists have thought. Rocks with traces of life date to only 3.4 billion years ago (SN: 7/26/21). 

    “That would be a very, very profound implication — that the potential for life is baked in at the very beginning of a planet,” says Zachary Adam, a paleobiologist at the University of Wisconsin–Madison who was not involved in the new studies.

    The exact timing of when conditions ripe for life arose is based on “iffy” models, though, says Frances Westall, a geobiologist at the Center for Molecular Biophysics in Orléans, France, who was also not part of Hazen’s team. She thinks that scientists need more data before they can be sure. But, she says, “the principle is fantastic.”

    The new results also show how essential water has been to making most of the minerals on Earth. Roughly 80 percent of known mineral types need H2O to form, the team reports.

    “Water is just incredibly important,” Hazen says, adding that the estimate is conservative. “It may be closer to 90 percent.”

    composite images of two green minerals: azurite and opalized ammonite
    Some minerals would not form in certain ways without the influence of life. Photosynthesizing bacteria helped bring about the oxygen-rich conditions needed for this azurite (left), while the opalized ammonite (right) was created by the mineral opal filling the space where an ammonite shell used to be.Rob Lavinsky/ARKENSTONE

    Taken one way, this means that if researchers see water on a planet like Mars, they can guess that it has a rich mineral ecosystem (SN: 3/16/21). But flipping this idea may be more useful: Scientists could identify what minerals are on the Red Planet and then use the new catalog to work backward and figure out what its environment was like in the past. A group of minerals, for example, might be explainable only if there had been water, or even life.

    Right now, scientists do this sort of detective work on just a few minerals at a time (SN: 5/11/20). But if researchers want to make the most of the samples collected on other planets, something more comprehensive is needed, Adam says, like the new study’s framework.

    And that’s just the beginning. “The value of this [catalog] is that it’s ongoing and potentially multigenerational,” Adam says. “We can go back to it again and again and again for different kinds of questions.” 

    “I think we have a lot more we can do,” agrees Shaunna Morrison, a mineralogist at the Carnegie Institution and coauthor of the new studies. “We’re just scratching the surface.”

    in Science News on 2022-07-01 14:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    50 years ago, a new theory of Earth’s core began solidifying

    How the Earth got its core – Science News, July 1, 1972

    the cover of the July 2, 1972 issue of Science News

    In the beginning, scientists believe there was an interstellar gas cloud of all the elements comprising the Earth. A billion or so years later, the Earth was a globe of concentric spheres with a solid iron inner core, a liquid iron outer core and a liquid silicate mantle…. The current theory is that the primeval cloud’s materials accreted … and that sometime after accretion, the iron, melted by radioactive heating, sank toward the center of the globe…. Now another concept is gaining ground: that the Earth may have accreted … with core formation and accretion occurring simultaneously.

    Update

    Most scientists now agree that the core formed as materials that make up Earth collided and glommed together and that the process was driven by heat from the smashups. The planet’s heart is primarily made of iron, nickel and some oxygen, but what other elements may dwell there and in what forms remains an open question. Recently, scientists proposed the inner core could be superionic, with liquid hydrogen flowing through an iron and silicon lattice (SN: 3/12/22, p. 12).

    in Science News on 2022-07-01 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    February: ‘we don’t agree there is an issue here.’ June: Retracted.

    A Springer Nature journal has retracted a paper on hepatitis C infection it had previously corrected for problematic data – but in between the editors declared the case closed. The paper, “The interaction between microRNA-152 and DNA methyltransferase-1 as an epigenetic prognostic biomarker in HCV-induced liver cirrhosis and HCC patients,” was published in July 2019 … Continue reading February: ‘we don’t agree there is an issue here.’ June: Retracted.

    in Retraction watch on 2022-07-01 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    People with no mind’s eye have less vivid and detailed memories

    By Matthew Warren

    When we’re asked to imagine a scene or object, most of us are able to conjure up an image in our mind’s eye. But about 2-5% of the population can’t do this: they have a condition called aphantasia, and are unable to produce mental imagery at all.

    Now a study published in Cognition has found that aphantasia can affect memory abilities too. The researchers report that aphantasics have less detailed and rich memories for events in their lives: a finding that not only reveals more about the condition, but also highlights the key role of mental imagery in memory generally.

    Past work had shown that people with aphantasia report almost no mental imagery when recalling past events from their lives or when thinking about potential future events. But these findings were based on participants rating their own abilities, note Alexei Dawes and colleagues from the University of New South Wales. So the team decided to examine whether aphantasic participants also show memory deficits in more objective measures.

    The researchers recruited 30 participants with aphantasia and 30 control participants who didn’t have any problems with mental imagery.  All participants filled in questionnaires asking them to report the vividness off various kinds of mental imagery, including when recalling scenes from their lives. The researchers found that — as in past studies — the aphantasic participants reported having less vivid mental imagery (although, again consistent with past work, their spatial imagery was just as good as that of control participants).

    To get a more “objective” measure of memory deficits, the team then looked at the kind of information people provided when describing their memories. Participants were asked to remember six events in their lives, as well as think about six hypothetical future events, and write a description of each. For their analysis, the researchers looked at different categories of details contained in these descriptions (for instance, details relating to participants’ sensations, thoughts, or emotions). After writing each description, participants also reported their own subjective experiences of that particular event, such as how vivid it was and how much emotion they felt.

    The team found that aphantasic participants gave fewer details than control participants about both kinds of event. This effect seemed to be driven specifically by the fact that they wrote down fewer visual details; the groups didn’t differ in the amount of detail they gave that involved other senses like smell or hearing, or that concerned other aspects of the event like thoughts or emotions they experienced.

    The two groups also differed in how they reported experiencing the memories and imagining the future events. Compared to controls, aphantasics indicated that the events were less vivid and contained fewer sensory or spatial details, for instance, and they also reported experiencing less emotion when thinking about the event.

    Overall, the results show that people with aphantasia have less vivid and detailed memories — particularly when it comes to visual details — and this this effect is clear whether they are asked about their experience or tested in more “objective” ways. The authors write that their findings represent the “first robust behavioural evidence that visual imagery absence is associated with a significantly reduced capacity to simulate the past and construct the future”.

    The very fact that aphantasics show these deficits also implies that mental imagery is generally an important part of recalling events or imagining future ones. This probably isn’t a huge surprise — but as the authors point out, is something that has been hard to test empirically until now. However, the study also shows that mental imagery isn’t everything: people with aphantasia were still able to retrieve memories, after all. Instead, it seems that mental imagery is specifically involved in that aspect of memory that involves “re-living” events in our minds.

    – Memories with a blind mind: Remembering the past and imagining the future with aphantasia

    Matthew Warren (@MattBWarren) is Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-07-01 09:32:36 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Schneider Shorts 1.07.2022 – Old men, new scams

    Schneider Shorts 1.07.2022 - with a Welsh Nobelist's new scam, fraudster's German husband and his fountain of youth, a surprise retraction, an unsurprising correction, Cheshire and his fraudsters, and a dirty old man succeeding a dirty old man in Marseille.

    in For Better Science on 2022-07-01 05:16:38 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Children exposed to more traffic noise in schools may experience lower working memory and slower attention spans

    This month, a PLOS Medicine study has found that children exposed to road traffic noise at school exhibit signs of relatively slower attention spans and working memory development compared to those educated in a quieter environment. The study done in Europe could spark the creation of environmental noise policies to protect school environments. Learn more about the study design in our monthly Research Highlights summary below or check out the full article in PLOS Medicine.

    Background

    Road traffic noise is the most widespread environmental and transportation noise source in Europe and the second most detrimental environmental factor for ill health in Europe after air pollution. An increasing number of epidemiological studies in adults detail the health impacts of transportation noise, but little is known about the effects in children.

    Study design and findings

    Maria Foraster and colleagues at the Barcelona Institute for Global Health (ISGlobal) studied a sample of 2,680 children aged 7-10 years at 38 schools in Barcelona, Spain between January 2012 and March 2013. The children completed computerized cognitive tests four times during one year to assess the development of working memory, complex working memory, and inattentiveness. The researchers measured markers of annual exposure to intensity levels and fluctuation in road traffic noise both inside and outside the schools at the beginning of the year and estimated exposure to intensity levels outdoors at home using Barcelona’s road traffic noise map for 2012. Their analyses also controlled for levels of traffic-related air pollution and sociodemographics, among other factors.

    In the children, exposure to higher intensity and fluctuation in road traffic noise measured outside the school was associated with a slower development of working memory, complex working memory, and attention over 12 months. Inside the classroom, associations with all cognitive development measures were more evident for exposure to noise fluctuation than for intensity levels. No associations were found for exposure to road traffic noise at home.

    Conclusion

    The team did not measure past noise exposure in their population which could affect the test scores. However, the authors reported that 98% of the children had attended the same school at least for one year and that noise levels are generally steady over time. The findings are of public health relevance given the number of children around the world who are exposed to road traffic noise in schools, but similar testing should be done elsewhere before the findings are generalized to other populations.

    Foraster adds, “Children exposed to road traffic noise at school and in the classroom showed slower cognitive development in terms of working memory and attention compared to children attending quieter schools.”

    The post Children exposed to more traffic noise in schools may experience lower working memory and slower attention spans appeared first on The Official PLOS Blog.

    in The Official PLOS Blog on 2022-06-30 21:41:30 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    This soft, electronic ‘nerve cooler’ could be a new way to relieve pain

    A flexible electronic implant could one day make pain management a lot more chill.

    Created from materials that dissolve in the body, the device encircles nerves with an evaporative cooler. Implanted in rats, the cooler blocked pain signals from zipping up to the brain, bioengineer John Rogers and colleagues report in the July 1 Science.

    Though far from ready for human use, a future version could potentially let “patients dial up or down the pain relief they need at any given moment,” says Rogers, of Northwestern University in Evanston, Ill. 

    Scientists already knew that low temperatures can numb nerves in the body. Think of frozen fingers in the winter, Rogers says. But mimicking this phenomenon with an electronic implant isn’t easy. Nerves are fragile, so scientists need something that gently hugs the tissues. And an ideal implant would be absorbed by the body, so doctors wouldn’t have to remove it. 

    Made from water-soluble materials, the team’s device features a soft cuff that wraps around a nerve like toilet paper on a roll. Tiny channels snake down its rubbery length. When liquid coolant that’s pumped through the channels evaporates, the process draws heat from the underlying nerve. A temperature sensor helps scientists hit the sweet spot — cold enough to block pain but not too cold to damage the nerve.

    The researchers wrapped the implant around a nerve in rats and tested how they responded to having a paw poked. With the nerve cooler switched on, scientists could apply about seven times as much pressure as usual before the animals pulled their paws away. That’s a sign that the rats’ senses had grown sluggish, Rogers says.

    He envisions the device being used to treat pain after surgery, rather than chronic pain. The cooler connects to an outside power source and would be tethered to patients like an IV line. They could control the level of pain relief by adjusting the coolant’s flow rate. Such a system might offer targeted relief without the downsides of addictive pain medications like opioids, Rogers suggests (SN: 8/27/19).

    Now the researchers want to explore how long they can apply the cooling effect without damaging tissues, Rogers says. In experiments, the longest that they cooled rats’ nerves was for about 15 minutes. 

    “If treating pain, cooling would have to go on for a much longer period of time,” says Seward Rutkove, a nerve physiologist at Harvard Medical School who wasn’t involved in the study. Still, he adds, the device is “an interesting proof of concept and should definitely be pursued.”

    in Science News on 2022-06-30 18:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Six months in space leads to a decade’s worth of long-term bone loss

    You might want to bring your dumbbells on that next spaceflight.

    During space missions lasting six months or longer, astronauts can experience bone loss equivalent to two decades of aging. A year of recovery in Earth’s gravity rebuilds about half of that lost bone strength, researchers report June 30 in Scientific Reports.

    Bones “are a living organ,” says Leigh Gabel, an exercise scientist at the University of Calgary in Canada. “They’re alive and active, and they’re constantly remodeling.” But without gravity, bones lose strength.

    Gabel and her colleagues tracked 17 astronauts, 14 men and three women with the average age of 47, who spent from four to seven months in space. The team used high-resolution peripheral quantitative computed tomography, or HR-pQCT, which can measure 3-D bone microarchitecture on scales of 61 microns, finer than the thickness of human hair, to image the bone structure of the tibia in the lower leg and the radius in the lower arm. The team took these images at four points in time — before spaceflight, when the astronauts returned from space, and then six months and one year later — and used them to calculate bone strength and density.

    Astronauts in space for less than six months were able to regain their preflight bone strength after a year back in Earth’s gravity. But those in space longer had permanent bone loss in their shinbones, or tibias, equivalent to a decade of aging. Their lower-arm bones, or radii, showed almost no loss, likely because these aren’t weight-bearing bones, says Gabel.

    Increasing weight lifting exercises in space could help alleviate bone loss, says Steven Boyd, also a Calgary exercise scientist. “A whole bunch of struts and beams all held together give your bone its overall strength,” says Boyd. “Those struts or beams are what we lose in spaceflight.” Once these microscopic tissues called trabeculae are gone, you can’t rebuild them, but you can strengthen the remaining ones, he says. The researchers found the remaining bone thickened upon return to Earth’s gravity.

    CT scan image of the interior structure of an astronaut's shin bone
    Using high-resolution computed tomography imaging allowed researchers to study 3-D bone microarchitecture in astronauts’ bones (example of a shinbone shown here). That minute level of detail can reveal changes in bone density and strength.S. Boyd and L. Gabel/University of Calgary

    “With longer spaceflight, we can expect bigger bone loss and probably a bigger problem with recovery,” says physiologist Laurence Vico of the University of Saint-Étienne in France, who was not part of the study. That’s especially concerning given that a crewed future mission to, say, Mars would last at least two years (SN: 7/15/20). She adds that space agencies should also consider other bone health measures, such as nutrition, to reduce bone absorption and increase bone formation (SN: 3/8/05). “It’s probably a cocktail of countermeasure that we will have to find,” Vico says.

    Gabel, Boyd and their colleagues hope to gain insight on how spending more than seven months in space affects bones. They are part of a planned NASA project to study the effects of a year in space on more than a dozen body systems. “We really hope that people hit a plateau, that they stop losing bone after a while,” says Boyd.

    in Science News on 2022-06-30 15:18:15 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    ICYMI

    PLOS has published a lot of great blog content in the past three months, and we don’t expect our readers to be up to date on everything. So, we are starting a new blog series called, ‘In case you missed it (ICYMI)’, a collection of some blogs and key announcements from the previous quarter. Happy reading!

    From the Official PLOS Blog:

    Uphold the Code

    Advancing Open Science in Africa

    Partnering with CRUK to improve cancer research

    The potential butterfly effect of preregistered peer-reviewed research

    From Speaking of Medicine and Health:

    A formula for disaster

    Vaccine apartheid is racist and wrong

    Monkeypox is not a gay disease

    There is no global in global health security

    Walking the path to gender equality, together

    Activist guide to a healthier world

    What do colonialism, racism and global health education have in common?

    Call for Papers: PLOS Medicine Special Issue ‘Bacterial Antimicrobial Resistance: Surveillance and Prevention’

    From EveryONE:

    Interviews with the lab protocols community

    Publication timeframes at PLOS ONE—and our plans to improve them

    An interview with Ben Brown, Guest Editor of the PLOS ONE-COS Cognitive Psychology Collection

    It takes two to tango

    From Latitude:

    PLOS Sustainability and Transformation celebrates World Biodiversity Day

    Cities and Climate Change: PLOS Climate’s first Call for Papers

    Don’t forget the socks!

    Meet Debora Walker, the new PLOS Water Executive Editor

    Celebrating Earth Day 2022

    The IPCC AR6 WGIII Report: A Youth Perspective

    Mitigation of Climate Change: The IPCC AR6 WGIII report

    From Biologue:

    Behind the Paper

    What does an editor do?

    The post ICYMI appeared first on The Official PLOS Blog.

    in The Official PLOS Blog on 2022-06-30 15:16:50 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Daily skin-to-skin contact in weeks after birth linked to less crying and better sleep

    By Emma Young

    Few things are as stressful as listening to your baby crying — and excessive crying is clearly not good for the baby, either. Skin-to-skin contact is widely used in the first hours after a birth, with benefits for infants and parents. But, according to a new paper in Developmental Psychology, a daily hour of skin-to-skin contact for weeks afterwards is beneficial, too: it reduces crying and improves sleep.

    Kelly Cooijmans at Radboud University, in the Netherlands, and her colleagues recruited Dutch healthy first-time mothers with full-term infants for their randomized controlled trial. On signing up while pregnant, the women knew they were taking part in a trial aimed at improving infant crying and sleep, but not what the intervention would be.

    Prolonged mother-infant skin-to-skin contact is not part of Dutch culture, the team notes. So when one group was instructed to perform “care as usual”, extended skin-to-skin contact took place only immediately after birth. The other group was asked to ensure that their baby had an hour a day of this contact for five weeks. The mothers were also asked to report on a range of measures, including how long their infant cried and slept, for the first 12 weeks of their baby’s life. (As women in the Netherlands get 12 weeks of paid maternity leave, the team felt that this would all be practically feasible.)

    Only relatively few women — 16 of an initial 64 in the daily contact group — did actually fully follow the protocol. Their babies cried less overall and had shorter individual crying bouts. In their first few days of life, they slept for longer, too. The data also suggested a dose-response relationship between skin contact and crying: more minutes of contact was associated with less crying and also, in the first days at least, more sleep. “Our findings suggest that extended [skin to skin contact] adds to the already beneficial effects during the first postnatal hours/days, at least regarding infant crying and sleeping,” the team writes.

    There are various potential mechanisms. Warm, soft, human touch signals that a caregiver is there, close at hand. It is known to be calming. Skin-to-skin contact also specifically triggers the release of oxytocin in both the mother and infant, the team adds — and this has known stress-reducing effects. Advantages of the intervention are, the team writes, that it is “uncomplicated and low cost”.

    However, while it might carry no immediate financial cost for women on paid maternity leave, the high drop-out rate certainly suggests that the mothers did perceive it to have other kinds of costs. In fact, 10% reported having trouble fitting it into their daily routine and 14% said that their own mental or physical recovery problems made it difficult. 

    It’s true that when these women were recruited, there was no evidence that time invested in regular daily skin-to-skin contact could have a pay-off in terms of reduced crying, the team notes. Perhaps the new, preliminary data might encourage some new mothers to try the protocol. And there were some significant benefits. In week 2, for example, the team reports a daily crying duration of 106 minutes for those mothers who fully followed the protocol vs 129 for the control. That’s an average of 23 fewer minutes spent crying per day.

    Still, it’s worth noting that the mothers in the intervention group paid for this with an hour a day of their time. Keeping a baby close in a sling can be tricky enough; ensuring you’re in a suitable place each day for an hour of bare skin-to-skin contact is clearly even more demanding. Whatever a mother’s circumstances, let’s not pretend that there are no or “low” costs attached to the potential benefits, because that places very little value on her invested time. 

    – Daily skin-to-skin contact and crying and sleeping in healthy full-term infants: A randomized controlled trial.

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-30 13:46:52 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    New COVID-19 boosters could contain bits of the omicron variant

    For all the coronavirus variants that have thrown pandemic curve balls — including alpha, beta, gamma and delta — COVID-19 vaccines have stayed the same. That could change this fall.

    On June 28, an advisory committee to the U.S. Food and Drug Administration met to discuss whether vaccine developers should update their jabs to include a portion of the omicron variant — the version of the coronavirus that currently dominates the globe. The verdict: The omicron variant is different enough that it’s time to change the vaccines. Those shots should be a dual mix that includes both a piece of the nearly identical omicron subvariants BA.4/BA.5 and the virus from the original vaccines, the FDA announced June 30. 

    “This doesn’t mean that we are saying that there will be boosters recommended for everyone in the fall,” Amanda Cohn, chief medical officer for vaccine policy at the U.S Centers for Disease Control and Prevention said at the meeting. “But my belief is that this gives us the right vaccine for preparation for boosters in the fall.”  

    The decision to update COVID-19 vaccines didn’t come out of nowhere. In the two-plus years that the coronavirus has been spreading around the world, it has had a few “updates” of its own — mutating some of its proteins that allow the virus to more effectively infect our cells or hide from our immune systems. 

    Vaccine developers had previously crafted vaccines to tackle the beta variant that was first identified in South Africa in late 2020. Those were scrapped after studies showed that current vaccines remained effective. 

    The current vaccines gave our immune systems the tools to recognize variants such as beta and alpha, which each had a handful of changes from the original SARS-CoV-2 virus that sparked the pandemic. But the omicron variant is a slipperier foe. Lots more viral mutations combined with our own waning immunity mean that once omicron can gain a foothold in the body, vaccine protection isn’t as good as it once was at fending off COVID-19 symptoms (SN: 6/27/22). 

    The shots still largely protect people from developing severe symptoms, but there has been an uptick in hospitalizations, especially among older people, Heather Scobie, deputy team lead of the CDC’s Surveillance and Analytics Epidemiology Task Force said at the meeting. Deaths among older age groups are also beginning to increase. And while it’s impossible to predict the future, we could be in for another tough fall and winter, epidemiologist Justin Lessler of the University of North Carolina at Chapel Hill said at the meeting. From March 2022 to March 2023, simulations project that deaths from COVID-19 in the United States might number in the tens to hundreds of thousands.

    A switch to omicron-containing jabs may give people an extra layer of protection for the upcoming winter. Pfizer-BioNTech presented data at the meeting showing that updated versions of its mRNA shot gave clinical trial participants a boost of antibodies that recognize omicron. One version included omicron alone, while the other is a twofer, or bivalent, jab that mixes the original formulation with omicron. Moderna’s bivalent shot boosted antibodies too. Novavax, which developed a protein-based vaccine that the FDA is still mulling whether to authorize for emergency use, doesn’t have an omicron-based vaccine yet, though the company said its original shot gives people broad protection, generating antibodies that probably will recognize omicron. 

    Pfizer and Moderna both updated their vaccines using a version of omicron called BA.1, which was the dominant variant in the United States in December and January. But BA.1 has siblings and has already been outcompeted by some of them. 

    See all our coverage of the coronavirus outbreak

    Since omicron first appeared late last year, “we’ve seen a relatively troubling, rapid evolution of SARS-CoV-2,” Peter Marks, director of the FDA’s Center for Biologics Evaluation and Research, said at the advisory meeting.  

    Now, omicron subvariants BA.2, BA.2.12.1, BA.4 and BA.5 are the dominant versions in the United States and other countries. The CDC estimates that roughly half of new U.S. infections the week ending June 25 were caused by either BA.4 or BA.5. By the time the fall rolls around, yet another new version of omicron — or a different variant entirely — may join their ranks. The big question is which of these subvariants to include in the vaccines to give people the best protection possible. 

    BA.1, the version already in the updated vaccines, may be the right choice, virologist Kanta Subbarao said at the FDA advisory meeting. An advisory committee to the World Health Organization, which Subbarao chairs, recommended on June 17 that vaccines may need to be tweaked to include omicron, likely BA.1. “We’re not trying to match [what variants] may circulate,” Subbarao said. Instead, the goal is to make sure that the immune system is as prepared as possible to recognize a wide variety of variants, not just specific ones. The hope is that the broader the immune response, the better our bodies will be at fighting the virus off even as it evolves. 

    The variant that is farthest removed from the original virus is probably the best candidate to accomplish that goal, said Subbarao, who is director of the WHO’s Collaborating Center for Reference and Research on Influenza at the Doherty Institute in Melbourne, Australia. Computational analyses of how antibodies recognize different versions of the coronavirus suggest that BA.1 is probably the original coronavirus variant’s most distant sibling, she said. 

    Some members of the FDA advisory committee disagreed with choosing BA.1, instead saying that they’d prefer vaccines that include a portion of BA.4 or BA.5. With BA.1 largely gone, it may be better to follow the proverbial hockey puck where it’s going rather than where it’s been, said Bruce Gellin, chief of Global Public Health Strategy with the Rockefeller Foundation in Washington, D.C. Plus, BA.4 and BA.5 are also vastly different from the original variant. Both have identical spike proteins, which the virus uses to break into cells and the vaccines use to teach our bodies to recognize an infection. So when it comes to making vaccines, the two are somewhat interchangeable.

    Sign up for e-mail updates on the latest coronavirus news and research

    There are some real-world data suggesting that current vaccines offer the least amount of protection from BA.4 and BA.5 compared with other omicron subvariants, Marks said. Pfizer also presented data showing results from a test in mice of a bivalent jab with the original coronavirus strain plus BA.4/BA.5. The shot sparked a broad immune response that boosted antibodies against four omicron subvariants. It’s unclear what that means for people. 

    Not everyone on the FDA advisory committee agreed that an update now is necessary — two members voted against it. Pediatrician Henry Bernstein of Zucker School of Medicine at Hofstra/Northwell in Uniondale, N.Y., noted that the current vaccines are still effective against severe disease and that there aren’t enough data to show that any changes would boost vaccine effectiveness. Pediatric infectious disease specialist Paul Offit of Children’s Hospital of Philadelphia said that he agrees that vaccines should help people broaden their immune responses, but he’s not yet convinced omicron is the right variant for it. 

    Plenty of other open questions remain too. The FDA could have authorized either a vaccine that contains omicron alone or a bivalent shot. Some data presented at the meeting hinted that a bivalent dose might spark immunity that could be more durable, but that’s still unknown. Pfizer and Moderna tested their updated shots in adults. It’s unclear what the results mean for kids. Also unknown is whether people who have never been vaccinated against COVID-19 could eventually start with such an omicron-based vaccine instead of the original two doses.

    Maybe researchers will get some answers before boosters start in the fall. But health agencies needed to make decisions now, so vaccine developers have a chance to make the shots in the first place. Unfortunately, we’re always lagging behind the virus, said pediatrician Hayley Gans of Stanford University. “We can’t always wait for the data to catch up.” 

    in Science News on 2022-06-30 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Seven months after an author request, journal retracts

    Two weeks after we reported on the unsuccessful efforts of a researcher at The Ohio State University to have one of his papers retracted for data manipulation, the journal that had been delaying the move has acted.  As we wrote earlier this month based on a request for public records, Philip Tsichlis had been urging … Continue reading Seven months after an author request, journal retracts

    in Retraction watch on 2022-06-29 17:49:43 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Megatooth sharks may have been higher on the food chain than any ocean animal ever

    Whenever paleontologist Dana Ehret gives talks about the 15-meter-long prehistoric sharks known as megalodons, he likes to make a joke: “What did megalodon eat?” asks Ehret, Assistant Curator of Natural History at the New Jersey State Museum in Trenton. “Well,” he says, “whatever it wanted.”

    Now, there might be evidence that’s literally true. Some megalodons (Otodus megalodon) may have been “hyper apex predators,” higher up the food chain than any ocean animal ever known, researchers report in the June 22 Science Advances. Using chemical measurements of fossilized teeth, scientists compared the diets of marine animals — from polar bears to ancient great white sharks — and found that megalodons and their direct ancestors were often predators on a level never seen before.

    The finding contradicts another recent study, which found megalodons were at a similar level in the food chain as great white sharks (SN: 5/31/22). If true, the new results might change how researchers think about what drove megalodons to extinction around 3.5 million years ago.

    In the latest study, researchers examined dozens of fossilized teeth for varieties of nitrogen, called isotopes, that have different numbers of neutrons. In animals, one specific nitrogen isotope tends to be more common than another. A predator absorbs both when it eats prey, so the imbalance between the isotopes grows further up the food chain. 

    For years, scientists have used this trend to learn about modern creatures’ diets. But researchers were almost never able to apply it to fossils millions of years old because the nitrogen levels were too low. In the new study, scientists get around this by feeding their samples to bacteria that digest the nitrogen into a chemical the team can more easily measure.

    The result: Megalodon and its direct ancestors, known collectively as megatooth sharks, showed nitrogen isotope excesses sometimes greater than any known marine animal. They were on average probably two levels higher on the food chain than today’s great white sharks, which is like saying that some megalodons would have eaten a beast that ate great whites.

    “I definitely thought that I’d just messed up in the lab,” says Emma Kast, a biogeochemist at the University of Cambridge. Yet on closer inspection, the data held up.

    The result is “eyebrow-raising,” says Robert Boessenecker, a paleontologist at the College of Charleston in South Carolina who was not involved in the study. “Even if megalodon was eating nothing but killer whales, it would still need to be getting some of this excess nitrogen from something else,” he says, “and there’s just nothing else in the ocean today that has nitrogen isotopes that are that concentrated.”

    “I don’t know how to explain it,” he says.

    There are possibilities. Megalodons may have eaten predatory sperm whales, though those went extinct before the megatooth sharks. Or megalodons could have been cannibals (SN: 10/5/20).  

    Another complication comes from the earlier, contradictory study. Those researchers examined the same food chain —  in some cases, even the same shark teeth — using a zinc isotope instead of nitrogen. They drew the opposite conclusion, finding megalodons were on a similar level as other apex predators.

    The zinc method is not as established as the nitrogen method, though nitrogen isotopes have also rarely been used this way before. “It could be that we don’t have a total understanding and grasp of this technique,” says Sora Kim, a paleoecologist at the University of California, Merced who was involved in both studies. “But if [the newer study] is right, that’s crazy.”

    Confirming the results would be a step toward understanding why megalodons died off. If great whites had a similar diet, it could mean that they outcompeted megalodons for food, says Ehret, who was not involved in the study. The new findings suggest that’s unlikely, but leave room for the possibility that great whites competed with — or simply ate — juvenile megalodons (SN: 1/12/21). 

    Measuring more shark teeth with both techniques could solve the mystery and reconcile the studies. At the same time, Kast says, there’s plenty to explore with their method for measuring nitrogen isotopes in fossils. “There’s so many animals and so many different ecosystems and time periods,” she says. 

    Boessenecker agrees. When it comes to the ancient oceans, he says, “I guarantee we’re going to find out some really weird stuff.”

    in Science News on 2022-06-29 13:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    How physicists are probing the Higgs boson 10 years after its discovery

    Javier Duarte kicked off his scientific career by witnessing the biggest particle physics event in decades. On July 4, 2012, scientists at the laboratory CERN near Geneva announced the discovery of the Higgs boson, the long-sought subatomic particle that reveals the origins of mass. Duarte was an eager graduate student who’d just arrived at CERN.

    “I was physically there maybe a week before the announcement,” Duarte says. As buzzing throngs of physicists crowded together to watch the announcement at CERN, Duarte didn’t make it to the main auditorium. That space was for VIPs — and those determined enough to wait in line all night to snag a seat. Instead, he says, he found himself in the basement, in an overflow room of an overflow room.

    But the enthusiasm was still palpable. “It was a very exciting time to be getting immersed into that world,” he says. Since then, he and thousands of other physicists from around the world working on CERN experiments have gone all out exploring the particle’s properties.

    Scientists predicted the existence of the Higgs boson back in 1964, as a hallmark of the process that gives elementary particles mass. But finding the particle had to wait for CERN’s Large Hadron Collider, or LHC. In 2010, the LHC began smashing protons together at extremely high energies, while two large experiments, ATLAS and CMS, used massive detectors to look through the debris.

    The particle’s discovery filled in the missing keystone of the standard model of particle physics. That theory explains the known elementary particles and their interactions. Those particles and interactions are behind just about everything we know. The particles serve as building blocks of atoms and transmit crucial forces of nature, such as electromagnetism. And the mass of those particles is key to their behavior. If electrons were massless, for example, atoms wouldn’t form. Without the Higgs boson, then, one of scientists’ most successful theories would collapse.

    The Higgs boson discovery dominated headlines around the globe. About half a million people tuned in to watch the livestreamed announcement, and footage from the event appeared on more than 5,000 news programs. Even oddball minutiae made it into the press, with a few articles analyzing the physicists’ use of the often-scorned font Comic Sans in their presentation. Little more than a year later, the discovery garnered a Nobel Prize for two of the scientists who developed the theory behind the Higgs boson, François Englert and Peter Higgs — for whom the particle is named.

    On July 4, 2012, at the European particle physics lab CERN, scientists announced the discovery of the Higgs boson. Physicist Lyn Evans (standing second from left), who led construction of the Large Hadron Collider, celebrates alongside former CERN directors.Denis Balibouse/AFP/GettyImages

    Now, as the discovery turns 10 years old, that initial excitement persists for Duarte and many other particle physicists. As a professor at the University of California, San Diego and member of the CMS experiment, Duarte’s research still revolves around the all-important particle. Progress in understanding the Higgs has been “stunning,” he says. “We’ve come so much farther than we expected to.”

    Physicists have been working through a checklist of things they want to know about the Higgs boson. They spent the last decade cataloging its properties, including how it interacts with several other particles. Though measurements have so far been in line with the predictions made by the standard model, if a discrepancy turns up in the future, it may mean there are unknown particles yet to be discovered.

    And there’s still more on the agenda. An especially important item is the Higgs boson’s interaction with itself. To help pin down this and other Higgs properties, scientists are looking forward to collecting more data. Scientists turned on an upgraded LHC for a new round of work in April. At the time of the Higgs discovery, collisions at the LHC reached an energy of 8 trillion electron volts. Collisions are expected to roll in at a record 13.6 trillion electron volts starting July 5, and data-taking will continue until 2026. These higher energies offer opportunities to spot heavier particles. And the High-Luminosity LHC, a more powerful iteration of the LHC, is expected to start up in 2029.

    “Finding a particle, it sounds like the end of something, but it’s really only the beginning,” says experimental particle physicist María Cepeda of CIEMAT in Madrid, a member of the CMS collaboration.

    The standard model

    The standard model of particle physics explains the known elementary particles and their interactions. It consists of 17 particles, many of which have antiparticle partners. The fermions, or matter particles, include six types of quarks (blue) and six leptons (pink). The bosons, or force-carrying particles (orange), transmit the fundamental forces. The Higgs boson has special status: It explains the origin of particles’ masses. 

    Tap the colored sections below for more. Tap the matter/antimatter switch to see matter’s antimatter partners.

    Design: T. Tibbitts; Interactive: Tyler Machado

    Coupling up

    Studying the Higgs boson is like geocaching, says theoretical particle physicist Gudrun Heinrich of the Karlsruhe Institute of Technology in Germany. Much like hobbyists use a GPS device to uncover a hidden stash of fun trinkets, physicists are using their wits to uncover the treasure trove of the Higgs boson. In 2012, scientists merely located the cache; the next 10 years were devoted to revealing its contents. And that investigation continues. “The hope is that the contents will contain something like a map that is guiding us towards an even bigger treasure,” Heinrich says.

    Detailed study of the Higgs boson could help scientists solve mysteries that the standard model fails to explain. “We know that the theory has limitations,” says theoretical particle physicist Laura Reina of Florida State University in Tallahassee. For instance, the standard model has no explanation for dark matter, a shadowy substance that throws its weight around the cosmos, exerting a gravitational pull necessary to explain a variety of astronomical observations. And the theory can’t explain other quandaries, like why the universe is composed mostly of matter rather than its alter ego, antimatter. Many proposed solutions to the standard model’s shortcomings require new particles that would alter how the Higgs interacts with known particles.

    The Higgs boson itself isn’t responsible for mass. Instead, that’s the job of the Higgs field. According to quantum physics, all particles are actually blips in invisible fields, like ripples atop a pond. Higgs bosons are swells in the Higgs field, which pervades the entire cosmos. When elementary particles interact with the Higgs field, they gain mass. The more massive the particle, the more strongly it interacts with the Higgs field, and with the Higgs boson. Massless particles, like photons, don’t directly interact with the Higgs field at all.

    One of the best ways to hunt for Higgs-related treasure is to measure those interactions, known as “couplings.” The Higgs couplings describe what particles the Higgs boson decays into, what particles can fuse to produce Higgs bosons and how often those processes occur. Scientists gauge these couplings by sifting through and analyzing the showers of particles produced when Higgs bosons pop up in the debris of proton smashups.

    Even if unknown particles are too heavy to show up at the LHC, the Higgs couplings could reveal their existence. “Any of these couplings not being what you expect them to be is a very clear sign of incredibly interesting new physics behind it,” says particle physicist Marumi Kado of Sapienza University of Rome and CERN, who is the deputy spokesperson for the ATLAS collaboration.

    ATLAS experiment diagram
    The ATLAS experiment was one of two detectors to see definitive signs of the Higgs boson. In this event, recorded on June 10, 2012, and shown here in three different views, a candidate Higgs particle decays into four muons (red tracks).ATLAS Experiment © 2012 CERN (CC BY-SA 4.0)

    Physicists have already checked the couplings to several elementary particles. These include both major classes of particles in physics: bosons (particles that carry forces) and fermions (particles that make up matter, such as electrons). Scientists have measured the Higgs’ interactions with a heavy relative of the electron called a tau lepton (a fermion) and with the W and Z bosons, particles that transmit the weak force, which is responsible for some types of radioactive decay. Researchers also pegged the Higgs’ couplings to the top quark and bottom quark. Those are two of the six types of quarks, which glom together into larger particles such as protons and neutrons. (The Higgs is responsible for the mass of elementary particles, but the mass of composite particles, including protons and neutrons, instead comes mostly from the energy of the particles jangling around within.)

    The couplings measured so far involve the standard model’s heavier elementary particles. The top quark, for example, is about as heavy as an entire gold atom. Since the Higgs couples more strongly to heavy particles, those interactions tend to be easier to measure. Next up, scientists want to observe the lighter particles’ couplings. ATLAS and CMS have used their giant detectors to see hints of the Higgs decaying to muons, the middleweight sibling in the electron family, lighter than the tau but heavier than the electron. The teams have also begun checking the coupling to charm quarks, which are less massive than top and bottom quarks.

    So far, the Higgs has conformed to the standard model. “The big thing we discovered is it looks pretty much like we expected it to. There have been no big surprises,” says theoretical particle physicist Sally Dawson of Brookhaven National Laboratory in Upton, N.Y.

    But there might be discrepancies that just haven’t been detected yet. The standard model predictions agree with measured couplings within error bars of around 10 percent or more. But no one knows if they agree to within 5 percent, or 1 percent. The more precisely scientists can measure these couplings, the better they can test for any funny business.

    An interaction checklist 

    Studying how the Higgs boson interacts with other particles is one way to test whether it fits with predictions of the standard model. Scientists have measured the Higgs’ interactions, or “couplings,” with five standard model particles (dark pink in the graphic), and have early evidence of coupling with a sixth. Heavier particles have been the first targets since they interact more strongly with the Higgs boson (as seen in the graph), so are easier to measure. So far, all measured couplings agree with predictions.

    standard model diagram and graph of particle mass vs. Higgs coupling for various particles
    E. Otwell; CMS collaboration/CERNE. Otwell; CMS collaboration/CERN

    One of a kind

    Before the LHC turned on, scientists had a clear favorite for a physics theory that could solve some of the standard model’s woes: supersymmetry, a class of theories in which every known particle has an undiscovered partner particle. Physicists had hoped such particles would turn up at the LHC. But none have been found yet. Though supersymmetry isn’t fully ruled out, the possibilities for the theory are far more limited.

    With no consensus candidate among many other theories for what could be beyond the standard model, a lot of focus rests on the Higgs. Physicists hope studies of the Higgs will reveal something that might point in the right direction to untangle some of the standard model’s snarls. “Measuring [the Higgs boson’s] properties is going to tell us much more about what is beyond the standard model … than anything before,” Reina says.

    One question that scientists are investigating in LHC smashups is whether the Higgs is truly unique. All the other known elementary particles have a quantum form of angular momentum, known as spin. But the Higgs has a spin of zero, what’s known as a “scalar.” Other types of particles tend to come in families, so it’s not outlandish to imagine that the Higgs boson could have scalar relatives. “It could be there’s a huge scalar sector somewhere hiding and we just saw the first particle of it,” Heinrich says. Supersymmetry predicts multiple Higgs bosons, but there are plenty of other ideas that envision Higgs accomplices.

    It’s also possible that the Higgs is not actually elementary. Combinations of particles, such as quarks, are known to make up larger particles with spins of zero. Perhaps the Higgs, like those other scalars, is made up of yet unknown smaller stuff.

    While hunting for these answers, physicists will be watching closely for any connection between the Higgs’ behavior and other recent puzzling results. In 2021, the Muon g−2 experiment at Fermilab in Batavia, Ill., reported hints that muons have magnetic properties that don’t agree with predictions of the standard model. And in April, scientists with the CDF experiment — which studied particle collisions at Fermilab until 2011 — found that the W boson’s mass is heavier than the standard model predicts.

    The Higgs boson’s relative newness makes it ripe for discoveries that could help sort out these quandaries. “The Higgs boson is the least explored elementary particle, and it could be a door to the other mysteries we still have to uncover or to shed light on,” Heinrich says.

    photo of the CMS detector
    The CMS detector, one of the detectors that discovered the Higgs boson, got upgraded in advance of a new run of particle collisions that started at the Large Hadron Collider this year.Samuel Joseph Hertzog/CERN

    Self-talk

    To work out thorny puzzles, physicists sometimes talk to themselves. Fittingly, another puzzle atop scientists’ Higgs to-do list is whether the particle, likewise, talks to itself.

    This “self-coupling,” how Higgs bosons interact with one another, has never been measured before. But “it turns out to be really just an incredible barometer of new physics,” says theoretical particle physicist Nathaniel Craig of the University of California, Santa Barbara. For example, measuring the Higgs self-coupling could suss out hidden particles that interact only with the Higgs, oblivious to any of the other standard model particles.

    The Higgs self-coupling is closely related to the Higgs potential, an undulating, sombrero-shaped surface that describes the energy of the universe-pervading Higgs field. In the early universe, that potential determined how the fundamental particles gained mass, when the Higgs field first turned on.

    How, exactly, that transition from massless to massive happened has some big implications for the cosmos. It could help explain how matter gained the upper hand over antimatter in the early universe. If the Higgs field did play that role in the universe’s beginnings, Craig says, “it’s going to leave some fingerprints on the Higgs potential that we measure today.”

    Depending on the full shape of the Higgs potential’s sombrero, at some point in the exceedingly distant future, the Higgs field could shift again, as it did in the early universe. Such a jump would change the masses of fundamental particles, creating a universe in which familiar features, including life, are probably obliterated.

    To better understand the Higgs potential, scientists will attempt to measure the self-coupling. They’ll do it by looking for Higgs bosons produced in pairs, a sign of the Higgs interacting with itself. That’s thought to happen at less than a thousandth the rate that individual Higgs bosons are produced in the LHC, making it extremely difficult to measure.

    Even with the planned High-Luminosity LHC, which will eventually collect about 10 times as much data as the LHC, scientists predict that the self-coupling will be measured with large error bars of about 50 percent, assuming the standard model is correct. That’s not enough to settle the matter.

    If scientists just do what they’re on track to do, “we’re going to fall short,” Duarte says. But new techniques could allow physicists to better identify double-Higgs events. Duarte is studying collisions in which two particularly high-energy Higgs bosons each decay into a bottom quark and a bottom antiquark. Using a specialized machine learning technique, Duarte and colleagues put together one of the most sensitive analyses yet of this type of decay.

    By improving this technique, and combining results with those from other researchers looking at different types of decays, “we have a good hope that we’ll be able to observe [the self-coupling] definitively,” Duarte says.

    illustration of the Higgs potential as a sombrero
    The Higgs potential can be represented as a sombrero-shaped surface that describes the energy of the Higgs field. At some point in the early universe, the energy of the field dropped from a higher value atop the sombrero to a lower energy in the sombrero’s well (illustrated). That’s when particles acquired mass.John Ellis/arXiv.org 2013, CERN, adapted by E. Otwell

    Waiting game

    Despite all his passion for the Higgs, Duarte notes that there have been disappointments. After that first rush of the Higgs announcement, “I was hoping for a Higgs-level discovery every year.” That didn’t happen. But he hasn’t lost his optimism. “We expect there to be another twist and turn coming up,” he says. “We’re still hoping it’s around the corner.”

    The wait for new physics is no shock to veterans of earlier particle hunts. Meenakshi Narain, a particle physicist at Brown University in Providence, R.I., and a member of the CMS experiment, was an undergraduate student around the time the bottom quark was discovered in the 1970s. After that discovery, Narain joined the search for the top quark. Even though physicists were convinced of the particle’s existence, that hunt still took nearly 20 years, she says. And it took nearly 50 years to uncover the Higgs boson after it was postulated.

    The standard model’s flaws make physicists confident that there must be more treasures to unearth. Because of her past experiences with the long-haul process of discovery, Narain says, “I have a lot of faith.”

    in Science News on 2022-06-29 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Author demands a refund after his paper is retracted for plagiarism

    The author of a 2021 paper in a computer science journal has lost the article because he purportedly stole the text from the thesis of a student in Pakistan – a charge he denies.  According to the editors of Computational Intelligence and Neuroscience, a Hindawi title, Marwan Ali Albahar, of Umm Al Qura University College … Continue reading Author demands a refund after his paper is retracted for plagiarism

    in Retraction watch on 2022-06-29 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Drinking coffee before shopping can lead to impulse buying

    By Emily Reynolds

    Those wanting to eat more healthily and save money are often advised not to go food shopping while hungry, the theory being that we make less prudent purchases when we’re more concerned with satisfying our immediate needs than thinking about long term goals. But how do other states of mind affect our purchases?

    We’d probably not think anything of having a cup of coffee or a can of Coke before going shopping. But a new study, published in the Journal of Marketing, finds that caffeine may have a bigger impact than we think, with participants spending more and buying more things after a caffeinated drink.

    In the first study, the team set up an espresso station over four days at a large chain store selling household goods in France. Each day, some shoppers were given espresso with caffeine while others were given decaf. Those who took the free coffee were asked to share the receipt for their purchases on leaving the shop, and answered questions on how excited, alert, and sleepy they felt while doing their shopping.

    The results showed that those who had consumed caffeinated, rather than decaffeinated, espresso felt more excited and alert after drinking their coffee. They also bought a higher number of items and spent more overall. This was replicated in a second study, which took place in a city in Spain.

    The third study, again in France, followed the same procedure, this time also tracking different product categories as well as cost. Again, drinking a caffeinated coffee before shopping led to greater spending and a greater number of purchases than drinking a non-caffeinated beverage. The types of items bought by caffeinated participants also differed — they were more likely than non-caffeinated participants to buy “high hedonic” items such as such as buttery or rich foods, which are considered more pleasurable than utilitarian things.

    In a final study, students drank caffeinated or decaffeinated drinks before being asked whether they wanted to buy items from a “relaxation” category, which contained products of high hedonic value, or from a category containing more useful objects like notebooks and diaries. Participants bought less useful and more “exciting” items after drinking coffee, and again felt more excited and alert having drunk coffee — the potential mechanism for their buying and spending more. This is because we are more impulsive when we feel excited, and thus more likely to take a risk on products we might not go for when we are feeling calm.

    This knowledge could obviously be of use to retailers — putting coffee stands or shops near the entrance of stores could make shoppers more likely to spend their money, and free coffee could potentially produce a significant return on investment. For this reason, the team argues that regulators should inform consumers of the impact of caffeine.

    On an individual basis, the results are clearly useful too. If you are struggling with your spending at the supermarket, external factors like the cost of living crisis are probably more likely to be to blame than your own individual choices. However, being aware of the impact of caffeine could help people make small savings or at least be more aware of the potential for unplanned spending.

    – EXPRESS: Caffeine’s Effects on Consumer Spending

    Emily Reynolds is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-29 09:23:18 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Apprentice of the One-Man Papermill

    "'More than 80 articles and H-index over 25 (Scopus) have been achieved'." - M.K. Ahmed

    in For Better Science on 2022-06-29 06:02:45 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    ‘Elusive’ profiles the physicist who predicted the Higgs boson

    Elusive
    Frank Close
    Basic Books, $30

    There’s a lot more to the story of the Higgs boson than just one man named Higgs.

    Despite the appeal of the “lone genius” narrative, it’s rare that a discovery can be attributed solely to the work of one scientist. At first, Elusive, a biography of Peter Higgs written by physicist and author Frank Close, seems to play into that misleading narrative: The book is subtitled “How Peter Higgs solved the mystery of mass.”

    But the book quickly — and rightfully — veers from that path as it delves into the theoretical twists and turns that kicked off a decades-long quest for the particle known as the Higgs boson, culminating with its discovery in 2012 (SN: 7/28/12, p. 5). That detection verified the mechanism by which particles gain mass. Higgs, of the University of Edinburgh, played a crucial role in establishing mass’s origins, but he was one of many contributors.

    The habitually modest and attention-averse Higgs makes the case against himself as the one whiz behind the discovery, the book notes: According to Higgs, “my actual contribution was only a key insight right at the end of the story.”

    The Higgs boson itself doesn’t bestow fundamental particles with mass. Instead, its discovery confirmed the correctness of a theory cooked up by Higgs and others. According to that theory, elementary particles gain mass by interacting with a field, now known as the Higgs field, that pervades all of space.

    A paper from Higgs in 1964 was not the first to propose this process. Physicists Robert Brout and François Englert just barely beat him to it. And another team of researchers published the same idea just after Higgs (SN: 11/2/13, p. 4). Crucial groundwork had already been laid by yet other scientists, and still others followed up on Higgs’ work. Higgs, however, was the one to make the pivotal point that the mass mechanism implied the existence of a new, massive particle, which could confirm the theory.

    Despite this complicated history, scientists slapped his name on not just the particle, the Higgs boson, but also the process behind it, traditionally called the Higgs mechanism, but more recently and accurately termed the Brout-Englert-Higgs mechanism. (Higgs has reportedly proposed calling it the “ABEGHHK’tH mechanism,” using the first letter of the last names of the parade of physicists who contributed to it, Anderson, Brout, Englert, Guralnik, Hagen, Higgs, Kibble and ’t Hooft.) The postmortem of how Higgs’ name attained outsize importance is one of the most interesting sections of Elusive, revealing much about the scientific sausage-making process and how it sometimes goes awry. Equally fascinating is the account of how the media embraced Higgs as a titan of physics based on his association with the boson, lofting him to a level of fame that, for Higgs, felt unwelcome and unwarranted.

    The book admirably tackles the complexities of the Brout-Englert-Higgs mechanism and how particles gain mass, covering details that are usually glossed over in most popular explanations. Close doesn’t shy away from nitty-gritty physics terms like “perturbation theory,” “renormalization” and “gauge invariance.” The thorniest bits are most appropriate for amateur physics aficionados who desire a deeper understanding, and those bits may require a reread before sinking in.

    Higgs is famously not a fan of the limelight — he disappeared for several hours on the day he won a Nobel Prize for his work on mass. The physicist sometimes seems to fade into the background of this biography as well, with multiple pages passing with no appearance or contribution from Higgs. Once the scientific community got wind of the possibility of a new particle, the idea took on a life of its own, with experimental physicists leading the charge. Higgs didn’t make many contributions to the subject beyond his initial insight, which he calls “the only really original idea I’ve ever had.”

    Thus, the book sometimes feels like a biography of a particle named Higgs, with the person playing a backup role. Higgs is so reserved and so private that you get the sense that Close still hasn’t quite cracked him. While interesting details of Higgs’ life and passions are revealed — for example, his fervent objection to nuclear weapons — deeper insights are missing. In the end, Higgs is, just like the particle named after him, elusive.


    Buy Elusive from Bookshop.org. Science News is a Bookshop.org affiliate and will earn a commission on purchases made from links in this article.

    in Science News on 2022-06-28 14:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Long-Term Depression and Recognition of Parallel Fibre Patterns in a Multi-Compartmental Model of a Cerebellar Purkinje Cell

    This week on Journal Club session Volker Steuber will talk about temporal coding and, in particular, rank order coding. Please see the papers below for more details.


    It has been suggested that long-term depression (LTD) of parallel fibre (PF) synapses enables a cerebellar Purkinje cell (PC) to learn to recognise PF activity patterns. We investigate the recognition of PF patterns that have been stored by LTD of AMPA receptors in a multi- compartmental PC model with a passive soma. We find that a corresponding artificial neural network outperforms a PC model with active dendrites by an order of magnitude. Removal of the dendritic ion channels leads to a further decrease in performance. Another effect of the active dendrites is an afterhyperpolarization response to novel PF patterns. Thus, the LTD based storage of PF patterns can lead to a potentiated late PC response.


    Papers:

    • B. Sabatini, W. Regehr, "Timing of Neurotransmission at Fast Synapses in the Mammalian Brain", 1996, Nature, 384, 170--172
    • V. Steuber, E. De, Schutter, "Long-Term Depression and Recognition of Parallel Fibre Patterns in a Multi-Compartmental Model of a Cerebellar Purkinje Cell", 2001, Neurocomputing, 38--40, 383--388
    • S. Thorpe, D. Fize, C. Marlot, "Speed of Processing in the Human Visual System", 1996, American Journal of Ophthalmology, 381, 608--609
    • S. Thorpe, A. Delorme, R. Van, Rullen, "Spike-Based Strategies for Rapid Processing", 2001, Neural Networks, 14, 715--725

    Date: 2022/07/01
    Time: 14:00
    Location: online

    in UH Biocomputation group on 2022-06-28 13:18:39 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Working memory training won’t make you more intelligent

    By Emma Young

    What can you do to make yourself smarter? All kinds of interventions have been designed and tried, mostly with little success. However, some studies have suggested that training working memory is effective. This has led to it becoming the most popular form of intelligence-training intervention, write the authors of new paper in the Journal of Experimental Psychology: Learning, Memory and Cognition.

    There have been mixed results in this area though, and, the team argues, potential problems with the methodology of some previous studies, making it hard to draw firm conclusions. (For example, some of the trials that failed to find an effect perhaps involved too little training.) So they set out to run as definitive a trial as possible. The results of their two-year longitudinal study now suggest that while working memory can indeed be improved in typically developing children, this has no impact whatsoever on intelligence.

    Working memory is the type that you use to consciously hold and manipulate information in your mind. No end of studies have linked better working memory scores to greater “fluid intelligence” — the sort involved in reasoning and learning. (Fluid intelligence is widely viewed as the “key ingredient” in human cognitive abilities, the team notes.) In fact, working memory has been viewed as the underpinning of fluid intelligence for decades. So it’s certainly reasonable to think that improving someone’s working memory might improve their intelligence, too.

    The team studied 225 healthy German children, who were aged about 14 when the study began. All had their working memory and intelligence assessed at the start and end of the two-year period. To measure working memory, the team used three types of tasks, which involved letters, numbers and spatial positions (alpha span, memory updating and N-back). All required the participants to hold and manipulate chunks of information in their mind to do well.

    About half of the children were in the intervention group. Every two weeks, they spent an hour engaged in versions of these three working memory tasks. In total, they received 40 hours of training over the study period.

    The results showed that this extensive practice did indeed boost their performance on these tasks. Two years on, these children had got better at all three — and they did substantially better than the control group. Also, the team’s analysis suggested that the trained children experienced a more general improvement in working memory that wasn’t only restricted to the individual tasks. However, they write, “Despite the striking improvements in WM [working memory], we did not observe transfer to intelligence.”

    Given all the data clearly linking working memory and intelligence, this prompts the question: why not?

    There is no clear answer to that. Despite the apparent usefulness of a good working memory for reasoning and learning, perhaps something else underpins both working memory capacity and fluid intelligence, and it’s that underlying X factor that explains the link between the two.

    There are some recent meta-analyses that also cast doubt on the idea that working memory training improves intelligence, the team notes. And in fact, of all the brain-training interventions that have been explored, only one has been found to consistently and robustly improve intelligence. That, as they write, is education — a “dosage” of several hours a day of broad brain training for most weeks of the year, over many years.

    Perhaps future work will find that a briefer intervention does reliably improve intelligence. And there’s no doubt that work in this field will continue. “Because the consequence of successful interventions would be far-reaching, research into (alternative) cognitive interventions will persist,” the team concludes. For now, though, it seems that if your goal is to become smarter, working memory training is a waste of time. 

    – Training working memory for two years—No evidence of transfer to intelligence.

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-28 11:47:08 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    A neck patch for athletes could help detect concussions early

    A flexible sensor applied to the back of the neck could help researchers detect whiplash-induced concussions in athletes.

    The sensor, described June 23 in Scientific Reports, is about the size of a bandage and is sleeker and more accurate than some instruments currently in use, says electrical engineer Nelson Sepúlveda of Michigan State University in East Lansing. “My hope is that it will lead to earlier diagnosis of concussions.”

    Bulky accelerometers in helmets are sometimes used to monitor for concussion in football players. But since the devices are not attached directly to athletes’ bodies, the sensors are prone to false readings from sliding helmets.

    Sepúlveda and colleagues’ patch adheres to the nape. It is made of two electrodes on an almost paper-thin piece of piezoelectric film, which generates an electric charge when stretched or compressed. When the head and neck move, the patch transmits electrical pulses to a computer. Researchers can analyze those signals to assess sudden movements that can cause concussion.

    The team tried out the patch on the neck of a human test dummy, dropping the figure from a height of about 60 centimeters. Researchers also packed the dummy’s head with different sensors to provide a baseline level of neck strain. Data from the patch aligned with data gathered by the internal sensors more than 90 percent of the time, Sepúlveda and colleagues found.

    The researchers are now working on incorporating a wireless transmitter into the patch for an even more streamlined design.

    in Science News on 2022-06-28 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Introducing the BMC Series SDG Editorial Board Members: Jean-Michel Heraud

    Welcome to our SDG Editorial Board Members blog collection. We are hearing from the Editorial Board Members of the BMC Series journals whose work aligns with achieving the Sustainable Development Goals. Here you can find other posts in this collection, grouped with the tag ‘SDG editorial board members‘.


    During my almost 20 years working in Africa and Low- and Middle-income countries, I have led several public health and research programs covering a broad variety of topics (Influenza and other respiratory viruses, Arboviruses, Hepatitis, Zoonotic pathogen, etc). The unique geographical characteristics of Africa make this continent a model of choice for studies on the evolution of viral populations, mechanisms of introduction and maintenance of viruses in regions with several bio-climates and important ecological diversity. This environment, where the proportion of endemism is among the highest in the world, also offers the opportunity to elucidate new epidemiological systems and organisms, and to test innovative models of disease surveillance.

    My “philosophy” during my years spent in Madagascar and working in Africa was to develop research programs of interest both at the local and international level, and to train/mentor young African scientists in order for them to develop and manage their own research projects.

    Children attending school in a remote village of Madagascar [CC0 Public Domain. Free for personal and commercial use. No attribution required]
    Nowadays, I’m developing new programs in the field of rabies and viral encephalitis. The aim of the rabies project is to implement an integrated approach for rabies surveillance in Senegal making use of an innovative application (REACT), developed by Mission Rabies and UC Centers for Disease Control and Prevention. This integrated bite case management is coordinated together with the Senegalese Ministry of Health and Ministry of Livestock.

    Field/outbreak investigation difficulties during the rainy season in North of Madagascar [courtesy of Julia Guillebaud and Jean-Michel Heraud]
    In Senegal, very little data are available on the main etiological causes of infectious encephalitis. This lack of knowledge impacts the management of patients. The research program that we have developed since 2020 is aimed at increasing our knowledge of the viral etiologies of encephalitis in Dakar, the capital city of Senegal. To explore this, we have implemented a multi-pathogen diagnostic platform that enables clinicians to receive results within 24h, thus accelerating the adapted management of patients. The innovation of our approach for the country is to test not only cerebrospinal fluids but also other biological specimens like nasopharyngeal swabs and blood samples. We published our first observational study in April 2022 where we demonstrated the high mortality rate (41%) among viral encephalitis patients. We also noted that SARS-CoV-2 seems to play a significant role in patients presenting with encephalitis. Besides the challenges due to limited funds available for neglected diseases, we hope that our results will allow us to better characterize the clinical spectrum of viral encephalitis, identify potential risk factors and improve healthcare by giving patients access to better treatment. Our final aim is to develop a network of hospital-based encephalitis surveillance not only in Senegal, but in West-Africa region.

    To conclude, I could say that my background allowed me to perform some basic science, nevertheless, I was always interested in developing operational research programs aiming to identify the burden of some viral diseases and reduce associated morbidity and mortality. Although most of my research relates to Sustainable Development Goal 3.3 (By 2030, end the epidemics of AIDS, tuberculosis, malaria and neglected tropical diseases and combat hepatitis, water-borne diseases and other communicable diseases), I have a personal interest in developing new and more inclusive approaches to tackle diseases, in particular in the region with poor access to healthcare. For that I consider embracing several SDGs in particular SDGs 3 (Good Health and Well-being), 4 (Quality education), 6 (Clean Water and Sanitation) and 7 (Affordable and Clean energy) through synergies of different programs could enable reaching some of the 2030 SDGs goals.

    As Louis Pasteur once said: “Science knows no country, because knowledge belongs to humanity, and is the torch which illuminates the world.” As a scientist that has worked for decades on infectious diseases and public health, it will be a great personal achievement shall I play a role in the reduction of morbidity and mortality due to viruses, making the use of global and sustainable approaches.

     

    The post Introducing the BMC Series SDG Editorial Board Members: Jean-Michel Heraud appeared first on BMC Series blog.

    in BMC Series blog on 2022-06-28 09:22:51 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Lancet more than doubles its impact factor, eclipsing NEJM for the first time ever

    The Lancet has overtaken the New England Journal of Medicine as the medical journal with the highest impact factor, according to Clarivate’s 2022 update to its Journal Citation Reports. And the jump wasn’t subtle: The Lancet’s impact factor – a controversial measure of how often a journal’s papers are cited on average – more than … Continue reading The Lancet more than doubles its impact factor, eclipsing NEJM for the first time ever

    in Retraction watch on 2022-06-28 09:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Here’s what we know right now about getting COVID-19 again

    Not long before the end of the school year, my husband and I received an e-mail from our fifth-grader’s principal that may now be all-too-familiar to many parents. The subject line included the words, “MULTIPLE COVID CASES.” 

    Several students in my daughter’s class had tested positive for COVID-19. Her school acted fast. It reinstated a mask mandate for 10 days and required students not up-to-date on their COVID-19 vaccinations to quarantine.

    These precautions may have helped — my daughter didn’t end up bringing the virus home. But for kids who do, COVID-19 can hopscotch through households, knocking down relatives one by one. And it’s not clear how long one infection protects you from a second round with the virus.

    Recent high-profile cases have put reinfections in the spotlight. Health and Human Services Secretary Xavier Becerra has had two bouts of COVID-19 in less than a month. So has The Late Show host Stephen Colbert. Back at his desk in May, he joked, “You know what they say. ‘Give me COVID once, shame on you. Give me COVID twice, please stop giving me COVID.’”

    Just a few months ago, scientists thought reinfections were relatively rare, occurring most often in unvaccinated people (SN: 2/24/22). But there are signs the number may be ticking up. 

    An ABC News investigation that contacted health departments in every state reported June 8 that more people seem to be getting the virus again. And omicron, the variant that sparked last winter’s surge, is still spawning sneaky subvariants. Some can evade antibodies produced after infection with the original omicron strain, scientists report June 17 in Nature. That means a prior COVID-19 infection might not be as helpful against future infections as it once was (SN:8/19/21). What’s more, reinfection could even add to a person’s risk of hospitalization or other adverse outcomes, a preliminary study suggests.

    Scientists are still working to pin down the rate of reinfection. Like most questions involving COVID-19 case numbers, the answer is more than a little murky. “You really need to have a cohort of people who are well followed and tested every time they have symptoms,” says Caroline Quach-Thanh, an infectious diseases specialist at CHU Sainte-Justine, a pediatric hospital at the University of Montreal. 

    A recent look at hundreds of thousands of COVID-19 cases among people in the province of Quebec found that roughly 4 percent were reinfections, scientists report in a preliminary study posted May 3 at medRxiv.org (SN:5/27/22). Quach-Thanh has seen an even smaller rate in her own study of health care workers first infected between March and September of 2020. Those data are still unpublished, but she points out that most of the people in her study were vaccinated. “A natural infection with three doses of vaccines protects better than just a natural infection,” she says. 

    As many families, mine included, gear up for summer camps and vacations, I wanted to learn more about our current COVID-19 risks. I chatted with Quach-Thanh and Anna Durbin, an infectious diseases physician at Johns Hopkins Bloomberg School of Public Health who has studied COVID-19 vaccines. Our conversations have been edited for length and clarity.

    What’s the latest on reinfections? Is the picture changing?

    Durbin: We have to remember that the virus strain that’s circulating now is very different from the earlier strains. Whether you’ve been infected with COVID-19 or vaccinated, your body makes an immune response to fight future infections. It recognizes [the strain] your body originally saw. But as the virus changes, as it did with omicron, it becomes sort of a fuzzier picture for the immune system. It’s not recognizing the virus as well, and that’s why we’re seeing reinfections. 

    I’ll also say that reinfections — particularly with respiratory viruses — are very common.

    How can scientists distinguish a true reinfection from a relapse of an original infection?

    Quach-Thanh: There are multiple ways of looking at this. The first is looking at the time elapsed between the first infection and a new positive PCR test. If it has been more than three months, it is unlikely to be just a remnant of a previous infection. We can also look at viral load. A really high viral load usually means it’s a new infection. But the best way to tell is to sequence the virus [to determine its genetic makeup] to see if it is actually a new strain. 

    What do we know about the health risks of reinfection?

    Quach-Thanh: The good thing is that most of the people who got reinfected [in the Quebec study] got a mild disease, and the risk of hospitalization and death was much lower. 

    When you get reinfected, you might [have symptoms] like a cold, or even sometimes a cough, and a little bit of a fever, but you usually don’t progress to complications as much as you would with your first infection — if you’re vaccinated.

    Does reinfection increase your chance of developing long COVID?

    Durbin: I think that’s unknown, but it’s being studied. 

    As we look back at the omicron wave in the U.S. that happened in January and February, now is about the time we would start to see symptoms of long COVID. So far it looks promising. We seem to be seeing a lower incidence of long COVID [after reinfection with omicron] than we did with primary infection, but those data are going to continue to be collected over the next few months.

    At this point in the pandemic, how cautious do we need to be?

    Quach-Thanh: It depends on your baseline risk of complications. If you’re healthy, if you’re doing most activities outdoors, if you’re vaccinated, life can proceed. But if you’re immune suppressed or elderly, the situation might be different.

    See all our coverage of the coronavirus outbreak

    If you have symptoms, it would be advisable to not mingle in indoor settings without a mask so that you don’t contaminate other people. There are immunocompromised people who might be at risk of serious infection. We still need to keep them in mind. I think we have to be responsible, and if we’re sick, we should get tested.

    Durbin: This is what I tell my friends, family and patients: This virus is here to stay. Any time you’re in a crowded place with poor ventilation and lots of people, there’s a chance there’s going to be transmission. The risk is never going to be zero. It’s a message people don’t want to hear. But as long as there are people to infect, this virus is not going away. 

    We have to move to acceptance, and we have to be better members of society. If we can, we should stay home when we’re sick. If we can’t stay home, we should wear a mask. We should wash our hands regularly. These are things that work to reduce transmission.

    They reduce your risk of getting not just COVID-19, but also a cold or the flu.

    Can we anticipate another surge in cases?

    Quach-Thanh: I think the next wave will come in the fall. The problem with this virus is that it mutates. And as long as it’s transmitted, it will continue to mutate.

    What can we look forward to? 

    Durbin: I think young kids getting vaccinated is going to reduce the ability of the virus to spread (SN: 6/17/22). That’s good news.

    Sign up for e-mail updates on the latest coronavirus news and research

    in Science News on 2022-06-27 13:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Britons’ tools from 560,000 years ago have emerged from gravel pits

    In the 1920s, laborers and amateur archaeologists at gravel quarry pits in southeastern England uncovered more than 300 ancient, sharp-edged oval tools. Researchers have long suspected that these hand axes were made 500,000 to 700,000 years ago. A new study confirms that suspicion in the first systematic excavation of the site, known as Fordwich.

    Dating those tools and more recent finds suggests that humanlike folk inhabited the area between about 560,000 and 620,000 years ago, researchers report in the June Royal Society Open Science. Relatively warm conditions at that time drew hominids to what’s now northern Europe before the evolutionary rise of Neandertals and Homo sapiens.

    The results confirm that Fordwich is one of the oldest hominid sites in England. Previous discoveries place hominids in what’s now southeastern England at least 840,000 years ago (SN: 7/7/10) and perhaps as far back as nearly 1 million years ago (SN: 2/11/14). No hominid fossils have been found at Fordwich. It’s unclear which species of the human genus made the tools.

    In 2020, archaeologist Alastair Key of the University of Cambridge and colleagues unearthed 238 stone artifacts at Fordwich that display grooves created by striking the surface with another stone. Other finds include three stones with resharpened edges, presumably used to scrape objects like animal hides.

    A method for determining when sediment layers were last exposed to sunlight indicated that the newly discovered artifacts date to roughly 542,000 years ago. The previously unearthed hand axes probably came from the same sediment.

    Hominids must have fashioned tools at Fordwich a bit earlier than 542,000 years ago because ancient climate data suggest that an ice age at that time made it hard to survive in northern Europe, the team concludes. Warmer conditions between 560,000 and 620,000 years ago would have enabled the hominid toolmakers to live so far north.

    in Science News on 2022-06-27 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Wiley: Committed to integrity? Get out!

    "We have initiated post-acceptance peer review with independent reviewers... " - Wiley.

    in For Better Science on 2022-06-27 09:06:42 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    National narcissists are more willing to conspire against their fellow citizens

    By Emma Young

    Narcissists feel that they are exceptional, and don’t get the recognition they deserve. But narcissistic beliefs can apply to a group, too. Feeling that your nation, religion, organisation, or political party is superior but under-appreciated is known as “collective narcissism“. And now a team led by Mikey Biddlestone at the University of Cambridge reports that collective narcissists are more willing to conspire against other members of their own group. 

    The team’s paper, in the British Journal of Psychology, reports studies on people from Poland, the UK, and the US. In the first study, 361 Polish participants completed various questionnaires, including one that measured national narcissism using items such as “If Poles had a major say in the world the world would be a much better place”. An additional measure of “National identification” tapped into a happier, more secure sense of national feeling — “I feel strong ties to other Polish people”, for example. The participants also completed a brief scale that assessed whether, if they held a government position, they would be willing to wiretap fellow citizens, “spread false information if the situation required it”, and perform Internet surveillance without the consent of the citizens being observed.

    Once any overlap between the participants’ national narcissism and national identification was accounted for in the analysis, those who’d scored highly for national narcissism were more likely to say they’d conspire against fellow citizens. In contrast, high national identification scores were linked to a lower willingness to conspire.

    A subsequent study on 471 US-based participants incorporated even more extreme potential actions — such as a willingness to aid in the concealment of efforts that could lead to the spread of viruses, or even, “if necessary”, to work with the government to carry out domestic acts of terrorism. People who scored higher in national narcissism were more willing to engage in these conspiracies — even though doing so could clearly harm fellow citizens.

    In a later study, again on Polish participants, the team found that national narcissism predicted willingness to conspire against the in-group over and above other political and personality factors, such as scores for Machiavellianism or psychopathy. There was also another interesting finding from this study: those with high national narcissism scores were only willing to conspire against those they perceived to be typical, genuine members of their nation. “Collective narcissists might… view typical, yet disloyal in-group members as the most threatening to the in-group image, making them likely targets of conspiracies,” the team writes.

    The UK study looked not at national but workplace narcissism. These participants were asked about their willingness to collude in spreading false information about other members of their team at work, or conspire against them in various other ways, in a bid to gain an advantage over them. This time, however, there was no clear link between collective narcissism scores and a willingness to conspire against the in-group.

    For the studies looking at national narcissism, though, the results were consistent. “While traditional accounts of in-group identity might suggest that people are willing to act against outgroups rather than in-groups, we show that this might not be true for certain forms of in-group identity,” the researchers observe.

    What might explain this? Collective narcissism is thought to compensate for frustrated individual needs, such as feelings of powerlessness. For these people, the group serves those needs — giving them an enhanced sense of power, for example. Acting against fellow members might further boost feelings of power.

    It’s also possible that collective narcissists are more willing to conspire against fellow group members because they project their own willingness to conspire on to them. Indeed, some of the data suggests that this is the case.

    However, it’s worth stressing that the links reported by the team are all correlational; there could be alternative or additional explanations. As the researchers point out, if collective narcissists are more likely to believe in conspiracy theories in general, this might fuel a culture of intra-group suspicion and paranoia, making in-group conspiracy narratives more believable. Also, only in one of the studies — on workplace teams — were the participants asked to think about people they actually knew personally. And of course this study did not find a clear link between collective narcissism and a willingness to conspire against the in-group. Perhaps this link manifests only when the group is so big that the individual has no personal connection with the people they would be targeting.

    Overall, though, the work suggests that we should be wary of fellow group members who show signs of collective narcissism. “Even though collective narcissists seem to always be on the lookout for others threatening their group, eventually they might end up being their own group’s worst enemies,” the team concludes.  

    – Their own worst enemy? Collective narcissists are willing to conspire against their in-group

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-27 07:56:58 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Higgs boson discovery was just the beginning

    Emily Conover spent one of the most consequential moments in recent physics history in a cavern near a nuclear power plant in France.

    At the time, Conover was a Ph.D. student in particle physics (she’s now physics senior writer for Science News). She was part of a team building a detector in the cavern to observe elusive particles called neutrinos. It was the Fourth of July 2012. A few hundred kilometers away, scientists were announcing the discovery of another elusive subatomic particle, the Higgs boson, which physicists had been hunting for decades. As hundreds of researchers cheered in the main auditorium at the CERN particle physics lab near Geneva, Conover and the small group of physicists in the chilly French cavern cheered too, as did scientists worldwide. The Higgs boson filled in a missing piece in the standard model of particle physics, which explains just about everything known about the particles that make up atoms and transmit the forces of nature. No Higgs boson, no life as we know it.

    In this issue’s cover story, “The Higgs boson at 10,” Conover looks back at the excitement around the discovery of the Higgs boson and looks ahead to the many things that researchers hope to find out with its help. She also reviews a new biography of Peter Higgs, a modest man who made clear that he was just one of many scientists who contributed to the breakthrough.

    The discovery is part of Science News history too. Journalists around the world were eagerly awaiting the big announcement, which was being kept under wraps. But when Kate Travis, a Science News editor at the time, uncovered an announcement video accidentally posted early on CERN’s website, we published the big news the day before the official announcement.

    “Even though its discovery is 10 years old now, that’s still new in the grand scheme of particle physics, so we’re still learning lots about it,” Conover told me. “It’s very cool that I get the opportunity to write about this particle that is still so new to science.” And it’s very cool that we get to explore it with her.

    in Science News on 2022-06-26 11:15:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Readers react to a holey Triceratops skull, the W boson’s mass and more

    Meaty data

    Food production contributes substantially to global greenhouse gas emissions. Altering your diet can reduce those emissions, Betsy Ladyzhets reported in “Food choices” (SN: 5/7/22 & 5/21/22, p. 22).

    The story displayed a map showing the impact of an average person’s diet, by country, on greenhouse gas emissions (see “Putting emissions on the map,” below). Reader Steve Woodbury wondered whether the map depicts emissions attributable to countries’ consumption or exportation of food products.

    On this map, the greenhouse gas emissions are counted only in the country where the food use is happening, Ladyzhets says. The scientists who worked on this analysis chose to focus on modeling consumption habits because “the majority of food produced in most countries is consumed domestically,” she says.

    Putting emissions on the map

    At least one reader wanted to know more about a map that shows how countries’ average diets contribute to global greenhouse gas emissions. The map captures emissions from food production, but not from processing, transportation, retail or waste. Data are not available for countries in gray.

    Greenhouse gas emissions caused by the average person’s diet, by country
    Greenhouse gas emissions caused by the average person’s diet, by country
    B. Ladyzhets and T. Tibbitts
    SOURCE: B.F. KIM ET AL/GLOBAL ENVIRONMENTAL CHANGE 2020

    All about the fight

    A hole in the skull of a Triceratops dubbed “Big John” may have been a battle scar sustained during a fight with a peer, Anna Gibbs reported in “Triceratops hole may be a combat injury” (SN: 5/7/22 & 5/21/22, p. 20).

    Reader Dale S. Smith asked how scientists know that Big John was male, and whether the injury could have resulted from mating with a female.

    We know that Big John is most likely male because of the morphology and measurements of the skeleton, particularly the skull and pelvis, says Ruggero D’Anastasio, a paleopathologist at the “G. D’Annunzio” University of Chieti-Pescara in Italy.

    It’s unlikely that the injury happened during mating, D’Anastasio says. The location and shape of the wound indicates that it was inflicted from behind, which is probably not where a female Triceratops would be positioned during mating, he says. The shape of the hole also suggests that a large horn penetrated perpendicular to the skull’s bony frill. If the injury was sustained during mating, the lesion would likely have a different shape, caused by a big horn that pierced the skull’s fan at an acute rather than perpendicular angle, he says.

    But it’s possible that mating rivalry had a part in the brawl. “In many animal species, males struggle with each other to acquire the right to mate with females,” D’Anastasio says. “It is all about fighting.”

    Mass mismatch

    A new measurement suggests the W boson may have a higher mass than expected, revealing a potential flaw in the standard model of particle physics, Emily Conover reported in “Subatomic particle may be extra hefty” (SN: 5/7/22 & 5/21/22, p. 12).

    The W boson’s newly measured mass is 80,433.5 million electron volts, Conover reported. That exceeds the predicted mass of 80,357 MeV by roughly 0.1 percent. Reader Jerry Boehm asked if such a tiny discrepancy is significant enough for scientists to discuss the prospect of new particles.

    In short, yes, Conover says. “Even though that sounds like a very small mismatch, physicists have determined the measured and predicted W boson masses so precisely that the discrepancy is much bigger than expected,” she says. “The W boson mass was measured to a precision of 0.01 percent, similar to that of the predicted mass. So we’d expect those values to be much closer together, unless we’re missing something. That ‘something’ could be an exciting find, like new particles, or it could mean that there’s something else that’s not accounted for properly in either the measurement or the prediction.”

    in Science News on 2022-06-26 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Monkeypox is not a global health emergency for now, WHO says

    Monkeypox is not yet a global public health emergency, the World Health Organization said June 25.

    The decision comes as the outbreak of the disease related to smallpox continues to spread, affecting at least 4,100 people in 46 countries as of June 24. That includes at least 201 cases in the United States. Those cases have been found in 25 states and the District of Columbia, according to the U.S. Centers for Disease Control and Prevention.

    “Controlling the further spread of outbreak requires intense response efforts,” and the situation should be reevaluated in a few weeks, the WHO committee evaluating the outbreak said in an announcement.

    The declaration of a public health emergency would have potentially made it easier to get treatments and vaccines to people infected with or exposed to the virus. Some medications and vaccines that could help fend off monkeypox are approved for use against smallpox, and can be used against monkeypox only with special authorization.

    The virus that causes monkeypox, named for its discovery in monkeys in 1958 though it is probably a virus that mainly infects rodents, is not a new threat. Countries in central Africa, where monkeypox is endemic, have had sporadic outbreaks since researchers found the first human case in 1970. Places in western Africa had few cases until 2017. But most cases outside the continent were travel-related, with limited spread to others (SN: 5/26/22). 

    “Monkeypox has been circulating in a number of African countries for decades and has been neglected in terms of research, attention and funding,” WHO director-general Tedros Ghebreyesus said in a statement announcing the decision. “This must change not just for monkeypox but for other neglected diseases in low-income countries as the world is reminded yet again that health is an interconnected proposition.”

    Monkeypox typically kills fewer than 10 percent of people who contract it. At least one person has died in the global outbreak.  

    As case numbers climb, researchers are working to decipher the genetic blueprint of the virus, in hopes of uncovering whether some viral mutations might explain why the virus has quickly gained a foothold in new places. 

    Tracing the mutations

    The closest known relative of the versions of the virus behind the global outbreak comes from Nigeria, hinting that the outbreak may have got its start there. 

    In the newest surge in cases, scientists have uncovered more viral changes than anticipated — a sign that the virus may have been circulating undetected among people for a while, perhaps since Nigeria’s 2017–2018 monkeypox outbreak, new research suggests. What’s more, a group of enzymes known for their virus-fighting abilities in the body may be to blame for many of those mutations. 

    A genetic analysis of monkeypox viruses involved in the global outbreak from 15 people across seven countries shows that these viruses have an average of 50 more genetic tweaks than versions circulating in 2018 and 2019, researchers report June 24 in Nature Medicine. That’s roughly six to 12 times as many mutations as scientists would have expected the virus to develop over that time. Unlike some other types of viruses, poxviruses, which include smallpox and monkeypox viruses, typically mutate fairly slowly.  

    The changes have a pattern that is a hallmark of an enzyme family called APOBEC3, the researchers say. These enzymes edit DNA’s building blocks — represented by the letters G, C, A and T — in a specific way: Gs change to As and Cs to Ts. The analysis found that particular pattern in the viral sequences, suggesting that APOBEC3s are responsible for the mutations. 

    Ideally, so many DNA building blocks are swapped for another that a virus is effectively destroyed and can’t infect more cells. But, sometimes, APOBEC3 enzymes don’t make enough changes to knock out the virus. Such mutated, though still functional, viruses can go on to infect additional cells, and possibly another person. 

    A big question, though, is whether the genetic tweaks seen in the monkeypox virus are helpful, harmful or have no effect at all on the virus. 

    While it’s still unknown whether the enzymes are directly responsible for the changes in the monkeypox virus, similar mutations are still popping up, the team found. So, APOBEC3s may still be helping the virus change as it continues to spread. One member of the enzyme family is found in skin cells, where people with monkeypox can develop infectious pox lesions.         

    six images of different types of skin lesions caused by monkeypox
    During the current outbreak, lesions on the skin that can spread monkeypox have been smaller than those seen in earlier outbreaks. Some examples are shown.UK Health Security Agency

    Different symptoms

    Symptoms reported in the global outbreak have been generally milder than those reported in previous outbreaks, perhaps allowing the disease to spread before a person knows they’re infected. 

    It is not clear whether those differences in symptoms are related to changes in the virus, Inger Damon, director of the CDC’s Division of High-Consequence Pathogens and Pathology, said June 21 in a news briefing hosted by SciLine, a service for journalists and scientists sponsored by the American Association for the Advancement of Science. 

    Typically, in previous outbreaks, people would develop flu-like symptoms, including fever, headaches, muscle aches and exhaustion about a week or two after exposure to the virus. Then, one to three days after those symptoms start, a rash including large pus-filled lesions pops up generally starting on the face and limbs, particularly the hands, and spreads over the body. Though generally milder, those symptoms are similar to smallpox, but people with monkeypox also tend to develop swollen lymph nodes. 

    All patients in the U.S. outbreak have gotten rashes, Damon said, “but the lesions have been scattered or localized to a specific body site, rather than diffuse, and have not generally involved the face or the … palms of the hand or the soles of the feet.” Instead, rashes may start in the genital or anal area where they can be mistaken for sexually transmitted diseases, such as syphilis or herpes, she said. 

    In many cases, the rashes have not spread to other parts of the body. And the classical early symptoms such as fever have been “mild and sometimes nonexistent before a rash appears,” Damon said. 

    Monkeypox is transmitted from person to person through close skin-to-skin contact or by contact with contaminated towels, clothes or bedding. It may also be spread by droplets of saliva exchanged during kissing or other intimate contact. The CDC is investigating whether the virus might be spread by semen as well as skin-to-skin contact during sex, Agam Rao, a captain in the U.S. Public Health Service, said June 23 at a meeting of the CDC’s Advisory Committee on Immunization Practices.

    “We don’t have any reason to suspect it is spread any other way,” such as through the air, Rao said.

    In Nigeria, more monkeypox cases have been recorded among women, while the global outbreak has affected mainly men, particularly men who have sex with men. Experts warn that anyone can be infected with monkeypox, and some people face an increased risk of severe disease. Those at increased risk include children, people who are immunocompromised, pregnant people and people with eczema. 

    The risk of catching monkeypox through casual contact is still low in the United States, Rao said. But data she presented show that while people in the country have contracted monkeypox while traveling abroad, cases have also spread locally.

    in Science News on 2022-06-26 01:15:35 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Weekend reads: Publication hijacking; questions about Sputnik vaccine; no more second round of review?

    Would you consider a donation to support Weekend Reads, and our daily work? Thanks in advance. The week at Retraction Watch featured: NYU postdoc with federal research misconduct settlement awarded NIH grant An Elsevier journal said it would retract 10 papers two years ago. It still hasn’t. UPenn prof with four retractions “may no longer be affiliated” … Continue reading Weekend reads: Publication hijacking; questions about Sputnik vaccine; no more second round of review?

    in Retraction watch on 2022-06-25 12:33:54 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    5 misunderstandings of pregnancy biology that cloud the abortion debate

    On June 24, the U.S. Supreme Court overturned Roe v. Wade. By undoing the landmark 1973 decision that protected a person’s right to an abortion, the highest court in the country has shifted decisions about this medical care to individual state and local governments.

    Some states have already passed laws that curtail abortion access. Now, without the federal protections Roe v. Wade provided, other states will likely follow suit.

    Many of those legislative efforts invoke medical and scientific language, in an effort to define when life begins. Heart development, fetal pain and viability have all been brought into justification for abortion restrictions. But many of these rationales don’t line up with the biology of early development. Texas’ 2021 “heartbeat law,” for instance, bans abortion after about six weeks when heart cells purportedly begin thumping. At that early stage of pregnancy, there isn’t yet a fully formed heart to beat. 

    Like most aspects of biology, early human development involves many complex processes.  Despite the rhetoric around these issues, clear lines — between having a heart and not having a heart or being able to survive outside of the uterus — are scarce, or nonexistent.

    “There aren’t these set black-and-white points for much of this,” says obstetrician-gynecologist Nisha Verma, a fellow with the American College of Obstetricians and Gynecologists in Washington, D.C.

    Here’s what’s known about five key aspects of pregnancy biology that often come up in abortion debates.

    1. The early timeline of a pregnancy is easy to misunderstand.

    That’s because how dates are determined is supremely confusing. The standard pregnancy clock actually starts ticking before a sperm cell encounters an egg, two weeks before, on average. An ovary releases an egg around day 14 of an average 28-day menstrual cycle (SN: 6/19/21) . (Day 1 is the first day of menstruation; day 1 is also when a pregnancy officially begins in the month an egg is fertilized.) That means that when a sperm fertilizes an egg, a person is already officially two weeks pregnant. As nonsensical as that sounds, it’s the simplest way medical professionals can date a pregnancy.

    That timeline means that abortion bans at six weeks, enacted in Texas, Oklahoma and Idaho, take effect earlier in pregnancy than many people think, Verma says. In 2020, she surveyed people in Georgia, where she was practicing medicine at the time, about their understanding of the timing. “Some people will say the six weeks is after your first missed period,” she says. “Some people think it’s from the date of conception.” Neither is correct.

    The ban would start four weeks after fertilization. Counting back, that’s two weeks after a missed period, which is often a person’s first indication that they might be pregnant. Such bans leave a person very little time — two weeks after a missed period — to access an abortion.

    What’s more, these dates are based on averages. Many women have irregular menstrual cycles. Birth control isn’t 100 percent effective, and certain types can eliminate menstruation altogether, throwing even more uncertainty into the early timeline of pregnancy.

    By the numbers

    Most abortions in the United States happen very early in pregnancy, data collected in 2019 by the U.S. Centers for Disease Control and Prevention show. Fewer than 5 percent of abortions are done at or after week 16 of pregnancy.  

    U.S. abortions in 2019 by week of pregnancy
    bar chart showing the number of U.S. abortions in 2019 by week of pregnancy, where most occur before 10 weeks
    E. OtwellE. Otwell
    Source: CDC

    2. Pregnancy takes more than sperm meeting egg.

    That meeting, which usually takes place in one of the two fallopian tubes near the ovaries, is fertilization, a process in which two cells fuse and mingle their genetic contents, creating what’s known as a zygote (SN: 1/10/15). But a fertilized egg does not automatically lead to a pregnancy, says obstetrician and gynecologist Jonas Swartz of Duke University School of Medicine. “Equating them doesn’t make sense from a medical standpoint.” Up to 50 percent of fertilized eggs do not implant in the uterus, researchers have estimated.

    The genetic material needs to combine in the right way. The growing ball of cells needs to travel to the uterus and implant itself in the right spot. And the right balance of hormones need to be churned out to support the pregnancy. “There are so many things other than the sperm meeting the egg that actually matter for this to become a pregnancy that has a chance to develop further,” says Selina Sandoval, an obstetrician and gynecologist who specializes in complex family planning at the University of California, San Diego.

    Lawmakers in some states are considering abortion rules that apply to a fertilized egg; Oklahoma had already passed such a law. That includes fertilized eggs that lodge in the wrong spot, the fallopian tube, for example. Called an ectopic pregnancy, this can lead to life-threatening medical emergencies when the growing tissue ruptures the tube and internal bleeding ensues. “These are pregnancies that under no circumstance can become a healthy pregnancy,” Sandoval says. “In fact, if they aren’t treated and continue to grow, they will kill the patient.” Laws that apply to a fertilized egg could “limit our ability to treat patients for ectopic pregnancies,” she says.

    3. “Heartbeat laws” are not what they seem.

    A Texas law bans abortions “after detection of an unborn child’s heartbeat.” But the rhythmic sounds heard on an ultrasound early in pregnancy aren’t caused by the opening and closing of heart valves as they move blood through the heart’s chambers, the motion that produces a typical lub dub sound. That’s because those chambers haven’t yet developed. On early ultrasounds, the heartbeat-like sounds are created by the ultrasound machine itself.

    “What we’re seeing is actually the primitive heart tube and the cells in that heart tube having electrical activity that causes fluttering,” Verma says. “The ultrasound is actually manufacturing that sound based on the electrical activity and fluttering motion.”

    Using the term “heartbeat” to describe the fluttering makes sense in some situations, like in conversations with excited parents-to-be, Verma says. “I’ve taken care of countless people who have seen that first ‘heartbeat’ on ultrasound for a desired pregnancy, and it’s this huge, exciting moment,” she says. “I don’t want to be dismissive of that.” She says two things can be true at the same time: “It can be exciting for a patient. It also isn’t a scientific thing.”

    4. Fetal pain is difficult to define.

    A bit of biology that’s often used to restrict abortions is the claim that fetuses (which form at week 11 of pregnancy) feel pain.

    “Pain is very complex,” Swartz says. “It requires not just a physical response, but the ability to suffer as a result.”

    Knowing what a fetus experiences is impossible, but brain development studies provide some clues. The experience of pain starts with the senses detecting something noxious. Those signals then have to travel to the cortex, the outer layer of the brain that helps interpret that sensation. In human fetuses, those brain connections don’t exist until about week 24 or 25 of pregnancy. In guidelines written by members of the Society for Maternal-Fetal Medicine, researchers write that these connections are necessary for the experience of pain, but are not sufficient on their own to conclude that pain is possible.

    In human fetuses, these connections aren’t actually operational until about week 28 or 29 of pregnancy, other studies suggest. “We can say with really, really good confidence that no sooner than 28 weeks is [pain] even possible,” Sandoval says. 

    The vast majority of abortions — over 90 percent — happen in the first trimester, before week 13 of pregnancy. The number of abortions after 24 or 25 weeks is “vanishingly small,” Swartz says.

    5. When a fetus could survive on its own is a complex medical calculation.

    The word “viability” is often used as a sharp cut-off point to mark the age at which a fetus could survive outside of the uterus. The problem is that one clear cut-off does not exist.

    “That has been a moving line as science has advanced and our ability to support very small babies has advanced,” Swartz says. “But it’s also not a fixed line for babies born now.”  

    On average, babies born around 22 to 24 weeks gestation either don’t survive or they survive with major health problems. Whether a fetus will survive if delivered depends on a whole suite of other factors, Swartz says. They include fetal sex, weight, developmental issues and mother’s health, not to mention individual health care facilities’ capabilities and training.

    The American College of Obstetricians and Gynecologists recently removed mentions of “viability” in their guidance on abortion care. “It’s such a complicated concept that we can’t make blanket statements about it,” Verma says. “It’s something that needs to be left to the clinician looking at the patient.”

    Inaccurate descriptions of biology can influence restrictions around reproductive health, and as a result, the health care people are able to receive, Swartz says. A colleague of his, for instance, wasn’t able to get appropriate medical care when she experienced signs of a pregnancy loss. Because of state abortion restrictions, her physician decided to delay treatment, an emotionally distressing experience she wrote about last year in Obstetrics and Gynecology. Abortion regulations based on flawed medical and scientific premises, Swartz says, “places priority on a potential life over the actual life of the person sitting in front of me.”

    in Science News on 2022-06-24 17:58:48 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    50 years ago, eels’ navigation skills electrified scientists

    June 24, 1972 cover of Science News

    Does the eel use electric fields to navigate? — Science News, June 24, 1972

    Many species of ocean fish [such as American eels] migrate over large distances. Some of them do so with such extreme accuracy that they can come thousands of miles to return to the stream or area where they were born. Naturalists naturally wonder how they do it. One of the suggestions is that they use electricity.

    Update

    It’s still a mystery how the American eel (Anguilla rostrata) navigates to its breeding grounds. But a growing body of evidence has shifted focus from electricity to magnetic fields. Experiments suggest that the American eel’s European cousin, A. anguilla, seems to follow a magnetic map to the North Atlantic’s Sargasso Sea, guided by an internal compass (SN Online: 4/13/17). In March, scientists proposed that freshly spawned A­merican and European eels follow paths of increasing magnetic intensity from the Sargasso Sea to their freshwater homes. As adults, the eels may sense decreasing intensity to retrace the path to their birthplace.

    in Science News on 2022-06-24 13:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Earth’s oldest known wildfires raged 430 million years ago

    Bits of charcoal entombed in ancient rocks unearthed in Wales and Poland push back the earliest evidence for wildfires to around 430 million years ago. Besides breaking the previous record by about 10 million years, the finds help pin down how much oxygen was in Earth’s atmosphere at the time.

    The ancient atmosphere must have contained at least 16 percent oxygen, researchers report June 13 in Geology. That conclusion is based on modern-day lab tests that show how much oxygen it takes for a wildfire to take hold and spread.

    While oxygen makes up 21 percent of our air today, over the last 600 million years or so, oxygen levels in Earth’s atmosphere have fluctuated between 13 percent and 30 percent (SN: 12/13/05). Long-term models simulating past oxygen concentrations are based on processes such as the burial of coal swamps, mountain building, erosion and the chemical changes associated with them. But those models, some of which predict lower oxygen levels as low as 10 percent for this time period, provide broad-brush strokes of trends and may not capture brief spikes and dips, say Ian Glasspool and Robert Gastaldo, both paleobotanists at Colby College in Waterville, Maine.

    Charcoal, a remnant of wildfire, is physical evidence that provides, at the least, a minimum threshold for oxygen concentrations. That’s because oxygen is one of three ingredients needed to create a wildfire. The second, ignition, came from lightning in the ancient world, says Glasspool. The third, fuel, came from burgeoning plants and fungus 430 million years ago, during the Silurian Period. The predominant greenery were low-growing plants just a couple of centimeters tall. Scattered among this diminutive ground cover were occasional knee-high to waist-high plants and Prototaxites fungi that towered up to nine meters tall. Before this time, most plants were single-celled and lived in the seas.

    Once plants left the ocean and began to thrive, wildfire followed. “Almost as soon as we have evidence of plants on land, we have evidence of wildfire,” says Glasspool.

    That evidence includes tiny chunks of partially charred plants — including charcoal as identified by its microstructure — as well as conglomerations of charcoal and associated minerals embedded within fossilized hunks of Prototaxites fungi. Those samples came from rocks of known ages that formed from sediments dumped just offshore of ancient landmasses. This wildfire debris was carried offshore in streams or rivers before it settled, accumulated and was preserved, the researchers suggest.

    image of a fossilized plant from Poland
    The microstructure of this fossilized and partially charred bit of plant unearthed in Poland from sediments that are almost 425 million years old reveals that it was burnt by some of Earth’s earliest known wildfires.Ian Glasspool/Colby College

    The discovery adds to previous evidence, including analyses of pockets of fluid trapped in halite minerals formed during the Silurian, that suggests that atmospheric oxygen during that time approached or even exceeded the 21 percent concentration seen today, the pair note.

    “The team has good evidence for charring,” says Lee Kump, a biogeochemist at Penn State who wasn’t involved in the new study. Although its evidence points to higher oxygen levels than some models suggest for that time, it’s possible that oxygen was a substantial component of the atmosphere even earlier than the Silurian, he says.

    “We can’t rule out that oxygen levels weren’t higher even further back,” says Kump. “It could be that plants from that era weren’t amenable to leaving a charcoal record.”

    in Science News on 2022-06-24 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Fans of horror movies are just as kind and compassionate as everyone else

    By Matthew Warren

    What kind of person wants to watch a movie where a boatload of people gets gruesomely cut in half by a wire, or where a man saws off his own foot to escape the sadistic games of a serial killer? You’d have to be pretty coldhearted and cruel to enjoy that kind of thing, right?

    That’s certainly how horror fans have historically been portrayed, at least by some commentators. But a new study finds no evidence for this stereotype. Fans of horror films are just as kind and compassionate as everyone else, according to the preprint published on PsyArXiv — and in some respects may be more so.

    First, Coltan Scrivner from Aarhus University examined whether people really do believe that horror fans lack empathy or compassion. He asked 201 participants to view a series of profiles which each presented information about a person, including their age, name, and favourite genre of movie: action, comedy, drama, or horror. Participants had to judge how kind, empathetic, and compassionate each person was.

    Participants did indeed see horror fans in a more negative light: they rated these people as significantly less kind than comedy, drama or action fans, and less empathetic and compassionate than comedy or drama fans.

    Scrivner then set out to see whether there was any truth to this stereotype. A new group of 244 participants rated the extent to which they enjoyed five sub-genres of horror film: gore/splatter, monster, paranormal, psychological, and slasher. They also completed measures of cognitive empathy (which is about understanding what another person is feeling) and affective empathy (which concerns the ability to share and experience their emotions), as well as a measure of “coldheartedness”, or disregard for others’ wellbeing.

    Participants who reported greater enjoyment of the various kinds of horror didn’t score any lower on empathy or higher on coldheartedness. In most cases, enjoyment of horror films wasn’t significantly related to scores on these measures at all, but there were a few instances where horror fans actually seemed more pro-social. For instance, people who enjoyed gore/splatter films had significantly greater cognitive empathy, while those who liked paranormal films scored higher on both kinds of empathy, and lower on coldheartedness. And overall enjoyment of horror across genres was related to lower coldheartness and higher cognitive empathy.

    These results suggest that the caricature of the anti-social, depraved horror fan is false. But, Scrivner reasoned, perhaps horror fans act less compassionately or empathetically, even if they don’t score any differently on scales measuring these traits. So, one to two weeks later, the same participants were each told that there were leftover funds from the previous study, and that they had been selected to receive an extra $0.50, alongside half of the other participants. They could choose to donate any amount of this money to another participant who had not received the bonus.

    Just over half of participants opted to donate some of the money — but the amount donated was not related to how much they enjoyed horror, or any of the sub-genres of horror. This suggests that people who like horror films act just as kindly and compassionately as others, Scrivner concludes.

    The results are hopefully not that surprising — most of us have moved on from the moral panic over “video nasties”, and recognise that we’re unlikely to become corrupted by the media we consume. But it’s nice to see that demonstrated, empirically. 

    And this isn’t the only study to do so. Just as horror enthusiasts remain as kind and compassionate as everyone else, players of violent video games don’t become more aggressive, and fans of heavy metal are just as well-adjusted as pop and rock aficionados. If there’s a broader message to all this work, it’s that we should let people enjoy the movies, games, and music that they like, without judging them or blaming them for society’s problems.

    – Bleeding-heart horror fans: Enjoyment of horror media is not related to reduced empathy or compassion [this paper is a preprint meaning that it has not yet been subjected to peer review and the final published version may differ from the version this report was based on]

    Matthew Warren (@MattBWarren) is Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-24 10:23:34 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    NYU postdoc with federal research misconduct settlement awarded NIH grant

    A postdoc at New York University’s Grossman School of Medicine who the U.S. Office of Research Integrity found engaged in research misconduct while a postdoc at another institution has been awarded an NIH grant just months after being sanctioned.  The postdoc, Shuo Chen, didn’t admit or deny the ORI’s findings, but agreed to one year … Continue reading NYU postdoc with federal research misconduct settlement awarded NIH grant

    in Retraction watch on 2022-06-24 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Schneider Shorts 24.06.2022 – Professor XYZ

    Schneider Shorts 24.06.2022 - bad choices in Dresden end with research misconduct findings, where the money for heart stem cell research went, nicotine and Photoshop fraud fail in clinical trials, with the most authoritative papermill guidelines, a coronavirus zapper from Italy, and a plant science professor who wasn't so great after all.

    in For Better Science on 2022-06-24 05:20:47 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    This giant bacterium is the largest one found yet

    There’s a new record holder for biggest bacterium — and you don’t need a microscope to see it.

    The newfound species, Thiomargarita magnifica, is roughly a centimeter long, and its cells are surprisingly complex, researchers report in the June 24 Science.

    The bacterial behemoth is roughly the size and shape of a human eyelash, marine biologist Jean-Marie Volland of the Laboratory for Research in Complex Systems in Menlo Park, Calif., said June 21 at a news conference. Maxing out at approximately 2 centimeters, T. magnifica is about 50 times the size of other giant bacteria and about 5,000 times the size of most other average-size bacterial species.

    What’s more, while the genetic material of most bacteria floats freely inside the cell, T. magnifica packs its DNA inside a sac surrounded by a membrane (SN: 6/22/17). Such a compartment is a hallmark of the larger, more complex cells of eukaryotes, a group of organisms that includes plants and animals.

    Study coauthor Oliver Gros, a marine biologist at the Université des Antilles Pointe-á-Pitre in Guadeloupe, France, first discovered T. magnifica while collecting water samples in tropical marine mangrove forests in the Caribbean’s Lesser Antilles. At first, he mistook the long, white filaments as some sort of eukaryote, Gros said at the news conference. But a few years later, genetic analyses showed that the organisms were actually bacteria. A closer look under the microscope revealed the cells’ DNA-containing sacs.

    Previous studies had predicted that bacterial cells’ overall lack of complexity meant there was a limit to how large bacteria could grow. But the new discovery is “breaking our way of thinking about bacteria,” says Ferran Garcia-Pichel, a microbiologist at Arizona State University in Tempe who was not involved with the study. When it comes to bacteria, people typically think small and simple. But that mindset may make researchers miss lots of other bacterial species, Garcia-Pichel says. It’s a bit like thinking the largest animal that exists is a small frog but then scientists discover elephants.

    It’s still unclear what role T. magnifica plays among the mangroves. Also unknown is why it evolved to be so large. One possibility, Volland said, is that being centimeters long helps cells access both oxygen and sulfide, which the bacteria need to survive.

    in Science News on 2022-06-23 18:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Vampire squid are gentle blobs. But this ancestor was a fierce hunter

    Despite the scary name, modern vampire squid are docile denizens of the deep sea — but their Jurassic ancestors may have been a lot fiercer.

    Analyses of fossilized soft tissue from three 164-million-year-old specimens of Vampyronassa rhodanica suggest that the ancient cephalopods had some powerful weapons, researchers say June 23 in Scientific Reports. Unlike its blobby modern relative, V. rhodanica had a more streamlined muscular body, with two of its eight arms twice as long as the other six. Strong suckers on all eight arms could have helped it snatch and hold onto prey.

    Modern vampire squid (Vampyroteuthis infernalis) are not actually squid; they’re the only surviving members of an ancient, diverse order of cephalopods, the Vampyromorpha. And V. infernalis are pretty passive about finding food (SN: 6/25/12). Alongside their arms, they have two long, retractable, sticky filaments that they use like flypaper to collect ”marine snow,” tiny bits of dead plankton or sinking fecal pellets that happen to drift past (SN: 5/19/15).

    Fossilized tissue suggests V. rhodanica had a very different lifestyle, report paleontologist Alison Rowe of Sorbonne University in Paris and her colleagues.

    image of Vampyronassa rhodanica fossil
    The soft tissues of this 164-million-year-old fossil of Vampyronassa rhodanica were remarkably well preserved in 3-D, allowing researchers to use high-resolution X-ray micro-computed tomography to reexamine and reconstruct its anatomy. It’s one of three specimens originally collected from a fossil-rich site in Ardèche, France.P. Loubry/CR2P.

    Both the ancient and the modern creatures have eight arms bearing suckers flanked by hairlike cirri. But the ancient cephalopod’s suckers are attached to the arms by stalks embedded in round layers of muscle. That muscular arrangement, the team says, would have greatly increased the pressure differential inside the sucker — making its suction more powerful. The creature’s numerous, closely packed cirri may have helped it sense prey, similar to a strategy used by some modern octopuses.

    V. infernalis’ suckers, lacking that muscle arrangement, don’t have such a strong grip. Instead, its suckers secrete mucus that coats whatever the creatures have captured with their filaments. The cirri then slide that slippery food along the arms and into its mouth.

    in Science News on 2022-06-23 15:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Guns and the precarity of manhood

    This is an excerpt from chapter 6, “Enraged, Rattled, and Wronged” of Enraged, Rattled, and Wronged: Entitlement’s Response to Social Progress by Kristin J. Anderson.

    Manhood is precarious. Unlike womanhood, manhood is hard won and easily lost and therefore men go to great effort to perform it—for the most part for other boys and men—sometimes to their own and others’ detriment.[63] Men will go out of their way to not appear unmanly or feminine—they adhere to an anti-femininity mandate. For example, men are reluctant to take jobs that women do. Men hold out for diminishing coal-mining jobs when they should be applying for home health aid jobs. Women have been flexible and have pushed themselves into men’s jobs; men have not pushed themselves into women’s jobs.[64] 

    There are consequences of the investment in maintaining manhood for individual men, as well as the rest of us. Men learn to be fixated on performing masculinity which often entails aggression. Men tend to believe that aggression is more typical than it actually is—as we learned earlier. They falsely believe that women are attracted to aggressive men, when, in fact, women tend to view aggression as weak and impulsive, a loss of self-control, and not sexy or charming.[65] 

    Gun popularity among men is linked to threats to their gender status … How do we know? First, men with higher sexism scores believe it should be easier to buy guns; men with lower sexism scores say it should be more difficult to buy guns.

    Entitlement tells White men that they shouldn’t have to bow down to those they perceive to be below them. In 2018, Markeis McGlockton, an African American man, pushed a White man to the ground because the man was yelling at his partner outside a convenience store. The man pulled out a gun and shot and killed McGlockton and was not prosecuted because of Florida’s stand-your-ground law.[66] Stand-your-ground laws and gun ownership are manifestations of feeling entitled to never back down. Gun owners often justify owning or carrying a gun with fears of violent crime however, over the same period in which gun purchases have risen, violent crime has dropped. In truth, support for gun rights for White men is linked to perceived threats to their racial privilege.[67] Gun popularity among men is linked to threats to their gender status as well. How do we know? First, men with higher sexism scores believe it should be easier to buy guns; men with lower sexism scores say it should be more difficult to buy guns.[68] Second, firearm background checks increase in communities where married men, but not married women, have lost their jobs.[69] Presumably, recently unemployed men become interested in guns at a time they feel vulnerable. In addition, when wives out earn their husbands, gun sales increase. [70] These men seem to see guns as one way to shore up masculinity. As we saw above, Jonathan Metzl’s work finds that guns are used by White men as a means of preserving racial privilege, even as Whites wind up being disproportionately harmed by the presence of guns.[71]

     It turns out in laboratory studies it’s pretty easy to threaten men’s masculinity into panic that then motivates them toward compensation. It’s worth taking a second look at a study described in Chapter 4 where Julia Dahl [72] and her colleagues asked young and mostly White men to complete a “gender knowledge test.” During this test participants were asked questions such as, “What is a dime in football?” and “Do you wear Manolo Blahniks on your head or feet?” The respondents were then randomly put into either a threat condition—being told they scored similar to the average women—or a no-threat condition—being told they scored like the typical man. Their grouping had nothing to do with their actual answers on the test. Men threatened by being associated with women were more likely to feel embarrassed by their responses, angry, and wanted to display dominant behaviors. Anger predicted greater endorsement of ideologies that implicitly promote men’s power over women. Specifically, men’s social dominance orientations and benevolent sexism (and both are correlated with entitlement) increased if they were exposed to the masculine threat. How did the threatened men in this study appease that threat to their masculinity? By endorsing the legitimacy of men’s societal power over other groups, particularly women.

    Featured image by Jari Hytönen on Unsplash, public domain

           

    Related Stories

    • The possibility of a world without intimate violence
    • A caution in exploring non-Western International Relations
    • How can we build the resilience of our healthcare systems?
     

    in OUPblog - Psychology and Neuroscience on 2022-06-23 12:30:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    We don’t trust extraverts more than introverts

    By Emily Reynolds

    When you think of an extravert, what personality traits come to mind? Sociability? Fun? While we often make positive judgments about extraversion, the picture is more complex, with negative traits also projected onto extraverts. Some research suggests that extraverts are seen as poorer listeners, for example.

    A new study, published in Personality and Social Psychology Bulletin, looks specifically at how much people trust those who are extraverted. The team finds that agreeableness, not extraversion, is the key to gaining trust in social situations.

    In the first study, participants were told they would be matched with another participant (who in reality did not exist) to play a trust game, in which decisions they made would affect both their payoffs and the payoffs of the other participant. All participants were given the role of the “trustor”, while their apparent partner played the “trustee”. They started with £1. If they chose to keep the money, this did not increase, but if they transferred it to their partner it was tripled. The partner could then choose how much of the £3 to transfer back to the participant.

    After being given the £1, participants were told that their partner was either agreeable or disagreeable, and extraverted or introverted. They were then asked to indicate whether they would transfer the money or not.

    The results showed that participants were more likely to trust agreeable than disagreeable partners: participants were four times more likely to transfer the £1 to agreeable trustees than disagreeable ones. Extraversion, however, was not significantly related to trust. This finding was replicated in a second study.

    A third study found that even without explicitly knowing a person’s personality, it still seems to affect our judgement of their trustworthiness. Participants first rated their own personalities. Then, 1 to 2 weeks later, they completed an in-person lab session in groups  of three to six people, most of whom did not know each other. Participants were told they would solve tasks as a group: three riddles, and two “unusual uses” tasks, which involve thinking of as many uses as possible for an object. They then took part in a similar trust game to the first studies.

    Results again showed that people who had rated themselves as higher in agreeableness were more likely to be trusted by the other members of their group. No other Big Five trait, including extraversion, was related to trust. So it seemed that participants made judgements of others’ personalities based on their behaviour, and trusted those who they judged as agreeable.         

    In the final study, the team found that we seem to instinctively know that agreeable people are more likely to be trusted. Participants were told they would interact with another participant and were given details of the trust game. However, this time, they were all assigned as the trustee, not the trustor. They then rated themselves on extraversion and agreeableness. In one condition, participants were told their answers would be shown to their partner; in another, they were told the answers would remain private and they should answer as accurately as possible.

    Participants in the public condition reported being more agreeable than those in the private condition: in other words, they presented themselves as more agreeable when they thought someone was judging them on how trustworthy they were. This was not the case with extraversion, however: participants did not report being more extraverted in the public condition compared to the private one.

    Overall, the results from the four studies suggest that we don’t place any more trust in people with higher levels of extraversion. Agreeableness, however, is a trait that not only seems to facilitate trust but that we also want to emphasise when we ourselves want to be trusted.

    Future research could explore exactly what it is about agreeableness that makes us trust people who embody it: non-verbal behaviour such as smiles or body language, types of verbal expression, or particular traits such as honesty or altruism.

    – The Effects of Partner Extraversion and Agreeableness on Trust

    Emily Reynolds is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-23 12:03:56 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Cats chewing on catnip boosts the plant’s insect-repelling powers

    For many cats, a mere whiff of catnip can send them into a licking, rolling, plant-shredding frenzy.

    That destruction amplifies catnip’s natural defenses against insects and its appeal to cats, a new chemical analysis finds. Compared with intact leaves, crushed-up leaves emit more volatile compounds called iridoids, which act as an insect repellant, researchers report June 14 in iScience. The higher emissions also seem to encourage cats to continue rolling around in the remains of the plant, effectively coating themselves in a natural bug spray (SN: 3/4/21).

    Masao Miyazaki, a biologist at Iwate University in Morioka, Japan, and his colleagues analyzed the chemistry of both catnip (Nepeta cataria) and silver vine (Actinidia polygama), a plant common in Asia that has a similar euphoric effect on cats. Both plants naturally produce iridoids, which discourage insects from snacking on leaves.

    As cats toyed with silver vine, the damaged leaves released about 10 times more iridoids than intact leaves did, and also changed the proportion of the chemicals released. The researchers also found that catnip when crushed up released over 20 times more of its insect repellant, mostly a type of iridoid called nepetalactone.

    With both plants, lab-made iridoid cocktails mimicking those of damaged catnip and silver vine chased off more mosquitoes than chemical solutions that mirrored those of intact leaves, the study found.

    The team also presented cats with two dishes — one with intact silver vine and one with damaged leaves. Without fail, the cats would go to the damaged leaf container and lick and play with the dish and roll against it.

    in Science News on 2022-06-23 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    An Elsevier journal said it would retract 10 papers two years ago. It still hasn’t.

    An Elsevier journal has sat for two years on its decision to retract 10 papers by researchers with known misconduct issues, according to emails seen by Retraction Watch.  The Journal of the Neurological Sciences had decided by June 2020 to retract the articles by Yoshihiro Sato and Jun Iwamoto, who are currently in positions four … Continue reading An Elsevier journal said it would retract 10 papers two years ago. It still hasn’t.

    in Retraction watch on 2022-06-23 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    DNM Young Talent Award

    I am proud and honoured to have received the Young talent award from the DNM Dutch Neuroscience society. This was the first time I explicitly talked about my climate activism in combination with my neuroscientific pursuits, which I hope contributes to more conversations about the climate crisis within the Dutch neuroscience community.

    in CoCoSys lab on 2022-06-23 08:46:38 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    An otherwise quiet galaxy in the early universe is spewing star stuff

    PASADENA, Calif. — A lucky celestial alignment has given astronomers a rare look at a galaxy in the early universe that is seeding its surroundings with the elements needed to forge subsequent generations of stars and galaxies.

    Seen as it was just 700 million years after the Big Bang, the distant galaxy has gas flowing over its edges. It is the earliest-known run-of-the-mill galaxy, one that could have grown into something like the Milky Way, to show such complex behavior, astronomer Hollis Akins said June 14 during a news conference at the American Astronomical Society meeting.

    “These results also tell us that this outflow activity seems to be able to shape galaxy evolution, even in this very early part of the universe,” said Akins, an incoming graduate student at the University of Texas at Austin. He and colleagues also submitted their findings June 14 to arXiv.org.

    The galaxy, called A1689-zD1,­ shows up in light magnified by Abell 1689, a large galaxy cluster that can bend and intensify, or gravitationally lens, light from the universe’s earliest galaxies (SN: 2/13/08; SN: 10/6/15). Compared with other observed galaxies in the early universe, A1689-zD1 doesn’t make a lot of stars — only about 30 suns each year — meaning the galaxy isn’t very bright to our telescopes. But the intervening cluster magnified A1689-zD1’s light by nearly 10 times.

    Akins and colleagues studied the lensed light with the Atacama Large Millimeter/submillimeter Array, or ALMA, a large network of radio telescopes in Chile. The team mapped the intensities of a specific spectral line of oxygen, a tracer for hot ionized gas, and a spectral line of carbon, a tracer for cold neutral gas. Hot gas shows up where the bright stars are, but the cold gas extends four times as far, which the team did not expect.

    “There has to be some mechanism [to get] carbon out into the circumgalactic medium,” the space outside of the galaxy, Akins says.

    Only a few scenarios could explain that outflowing gas. Perhaps small galaxies are merging with A1689-zD1 and flinging gas farther out where it cools, Akins said. Or maybe the heat from star formation is pushing the gas out. The latter would be a surprise considering the relatively low rate of star formation in this galaxy. While astronomers have seen outflowing gas in other early-universe galaxies, those galaxies are bustling with activity, including converting thousands of solar masses of gas into stars per year.

    radio image of Galaxy A169-zD1
    Galaxy A169-zD1 (pictured, in radio waves) exists in the universe’s first 700 million years.ALMA/ESO, NAOJ and NRAO; H. Akins/Grinnell College; B. Saxton/NRAO/AUI/NSF

    The researchers again used the ALMA data to measure the motions of both the cold neutral and hot ionized gas. The hot gas showed a larger overall movement than the cold gas, which implies it’s being pushed from A1689-zD1’s center to its outer regions, Akins said at the news conference.

    Despite the galaxy’s relatively low rate of star formation, Akins and his colleagues still think the 30-solar-masses of stars a year heat the gas enough to push it out from the center of the galaxy. The observations suggest a more orderly bulk flow of gas, which implies outflows, however the researchers are analyzing the movement of the gas in more detail and cannot yet rule out alternate scenarios.

    They think when the hot gas flows out, it expands and eventually cools, Akins said, which is why they see the colder gas flowing over the galaxy’s edge. That heavy-element-rich gas enriches the circumgalactic medium and will eventually be incorporated into later generations of stars (SN: 6/17/15). Due to gravity’s pull, cool gas, often with fewer heavy elements, around the galaxy also falls toward its center so A1689-zD1 can continue making stars.

    These observations of A1689-zD1 show this flow of gas happens not only in the superbright, extreme galaxies, but even in normal ones in the early universe. “Knowing how this cycle is working helps us to understand how these galaxies are forming stars, and how they grow,” says Caltech astrophysicist Andreas Faisst, who was not involved in the study.

    Astronomers aren’t done learning about A1689-zD1, either. “It’s a great target for follow-up observations,” Faisst says. Several of Akins’s colleagues plan to do just that with the James Webb Space Telescope (SN: 10/6/21).

    in Science News on 2022-06-22 16:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Physicists may have finally spotted elusive clusters of four neutrons

    Physicists have found the strongest sign yet of a fabled four of a kind.

    For six decades, researchers have hunted for clusters of four neutrons called tetraneutrons. But evidence for their existence has been shaky. Now, scientists say they have observed neutron clusters that appear to be tetraneutrons. The result strengthens the case that the fab four is more than a figment of physicists’ imaginations. But some scientists doubt that the claimed tetraneutrons are really what they seem.

    Unlike an atomic nucleus, in which protons and neutrons are solidly bound together, the purported tetraneutrons seem to be quasi-bound, or resonant, states. That means that the clumps last only for fleeting instants — in this case, less than a billionth of a trillionth of a second, the researchers report in the June 23 Nature.

    Tetraneutrons fascinate physicists because, if confirmed, the clusters would help scientists isolate and probe mysterious neutron-neutron forces and the inner workings of atomic nuclei. All atomic nuclei contain one or more protons, so scientists don’t have a complete understanding of the forces at play within groups composed only of neutrons.

    Conclusively spotting the four-neutron assemblage would be a first. “Up to now, there was no real observation of … such a system that is composed only from neutrons,” says nuclear physicist Meytal Duer of the Technical University of Darmstadt in Germany.

    To create the neutron quartets, Duer and colleagues started with a beam of a radioactive, neutron-rich type of helium called helium-8, created at RIKEN in Wako, Japan. The team then slammed that beam into a target containing protons. When a helium-8 nucleus and proton collided, the proton knocked out a group of two protons and two neutrons, also known as an alpha particle. Because each initial helium-8 nucleus had two protons and six neutrons, that left four neutrons alone.

    By measuring the momenta of the alpha particle and the ricocheting proton, the researchers determined the energy of the four neutrons. The measurement revealed a bump on a plot of the neutrons’ energy across multiple collisions — the signature of a resonance.

    Particle smashup

    Physicists collided a helium-8 nucleus with a target proton and measured the momenta of the ricocheting proton and an escaping alpha particle — a clump of two neutrons and two protons. Those measurements revealed signs that the four neutrons released in the smashup formed a long-sought cluster called a tetraneutron.

    diagram showing a Helium-8 nucleus colliding with a target proton resulting in a ricocheting proton and an escaping alpha particle and releasing 4 neutrons
    M. Duer et al/Nature 2022M. Duer et al/Nature 2022

    In the past, “there were indications, but it was never very clear” whether tetraneutrons existed, says nuclear physicist Marlène Assié of Laboratoire de Physique des 2 Infinis Irène Joliot-Curie in Orsay, France. In 2016, Assié and colleagues reported hints of only a few tetraneutrons (SN: 2/8/16). In the new study, the researchers report observing around 30 clusters. The bump on the new plot is much clearer, she says. “I have no doubts on this measurement.”

    But theoretical calculations of what happens when four neutrons collide have raised skepticism as to whether a tetraneutron resonance can exist. If the forces between neutrons were strong enough to create a tetraneutron resonance, certain types of atomic nuclei should exist that are known not to, says theoretical nuclear physicist Natalia Timofeyuk of the University of Surrey in Guildford, England.

    Because of that contradiction, she thinks that the researchers have not observed a true resonance, but another effect that is not yet understood. For example, she says, the bump could result from a “memory” that the neutrons retain of how they were arranged inside the helium-8 nucleus.

    Other types of theoretical calculations are a closer match with the new results. “Indeed, theoretical results are very controversial, as they either predict a tetraneutron resonance in good agreement with the results presented in this paper, or they don’t predict any resonance at all,” says theoretical nuclear physicist Stefano Gandolfi of Los Alamos National Laboratory in New Mexico. Further calculations will be needed to understand the results of the experiment.

    New experiments could help too. Because detecting neutrons, which have no electric charge, is more difficult than detecting charged particles, the researchers didn’t directly observe the four neutrons. In future experiments, Duer and colleagues hope to spot the neutrons and better pin down the tetraneutrons’ properties.

    Future work may reveal once and for all whether tetraneutrons are the real deal.  

    in Science News on 2022-06-22 15:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Gravitational wave ‘radar’ could help map the invisible universe

    It sounds like the setup for a joke: If radio waves give you radar and sound gives you sonar, what do gravitational waves get you?

    The answer might be “GRADAR” — gravitational wave “radar” — a potential future technology that could use reflections of gravitational waves to map the unseen universe, say researchers in a paper accepted to Physical Review Letters. By looking for these signals, scientists may be able to find dark matter or dim, exotic stars and learn about their deep insides.

    Astronomers routinely use gravitational waves — traveling ripples in the fabric of space and time itself, first detected in 2015 — to watch cataclysmic events that are hard to study with light alone, such as the merging of two black holes (SN: 2/11/2016).

    But physicists have also known about a seemingly useless property of gravitational waves: They can change course. Einstein’s theory of gravity says that spacetime gets warped by matter, and any wave passing through these distortions will change course. The upshot is that when something emits gravitational waves, part of the signal comes straight at Earth, but some might arrive later — like an echo — after taking longer paths that bend around a star or anything else heavy.

    Scientists have always thought these later signals, called “gravitational glints,” should be too weak to detect. But physicists Craig Copi and Glenn Starkman of Case Western Reserve University in Cleveland, Ohio, took a leap: Working off Einstein’s theory, they calculated how strong the signal would be when waves scatter through the gravitational field inside a star itself.

    “The shocking thing is that you seem to get a much larger result than you would have expected,” Copi says. “It’s something we’re still trying to understand, where that comes from — whether it’s believable, even, because it just seems too good to be true.”

    If gravitational glints can be so strong, astronomers could possibly use them to trace the insides of stars, the team says. Researchers could even look for massive bodies in space that would otherwise be impossible to detect, like globs of dark matter or lone neutron stars on the other side of the observable universe.

    “That would be a very exciting probe,” says Maya Fishbach, an astrophysicist at Northwestern University in Evanston, Ill., who was not involved in the study.

    There are still reasons to be cautious, though. If this phenomenon stands up to more detailed scrutiny, Fishbach says, scientists would have to understand it better before they could use it — and that will probably be difficult.

    “It’s a very hard calculation,” Copi says.

    But similar challenges have been overcome before. “The whole story of gravitational wave detection has been like that,” Fishbach says. It was a struggle to do all the math needed to understand their measurements, she says, but now the field is taking off (SN: 1/21/21). “This is the time to really be creative with gravitational waves.”

    in Science News on 2022-06-22 11:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    UPenn prof with four retractions “may no longer be affiliated” with school

    A pharmacology researcher with four retractions appears to have left the University of Pennsylvania, where he had worked for at least 30 years and won more than $7 million in NIH grants. The school’s faculty page for William Armstead, who held ​​a research professorship in Anesthesiology and Critical Care, now bears only the statement that … Continue reading UPenn prof with four retractions “may no longer be affiliated” with school

    in Retraction watch on 2022-06-22 10:00:00 UTC.

  • - Wallabag.it! - Save to Instapaper - Save to Pocket -

    Merely expecting to feel stressed has a negative effect on our mood

    By Emily Reynolds

    Elections can be stressful. Research has looked into the distress of Americans when Trump was elected, and elections have also been linked to an increase in anxiety and stress, and poorer sleep quality. Most of this research, however, has looked at what happens after an election result, not before.

    A new study takes a different look, asking how the approach of an election result can impact people’s mental health. Writing in the International Journal of Psychology, a team from North Carolina State University finds that simply anticipating election stress has a negative effect on our mood.

    The study took place in 2018, starting just before a midterm election in the US; election day was November 6th, and the study took place between October 15th and November 13th. Participants first rated their own political ideology, and indicated whether they were identified as a Republican, Democrat, or something else. They then took part in a daily survey every day for the next 29 days.

    Each day, participants indicated how negative they were feeling on a scale from 1 to 5. Negative mood was separated into five different feelings: upset, hostile, ashamed, nervous and afraid. They then indicated how likely it was they would experience stress related to the midterm in the next 24 hours, again on a scale from 1 to 5. Finally, participants indicated whether or not they had been exposed to information about the midterms that day via TV, newspaper, and other forums.

    As expected, there was a significant association between anticipated election-related stress and negative mood: on days when people indicated that they were highly likely to experience stress around the midterms, their level of negative feeling was higher than on days when they did not expect to experience much stress. People were more likely to anticipate stress before the election and on election day than after election day. Conservatives also anticipated experiencing higher levels of election-related stress than liberals, which the team believes has something to do with the “Blue wave” that was taking place in the US at the time.

    The team writes this is the first evidence linking an anticipatory form of stress around elections to negative mood, and adds to evidence that simply anticipating future stressors can be as impactful as experiencing those stressors directly. Anticipatory stress is also likely to be present in other areas of life, too, whether political or personal: future research could explore how this manifests itself in different domains.

    If this is the case for all elections, then resources could be put in place to help people deal with their anxiety and stress. Overall, the study suggests that such feelings could have a serious impact on people’s wellbeing — and that anticipatory stress, around elections or other big events, should be taken seriously.

    – Anticipatory stress during an election: A daily diary study

    Emily Reynolds is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on 2022-06-22 09:09:24 UTC.

Feed list

  • Brain Science with Ginger Campbell, MD: Neuroscience for Everyone
  • Ankur Sinha
  • Marco Craveiro
  • UH Biocomputation group
  • BMC Series blog
  • The Official PLOS Blog
  • PLOS Neuroscience Community
  • The Neurocritic
  • Discovery magazine - Neuroskeptic
  • Neurorexia
  • Neuroscience - TED Blog
  • xcorr.net
  • Erin C. McKiernan
  • The British Psychological Society - Research Digest
  • Nature - Action potential
  • The Guardian - Neurophilosophy by Mo Constandi
  • Science News
  • OIST Japan - CNU - Eric De Schutter
  • Brain Byte - The HBP blog
  • The Silver Lab
  • Scientific American
  • Scientific American Mind
  • Nature News & Comment
  • Nature Biological Sciences Research
  • Romain Brette
  • Retraction watch
  • Elsevier Connect
  • Neural Ensemble News
  • Marianne Bezaire
  • Forging Connections
  • Yourbrainhealth
  • Neuroscientists talk shop
  • Brain matters the Podcast
  • Brain Science with Ginger Campbell, MD: Neuroscience for Everyone
  • NeuroPod
  • Brain box
  • Over the brainbow
  • The Spike
  • OUPblog - Psychology and Neuroscience
  • For Better Science
  • Open and Shut?
  • Open Access Tracking Project: news
  • Computational Neuroscience
  • Pillow Lab
  • NeuroFedora blog
  • Anna Dumitriu: Bioart and Bacteria
  • arXiv.org blog
  • Neurdiness: thinking about brains
  • Bits of DNA
  • Neuroscientifically Challenged
  • Peter Rupprecht
  • Malin Sandström's blog
  • INCF/OCNS Software Working Group
  • Gender Issues in Neuroscience (at Standford University)
  • Vanessa Christopher's blog
  • CoCoSys lab

All content on this page is owned by their respective owners. The source code used to generate this page can be found here.