This robot automatically tucks its limbs to squeeze through spaces

Inspired by how ants move through narrow spaces by shortening their legs, scientists have built a robot that draws in its limbs to navigate constricted passages.

The robot was able to hunch down and walk quickly through passages that were narrower and shorter than itself, researchers report January 20 in Advanced Intelligent Systems. It could also climb over steps and move on grass, loose rock, mulch and crushed granite.

Such generality and adaptability are the main challenges of legged robot locomotion, says robotics engineer Feifei Qian, who was not involved in the study. Some robots have specialized limbs to move over a particular terrain, but they cannot squeeze into small spaces (SN: 1/16/19).
“A design that can adapt to a variety of environments with varying scales or stiffness is a lot more challenging, as trade-offs between the different environments need to be considered,” says Qian, of the University of Southern California in Los Angeles.

For inspiration, researchers in the new study turned to ants. “Insects are really a neat inspiration for designing robot systems that have minimal actuation but can perform a multitude of locomotion behaviors,” says Nick Gravish, a roboticist at the University of California, San Diego (SN: 8/16/18). Ants adapt their posture to crawl through tiny spaces. And they aren’t perturbed by uneven terrain or small obstacles. For example, their legs collapse a bit when they hit an object, Gravish says, and the ants continue to move forward quickly.

Gravish and colleagues built a short, stocky robot — about 30 centimeters wide and 20 centimeters long — with four wavy, telescoping limbs. Each limb consists of six nested concentric tubes that can draw into each other. What’s more, the limbs do not need to be actively powered or adjusted to change their overall length. Instead, springs that connect the leg segments automatically allow the legs to contract when the robot navigates a narrow space and stretch back out in an open space. The goal was to build mechanically intelligent structures rather than algorithmically intelligent robots.

“It’s likely faster than active control, [which] requires the robot to first sense the contact with the environment, compute the suitable action and then send the command to its motors,” Qian says, about these legs. Removing the sensing and computing components can also make the robots small, cheap and less power hungry.

The robot could modify its body width and height to achieve a larger range of body sizes than other similar robots. The leg segments contracted into themselves to let the robot wiggle through small tunnels and sprawled out when under low ceilings. This adaptability let the robot squeeze into spaces as small as 72 percent its full width and 68 percent its full height.
Next, the researchers plan to actively control the stiffness of the springs that connect the leg segments to tune the motion to terrain type without consuming too much power. “That way, you can keep your leg long when you are moving on open ground or over tall objects, but then collapse down to the smallest possible shape in confined spaces,” Gravish says.
Such small-scale, minimal robots are easy to produce and can be quickly tweaked to explore complex environments. However, despite being able to walk across different terrains, these robots are, for now, too fragile for search-and-rescue, exploration or biological monitoring, Gravish says.

The new robot takes a step closer to those goals, but getting there will take more than just robotics, Qian says. “To actually achieve these applications would require an integration of design, control, sensing, planning and hardware advancement.”

But that’s not Gravish’s interest. Instead, he wants to connect these experiments back to what was observed in the ants originally and use the robots to ask more questions about the rules of locomotion in nature (SN: 1/16/20).

“I really would like to understand how small insects are able to move so rapidly across certain unpredictable terrain,” he says. “What is special about their limbs that enables them to move so quickly?”

The Kuiper Belt’s dwarf planet Quaoar hosts an impossible ring

The dwarf planet Quaoar has a ring that is too big for its metaphorical fingers. While all other rings in the solar system lie within or near a mathematically determined distance of their parent bodies, Quaoar’s ring is much farther out.

“For Quaoar, for the ring to be outside this limit is very, very strange,” says astronomer Bruno Morgado of the Federal University of Rio de Janeiro. The finding may force a rethink of the rules governing planetary rings, Morgado and colleagues say in a study published February 8 in Nature.
Quaoar is an icy body about half the size of Pluto that’s located in the Kuiper Belt at the solar system’s edge (SN: 8/23/22). At such a great distance from Earth, it’s hard to get a clear picture of the world.

So Morgado and colleagues watched Quaoar block the light from a distant star, a phenomenon called a stellar occultation. The timing of the star winking in and out of view can reveal details about Quaoar, like its size and whether it has an atmosphere.

The researchers took data from occultations from 2018 to 2020, observed from all over the world, including Namibia, Australia and Grenada, as well as space. There was no sign that Quaoar had an atmosphere. But surprisingly, there was a ring. The finding makes Quaoar just the third dwarf planet or asteroid in the solar system known to have a ring, after the asteroid Chariklo and the dwarf planet Haumea (SN: 3/26/14; SN: 10/11/17).

Even more surprisingly, “the ring is not where we expect,” Morgado says.
Known rings around other objects lie within or near what’s called the Roche limit, an invisible line where the gravitational force of the main body peters out. Inside the limit, that force can rip a moon to shreds, turning it into a ring. Outside, the gravity between smaller particles is stronger than that from the main body, and rings will coalesce into one or several moons.

“We always think of [the Roche limit] as straightforward,” Morgado says. “One side is a moon forming, the other side is a ring stable. And now this limit is not a limit.”

For Quaoar’s far-out ring, there are a few possible explanations, Morgado says. Maybe the observers caught the ring at just the right moment, right before it turns into a moon. But that lucky timing seems unlikely, he notes.

Maybe Quaoar’s known moon, Weywot, or some other unseen moon contributes gravity that holds the ring stable somehow. Or maybe the ring’s particles are colliding in such a way that they avoid sticking together and clumping into moons.

The particles would have to be particularly bouncy for that to work, “like a ring of those bouncy balls from toy stores,” says planetary scientist David Jewitt of UCLA, who was not involved in the new work.

The observation is solid, says Jewitt, who helped discover the first objects in the Kuiper Belt in the 1990s. But there’s no way to know yet which of the explanations is correct, if any, in part because there are no theoretical predictions for such far-out rings to compare with Quaoar’s situation.

That’s par for the course when it comes to the Kuiper Belt. “Everything in the Kuiper Belt, basically, has been discovered, not predicted,” Jewitt says. “It’s the opposite of the classical model of science where people predict things and then confirm or reject them. People discover stuff by surprise, and everyone scrambles to explain it.”

More observations of Quaoar, or more discoveries of seemingly misplaced rings elsewhere in the solar system, could help reveal what’s going on.

“I have no doubt that in the near future a lot of people will start working with Quaoar to try to get this answer,” Morgado says.

How fingerprints form was a mystery — until now

Scientists have finally figured out how those arches, loops and whorls formed on your fingertips.

While in the womb, fingerprint-defining ridges expand outward in waves starting from three different points on each fingertip. The raised skin arises in a striped pattern thanks to interactions between three molecules that follow what’s known as a Turing pattern, researchers report February 9 in Cell. How those ridges spread from their starting sites — and merge — determines the overarching fingerprint shape.
Fingerprints are unique and last for a lifetime. They’ve been used to identify individuals since the 1800s. Several theories have been put forth to explain how fingerprints form, including spontaneous skin folding, molecular signaling and the idea that ridge pattern may follow blood vessel arrangements.

Scientists knew that the ridges that characterize fingerprints begin to form as downward growths into the skin, like trenches. Over the few weeks that follow, the quickly multiplying cells in the trenches start growing upward, resulting in thickened bands of skin.

Since budding fingerprint ridges and developing hair follicles have similar downward structures, researchers in the new study compared cells from the two locations. The team found that both sites share some types of signaling molecules — messengers that transfer information between cells — including three known as WNT, EDAR and BMP. Further experiments revealed that WNT tells cells to multiply, forming ridges in the skin, and to produce EDAR, which in turn further boosts WNT activity. BMP thwarts these actions.

To examine how these signaling molecules might interact to form patterns, the team adjusted the molecules’ levels in mice. Mice don’t have fingerprints, but their toes have striped ridges in the skin comparable to human prints. “We turn a dial — or molecule — up and down, and we see the way the pattern changes,” says developmental biologist Denis Headon of the University of Edinburgh.

Increasing EDAR resulted in thicker, more spaced-out ridges, while decreasing it led to spots rather than stripes. The opposite occurred with BMP, since it hinders EDAR production.

That switch between stripes and spots is a signature change seen in systems governed by Turing reaction-diffusion, Headon says. This mathematical theory, proposed in the 1950s by British mathematician Alan Turing, describes how chemicals interact and spread to create patterns seen in nature (SN: 7/2/10). Though, when tested, it explains only some patterns (SN: 1/21/14).

Mouse digits, however, are too tiny to give rise to the elaborate shapes seen in human fingerprints. So, the researchers used computer models to simulate a Turing pattern spreading from the three previously known ridge initiation sites on the fingertip: the center of the finger pad, under the nail and at the joint’s crease nearest the fingertip.
By altering the relative timing, location and angle of these starting points, the team could create each of the three most common fingerprint patterns — arches, loops and whorls — and even rarer ones. Arches, for instance, can form when finger pad ridges get a slow start, allowing ridges originating from the crease and under the nail to occupy more space.

“It’s a very well-done study,” says developmental and stem cell biologist Sarah Millar, director of the Black Family Stem Cell Institute at the Icahn School of Medicine at Mount Sinai in New York City.

Controlled competition between molecules also determines hair follicle distribution, says Millar, who was not involved in the work. The new study, she says, “shows that the formation of fingerprints follows along some basic themes that have already been worked out for other types of patterns that we see in the skin.”

Millar notes that people with gene mutations that affect WNT and EDAR have skin abnormalities. “The idea that those molecules might be involved in fingerprint formation was floating around,” she says.

Overall, Headon says, the team aims to aid formation of skin structures, like sweat glands, when they’re not developing properly in the womb, and maybe even after birth.

“What we want to do, in broader terms, is understand how the skin matures.”

The deadly VEXAS syndrome is more common than doctors thought

A mysterious new disease may be to blame for severe, unexplained inflammation in older men. Now, researchers have their first good look at who the disease strikes, and how often.

VEXAS syndrome, an illness discovered just two years ago, affects nearly 1 in 4,000 men over 50 years old, scientists estimate January 24 in JAMA. The disease also occurs in older women, though less frequently. Altogether, more than 15,000 people in the United States may be suffering from the syndrome, says study coauthor David Beck, a clinical geneticist at NYU Langone Health in New York City. Those numbers indicate that physicians should be on the lookout for VEXAS, Beck says. “It’s underrecognized and underdiagnosed. A lot of physicians aren’t yet aware of it.”
Beck’s team reported discovering VEXAS syndrome in 2020, linking mutations in a gene called UBA1 to a suite of symptoms including fever, low blood cell count and inflammation. His team’s new study is the first to estimate how often VEXAS occurs in the general population — and the results are surprising. “It’s more prevalent than we suspected,” says Emma Groarke, a hematologist at the National Institutes of Health in Bethesda, Md., who was not involved with the study.
VEXAS tends to show up later in life ­­— after people somehow acquire UBA1 mutations in their blood cells. Patients may feel overwhelming fatigue, lethargy and have skin rashes, Beck says. “The disease is progressive, and it’s severe.” VEXAS can also be deadly. Once a person’s symptoms begin, the median survival time is about 10 years, his team has found.

Until late 2020, no one knew that there was a genetic thread connecting VEXAS syndrome’s otherwise unexplained symptoms. In fact, individuals may be diagnosed with other conditions, including polyarteritis nodosa, an inflammatory blood disease, and relapsing polychondritis, a connective tissue disorder, before being diagnosed with VEXAS.

To ballpark the number of VEXAS-affected individuals, Beck’s team combed through electronic health records of more than 160,000 people in Pennsylvania, in a collaboration with the NIH and Geisinger Health. In people over 50, the disease-causing UBA1 mutations showed up in roughly 1 in 4,000 men. Among women in that age bracket, about 1 in 26,000 had the mutations.

A genetic test of the blood can help doctors diagnose VEXAS, and treatments like steroids and other immunosuppressive drugs, which tamp down inflammation, can ease symptoms. Groarke and her NIH colleagues have also started a small phase II clinical trial testing bone marrow transplants as a way to swap patients’ diseased blood cells for healthy ones.

Beck says he hopes to raise awareness about the disease, though he recognizes that there’s much more work to do. In his team’s study, for instance, the vast majority of participants were white Pennsylvanians, so scientists don’t know how the disease affects other populations. Researchers also don’t know what spurs the blood cell mutations, nor how they spark an inflammatory frenzy in the body.

“The more patients that are diagnosed, the more we’ll learn about the disease,” Beck says. “This is just one step in the process of finding more effective therapies.”

Muon scanning hints at mysteries within an ancient Chinese wall

For nearly 650 years, the fortress walls in the Chinese city of Xi’an have served as a formidable barrier around the central city. At 12 meters high and up to 18 meters thick, they are impervious to almost everything — except subatomic particles called muons.

Now, thanks to their penetrating abilities, muons may be key to ensuring that the walls that once protected the treasures of the first Ming Dynasty — and are now a national architectural treasure in their own right — stand for centuries more.

A refined detection method has provided the highest-resolution muon scans yet produced of any archaeological structure, researchers report in the Jan. 7 Journal of Applied Physics. The scans revealed interior density fluctuations as small as a meter across inside one section of the Xi’an ramparts. The fluctuations could be signs of dangerous flaws or “hidden structures archaeologically interesting for discovery and investigation,” says nuclear physicist Zhiyi Liu of Lanzhou University in China.
Muons are like electrons, only heavier. They rain down all over the planet, produced when charged particles called cosmic rays hit the atmosphere. Although muons can travel deep into earth and stone, they are scattered or absorbed depending on the material they encounter. Counting the ones that pass through makes them useful for studying volcano interiors, scanning pyramids for hidden chambers and even searching for contraband stashed in containers impervious to X-rays (SN: 4/22/22).

Though muons stream down continuously, their numbers are small enough that the researchers had to deploy six detectors for a week at a time to collect enough data for 3-D scans of the rampart.

It’s now up to conservationists to determine how to address any density fluctuations that might indicate dangerous flaws, or historical surprises, inside the Xi’an walls.

Too much of this bacteria in the nose may worsen allergy symptoms

A type of bacteria that’s overabundant in the nasal passages of people with hay fever may worsen symptoms. Targeting that bacteria may provide a way to rein in ever-running noses.

Hay fever occurs when allergens, such as pollen or mold, trigger an inflammatory reaction in the nasal passages, leading to itchiness, sneezing and overflowing mucus. Researchers analyzed the composition of the microbial population in the noses of 55 people who have hay fever and those of 105 people who don’t. There was less diversity in the nasal microbiome of people who have hay fever and a whole lot more of a bacterial species called Streptococcus salivarius, the team reports online January 12 in Nature Microbiology.
S. salivarius was 17 times more abundant in the noses of allergy sufferers than the noses of those without allergies, says Michael Otto, a molecular microbiologist at the National Institute of Allergy and Infectious Diseases in Bethesda, Md. That imbalance appears to play a part in further provoking allergy symptoms. In laboratory experiments with allergen-exposed cells that line the airways, S. salivarius boosted the cells’ production of proteins that promote inflammation.

And it turns out that S. salivarius really likes runny noses. One prominent, unpleasant symptom of hay fever is the overproduction of nasal discharge. The researchers found that S. salivarius binds very well to airway-lining cells exposed to an allergen and slathered in mucus — better than a comparison bacteria that also resides in the nose.

The close contact appears to be what makes the difference. It means that substances on S. salivarius’ surface that can drive inflammation — common among many bacteria — are close enough to exert their effect on cells, Otto says.

Hay fever, which disrupts daily activities and disturbs sleep, is estimated to affect as many as 30 percent of adults in the United States. The new research opens the door “to future studies targeting this bacteria” as a potential treatment for hay fever, says Mahboobeh Mahdavinia, a physician scientist who studies immunology and allergies at Rush University Medical Center in Chicago.

But any treatment would need to avoid harming the “good” bacteria that live in the nose, says Mahdavinia, who was not involved in the research.

The proteins on S. salivarius’ surface that are important to its ability to attach to mucus-covered cells might provide a target, says Otto. The bacteria bind to proteins called mucins found in the slimy, runny mucus. By learning more about S. salivarius’ surface proteins, Otto says, it may be possible to come up with “specific methods to block that adhesion.”

Chicken DNA is replacing the genetics of their ancestral jungle fowl

Today’s red jungle fowl — the wild forebears of the domesticated chicken — are becoming more chickenlike. New research suggests that a large proportion of the wild fowl’s DNA has been inherited from chickens, and relatively recently.

Ongoing interbreeding between the two birds may threaten wild jungle fowl populations’ future, and even hobble humans’ ability to breed better chickens, researchers report January 19 in PLOS Genetics.

Red jungle fowl (Gallus gallus) are forest birds native to Southeast Asia and parts of South Asia. Thousands of years ago, humans domesticated the fowl, possibly in the region’s rice fields (SN: 6/6/22).
“Chickens are arguably the most important domestic animal on Earth,” says Frank Rheindt, an evolutionary biologist at the National University of Singapore. He points to their global ubiquity and abundance. Chicken is also one of the cheapest sources of animal protein that humans have.

Domesticated chickens (G. gallus domesticus) were known to be interbreeding with jungle fowl near human settlements in Southeast Asia. Given the unknown impacts on jungle fowl and the importance of chickens to humankind, Rheindt and his team wanted to gather more details. Wild jungle fowl contain a store of genetic diversity that could serve as a crucial resource for breeding chickens resistant to diseases or other threats.

The researchers analyzed and compared the genomes — the full complement of an organism’s DNA — of 63 jungle fowl and 51 chickens from across Southeast Asia. Some of the jungle fowl samples came from museum specimens collected from 1874 through 1939, letting the team see how the genetic makeup of jungle fowl has changed over time.

Over the last century or so, wild jungle fowl’s genomes have become increasingly similar to chickens’. Between about 20 and 50 percent of the genomes of modern jungle fowl originated in chickens, the team found. In contrast, many of the roughly 100-year-old jungle fowl had a chicken-ancestry share in the range of a few percent.

The rapid change probably comes from human communities expanding into the region’s wilderness, Rheindt says. Most modern jungle fowl live in close vicinity to humans’ free-ranging chickens, with which they frequently interbreed.

Such interbreeding has become “almost the norm now” for any globally domesticated species, Rheindt says, such as dogs hybridizing with wolves and house cats crossing with wildcats. Pigs, meanwhile, are mixing with wild boars and ferrets with polecats.
Wild populations that interbreed with their domesticated counterparts could pick up physical or behavioral traits that change how the hybrids function in their ecosystem, says Claudio Quilodrán, a conservation geneticist at the University of Geneva not involved with this research.

The effect is likely to be negative, Quilodrán says, since some of the traits coming into the wild population have been honed for human uses, not for survival in the local environment.

Wild jungle fowl have lost their genetic diversity as they’ve interbred too. The birds’ heterozygosity — a measure of a population’s genetic diversity — is now just a tenth of what it was a century ago.

“This result is initially counterintuitive,” Rheindt says. “If you mix one population with another, you would generally expect a higher genetic diversity.”

But domesticated chickens have such low genetic diversity that certain versions of jungle fowl genes are being swept out of the population by a tsunami of genetic homogeneity. The whittling down of these animals’ genetic toolkit may leave them vulnerable to conservation threats.

“Having lots of genetic diversity within a species increases the chance that certain individuals contain the genetic background to adapt to a varied range of different environmental changes and diseases,” says Graham Etherington, a computational biologist at the Earlham Institute in Norwich, England, who was not involved with this research.

A shallower jungle fowl gene pool could also mean diminished resources for breeding better chickens. The genetics of wild relatives are sometimes used to bolster the disease or pest resistance of domesticated crop plants. Jungle fowl genomes could be similarly valuable for this reason.

“If this trend continues unabated, future human generations may only be able to access the entirety of ancestral genetic diversity of chickens in the form of museum specimens,” Rheindt says, which could hamper chicken breeding efforts using the wild fowl genes.

Some countries such as Singapore, Rheindt says, have started managing jungle fowl populations to reduce interbreeding with chickens.

Lots of Tatooine-like planets around binary stars may be habitable

SEATTLE — Luke Skywalker’s home planet in Star Wars is the stuff of science fiction. But Tatooine-like planets in orbit around pairs of stars might be our best bet in the search for habitable planets beyond our solar system.

Many stars in the universe come in pairs. And lots of those should have planets orbiting them (SN: 10/25/21). That means there could be many more planets orbiting around binaries than around solitary stars like ours. But until now, no one had a clear idea about whether those planets’ environments could be conducive to life. New computer simulations suggest that, in many cases, life could imitate art.
Earthlike planets orbiting some configurations of binary stars can stay in stable orbits for at least a billion years, researchers reported January 11 at the American Astronomical Society meeting. That sort of stability, the researchers propose, would be enough to potentially allow life to develop, provided the planets aren’t too hot or cold.

Of the planets that stuck around, about 15 percent stayed in their habitable zone — a temperate region around their stars where water could stay liquid — most or even all of the time.

The researchers ran simulations of 4,000 configurations of binary stars, each with an Earthlike planet in orbit around them. The team varied things like the relative masses of the stars, the sizes and shapes of the stars’ orbits around each other, and the size of the planet’s orbit around the binary pair.

The scientists then tracked the motion of the planets for up to a billion years of simulated time to see if the planets would stay in orbit over the sorts of timescales that might allow life to emerge.

A planet orbiting binary stars can get kicked out of the star system due to complicated interactions between the planet and stars. In the new study, the researchers found that, for planets with large orbits around star pairs, only about 1 out of 8 were kicked out of the system. The rest were stable enough to continue to orbit for the full billion years. About 1 in 10 settled in their habitable zones and stayed there.

Of the 4,000 planets that the team simulated, roughly 500 maintained stable orbits that kept them in their habitable zones at least 80 percent of the time.

“The habitable zone . . . as I’ve characterized it so far, spans from freezing to boiling,” said Michael Pedowitz, an undergraduate student at the College of New Jersey in Ewing who presented the research. Their definition is overly strict, he said, because they chose to model Earthlike planets without atmospheres or oceans. That’s simpler to simulate, but it also allows temperatures to fluctuate wildly on a planet as it orbits.
“An atmosphere and oceans would smooth over temperature variations fairly well,” says study coauthor Mariah MacDonald, an astrobiologist also at the College of New Jersey. An abundance of air and water would potentially allow a planet to maintain habitable conditions, even if it spent more of its time outside of the nominal habitable zone around a binary star system.

The number of potentially habitable planets “will increase once we add atmospheres,” MacDonald says, “but I can’t yet say by how much.”

She and Pedowitz hope to build more sophisticated models in the coming months, as well as extend their simulations beyond a billion years and include changes in the stars that can affect conditions in a solar system as it ages.

The possibility of stable and habitable planets in binary star systems is a timely issue says Penn State astrophysicist Jason Wright, who was not involved in the study.

“At the time Star Wars came out,” he says, “we didn’t know of any planets outside the solar system, and wouldn’t for 15 years. Now we know that there are many and that they orbit these binary stars.”

These simulations of planets orbiting binaries could serve as a guide for future experiments, Wright says. “This is an under-explored population of planets. There’s no reason we can’t go after them, and studies like this are presumably showing us that it’s worthwhile to try.”

Procrastination may harm your health. Here’s what you can do

The worst procrastinators probably won’t be able to read this story. It’ll remind them of what they’re trying to avoid, psychologist Piers Steel says.

Maybe they’re dragging their feet going to the gym. Maybe they haven’t gotten around to their New Year’s resolutions. Maybe they’re waiting just one more day to study for that test.

Procrastination is “putting off to later what you know you should be doing now,” even if you’ll be worse off, says Steel, of the University of Calgary in Canada. But all those tasks pushed to tomorrow seem to wedge themselves into the mind — and it may be harming people’s health.
In a study of thousands of university students, scientists linked procrastination to a panoply of poor outcomes, including depression, anxiety and even disabling arm pain. “I was surprised when I saw that one,” says Fred Johansson, a clinical psychologist at Sophiahemmet University in Stockholm. His team reported the results January 4 in JAMA Network Open.

The study is one of the largest yet to tackle procrastination’s ties to health. Its results echo findings from earlier studies that have gone largely ignored, says Fuschia Sirois, a behavioral scientist at Durham University in England, who was not involved with the new research.

For years, scientists didn’t seem to view procrastination as something serious, she says. The new study could change that. “It’s that kind of big splash that’s … going to get attention,” Sirois says. “I’m hoping that it will raise awareness of the physical health consequences of procrastination.”

Procrastinating may be bad for the mind and body
Whether procrastination harms health can seem like a chicken-and-egg situation.

It can be hard to tell if certain health problems make people more likely to procrastinate — or the other way around, Johansson says. (It may be a bit of both.) And controlled experiments on procrastination aren’t easy to do: You can’t just tell a study participant to become a procrastinator and wait and see if their health changes, he says.
Many previous studies have relied on self-reported surveys taken at a single time point. But a snapshot of someone makes it tricky to untangle cause and effect. Instead, in the new study, about 3,500 students were followed over nine months, so researchers could track whether procrastinating students later developed health issues.

On average, these students tended to fare worse over time than their prompter peers. They were slightly more stressed, anxious, depressed and sleep-deprived, among other issues, Johansson and colleagues found. “People who score higher on procrastination to begin with … are at greater risk of developing both physical and psychological problems later on,” says study coauthor Alexander Rozental, a clinical psychologist at Uppsala University in Sweden. “There is a relationship between procrastination at one time point and having these negative outcomes at the later point.”

The study was observational, so the team can’t say for sure that procrastination causes poor health. But results from other researchers also seem to point in this direction. A 2021 study tied procrastinating at bedtime to depression. And a 2015 study from Sirois’ lab linked procrastinating to poor heart health.

Stress may be to blame for procrastination’s ill effects, data from Sirois’ lab and other studies suggest. She thinks that the effects of chronic procrastinating could build up over time. And though procrastination alone may not cause disease, Sirois says, it could be “one extra factor that can tip the scales.”

No, procrastinators are not lazy
Some 20 percent of adults are estimated to be chronic procrastinators. Everyone might put off a task or two, but chronic procrastinators make it their lifestyle, says Joseph Ferrari, a psychologist at DePaul University in Chicago, who has been studying procrastination for decades. “They do it at home, at school, at work and in their relationships.” These are the people, he says, who “you know are going to RSVP late.”

Though procrastinators may think they perform better under pressure, Ferrari has reported the opposite. They actually worked more slowly and made more errors than non-procrastinators, his experiments have shown. And when deadlines are slippery, procrastinators tend to let their work slide, Steel’s team reported last year in Frontiers in Psychology.

For years, researchers have focused on the personalities of people who procrastinate. Findings vary, but some scientists suggest procrastinators may be impulsive, worriers and have trouble regulating their emotions. One thing procrastinators are not, Ferrari emphasizes, is lazy. They’re actually “very busy doing other things than what they’re supposed to be doing,” he says.

In fact, Rozental adds, most research today suggests procrastination is a behavioral pattern.

And if procrastination is a behavior, he says, that means it’s something you can change, regardless of whether you’re impulsive.

Why procrastinators should be kind to themselves
When people put off a tough task, they feel good — in the moment.
Procrastinating is a way to sidestep the negative emotions linked to the task, Sirois says. “We’re sort of hardwired to avoid anything painful or difficult,” she says. “When you procrastinate, you get immediate relief.” A backdrop of stressful circumstances — say, a worldwide pandemic — can strain people’s ability to cope, making procrastinating even easier. But the relief it provides is only temporary, and many seek out ways to stop dawdling.

Researchers have experimented with procrastination treatments that run the gamut from the logistical to the psychological. What works best is still under investigation. Some scientists have reported success with time-management interventions. But the evidence for that “is all over the map,” Sirois says. That’s because “poor time management is a symptom not a cause of procrastination,” she adds.

For some procrastinators, seemingly obvious tips can work. In his clinical practice, Rozental advises students to simply put down their smartphones. Silencing notifications or studying in the library rather than at home can quash distractions and keep people on task. But that won’t be enough for many people, he says.

Hard-core procrastinators may benefit from cognitive behavioral therapy. In a 2018 review of procrastination treatments, Rozental found that this type of therapy, which involves managing thoughts and emotions and trying to change behavior, seemed to be the most helpful. Still, not many studies have examined treatments, and there’s room for improvement, he says.

Sirois also favors an emotion-centered approach. Procrastinators can fall into a shame spiral where they feel uneasy about a task, put the task off, feel ashamed for putting it off and then feel even worse than when they started. People need to short-circuit that loop, she says. Self-forgiveness may help, scientists suggested in one 2020 study. So could mindfulness training.

In a small trial of university students, eight weekly mindfulness sessions reduced procrastination, Sirois and colleagues reported in the January Learning and Individual Differences. Students practiced focusing on the body, meditating during unpleasant activities and discussed the best way to take care of themselves. A little self-compassion may snap people out of their spiral, Sirois says.

“You made a mistake and procrastinated. It’s not the end of the world,” she says. “What can you do to move forward?”

Supercooled water has been caught morphing between two forms

Supercooled water is two of a kind, a new study shows.

Scientists have long suspected that water at subfreezing temperatures comes in two distinct varieties: a high-density liquid that appears at very high pressures and a low-density liquid at lower pressures. Now, ultrafast measurements have caught water morphing from one type of liquid to the other, confirming that hunch. The discovery, reported in the Nov. 20 Science, could help explain some of water’s quirks.

The experiment “adds more and more evidence to the idea that water really is two components … and that that is the reason that underlies why water is so weird,” says physicist Greg Kimmel of Pacific Northwest National Laboratory in Richland, Wash., who was not involved in the study.

When free from impurities, water can remain liquid below its typical freezing point of zero degrees Celsius, forming what’s called a supercooled liquid. But the dual nature of supercooled water was expected to appear in a temperature realm so difficult to study that it’s been dubbed “no-man’s-land.” Below around –40° C, water remains liquid for mere instants before it crystallizes into ice. Making the task even more daunting, the high-density phase appears only at very high pressures. Still, “people have dreamt about how to do an experiment,” says Anders Nilsson of Stockholm University.
Thanks to speedy experimental maneuvers, Nilsson and colleagues have infiltrated that no-man’s-land by monitoring water’s properties on a scale of nanoseconds. “This is one of the major accomplishments of this paper,” says computational chemist Gül Zerze of Princeton University. “I’m impressed with their work.”

The scientists started by creating a type of high-density ice. Then, a pulse from an infrared laser heated the ice, forming liquid water under high pressure. That water then expanded, and the pressure rapidly dropped. Meanwhile, the researchers used an X-ray laser to investigate how the structure of the water changed, based on how the X-rays scattered. As the pressure decreased, the water transitioned from a high-density to low-density fluid before crystallizing into ice.

Previous studies have used ultrafast techniques to find hints of water’s two-faced demeanor, but those have been done mainly at atmospheric pressure (SN: 9/28/20). In the new work, the water was observed at about 3,000 times atmospheric pressure and –68° C. “It’s the first time we have real experimental data at these pressures and temperatures,” says physicist Loni Kringle of Pacific Northwest National Laboratory, who was not involved with the experiment.

The result could indicate that supercooled water has a “critical point” — a certain pressure and temperature at which two distinct phases merge into one. In the future, Nilsson hopes to pinpoint that spot.

Such a critical point could explain why water is an oddball liquid. For most liquids, cooling makes them become denser and more difficult to compress. Water gets denser as it is cooled to 4° C, but becomes less dense as it is cooled further. Likewise, its compressibility increases as it’s cooled.

If supercooled water has a critical point, that could indicate that the water experienced in daily life is strange because, under typical pressures and temperatures, it is a supercritical liquid — a weird state that occurs beyond a critical point. Such a liquid would not be the high-density or low-density form, but would consist of some regions with a high-density arrangement of water molecules and other pockets of low density. The relative amounts of those two structures, which result from different arrangements of hydrogen bonds between the molecules, would change as the temperature changes, explaining why water behaves strangely as it is cooled.

So despite the fact that the experiment involved extreme pressures and temperatures, Nilsson says, “it influences water in our ordinary life.”