AI headphones let you listen to only a single person in a crowd

A University of Washington team has developed an AI system that lets a user wearing headphones look at a person speaking for three to five seconds and then listen only to that person (“enroll” them).

Their “Target Speech Hearing” app then cancels all other sounds in the environment and plays just the enrolled speaker’s voice in real time, even if the listener moves around in noisy places and no longer faces the speaker.

How it works

To use the system, a person wearing off-the-shelf headphones fitted with microphones taps a button while directing their head at someone talking. The sound waves from that speaker’s voice then should reach the microphones on both sides of the headset simultaneously.

The headphones send that signal to an on-board embeded computer, where the team’s machine learning software learns the desired speaker’s vocal patterns. The system latches onto that speaker’s voice and continues to play it back to the listener, even as the pair moves around. The system’s ability to focus on the enrolled voice improves as the speaker keeps talking, giving the system more instant training data.

This work builds on the team’s previous “semantic hearing” research, which allowed users to select specific sound classes—such as birds or voices—that they wanted to hear, and automatically cancel other sounds in the environment.

The team plans to use the Target Speech Hearing app with earbuds and hearing aids in the future. The code for the proof-of-concept device is available for others to build on, but not commercially available.

Citation: Bandhav Veluri, Malek Itani, Tuochao Chen,Takuya Yoshioka, Shyamnath Gollakota. CHI ’24. Look Once to Hear: Target Speech Hearing with Noisy Examples. Proceedings of the CHI Conference on Human Factors in Computing Systems, May 2024 No.: 37 pages 1–16 https://doi.org/10.1145/3613904.3642057 (open source)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Euclid telescope reveals amazing images from the Universe’s distant past

Scientists have just released the first set of scientific data captured with the Euclid telescope, showing an exciting glimpse of the Universe’s distant past.

The mission seeks to unlock mysteries of dark matter and dark energy and reveal how and why the Universe looks as it does today.

Five never-before-seen images of the Universe

Early observations, described in a series of 10 scientific papers published today (May 23, 2024), include five never-before-seen images of the Universe: free-floating new-born planets, newly identified extragalactic star clusters, new low-mass dwarf galaxies in a nearby galaxy cluster, the distribution of dark matter and intracluster light in galaxy clusters, and very distant bright galaxies from the first billion years of the Universe.

Most precise map of our Universe over time

The images obtained by Euclid are at least four times sharper than those that can be taken from ground-based telescopes. They cover large patches of sky at unrivalled depth, looking far into the distant Universe using both visible and infrared light.

The Euclid telescope is designed to provide the most precise map of our Universe over time and demonstrates Euclid’s ability to unravel the secrets of the cosmos.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Machine-learning algorithm tracks dementia-related protein clumping in real time

More than 55 million people around the world live with dementia-related disorders such as Alzheimer’s and Parkinson’s and other neurodegenerative disorders.

Now chemists at the University of Copenhagen have developed a machine-learning algorithm that they say could lead to developing new drugs and therapies to combat these diseases, which are caused by proteins clumping together and destroying vital functions.

The algorithm can track clumping under a microscope in real-time, automatically mapping and tracking the important characteristics of the clumped-up proteins that cause neurodegenerative disorders.

The research has just been published in the journal Nature Communications

Detecting and tracking microscopic proteins in real time

The algorithm can spot protein clumps down to a billionth of a meter in microscopy images in real time. Their exact shape can vary depending on the disorder they trigger.

“Our new tool can let us see how these clumps are affected by whatever compounds we add. In this way, the model can help us work towards understanding how to potentially stop or transform them into less dangerous or more stable clumps,” explains Jacob Kæstel-Hansen from the Department of Chemistry, who co-led the research team behind the algorithm, in a press release.

“In the future, the algorithm will make it much easier to learn more about why clumps form so that we can develop new drugs and therapies to combat these disorders.”

New drugs

The researchers also see potential in using the tool to develop new drugs once the microscopic building blocks have been clearly identified. They hope that their work will kickstart the gathering of more comprehensive knowledge about the shapes and functions of proteins and molecules.

The team is also using the tool to conduct experiments with insulin molecules. As insulin molecules clump, their ability to regulate blood sugar weakens.

“As other researchers around the world begin to deploy the tool, it will help create a large library of molecule and protein structures related to various disorders and biology in general. This will allow us to better understand diseases and try to stop them,” concludes Nikos Hatzakis from the Department of Chemistry.

Written in python, the algorithm is freely available on Github.

Citation: Bender, S.W.B., Dreisler, M.W., Zhang, M. et al. SEMORE: SEgmentation and MORphological fingErprinting by machine learning automates super-resolution data analysis. Nat Commun 15, 1763 (2024). https://doi.org/10.1038/s41467-024-46106-0 (open access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Study links key nutrients with slower brain aging

A new study published in the journal Nature Aging has found specific nutrients that may play a pivotal role in healthy aging of the brain. Combining state-of-the-art innovations in neuroscience and nutritional science, scientists identified a specific nutrient profile in study participants who performed better cognitively.

The researchers at University of Nebraska–Lincoln’s Center for Brain, Biology and Behavior, the University of Illinois at Urbana-Champaign University of Nebraska–Lincoln’s Center for Brain, Biology and Behavior and the University of Illinois at Urbana-Champaign enrolled 100 cognitively healthy participants, aged 65-75.

The participants completed a questionnaire with demographic information, body measurements and physical activity. Blood plasma was collected following a fasting period to analyze the nutrient biomarkers. Participants also underwent cognitive assessments and MRI scans.

The beneficial nutrient blood biomarkers in the study were a combination of fatty acids, antioxidants and carotenoids. This profile is correlated with nutrients found in the Mediterranean diet, previously associated with healthy brain aging.

Citation: Zwilling, C.E., Wu, J. & Barbey, A.K. Investigating nutrient biomarkers of healthy brain aging: a multimodal brain imaging study. npj Aging 10, 27 (2024). https://www.nature.com/articles/s41514-024-00150-8 (open-access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Algorithms to track space (and sea) objects

Cislunar space, which stretches from the Earth to just beyond the Moon’s orbit, is about to become heavily trafficked over the next 10 years. This traffic includes NASA’s planned Artemis missions and other countries joining in the cislunar space race.

So there’s a need for observing, tracking and predicting the orbit of objects like asteroids and satellites so they don’t collide with spacecraft (and each other). Space domain awareness (SDA), the process of detecting and observing space objects, faces challenges.

The current SDA infrastructure, which is mostly Earth-based, is not equipped to provide the needed coverage in cislunar space, according to Tarek Elgohary, an associate professor of aerospace engineering at the University of Central Florida and director of the university’s Astrodynamics and Space Robotics Laboratory .

Tracking space objects

Elgohary’s team plans to create a computational framework to rapidly and accurately track space objects. Using Oracle, a satellite developed by the U.S. Air Force Research Laboratory, the researchers will conduct experiments on space-object detection and tracking in cislunar space.

The algorithms will also allow Oracle and other spacecraft to operate autonomously without requiring intervention from Earth, according to the team.

The team will also develop a similar computational framework using algorithms to allow sea vessels to detect objects in real time and predict their future locations.

The work is supported by a $350,000 grant from the U.S. Air Force Office of Scientific Research Dynamic Data and Information Processing Program and a $150,000 grant from Lockheed Martin.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Do you have robot-phobia?

Some workers in the hospitality industry (such as hotels) apparently have “robot-phobia”—the fear that robots and technology will take human jobs.

Using more robots to close labor gaps in the hospitality industry may backfire and cause more human workers to quit, according to a Washington State University study.

Job insecurity and stress

The study, which involved more than 620 lodging and food service employees, found that human-like robot servers and automated robotic arms as well as self-service kiosks and tabletop devices increased workers’ job insecurity and stress.

That led to greater intentions to leave their jobs. The impact was more pronounced with employees who had real experience working with robotic technology.

Effects on hospitality workers

Published in the International Journal of Contemporary Hospitality Management, the study focuses on how the technology impacted hospitality workers. The researchers surveyed 321 lodging and 308 food service employees from across the U.S., asking a range of questions about their jobs and attitudes toward robots.

Having a higher robot-phobia was connected to greater feelings of job insecurity and stress, which were then correlated with “turnover intention” or workers’ plans to leave their jobs, they found.

The employees who viewed robots as being more capable and efficient also ranked higher in turnover intention.

Citation: Chen, C.-C.(B). and Cai, R. (30 April 2024). “Are robots stealing our jobs? Examining robot-phobia as a job stressor in the hospitality workplace. International Journal of Contemporary Hospitality Management10.1108/IJCHM-09-2023-1454 (open-access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Generating heat over 1,000 degrees Celsius with solar power to tackle climate change

Instead of burning fossil fuels to smelt steel and cook cement, what if we trapped solar energy directly from the Sun?

That’s what researchers at ETH Zurich, Switzerland are exploring. Their proof-of-concept study, published May 15 in the journal Device, uses synthetic quartz to trap solar energy at temperatures higher than 1,000°C (1,832°F). The research could lead to using clean energy for carbon-intensive industries, which currently account for about 25% of global energy consumption.

“To tackle climate change, we need to decarbonize energy in general,” says corresponding author Emiliano Casati of ETH Zurich, Switzerland, in a press release.

Researchers have prevsiously explored a clean-energy alternative using solar receivers, which concentrate and build heat with thousands of sun-tracking mirrors. But that technology has difficulties transferring solar energy efficiently above 1,000°C.

Light from 136 suns

To boost the efficiency of solar receivers, Casati turned to semitransparent materials such as quartz, which can trap sunlight—a phenomenon called the “thermal-trap effect.”

The team crafted a thermal-trapping device by attaching a synthetic quartz rod to an opaque silicon disk as an energy absorber. When they exposed the device to an energy flux equivalent to the light coming from 136 suns, the absorber plate reached 1,050°C (1,922°F), while the other end of the quartz rod remained at 600°C (1,112°F). 

“Previous research has only managed to demonstrate the thermal-trap effect up to 170°C (338°F),” says Casati. “Our research showed that solar thermal trapping works not just at low temperatures, but well above 1,000°C.”

Casati and his colleagues are now optimizing the thermal-trapping effect and investigating new applications for the method. By exploring other materials, such as different fluids and gases, they were able to reach even higher temperatures, noting that these semitransparent materials’ ability to absorb light or radiation is not limited to solar radiation.

Citation: Casati et al. Solar thermal trapping at 1,000ºC and above. Light from 136 suns. Device https://cell.com/device/fulltext/S2666-9986(24)00235-7 (open access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

How the brain turns waves of light into experiences of color

Columbia University neuroscientists have identifed, for the first time, brain-cell circuitry in fruit flies that converts raw sensory signals into color perceptions (which can guide behavior), in a paper published in the journal Nature Neuroscience today.

(“Colors” are perceptions your brain constructs as it makes sense of the longer and shorter wavelengths of light detected by your eyes, the researchers explain.)

Networks of neurons in fruit flies

The research team reports the discovery of specific networks of neurons in fruit flies that respond selectively to various hues (perceived colors) associated with specific combinations of wavelengths of light). These hue-selective neurons lie within the brain area responsible for vision.

(Some people could perceive a wavelength as violet and others as ultraviolet (not detectable by most humans.) Detecting UV hues is important for the survival of some creatures, such as bees and perhaps fruit flies. Many plants, for example, possess ultraviolet patterns that can help guide insects to pollen.) 

Fly-brain connectome

The recent availability of a fly-brain connectome has proven helpful here, say the researchers. This intricate map details how some 130,000 neurons and 50 million synapses in a fruit-fly’s poppy seed-sized brain are interconnected.

With the connectome serving as a reference, the researchers used their observations of brain cells to develop a diagram they suspected represents the neuronal circuitry behind hue selectivity. The scientists then portrayed these circuits as mathematical models to simulate and probe the circuits’ activities and capabilities. 

Brain circuitry involved in color perception identified

The modeling revealed that these circuits can host activity required for hue selectivity. It also pointed to a type of cell-to-cell interconnectivity, known as recurrence, without which hue-selectivity cannot happen. In a neural circuitry with recurrence, outputs of the circuit circle back in to become inputs.

And that suggested yet another experiment: “When we used a genetic technique to disrupt part of this recurrent connectivity in the brains of fruit flies, the neurons that previously showed hue-selective activity lost that property. This reinforced our confidence that we really had discovered brain circuitry involved in color perception.”

Citation: Christenson, M.P., Sanz Diez, A., Heath, S.L. et al. Hue selectivity from recurrent circuitry in DrosophilaNat Neurosci (2024) https://doi.org/10.1038/s41593-024-01640-4 (open-access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

How to get a complete understanding of the brain

It starts with zooming into the tiniest-visible subcellular level of the brain: a cubic millimeter (about the size of a grain of rice) of human temporal cortex (located on the right and left side of the your brain, near your temples).

The Harvard and Google researchers could see 57,000 cells, 230 millimeters of blood vessels, and 150 million synapses—1,400 terabytes of data in vivid detail for the first time, they report in the journal Science.

A complete map of the mouse brain

Their ultimate goal, supported by the National Institutes of Health BRAIN Initiative: create a high-resolution map of a whole mouse brain’s neural wiring—about 1,000 times the amount of data.

A Harvard team led by Jeff Lichtman, the Jeremy R. Knowles Professor of Molecular and Cellular Biology and newly appointed dean of science, has co-created with Google researchers the largest synaptic-resolution, 3D reconstruction of a piece of human brain to date, showing each cell.

Lichtman’s field is “connectomics,” which seeks to create comprehensive catalogues of brain structure, down to individual cells and wiring. Such completed maps would light the way toward new insights into brain function and disease, about which scientists still know very little.

AI-enhanced

Google’s state-of-the-art AI algorithms take it a step forward, allowing for reconstruction and mapping of brain tissue in three dimensions. The team has also developed a suite of publicly available tools that researchers can use to examine and annotate the connectome.

Next: the team will tackle the mouse hippocampal formation, which is important to neuroscience for its role in memory and neurological disease.

Citation: Alexander Shapson-Coe et al. (21 authors). 10 May 2024. A petavoxel fragment of human cerebral cortex reconstructed at nanoscale resolution. Vol 384, Issue 6696. Science. https://www.science.org/doi/10.1126/science.adk4858

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Would you trust a robot to look after your cat?

New research found it takes more than a carefully designed robot to care for your cat. Their environment is also vital, as well as human interaction.

“Cat Royale” is a collaboration between computer scientists from the University of Nottingham and artists at Blast Theory, who worked together to create a multispecies world centered around a enclosure in which three cats and a robot arm coexist for six hours a day, as part of an artist-led project.

Designing the “world,” not just the tech

The open-access research paper “Designing Multispecies Worlds for Robots, Cats, and Humans” suggests that designing the technology and its interactions is not sufficient. It’s equally important to consider the design of the “world” in which the technology operates and human involvement.

To do that, the researcher used a robot arm offering activities to make the cats happier, like dragging a “mouse” toy along the floor, raising a feather “bird” into the air, and even offering them treats to eat. The team then trained an AI to learn what games the cats liked best so that it could personalize their experiences.

A designed world

The researrchers found it had to design the robot to pick up toys and deploy them in ways that excited the cats while it learned which games each cat liked. They also designed the entire world in which the cats and the robot lived, providing safe spaces for the cats to observe the robot and (from which to sneak up on it) and decorating it so that the robot had the best chance of spotting the approaching cats. 

The implication: designing robots involves interior design as well as engineering and AI. It includes the enclosure, the robot and its underlying systems, the various roles of the humans-in-the-loop, and, of course, the selection of the cats. 

Eike Schneiders et al. 11 May 2024. Designing Multispecies Worlds for Robots, Cats, and Humans. CHI ’24: Proceedings of the CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3613904.3642115 (open access)

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter