During the late 2021 hurricane season, beneath the swell and surge of the Atlantic, there roamed a fleet of unmanned, autonomous, gliding sea robots taking samples from the water. Their mission? Understand the impact hurricanes have on the fragile ocean ecosystem as climate change increases the intensity of such extreme weather events. The sophisticated submersibles, developed by Rutgers University Center for Ocean Observing Leadership with funding from the U.S. National Oceanic and Atmospheric Administration, are designed to “accelerate improvements in hurricane intensity forecasting.” How? They suck in a little bit of seawater and measure qualities like pH balance, salinity and temperature, then send that data directly to researchers. Next up for the team: adding sensors to the drones to gauge the turbulence and “ocean mixing” that occurs during storms.
The magic behind a typical 3D printer works like this: It emits material from a single point, creating a fine stream that must be intricately layered until it forms an object. Researchers at Queensland University of Technology’s Centre for Materials Science had a different idea: generate whole layers of an object all at once by harnessing the power of intersecting beams of light to cause chemical reactions in molecules that coalesce into a solid material. One beam of light activates one molecule, while the other activates a separate one—meaning that only when the beams are joined can the printing begin, giving researchers more precise control over the resulting object’s creation. The potential benefits of the technology could affect 3D printing on a large scale, likely adding to the industry’s expected stunning transformation into a US$94 billion industry by 2030.
After years of top-secret development, Amazon finally unveiled its first home robot, Astro, in September 2021. Dubbed an “Alexa on wheels,” the diminutive robot can blast tunes to start a dance party, deliver the news or play a podcast just like the super-popular voice assistant. But Astro’s ability to move autonomously through a home means it can also perform tasks like checking on sleeping kids or delivering a drink. Skeptics abound: Do you really need to offload such tasks to a bot that can’t even climb stairs? But the U.S. tech giant has a larger vision. With its built-in periscoping camera, for instance, Astro can be the ultimate watchdog when people are away from home.
The classic vision of a robot—that is, metal exterior, mechanical voice, and lots of beeps and boops—is far too narrow for today’s needs. And it’s not just about the aesthetics. In July, a team at the University of Glasgow’s Bendable Electronics and Sensing Technologies group announced the development of an electronically enhanced “skin” that could transform what robotic devices look like—and what they can do. The e-skin is equipped with sensors that allow robots to detect light not visible to the human eye, including infrared and ultraviolet. Made of a bendy plastic, the material can accommodate dozens of electronic semiconductors, which are in turn made of gallium arsenide, a metal often used in electronic devices. One possible use case: If the e-skin were applied to a mechanical arm on a factory floor, its newfound ability to sense shifts in light—say, due to a malfunctioning laser—would allow it to identify a problem and head off manufacturing delays or safety risks.
It was a project with high stakes and unusual parameters: Figure out a way to destroy an assemblage of 70,000 decommissioned midcentury chemical weapons—and do so both safely and efficiently. The munitions in question, 6-foot-long (1.8-meter-long) M55 rockets loaded with nerve agents stored at the Blue Grass Army Depot in Kentucky, USA, had shown signs of leakage and warping during prior attempts at destruction. So a consortium of problem solvers assembled, including the U.S. Pentagon’s Program Executive Office, Assembled Chemical Weapons Alternatives; engineering giant Bechtel; robotics firm CRG Automation; explosives containment manufacturing firm Dynasafe; and Crown Packaging Corp. With a plan in place, CRG Automation used its automation prowess to design several elements of the weapon-destroying process, including a robotic vertical cutting system that separated rockets from warheads, and various robots tasked with transporting weapons through the manufacturing floor or crimping them shut once they’d been neutralized. The destruction project is expected to be completed in 2023.
Ever watch a maple seed pod fall from a tree? Spinning like a helicopter propeller, it stays aloft for far longer than if it fell directly to the ground. The kind of elegant engineering found in nature was the inspiration behind a City University of Hong Kong project to design a more efficient and versatile two-blade drone—one that can fly for almost double the time of a traditional four-rotor version, whose quartet of power-supplying motors suck up battery power and thus limit flight duration. The reimagined drone, which took its first flight in May, weighs in at about 35 grams (1.2 ounces) and spins at around 200 revolutions per minute. With its diminutive battery humming, it can hover for an impressive 24 minutes. The drone’s design also makes it capable of recording video from every direction simultaneously—and its developers envision it as a powerful tool for environmental research and urban planning.
When a child undergoes a medical or dental procedure, their emotions can run high. To reduce the trauma and anxiety, Armenian startup Expper Technologies developed Robin, a comforting, child psychology-powered robot. But it does more than put children at ease. The team equipped Robin with emotion-reading software that helps it deliver just the right customized response to each child—all with wide eyes and a gentle voice. The AI allows the robot to store memories about those conversations and even explain medical procedures. And healthcare professionals can take command of Robin’s interactions when necessary, engaging with children through a friendly intermediary. Robin has been adopted at a handful of U.S. hospitals, and a January US$2 million funding round could fuel widespread rollout.
Robots have been lending a surgical hand with abdominal hysterectomies since 2005, yet limited range of motion means patient recovery is still about three to four weeks, and the procedure leaves several scars. The next-gen remedy? The Anovo Surgical System, a robot with arms capable of articulating 360 degrees at its shoulder, elbow and wrist joints, with dexterity like that of a human. Developed by Israeli surgical tech company Momentis Surgical, Anovo makes transvaginal hysterectomy exponentially more precise; the minimally invasive procedures it performs require roughly half the recovery time as traditional hysterectomies, leave no scar and result in less blood loss. In June, the company announced that two U.S. hospitals had successfully performed the first procedures using the Anovo.
Pesticides account for nearly one-third of the production costs for Florida citrus crops, one study found. Looking to lower those numbers, the University of Florida’s Institute of Food and Agricultural Sciences came up with a way to apply pesticides in a more efficient and cost-effective way: an AI-powered smart-spraying robot that can instantly detect the leaf density, height and fruit yield of a given tree. Armed with that knowledge, it can then mete out the appropriate amount of pesticide. Using machine vision, the robot identifies “target foliage” and applies pesticides with far greater precision than traditional sprayers. Unveiled in January, the tech is estimated to reduce a farmer’s chemical spray use by roughly 30 percent. And it could scale soon, with at least one industry partner seeking to bring the smart spray to the commercial market. Beyond that, the team is working to develop similar AI tech for spreading fertilizer.
The process of finding someone’s signature scent just got a lot easier—provided the customer is willing to let a perfumer delve into their brainwaves. In March, the in-house technology incubator at French beauty giant L’Oréal Groupe announced a partnership with U.S. neurotechnology company Emotiv to develop a headset equipped with sophisticated electroencephalogram sensors that can measure a customer’s neurological responses to scent. As the readings are taken from the sensors, machine learning interprets the data—gauging everything from stress spikes to stimulation levels. The tech made its debut at the Dubai Mall late last year. But starting late this year, fashionistas will be able to visit select Yves Saint Laurent Beauty counters around the world, don the headset in private 25-minute consultations and let their brains’ responses to particular scents guide the recommendation of three YSL products. What’s that intoxicating aroma? Smells like the future to us.