New Species of Bird-Like Dinosaur Identified in Canada

New Species of Bird-Like Dinosaur Identified in Canada

Albertavenator curriei, as the paleontologists call the new dinosaur species, belongs to Troodontidae, a family of bird-like theropod dinosaurs.

It lived about 71 million years ago (Cretaceous period) in what is now Alberta, Canada.

Its specific name, curriei, honors the renowned Canadian paleontologist Dr. Philip J. Currie.

The bones of Albertavenator curriei were found in the badlands surrounding the Royal Tyrrell Museum, which Dr. Currie played a key role in establishing in the early 1980s.

Scientists initially thought that the dinosaur’s bones belonged to its close relative, Troodon inequalis, which lived around 5 million years earlier.

Both bird-like creatures walked on two legs, were covered in feathers, and were about the size of a person.

New comparisons of bones forming the top of the head reveal that Albertavenator curriei had a distinctively shorter and more robust skull than Troodon.

“The delicate bones of these feathered dinosaurs are very rare,” said Dr. David Evans, Temerty Chair and Senior Curator of Vertebrate Paleontology at the Royal Ontario Museum and lead author of a new paper in the Canadian Journal of Earth Sciences describing the discovery.

“We were lucky to have a critical piece of the skull that allowed us to distinguish Albertavenator curriei as a new species.”

“We hope to find a more complete skeleton of Albertavenator curriei in the future, as this would tell us so much more about this fascinating animal.”

“It was only through our detailed anatomical and statistical comparisons of the skull bones that we were able to distinguish between Albertavenator currieiand Troodon,” added co-author Thomas Cullen, a Ph.D. student at the University of Toronto.

“This discovery really highlights the importance of finding and examining skeletal material from these rare dinosaurs,” said co-author Dr. Derek Larson, Assistant Curator of the Philip J. Currie Dinosaur Museum.

These baby fish exercise to change the shape of their faces

These baby fish exercise to change the shape of their faces

Humans aren’t the only animals who exercise. Baby Lake Malawi cichlids—a group of 10-centimeter-long striped fish native to East Africa—open and close their mouths up to 260 times per minute to develop a short jaw and a long retroarticular process, a critical bone for jaw opening, researchers report today in the Proceedings of the Royal Society B. Both of those features are an advantage for scraping algae from rocks. Some species of young cichlids “exercise” less, only gaping about 180 times per minute. They develop a long jaw and a short retroarticular process, which are also advantageous for feeding by sucking prey into their mouth. When researchers manipulated the baby fish gaping behavior, they produced changes in bone shape that were similar to those driven by genes, suggesting that the fishes’ environment can influence development as much as their DNA does.

Engineers invent the first bio-compatible, ion current battery

Engineers invent the first bio-compatible, ion current battery

In our bodies, flowing ions (sodium, potassium and other electrolytes) are the electrical signals that power the brain and control the rhythm of the heart, the movement of muscles, and much more.

In traditional batteries, the electrical energy, or current, flows in form of moving electrons. This current of electrons out of the battery is generated within the battery by moving positive ions from one end (electrode) of a battery to the other. The new UMD battery does the opposite. It moves electrons around in the device to deliver energy that is a flow of ions. This is the first time that an ionic current-generating battery has been invented.

“My intention is for ionic systems to interface with human systems,” said Liangbing Hu, the head of the group that developed that battery. Hu is a professor of materials science at the University of Maryland, College Park. He is also a member of the University of Maryland Energy Research Center and a principal investigator of the Nanostructures for Electrical Energy Storage Energy Frontier Research Center, sponsored by the Department of Energy, which funded the study.

“So I came up with the reverse design of a battery,” Hu said. “In a typical battery, electrons flow through wires to interface electronics, and ions flow through the battery separator. In our reverse design, a traditional battery is electronically shorted (that means electrons are flowing through the metal wires). Then ions have to flow through the outside ionic cables. In this case, the ions in the ionic cable — here, grass fibers — can interface with living systems.”

The work of Hu and his colleagues was published in the July 24 issue of Nature Communications.

“Potential applications might include the development of the next generation of devices to micro-manipulate neuronal activities and interactions that can prevent and/or treat such medical problems as Alzheimer’s disease and depression,” said group member Jianhua Zhang, PhD, a staff scientist at the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), part of the National Institutes of Health in Bethesda, Md.

“The battery could be used to develop medical devices for the disabled, or for more efficient drug and gene delivery tools in both research and clinical settings, as a way to more precisely treat cancers and other medical diseases, said Zhang, who performed biological experiments to test that the new battery successfully transmitted current to living cells..

“Looking far ahead on the scientific horizon, one hopes also that this invention may help to establish the possibility of direct machine and human communication,” he said.

Bio-compatible, bio-material batteries

Because living cells work on ionic current and existing batteries provide an electronic current, scientists have previously tried to figure out how to create biocompatibility between these two by patching an electronic current into an ionic current. The problem with this approach is that electronic current needs to reach a certain voltage to jump the gap between electronic systems and ionic systems. However, in living systems ionic currents flow at a very low voltage. Thus, with an electronic-to-ionic patch the induced current would be too high to run, say, a brain or a muscle. This problem could be eliminated by using ionic current batteries, which could be run at any voltage.

The new UMD battery also has another unusual feature — it uses grass to store its energy. To make the battery, the team soaked blades of Kentucky bluegrass in lithium salt solution. The channels that once moved nutrients up and down the grass blade were ideal conduits to hold the solution.

The demonstration battery the research team created looks like two glass tubes with a blade of grass inside, each connected by a thin metal wire at the top. The wire is where the electrons flow through to move from one end of the battery to the other as the stored energy slowly discharges. At the other end of each glass tube is a metal tip through which the ionic current flows.

The researchers proved that the ionic current is flowing by touching the ends of the battery to either end of a lithium-soaked cotton string, with a dot of blue-dyed copper ions in the middle. Caught up in the ionic current, the copper moved along the string toward the negatively charged pole, just as the researchers predicted.

“The microchannels in the grass can hold the salt solution, making them a stable ionic conductor,” said Chengwei Wang, first author of the paper and a graduate student in the Materials Science and Engineering department at the University of Maryland in College Park.

However, the team plans to diversify the types of ionic current electron batteries they can produce. “We are developing multiple ionic conductors with cellulose, hydrogels and polymers,” said Wang.

This is not the first time UMD scientists have tested natural materials in new uses. Hu and his team previously have been studying cellulose and plant materials for electronic batteries, creating a battery and a supercapacitor out of wood and a battery from a leaf. They also have created transparent wood as a potentially more energy-efficient replacement for glass windows.

Creative Work

Ping Liu, an associate professor in nanoengineering at the University of California, San Diego, who was not involved with the study, said: “The work is very creative and its main value is in delivering ionic flow to bio systems without posing other dangers to them. Eventually, the impact of the work really resides in whether smaller and more biocompatible junction materials can be found that then interface with cells and organisms more directly and efficiently.”

Moon has a water-rich interior

Moon has a water-rich interior

Scientists had assumed for years that the interior of the Moon had been largely depleted of water and other volatile compounds. That began to change in 2008, when a research team including Brown University geologist Alberto Saal detected trace amounts of water in some of the volcanic glass beads brought back to Earth from the Apollo 15 and 17 missions to the Moon. In 2011, further study of tiny crystalline formations within those beads revealed that they actually contain similar amounts of water as some basalts on Earth. That suggests that the Moon’s mantle — parts of it, at least — contain as much water as Earth’s.

“The key question is whether those Apollo samples represent the bulk conditions of the lunar interior or instead represent unusual or perhaps anomalous water-rich regions within an otherwise ‘dry’ mantle,” said Ralph Milliken, lead author of the new research and an associate professor in Brown’s Department of Earth, Environmental and Planetary Sciences. “By looking at the orbital data, we can examine the large pyroclastic deposits on the Moon that were never sampled by the Apollo or Luna missions. The fact that nearly all of them exhibit signatures of water suggests that the Apollo samples are not anomalous, so it may be that the bulk interior of the Moon is wet.”

The research, which Milliken co-authored with Shuai Li, a postdoctoral researcher at the University of Hawaii and a recent Brown Ph.D. graduate, is published in Nature Geoscience.

Detecting the water content of lunar volcanic deposits using orbital instruments is no easy task. Scientists use orbital spectrometers to measure the light that bounces off a planetary surface. By looking at which wavelengths of light are absorbed or reflected by the surface, scientists can get an idea of which minerals and other compounds are present.

The problem is that the lunar surface heats up over the course of a day, especially at the latitudes where these pyroclastic deposits are located. That means that in addition to the light reflected from the surface, the spectrometer also ends up measuring heat.

“That thermally emitted radiation happens at the same wavelengths that we need to use to look for water,” Milliken said. “So in order to say with any confidence that water is present, we first need to account for and remove the thermally emitted component.”

To do that, Li and Milliken used laboratory-based measurements of samples returned from the Apollo missions, combined with a detailed temperature profile of the areas of interest on the Moon’s surface. Using the new thermal correction, the researchers looked at data from the Moon Mineralogy Mapper, an imaging spectrometer that flew aboard India’s Chandrayaan-1 lunar orbiter.

The researchers found evidence of water in nearly all of the large pyroclastic deposits that had been previously mapped across the Moon’s surface, including deposits near the Apollo 15 and 17 landing sites where the water-bearing glass bead samples were collected.

“The distribution of these water-rich deposits is the key thing,” Milliken said. “They’re spread across the surface, which tells us that the water found in the Apollo samples isn’t a one-off. Lunar pyroclastics seem to be universally water-rich, which suggests the same may be true of the mantle.”

The idea that the interior of the Moon is water-rich raises interesting questions about the Moon’s formation. Scientists think the Moon formed from debris left behind after an object about the size of Mars slammed into the Earth very early in solar system history. One of the reasons scientists had assumed the Moon’s interior should be dry is that it seems unlikely that any of the hydrogen needed to form water could have survived the heat of that impact.

“The growing evidence for water inside the Moon suggest that water did somehow survive, or that it was brought in shortly after the impact by asteroids or comets before the Moon had completely solidified,” Li said. “The exact origin of water in the lunar interior is still a big question.”

In addition to shedding light on the water story in the early solar system, the research could also have implications for future lunar exploration. The volcanic beads don’t contain a lot of water — about .05 percent by weight, the researchers say — but the deposits are large, and the water could potentially be extracted.

“Other studies have suggested the presence of water ice in shadowed regions at the lunar poles, but the pyroclastic deposits are at locations that may be easier to access,” Li said. “Anything that helps save future lunar explorers from having to bring lots of water from home is a big step forward, and our results suggest a new alternative.”

The research was funded by the NASA Lunar Advanced Science and Exploration Research Program (NNX12AO63G).

Water bears will survive the end of the world as we know it

Water bears will survive the end of the world as we know it

These tough little buggers, also known as tardigrades, could keep calm and carry on until the sun boils Earth’s oceans away billions of years from now, according to a new study that examined water bears’ resistance to various astronomical disasters. This finding, published July 14 inScientific Reports, suggests that complex life can be extremely difficult to destroy, which bodes well for anyone hoping Earthlings have cosmic company.

Most previous studies of apocalyptic astronomical events — like asteroid impacts, neighboring stars going supernova or insanely energetic explosions called gamma-ray bursts — focused on their threat to humankind. But researchers wanted to know what it would take to annihilate one of the world’s most resilient creatures, so they turned to tardigrades.

The tardigrade is basically the poster child for extremophiles. These hardy, microscopic critters are up for anything. Decades without food or water? No problem. Temperatures plummeting to –272° Celsius or skyrocketing to 150°? Bring it on. Even the crushing pressure of deep seas, the vacuum of outer space and exposure to extreme radiation don’t bother water bears.

Water bears are so sturdy that they probably won’t succumb to nuclear war, global warming or any astronomical events that wreak havoc on Earth’s atmosphere — all of which could doom humans, says Harvard University astrophysicist Avi Loeb. To exterminate tardigrades, something would have to boil the oceans away (no more water means no more water bears). So Loeb and colleagues calculated just how big an asteroid, how strong a supernova, or how powerful a gamma-ray burst would have to be to inject that much energy into Earth’s oceans.

“They actually ran the numbers on everyone’s favorite natural doomsday weapons,” marvels Seth Shostak, an astronomer at the SETI Institute in Mountain View, Calif.

Loeb’s team found that there are only 19 asteroids in the solar system sufficiently massive enough to eradicate water bears, and none are on a collision course with Earth. A supernova — the explosion of a massive star after it burns through its fuel — would have to happen within 0.13 light-years of Earth, and the closest star big enough to go supernova is nearly 147 light-years away. And gamma-ray bursts — thought to result from especially powerful supernovas or stellar collisions — are so rare that the researchers calculated that, over a billion years, there’s only about a 1 in 3 billion chance of one killing off tardigrades.

“Makes me wish I were an extremophile like a tardigrade,” says Edward Guinan, an astrophysicist at Villanova University in Pennsylvania who was not involved in the work.

But even tardigrades can’t cheat death forever. In the next seven billion years, the sun will swell into a red giant star, potentially engulfing Earth and surely sizzling away its water. But the fact that tardigrades are so resistant to other potential apocalypses in the interim implies that “life is tough, once it gets going,” Shostak says.

Bones make hormones that communicate with the brain and other organs

Bones make hormones that communicate with the brain and other organs

Long typecast as the strong silent type, bones are speaking up.

In addition to providing structural support, the skeleton is a versatile conversationalist. Bones make hormones that chat with other organs and tissues, including the brain, kidneys and pancreas, experiments in mice have shown.

“The bone, which was considered a dead organ, has really become a gland almost,” says Beate Lanske, a bone and mineral researcher at Harvard School of Dental Medicine. “There’s so much going on between bone and brain and all the other organs, it has become one of the most prominent tissues being studied at the moment.”

At least four bone hormones moonlight as couriers, recent studies show, and there could be more. Scientists have only just begun to decipher what this messaging means for health. But cataloging and investigating the hormones should offer a more nuanced understanding of how the body regulates sugar, energy and fat, among other things.

Of the hormones on the list of bones’ messengers — osteocalcin, sclerostin, fibroblast growth factor 23 and lipocalin 2 — the last is the latest to attract attention. Lipocalin 2, which bones unleash to stem bacterial infections, also works in the brain to control appetite, physiologist Stavroula Kousteni of Columbia University Medical Center and colleagues reported in the March 16 Nature.

Bone-brain connection

After mice eat, their bone-forming cells absorb nutrients and release a hormone called lipocalin 2 (LCN2) into the blood. LCN2 travels to the brain, where it gloms on to appetite-regulating nerve cells, which tell the brain to stop eating, a recent study suggests.

Researchers previously thought that fat cells were mostly responsible for making lipocalin 2, or LCN2. But in mice, bones produce up to 10 times as much of the hormone as fat cells do, Kousteni and colleagues showed. And after a meal, mice’s bones pumped out enough LCN2 to boost blood levels three times as high as premeal levels. “It’s a new role for bone as an endocrine organ,” Kousteni says.

Clifford Rosen, a bone endocrinologist at the Center for Molecular Medicine in Scarborough, Maine, is excited by this new bone-brain connection. “It makes sense physiologically that there are bi­directional interactions” between bone and other tissues, Rosen says. “You have to have things to regulate the fuel sources that are necessary for bone formation.”

Bones constantly reinvent themselves through energy-intensive remodeling. Cells known as osteoblasts make new bone; other cells, osteoclasts, destroy old bone. With such turnover, “the skeleton must have some fine-tuning mechanism that allows the whole body to be in sync with what’s happening at the skeletal level,” Rosen says. Osteoblasts and osteoclasts send hormones to do their bidding.

Scientists began homing in on bones’ molecular messengers a decade ago (SN: 8/11/07, p. 83). Geneticist Gerard Karsenty of Columbia University Medical Center found that osteocalcin — made by osteoblasts —helps regulate blood sugar. Osteocalcin circulates through the blood, collecting calcium and other minerals that bones need. When the hormone reaches the pancreas, it signals insulin-making cells to ramp up production, mouse experiments showed. Osteocalcin also signals fat cells to release a hormone that increases the body’s sensitivity to insulin, the body’s blood sugar moderator, Karsenty and colleagues reported in Cell in 2007. If it works the same way in people, Karsenty says, osteocalcin could be developed as a potential diabetes or obesity treatment.

Wi-Fi could protect you from getting lost in virtual reality

Wi-Fi could protect you from getting lost in virtual reality

You’re at home playing a virtual reality (VR) game on the Oculus Rift, dodging zombies like a pro. But then you step too far back or look behind you, and suddenly you’re frozen in space, as the system’s infrared cameras can no longer see the lights on your goggles and it loses track of you. Instant brain food. Now, researchers have come up with a way to spare you such a frustrating end by using standard Wi-Fi technology to enhance VR’s tracking abilities. In addition to improving VR, the technology could also help track robots or drones and streamline motion capture for movies.

VR enables a user to move through a virtual 3D world projected through the video screens in the system’s headset. To track the user’s movement, the Rift uses one or more infrared cameras in a room, often on tripods. The headset has accelerometers to measure tilt, and it has infrared lights that the cameras use to track movement forward, back, or sideways. Another VR system, the HTC Vive, tracks movement by projecting infrared light from devices in the corners of the room that are detected by sensors on the headset. A related technology, called augmented reality (AR), maps virtual features onto the wearer’s view of the real world. So a user’s living room may be inhabited by virtual monsters. Microsoft’s HoloLens AR system uses several outward-facing cameras on the headset to track the user’s movement in relation to the environment.

Such systems have their limitations, however. In order for VR games to work without glitches, users often need to stay within a few square meters, and the infrared sightlines can’t be blocked by furniture or other people or by turning away. Microsoft’s AR system doesn’t work in all lighting conditions, it can be confused by blank walls or windows, and it can’t track your hands if they move out of view.

A team of researchers from Stanford University in Palo Alto, California, wanted a simpler, cheaper, more robust system. So they turned to the common radio technology Wi-Fi. Wi-Fi has been used to localize people and objects in space before, but only with an accuracy of tens of centimeters, says Manikanta Kotaru, a computer scientist at Stanford, and he and his colleagues thought they could do better.

Their solution, which they call WiCapture, requires two parts: a standard Wi-Fi chip, such as the one you might find in your phone, and at least two Wi-Fi “access points,” which are transmitters such as the ones found in home routers. Communication between the chip and a transmitter comes in high-frequency radio waves. In order to track a Wi-Fi signal source with millimeter-level accuracy, one needs to measure the time it takes a signal to travel from the chip to the transmitter with picosecond-level accuracy. However, the chip and transmitter have different clocks, and no two clocks in Wi-Fi devices are perfectly synchronized.

To get around this problem the researchers took advantage of the fact that signals reach the transmitter through many paths. Some radio waves travel directly to the receiver to create the main signal, whereas others bounce off walls to create echoes. Kotaru wrote an algorithm that looks at signals from two different paths, identified by triangulating among the transmitter’s multiple antennas. Those signals will be equally affected by clock asynchrony, so the algorithm can just compare their relative change as the chip moves and ignore the drift of the clocks’ timing. Still, this method measures distance to only one transmitter; using two or more transmitters in combination allows the algorithm to use triangulation to track motion in two dimensions. (The researchers will eventually expand WiCapture to track motion in three dimensions.)

To test the idea, scientists placed the Wi-Fi chip on a mechanical device that could move it with high accuracy in an office 5 meters by 6 meters with four Wi-Fi transmitters in the corners. As they moved the chip around in various patterns, WiCapture tracked its position to within a centimeter. Next, the researchers tried an office in which all the Wi-Fi transmitters were occluded by furniture or walls. As long as two were in the same room as the chip, WiCapture’s median error was still only 1.5 centimeters. Outside, the median error was again less than a centimeter, the team will report this month at the Conference on Computer Vision and Pattern Recognition in Honolulu.

“It was really nice to bridge work in the wireless community with work in the virtual reality community,” says Dina Katabi, a computer scientist at the Massachusetts Institute of Technology in Cambridge who was not involved in the experiment. Yuval Boger, a physicist and the CEO of Sensics, a VR hardware and software company in Columbia, Maryland, says, “the need is real” for a robust hi-resolution position tracker. He notes that 1 centimeter is not a high enough accuracy for head tracking, but would work for hand tracking. In a fighting game, “I’m not sure I’m going to do any small delicate movements with a sword.”

The authors acknowledge that WiCapture still has a slower reaction time and lower accuracy than infrared cameras, but they think they can improve both by combining it with an accelerometer to add another source of data and fill in the gaps. In any case, Kotaru says, the technology is basically ready to use.

Astronomers Find Giant Planet That’s Hotter Than Most Stars

Astronomers Find Giant Planet That’s Hotter Than Most Stars

Astronomers have discovered the hottest planet ever known, with a dayside temperature of more than 4,300 degrees Celsius. In fact, this planet, called KELT-9b, is hotter than most stars, according to a study published in the journal Nature.

“This is the hottest gas giant planet that has ever been discovered,” said Scott Gaudi, Professor at the Ohio State University in Columbus who led a study.

KELT-9b is 2.8 times more massive than Jupiter, but only half as dense.

It is nowhere close to habitable, but Gaudi said there is a good reason to study worlds that are unlivable in the extreme.

“As has been highlighted by the recent discoveries from the MEarth collaboration, the planet around Proxima Centauri, and the astonishing system discovered around TRAPPIST-1, the astronomical community is clearly focused on finding Earthlike planets around small, cooler stars like our sun,”Gaudi said.

“They are easy targets and there’s a lot that can be learned about potentially habitable planets orbiting very low-mass stars in general. On the other hand, because KELT-9b’s host star is bigger and hotter than the Sun, it complements those efforts and provides a kind of touchstone for understanding how planetary systems form around hot, massive stars,” he explained.

Because the planet is tidally locked to its star – as the moon is to Earth – one side of the planet is always facing toward the star, and one side is in perpetual darkness.

Molecules such as water, carbon dioxide and methane cannot form on the dayside because it is bombarded by too much ultraviolet radiation.

The properties of the nightside are still mysterious – molecules may be able to form there, but probably only temporarily.

“It’s a planet by any of the typical definitions of mass, but its atmosphere is almost certainly unlike any other planet we’ve ever seen just because of the temperature of its dayside,” said Gaudi, worked on this study while on sabbatical at NASA’s Jet Propulsion Laboratory, Pasadena, California.

Its star, called KELT-9, is even hotter – in fact, it is probably unravelling the planet through evaporation. It is only 300 million years old, which is young in star time.

It is more than twice as large, and nearly twice as hot, as our sun.

Given that the planet’s atmosphere is constantly blasted with high levels of ultraviolet radiation, the planet may even be shedding a tail of evaporated planetary material like a comet.

“KELT-9 radiates so much ultraviolet radiation that it may completely evaporate the planet,” said Keivan Stassun, Professor at Vanderbilt University, Nashville, Tennessee.

The KELT-9b planet was found using the Kilodegree Extremely Little Telescope, or KELT.

MXene Could Help Make Batteries That Charge as Fast as Supercapacitors: Study

MXene Could Help Make Batteries That Charge as Fast as Supercapacitors: Study

A new battery electrode design from a highly conductive, two-dimensional material called Mxene could pave the way for fully charging your smartphone in just a few seconds, a new study says.

The design, described in the journal Nature Energy, could make energy storage devices like batteries, viewed as the plodding tanker truck of energy storage technology, just as fast as the speedy supercapacitors that are used to provide energy in a pinch – often as a battery back-up or to provide quick bursts of energy for things like camera flashes.

“This paper refutes the widely accepted dogma that chemical charge storage, used in batteries and pseudocapacitors, is always much slower than physical storage used in electrical double-layer capacitors, also known as supercapacitors,” said lead researcher Yury Gogotsi, Professor at Drexel University in Philadelphia, Pennsylvania, US.

“We demonstrate charging of thin MXene electrodes in tens of milliseconds. This is enabled by very high electronic conductivity of MXene. This paves the way to development of ultrafast energy storage devices than can be charged and discharged within seconds, but store much more energy than conventional supercapacitors,” Gogotsi added.

The key to faster charging energy storage devices is in the electrode design.

Electrodes are essential components of batteries, through which energy is stored during charging and from which it is disbursed to power our devices.

So the ideal design for these components would be one that allows them to be quickly charged and store more energy.

The overarching benefit of using MXene as the material for the electrode design is its conductivity.

“If we start using low-dimensional and electronically conducting materials as battery electrodes, we can make batteries working much, much faster than today,” Gogotsi said.

“Eventually, appreciation of this fact will lead us to car, laptop and cell-phone batteries capable of charging at much higher rates – seconds or minutes rather than hours,” Gogotsi added.

NASA developing first asteroid deflection mission

NASA developing first asteroid deflection mission

NASA is developing the first-ever mission that will deflect a near-Earth asteroid, and help test the systems that will allow mankind to protect the planet from potential cosmic body impacts in the future.

The Double Asteroid Redirection Test (DART) — which is being designed and would be built and managed by the John Hopkins Applied Physics Laboratory — is moving from concept development to preliminary design phase, the US space agency said.

“DART would be NASA’s first mission to demonstrate what’s known as the kinetic impactor technique — striking the asteroid to shift its orbit — to defend against a potential future asteroid impact,” said Lindley Johnson, planetary defense officer at NASA Headquarters in Washington.

“This approval step advances the project towards a historic test with a nonthreatening small asteroid,” said Johnson.

“DART is a critical step in demonstrating we can protect our planet from a future asteroid impact,” said Andy Cheng, who serves as the DART investigation co-lead.

“Since we don’t know that much about their internal structure or composition, we need to perform this experiment on a real asteroid,” Andy said.

Protecting our planet

“With DART, we can show how to protect Earth from an asteroid strike with a kinetic impactor by knocking the hazardous object into a different flight path that would not threaten the planet,” he said.

The target for DART is an asteroid that will have a distant approach to Earth in October 2022, and then again in 2024.

The asteroid is called Didymos — Greek for “twin” — because it is an asteroid binary system that consists of two bodies: Didymos A, about 780 metres in size, and a smaller asteroid orbiting it called Didymos B, about 160 metres in size.

DART would impact only the smaller of the two bodies, Didymos B.

The Didymos system has been closely studied since 2003.

The primary body is a rocky S-type object, with composition similar to that of many asteroids. The composition of its small companion, Didymos B, is unknown, but the size is typical of asteroids that could potentially create regional effects should they impact Earth.

After launch, DART would fly to Didymos and use an APL- developed onboard autonomous targeting system to aim itself at Didymos B.

Then the refrigerator-sized spacecraft would strike the smaller body at a speed about nine times faster than a bullet, about six kilometres per second.

Kinetic impact

Earth-based observatories would be able to see the impact and the resulting change in the orbit of Didymos B around Didymos A, allowing scientists to better determine the capabilities of kinetic impact as an asteroid mitigation strategy.

The kinetic impact technique works by changing the speed of a threatening asteroid by a small fraction of its total velocity, but by doing it well before the predicted impact so that this small nudge will add up over time to a big shift of the asteroid’s path away from Earth.