Monday 31 August 2015

MIT: "Largest of the Five Mass Extinctions Caused By Microbes" (Today's Most Popular)

Dinogorgon_923_600x450

Evidence left at the crime scene is abundant and global: Fossil remains show that sometime around 252 million years ago, about 90 percent of all species on Earth were suddenly wiped out — by far the largest of this planet’s five known mass extinctions. But pinpointing the culprit has been difficult, and controversial.


This past March, a team of MIT researchers may have found enough evidence to convict the guilty parties — but you’ll need a microscope to see the killers.

The researchers’ case builds upon three independent sets of evidence. First, geochemical evidence shows an exponential (or even faster) increase of carbon dioxide in the oceans at the time of the so-called end-Permian extinction. Second, genetic evidence shows a change in Methanosarcina at that time, allowing it to become a major producer of methane from an accumulation of organic carbon in the water. Finally, sediments show a sudden increase in the amount of nickel deposited at exactly this time.

The perpetrators, this new work suggests, were not asteroids, volcanoes, or raging coal fires, all of which have been implicated previously. Rather, they were a form of microbes — specifically, methane-producing archaea called Methanosarcina — that suddenly bloomed explosively in the oceans, spewing prodigious amounts of methane into the atmosphere and dramatically changing the climate and the chemistry of the oceans.

Volcanoes are not entirely off the hook, according to this new scenario; they have simply been demoted to accessories to the crime. The reason for the sudden, explosive growth of the microbes, new evidence shows, may have been their novel ability to use a rich source of organic carbon, aided by a sudden influx of a nutrient required for their growth: the element nickel, emitted by massive volcanism at just that time.

The solution to this mystery was published in the Proceedings of the National Academy of Science by MIT professor of geophysics Daniel Rothman, postdoc Gregory Fournier, and five other researchers at MIT and in China.

The carbon deposits show that something caused a significant uptick in the amount of carbon-containing gases — carbon dioxide or methane — produced at the time of the mass extinction. Some researchers have suggested that these gases might have been spewed out by the volcanic eruptions that produced the Siberian traps, a vast formation of volcanic rock produced by the most extensive eruptions in Earth’s geological record. But calculations by the MIT team showed that these eruptions were not nearly sufficient to account for the carbon seen in the sediments. Even more significantly, the observed changes in the amount of carbon over time don’t fit the volcanic model.

“A rapid initial injection of carbon dioxide from a volcano would be followed by a gradual decrease,” Fournier says. “Instead, we see the opposite: a rapid, continuing increase. That suggests a microbial expansion,” he adds: The growth of microbial populations is among the few phenomena capable of increasing carbon production exponentially, or even faster.

But if living organisms belched out all that methane, what organisms were they, and why did they choose to do so at that time?

That’s where genomic analysis can help: It turns out that Methanosarcina had acquired a particularly fast means of making methane, through gene transfer from another microbe — and the team’s detailed mapping of the organism’s history now shows that this transfer happened at about the time of the end-Permian extinction. (Previous studies had only placed this event sometime in the last 400 million years.) Given the right conditions, this genetic acquisition set the stage for the microbe to undergo a dramatic growth spurt, rapidly consuming a vast reserve of organic carbon in the ocean sediments.

But there is one final piece to the puzzle: Those organisms wouldn’t have been able to proliferate so prodigiously if they didn’t have enough of the right mineral nutrients to support them. For this particular microbe, the limiting nutrient is nickel — which, new analysis of sediments in China showed, increased dramatically following the Siberian eruptions (which were already known to have produced some of the world’s largest deposits of nickel). That provided the fuel for Methanosarcina’s explosive growth.

The burst of methane would have increased carbon dioxide levels in the oceans, resulting in ocean acidification — similar to the acidification predicted from human-induced climate change. Independent evidence suggests that marine organisms with heavily calcified shells were preferentially wiped out during the end-Permian extinction, which is consistent with acidification.

“A lot of this rests on the carbon isotope analysis,” Rothman says, which is exceptionally strong and clear in this part of the geological record. “If it wasn’t such an unusual signal, it would be harder to eliminate other possibilities.”

John Hayes, a researcher at Woods Hole Oceanographic Institution who was not involved in the research, says this work is “a remarkable combination of physics, biochemistry, and geochemistry. It grows out of years of outstanding and patient work that has provided a highly refined time scale for the events that accompanied Earth’s most severe cluster of extinctions.”

Hayes adds that the team’s identification of one organism that may have been responsible for many of the changes is “the first time that the explosive onset of a single process has been recognized in this way, and it adds very important detail to our understanding of the extinction.”

While no single line of evidence can prove exactly what happened in this ancient die-off, says Rothman, who is also co-director of MIT’s Lorenz Center, “the cumulative impact of all these things is much more powerful than any one individually.” While it doesn’t conclusively prove that the microbes did it, it does rule out some alternative theories, and makes a strong and consistent case, he says.

The research was supported by NASA, the National Science Foundation, the Natural Science Foundation of China, and the National Basic Research Program of China.

Read More

Saturday 29 August 2015

"A Billion Miles Beyond Pluto" -- NASA New Horizons to Probe Unexplored Kuiper Belt Objects

Newhorizonst

NASA has selected the potential next destination for the New Horizons mission to visit after its historic July 14 flyby of the Pluto system. The destination is a small Kuiper Belt object (KBO) known as 2014 MU69 that orbits nearly a billion miles beyond Pluto.

This remote KBO was one of two identified as potential destinations and the one recommended to NASA by the New Horizons team. Although NASA has selected 2014 MU69 as the target, as part of its normal review process the agency will conduct a detailed assessment before officially approving the mission extension to conduct additional science.

“Even as the New Horizon’s spacecraft speeds away from Pluto out into the Kuiper Belt, and the data from the exciting encounter with this new world is being streamed back to Earth, we are looking outward to the next destination for this intrepid explorer,” said John Grunsfeld, astronaut and chief of the NASA Science Mission Directorate at the agency headquarters in Washington. “While discussions whether to approve this extended mission will take place in the larger context of the planetary science portfolio, we expect it to be much less expensive than the prime mission while still providing new and exciting science.”

Like all NASA missions that have finished their main objective but seek to do more exploration, the New Horizons team must write a proposal to the agency to fund a KBO mission. That proposal – due in 2016 – will be evaluated by an independent team of experts before NASA can decide about the go-ahead.

Path of NASA's New Horizons spacecraft toward its next potential target, the Kuiper Belt object 2014 MU69, nicknamed "PT1" (for "Potential Target 1") by the New Horizons team. Although NASA has selected 2014 MU69 as the target, as part of its normal review process the agency will conduct a detailed assessment before officially approving the mission extension to conduct additional science. (Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute/Alex Parker)

Newhorizonst

Early target selection was important; the team needs to direct New Horizons toward the object this year in order to perform any extended mission with healthy fuel margins. New Horizons will perform a series of four maneuvers in late October and early November to set its course toward 2014 MU69 – nicknamed “PT1” (for “Potential Target 1”) – which it expects to reach on January 1, 2019. Any delays from those dates would cost precious fuel and add mission risk.

“2014 MU69 is a great choice because it is just the kind of ancient KBO, formed where it orbits now, that the Decadal Survey desired us to fly by,” said New Horizons Principal Investigator Alan Stern, of the Southwest Research Institute (SwRI) in Boulder, Colorado. “Moreover, this KBO costs less fuel to reach [than other candidate targets], leaving more fuel for the flyby, for ancillary science, and greater fuel reserves to protect against the unforeseen.”

New Horizons was originally designed to fly beyond the Pluto system and explore additional Kuiper Belt objects. The spacecraft carries extra hydrazine fuel for a KBO flyby; its communications system is designed to work from far beyond Pluto; its power system is designed to operate for many more years; and its scientific instruments were designed to operate in light levels much lower than it will experience during the 2014 MU69 flyby.

The 2003 National Academy of Sciences’ Planetary Decadal Survey (“New Frontiers in the Solar System”) strongly recommended that the first mission to the Kuiper Belt include flybys of Pluto and small KBOs, in order to sample the diversity of objects in that previously unexplored region of the solar system. The identification of PT1, which is in a completely different class of KBO than Pluto, potentially allows New Horizons to satisfy those goals.

But finding a suitable KBO flyby target was no easy task. Starting a search in 2011 using some of the largest ground-based telescopes on Earth, the New Horizons team found several dozen KBOs, but none were reachable within the fuel supply aboard the spacecraft.

The powerful Hubble Space Telescope came to the rescue in summer 2014, discovering five objects, since narrowed to two, within New Horizons’ flight path. Scientists estimate that PT1 is just under 30 miles (about 45 kilometers) across; that’s more than 10 times larger and 1,000 times more massive than typical comets, like the one the Rosetta mission is now orbiting, but only about 0.5 to 1 percent of the size (and about 1/10,000th the mass) of Pluto. As such, PT1 is thought to be like the building blocks of Kuiper Belt planets such as Pluto.

Unlike asteroids, KBOs have been heated only slightly by the Sun, and are thought to represent a well preserved, deep-freeze sample of what the outer solar system was like following its birth 4.6 billion years ago.

“There’s so much that we can learn from close-up spacecraft observations that we’ll never learn from Earth, as the Pluto flyby demonstrated so spectacularly,” said New Horizons science team member John Spencer, also of SwRI. “The detailed images and other data that New Horizons could obtain from a KBO flyby will revolutionize our understanding of the Kuiper Belt and KBOs.” The New Horizons spacecraft – currently 3 billion miles [4.9 billion kilometers] from Earth – is just starting to transmit the bulk of the images and other data, stored on its digital recorders, from its historic July encounter with the Pluto system. The spacecraft is healthy and operating normally.

New Horizons is part of NASA’s New Frontiers Program, managed by the agency’s Marshall Space Flight Center in Huntsville, Ala. The Johns Hopkins University Applied Physics Laboratory in Laurel, Md., designed, built, and operates the New Horizons spacecraft and manages the mission for NASA’s Science Mission Directorate. SwRI leads the science mission, payload operations, and encounter science planning.

Read More

Friday 28 August 2015

Quasar Nearest to Earth Harbors Two Supermassive Black Holes

Ouastrophysi

Astrophysicists have found two supermassive black holes in Markarian 231, the nearest quasar to Earth, using observations from NASA's Hubble Space Telescope. The discovery of two supermassive black holes--one larger one and a second, smaller one--are evidence of a binary black hole and suggests that supermassive black holes assemble their masses through violent mergers.

Xinyu Dai of Oklahoma University , collaborated on this project with Youjun Lu of the National Astronomical Observatories of China, Chinese Academy of Sciences. Dai and Lu looked at ultraviolet radiation emitted from the center of the Mrk 231 from Hubble observations, then applied a model developed by Lu to the spectrum of the galaxy. As a result, they were able to predict the existence of the binary black holes in Mrk 231.

"We are extremely excited about this finding because it not only shows the existence of a close binary black hole in Mrk 231, but also paves a new way to systematically search binary black holes via the nature of their ultraviolet light emission," said Lu, National Astronomical Observatories of China, Chinese Academy of Sciences.

"The structure of our universe, such as those giant galaxies and clusters of galaxies, grows by merging smaller systems into larger ones, and binary black holes are natural consequences of these mergers of galaxies," said Dai.

So over time, the two black holes discovered by Dai and Lu in Mrk 231 will collide and merge to form a quasar with a supermassive black hole. A quasar is an active galaxy with an illuminated center, which is short lived compared to the age of the universe.

The results of this project were published in the August 14, 2015, edition of The Astrophysical Journal.

Read More

Cosmic Collision Triggers Rebirth of a "Radio Phoenix"

A1033_label

Astronomers have found evidence for a faded electron cloud "coming back to life," much like the mythical phoenix, after two galaxy clusters collided. This "radio phoenix," so-called because the high-energy electrons radiate primarily at radio frequencies, is found in Abell 1033. The system is located about 1.6 billion light years from Earth.

By combining data from NASA's Chandra X-ray Observatory, the Westerbork Synthesis Radio Telescope in the Netherlands, NSF's Karl Jansky Very Large Array (VLA), and the Sloan Digital Sky Survey (SDSS), astronomers were able to recreate the scientific narrative behind this intriguing cosmic story of the radio phoenix.

Galaxy clusters are the largest structures in the Universe held together by gravity. They consist of hundreds or even thousands of individual galaxies, unseen dark matter, and huge reservoirs of hot gas that glow in X-ray light. Understanding how clusters grow is critical to tracking how the Universe itself evolves over time.

Astronomers think that the supermassive black hole close to the center of Abell 1033 erupted in the past. Streams of high-energy electrons filled a region hundreds of thousands of light years across and produced a cloud of bright radio emission. This cloud faded over a period of millions of years as the electrons lost energy and the cloud expanded.

The radio phoenix emerged when another cluster of galaxies slammed into the original cluster, sending shock waves through the system. These shock waves, similar to sonic booms produced by supersonic jets, passed through the dormant cloud of electrons. The shock waves compressed the cloud and re-energized the electrons, which caused the cloud to once again shine at radio frequencies.

A new portrait of this radio phoenix is captured in this multi wavelength image of Abell 1033. X-rays from Chandra are in pink and radio data from the VLA are colored green. The background image shows optical observations from the SDSS. A map of the density of galaxies, made from the analysis of optical data, is seen in blue. Mouse over the image above to see the location of the radio phoenix.

The Chandra data show hot gas in the clusters, which seems to have been disturbed during the same collision that caused the re-ignition of radio emission in the system. The peak of the X-ray emission is seen to the south (bottom) of the cluster, perhaps because the dense core of gas in the south is being stripped away by surrounding gas as it moves. The cluster in the north may not have entered the collision with a dense core, or perhaps its core was significantly disrupted during the merger. On the left side of the image, a so-called wide-angle tail radio galaxy shines in the radio. The lobes of plasma ejected by the supermassive black hole in its center are bent by the interaction with the cluster gas as the galaxy moves through it.

Astronomers think they are seeing the radio phoenix soon after it had reborn, since these sources fade very quickly when located close to the center of the cluster, as this one is in Abell 1033. Because of the intense density, pressure, and magnetic fields near the center of Abell 1033, a radio phoenix is only expected to last a few tens of millions of years.

Read More

Thursday 27 August 2015

Cosmic Oases --"Did Life Cross the Vast Gulf of Interstellar Space Long Ago?"

98197_web

We only have one example of a planet with life: Earth. But within the next generation, it should become possible to detect signs of life on planets orbiting distant stars. If we find alien life, new questions will arise. For example, did that life arise spontaneously? Or could it have spread from elsewhere? If life crossed the vast gulf of interstellar space long ago, how would we tell?

New research by Harvard astrophysicists shows that if life can travel between the stars (a process called panspermia), it would spread in a characteristic pattern that we could potentially identify.

"In our theory clusters of life form, grow, and overlap like bubbles in a pot of boiling water," says lead author Henry Lin of the Harvard-Smithsonian Center for Astrophysics (CfA).

There are two basic ways for life to spread beyond its host star. The first would be via natural processes such as gravitational slingshotting of asteroids or comets. The second would be for intelligent life to deliberately travel outward. The paper does not deal with how panspermia occurs. It simply asks: if it does occur, could we detect it? In principle, the answer is yes.

The model assumes that seeds from one living planet spread outward in all directions. If a seed reaches a habitable planet orbiting a neighboring star, it can take root. Over time, the result of this process would be a series of life-bearing oases dotting the galactic landscape.

"Life could spread from host star to host star in a pattern similar to the outbreak of an epidemic. In a sense, the Milky Way galaxy would become infected with pockets of life," explains CfA co-author Avi Loeb.

In the theoretical artist's conception of the Milky Way galaxy at the top of the page, transluscent green "bubbles" mark areas where life has spread beyond its home system to create cosmic oases, a process called panspermia. New research suggests that we could detect the pattern of panspermia, if it occurs.

If we detect signs of life in the atmospheres of alien worlds, the next step will be to look for a pattern. For example, in an ideal case where the Earth is on the edge of a "bubble" of life, all the nearby life-hosting worlds we find will be in one half of the sky, while the other half will be barren.

Lin and Loeb caution that a pattern will only be discernible if life spreads somewhat rapidly. Since stars in the Milky Way drift relative to each other, stars that are neighbors now won't be neighbors in a few million years. In other words, stellar drift would smear out the bubbles.

Read More

"Metamorphosis" --The Evolution of Galaxies: The Basic Building Block of the Observable Universe

6a00d8341bf7f753ef01b7c754e8c7970b

A team of international scientists has shown for the first time that galaxies can change their structure over the course of their lifetime. By observing the sky as it is today, and peering back in time using the Hubble and Herschel telescopes, the team have shown that a large proportion of galaxies have undergone a major 'metamorphosis' since they were initially formed after the Big Bang.

By providing the first direct evidence of the extent of this transformation, the team hope to shed light on the processes that caused these dramatic changes, and therefore gain a greater understanding of the appearance and properties of the Universe as we know it today.

In their study, led by astronomers from Cardiff University's School of Physics and Astronomy, which has been published in the Monthly Notices of the Royal Astronomical Society¸ the researchers observed around 10,000 galaxies currently present in the Universe using a survey of the sky created by the Herschel ATLAS and GAMA projects.

The researchers then classified the galaxies into the two main types: flat, rotating, disc-shaped galaxies (much like our own galaxy, the Milky Way); and large, oval-shaped galaxies with a swarm of disordered stars.

Using the Hubble and Herschel telescopes, the researchers then looked further out into the Universe, and thus further back in time, to observe the galaxies that formed shortly after the Big Bang.

The researchers showed that 83 per cent of all the stars formed since the Big Bang were initially located in a disc-shaped galaxy. However, only 49 per cent of stars that exist in the Universe today are located in these disc-shaped galaxies--the remainder are located in oval-shaped galaxies.

The results suggest a massive transformation in which disc-shaped galaxies became oval-shaped galaxies.

A popular theory is that this transformation was caused by many cosmic catastrophes, in which two disk-dominated galaxies, straying too close to each other, were forced by gravity to merge into a single galaxy, with the merger destroying the disks and producing a huge pileup of stars. An opposing theory is that the transformation was a more gentle process, with stars formed in a disk gradually moving to the centre of a disk and producing a central pile-up of stars.

Lead author of the study Professor Steve Eales, from Cardiff University's School of Physics and Astronomy, said: "Many people have claimed before that this metamorphosis has occurred, but by combining Herschel and Hubble, we have for the first time been able to accurately measure the extent of this transformation.

"Galaxies are the basic building blocks of the Universe, so this metamorphosis really does represent one of the most significant changes in its appearance and properties in the last 8 billion years."

"This study is important as it establishes statistics showing that almost all stars formed in spiral galaxies in the past, but a large fraction of these now appear as large, dead, elliptical galaxies today," said
Asantha Cooray, a co-author of the study from the University of California. "This study will require us to refine the models and computer simulations that attempt to explain how galaxies formed and behaved over the last 13 billion years."

Dr David Clements, a co-author of the study from Imperial College London, said: "Up to now we've seen individual cases in the local universe where galaxy collisions convert spirals into ellipticals. This study shows that this kind of transformation is not exceptional, but is part of the normal history of galaxy evolution."

Matthew Allen, a Ph.D. student at Cardiff University and a member of the team, said: "This is a huge step in understanding how the galactic population has evolved over billions of years. Using some of the most cutting edge data and techniques, we are finally beginning to understand the processes that have shaped our Universe."

The image at the top of the page is the Hubble Deep Field image of the early universe.

Read More

Wednesday 26 August 2015

Image of the Day --The Spectacular Core of the Twin Jet Nebula

Opo9738a

The shimmering colors visible in this NASA/ESA Hubble Space Telescope image show off the remarkable complexity of the Twin Jet Nebula. The new image highlights the nebula's shells and its knots of expanding gas in striking detail. Two iridescent lobes of material stretch outwards from a central star system. Within these lobes two huge jets of gas are streaming from the star system at speeds in excess of one million kilometers per hour.

The cosmic butterfly pictured in this NASA/ESA Hubble Space Telescope image goes by many names. It is called the Twin Jet Nebula as well as answering to the slightly less poetic name of PN M2-9.

The M in this name refers to Rudolph Minkowski, a German-American astronomer who discovered the nebula in 1947. The PN, meanwhile, refers to the fact that M2-9 is a planetary nebula. The glowing and expanding shells of gas clearly visible in this image represent the final stages of life for an old star of low to intermediate mass. The star has not only ejected its outer layers, but the exposed remnant core is now illuminating these layers -- resulting in a spectacular light show like the one seen here. However, the Twin Jet Nebula is not just any planetary nebula, it is a bipolar nebula.

Ordinary planetary nebulae have one star at their center, bipolar nebulae have two, in a binary star system. Astronomers have found that the two stars in this pair each have around the same mass as the Sun, ranging from 0.6 to 1.0 solar masses for the smaller star, and from 1.0 to 1.4 solar masses for its larger companion. The larger star is approaching the end of its days and has already ejected its outer layers of gas into space, whereas its partner is further evolved, and is a small white dwarf

The characteristic shape of the wings of the Twin Jet Nebula is most likely caused by the motion of the two central stars around each other. It is believed that a white dwarf orbits its partner star and thus the ejected gas from the dying star is pulled into two lobes rather than expanding as a uniform sphere. However, astronomers are still debating whether all bipolar nebulae are created by binary stars. Meanwhile the nebula's wings are still growing and, by measuring their expansion, astronomers have calculated that the nebula was created only 1200 years ago.

Within the wings, starting from the star system and extending horizontally outwards like veins are two faint blue patches. Although these may seem subtle in comparison to the nebula's rainbow colors, these are actually violent twin jets streaming out into space, at speeds in excess of one million kilometers per hour. This is a phenomenon that is another consequence of the binary system at the heart of the nebula. These jets slowly change their orientation, precessing across the lobes as they are pulled by the wayward gravity of the binary system.

The two stars at the heart of the nebula circle one another roughly every 100 years. This rotation not only creates the wings of the butterfly and the two jets, it also allows the white dwarf to strip gas from its larger companion, which then forms a large disc of material around the stars, extending out as far as 15 times the orbit of Pluto! Even though this disc is of incredible size, it is much too small to be seen on the image taken by Hubble.

An earlier image of the Twin Jet Nebula using data gathered by Hubble's Wide Field Planetary Camera 2 was released in 1997. This newer version incorporates more recent observations from the telescope's Space Telescope Imaging Spectrograph (STIS).

Read More

"Staggering" --Exploring the Limits of What Life Might Be Like in the Universe

6a00d8341bf7f753ef01bb083dd388970d-800wi

Bizarre creatures that go years without water. Others that can survive the vacuum of open space. Some of the most unusual organisms found on Earth provide insights for Washington State University planetary scientist Dirk Schulze-Makuch to predict what life could be like elsewhere in the universe.

"If you don't explore the various options of what life may be like in the universe, you won't know what to look for when you go out to find it," said Schulze-Makuch. "We do not propose that these organisms exist but like to point out that their existence would be consistent with physical and chemical laws, as well as biology."

NASA's discovery last month of 500 new planets near the constellations Lyra and Cygnus, in the Milky Way Galaxy, touched off a storm of speculation about alien life. In a recent article in the journal Life, Schulze-Makuch draws upon what is known about Earth's most extreme lifeforms and the environments of Mars and Titan, Saturn's moon, to paint a clearer picture of what life on other planets could be like. His work was supported by the European Research Council.

For example, on Earth, a species of beetle called bombardier excretes an explosive mix of hydrogen peroxide and other chemicals to ward off predators.

"On other planets, under gravity conditions similar to those present on Mars, a bombardier beetle-like alien could excrete a similar reaction to propel itself as much as 300 meters into the air," Schulze-Makuch said.

While explorers to Mars might find creatures similar to those on Earth, life on a Titan-like planet would require a completely novel biochemistry. Such a discovery would be a landmark scientific achievement with profound implications.

Earth life, with its unique biochemical toolset, could feasibly survive on a Mars-like planet with a few novel adaptations.

First, organisms would need a way to get water in an environment that is akin to a drier and much colder version of Chile's Atacama Desert. A possible adaptation would be to use a water-hydrogen peroxide mixture rather than water as an intracellular liquid, Schulze-Makuch said.

Hydrogen peroxide is a natural antifreeze that would help microorganisms survive frigid Martian winters. It is also hygroscopic, meaning it naturally attracts water molecules from the atmosphere.

During the daytime, plant-like microorganisms on a Martian-like surface could photosynthesize hydrogen peroxide. At night, when the atmosphere is relatively humid, they could use their stored hydrogen peroxide to scavenge water from the atmosphere, similar to how microbial communities in the Atacama use the moisture that salt brine extracts from the air to stay alive.

Schulze-Makuch speculates that a larger, more complex alien creature, maybe resembling Earth's bombardier beetle, could use these microorganisms as a source of food and water. To move from one isolated patch of life-sustaining microorganisms to another, it could use rocket propulsion.

Due to its greater distance from the Sun, Titan is much colder than Earth. Its surface temperature is on average -290 degrees F. Additionally, there is no liquid water on the surface nor carbon dioxide in the atmosphere. The two chemical components are essential for life as we know it.

If life does exist on Titan or a Titan-like planet elsewhere in the universe, it uses something other than water as an intracellular liquid. One possibility is a liquid hydrocarbon like methane or ethane. Non-water based lifeforms could feasibly live in the liquid methane and ethane lakes and seas that make up a large portion of Titan's surface, just as organisms on Earth live in water, Schulze-Makuch said.

Such hypothetical creatures would take in hydrogen in place of oxygen and react it with high energy acetylene in the atmosphere to produce methane instead of carbon dioxide.

Due to their frigid environment, these organisms would have huge (by Earth standards) and very slowly metabolizing cells. The slow rate of metabolism would mean evolution and aging would occur much slower than on Earth, possibly raising the life span of individual organisms significantly.

"On Earth, we have only scratched the surface of the physiological options various organisms have. But what we do know is astounding," Schulze-Makuch said. "The possibilities of life elsewhere in the universe are even more staggering.

"Only the discovery of extraterrestrial life and a second biosphere will allow us to test these hypotheses," he said, "which would be one of the grandest achievements of our species."

Read More

Tuesday 25 August 2015

"Robot Scientist" to Search for Laws of Nature That Underlie the Universe

Quarksoup-HR

Everything that is changing around and within us - from the relatively simple motion of celestial bodies, to weather and complex biological processes - is a dynamical system. A large part of science is guessing the laws of nature that underlie such systems, summarizing them in mathematical equations that can be used to make predictions, and then testing those equations and predictions through experiments.

Biophysicists have taken another small step forward in the quest for an automated method to infer models describing a system's dynamics - a so-called robot scientist. Nature Communications published the finding - a practical algorithm for inferring laws of nature from time-series data of dynamical systems.

"Our algorithm is a small step," says Ilya Nemenman, lead author of the study and a professor of physics and biology at Emory University. "It could be described as a toy version of a robot scientist, but even so it may have practical applications. For the first time, we've taught a computer how to efficiently search for the laws that underlie arbitrary, natural dynamical systems, including complex, non-linear biological systems.

"The long-term dream is to harness large-scale computing to make the guesses for us and speed up the process of discovery," Nemenman says. Nemenman's co-author on the paper is Bryan Daniels, a biophysicist at the University of Wisconsin.

While the quest for a true robot scientist, or computerized general intelligence, remains elusive, this latest algorithm represents a new approach to the problem.

"We think we have beaten any automated-inference algorithm that currently exists because we focus on getting an approximate solution to a problem, which we can get with much less data," Nemenman says.

In previous research, John Wikswo, a biophysicist at Vanderbilt University, along with colleagues at Cornell University, applied a software system to automate the scientific process for biological systems.

"We came up with a way to derive a model of cell behavior, but the approach is complicated and slow, and it is limited in the number of variables that it can track - it can't be scaled to more complicated systems," Wikswo says. "This new algorithm increases the speed of the necessary calculation by a factor of 100 or more. It provides an elegant method to generate compact and effective models that should allow prediction and control of complex systems."

Nemenman and Daniels dubbed their new algorithm "Sir Issac."

The real Sir Isaac Newton serves as a classic example of how the scientific method involves forming hypotheses, then testing them by looking at data and experiments. Newton guessed that the same rules of gravity applied to a falling apple and to the moon in orbit. He used data to test and refine his guess and generated the law of universal gravitation.

To test their algorithm, Nemenman and Daniels created an artificial, model solar system by generating numerical trajectories of planets and comets that move around a sun. In this simplified solar system, only the sun attracted the planets and comets.

"We trained our algorithm how to search through a group of laws which were limited enough to be practical, but also flexible enough to explain many different dynamics," Nemenman explains. "We then gave the algorithm some simulated planetary trajectories, and asked it what makes these planets move. It gave us the universal gravitational force. Not perfectly, but with very good accuracy. The error was just a few percent."

The algorithm also figured out that force changes velocity, not the position directly. "It gets Newton's First Law," Nemenman says, "the fact that in order to predict the possible trajectory of a planet, whether it stays near the sun or flies off into infinity, just knowing its initial position is not enough. The algorithm understands that you also need to know the velocity."

While most modern-day high school student know Newton's First Law, it took humanity 2,000 years beyond the time of Aristotle to discover it.

One limitation of the algorithm is inexactness. Getting an approximate model, however, is beneficial as long as the approximation is close enough to make good predictions, Nemenman says.

"Newton's laws are also approximate, but they have been remarkably beneficial for 350 years," he says. "We're still using them to control everything from electron microscopes to rockets."

Getting an exact description of any complex dynamical system requires large amounts of data, he adds. "In contrast, with our algorithm, we can get an approximate description by using just a few measurements of a system. That makes our method practical."

The researchers demonstrated, for example, that the algorithm can infer the dynamics of a caricature of an immune receptor in a leukocyte. This type of model could lead to a better understanding of the time-course for the response to an infection or a drug.

In another experiment, the researchers fed the algorithm data on concentrations of just three different species of chemicals involved in glycolysis in yeast. The algorithm generated a model that makes accurate predictions for the full system of this basic metabolic process to consume glucose, which involves seven chemical species.

"If you applied other methods of automatic inference to this system it would typically take tens of thousands of examples to reliably generate the laws that drive these chemical transformations," Nemenman says. "With our algorithm, we were able to do it with fewer than 100 examples."

With their experimental collaborators, the researchers are now exploring whether the algorithm can model more complex biological processes, such as the dynamics of insulin secretion in the pancreas and its relationship to the onset of a disease like diabetes. "The biology of insulin secreting cells is extremely complex. Understanding their dynamics on multiple scales is going to be difficult, and may not be possible for years with traditional methods," Nemenman says. "But we want to see if we can get a good enough approximation with our method to deliver a practical result."

The intuition of a genius mind like that of Isaac Newton is one quality that distinguishes human intelligence from even the highest-powered computer and algorithmic program.

"You can't give a machine intuition - at least for now," Nemenman says. "What we're hoping we can do is get our computer algorithm to spit out models of phenomena so that we, as scientists, can use them and our intuition to make useful generalizations. It's easier to generalize from models of specific systems then it is to generalize from various data sets directly."

The image at the top of the page shows quark soup, liquid quark-gluon plasma, from the Relativistic Heavy Ion Collider (RHIC), an “atom smasher” at Brookhaven Lab that recreates conditions of the very early universe to explore the fundamental properties and interactions of matter.

Read More

Monday 24 August 2015

New Insights Into "Snowball Earth" --"The Most Extreme Climatic Conditions Ever Known"

Snowball_earth_1


The second ice age during the Cryogenian period was not followed by the sudden and chaotic melting-back of the ice as previously thought, but ended with regular advances and retreats of the ice, according to research published by scientists from the University of Birmingham. The researchers also found that the constant advance and retreat of ice during this period was caused by the Earth wobbling on its axis.

These ice ages are explained by a theory of Snowball Earth, which says that they represent the most extreme climatic conditions the world has ever known and yet they ended quite abruptly 635 million years ago. Little was known about how they ended -- until now.

For the study, the scientists analysed sedimentary rocks from Svalbard, Norway that were laid down in that ice age. The deposits preserved a chemical record which showed high levels of CO2 were present in the atmosphere. Carbon dioxide was low when the ice age started, and built up slowly over millions of years when the whole Earth was very cold -- this period is represented only by frost-shattered rubble under the sediments.

Eventually the greenhouse warmth in the atmosphere from carbon dioxide caused enough melting for glaciers to erode, transport and deposit sediment. The sedimentary layers showed ice retreat and advance as well as cold arid conditions. They reveal a time when glacial advances alternated with even more arid, chilly periods and when the glaciers retreated, rivers flowed, lakes formed, and yet simple life survived.

As theory predicts, this icy Earth with a hot atmosphere rich in carbon dioxide had reached a 'Goldilocks' zone -- too warm to stay completely frozen, too cold to lose its ice, but just right to record more subtle underlying causes of ancient climate change.

The geological researchers invited a French group of physicists who produce sophisticated climate models to test their theory that the advances and retreats of ice during this period were caused by the Earth wobbling on its axis in 20,000 year periods. The rocks and the models agreed: slight wobbles of the Earth on its spin axis caused differences in the heat received at different places on the Earth's surface. These changes were small, but enough over thousands of years to cause a change in the places where snow accumulated or melted, leading the glaciers to advance and retreat. During this time the whole Earth would have looked like the Dry Valley regions of Antarctica -- a very dry landscape, with lots of bare ground, but also containing glaciers up to 3 km thick.

We now have a much richer story about what happened at the end of the Snowball Earth period," said
Ian Fairchild, lead investigator from the University of Birmingham's School of Geography, Earth and Environmental Sciences. "The sediment analysis has given us a unique window on what happened so many millions of years ago. We know that the Earth's climate is controlled by its orbit, and we can now see the effect of that in this ancient ice age too."

Read More

Apollo Mission "Volcanic Fire Eruption" Mystery --Solved?

  14fede02a212fbf707ef7670edf425fc_large

Tiny beads of volcanic glass found on the lunar surface during the Apollo missions are a sign that fire fountain eruptions took place on the Moon's surface. Now, scientists from Brown University and the Carnegie Institution for Science have identified the volatile gas that drove those eruptions.

Fire fountains, a type of eruption that occurs frequently in Hawaii, require the presence of volatiles mixed in with the erupting lava. Volatile compounds turn into gas as the lavas rise from the depths. That expansion of that gas causes lava to blast into the air once it reaches the surface, a bit like taking the lid of a shaken bottle of soda.

"The question for many years was what gas produced these sorts of eruptions on the Moon," said Alberto Saal, associate professor of earth, environmental, and planetary sciences at Brown and corresponding author of the new research. "The gas is gone, so it hasn't been easy to figure out."

The image below shows melt inclusions which are tiny dots of magma frozen within olivine crystals. The crystals lock in volatile elements that may have otherwise escaped from the magma. Researchers have shown that melt inclusions within volcanic glasses from the Moon contain carbon. They conclude that gas-phase carbon likely drive the "fire fountain" eruptions the produced the glass. (Saal Lab / Brown University).

              Researchmays

The research, published in Nature Geoscience, suggests that lava associated with lunar fire fountains contained significant amounts of carbon. As it rose from the lunar depths, that carbon combined with oxygen to make substantial amounts carbon monoxide (CO) gas. That CO gas was responsible for the fire fountains that sprayed volcanic glass over parts of the lunar surface.

For many years, the Moon was thought to be devoid of volatiles like hydrogen and carbon. It wasn't until the last decade or so that volatiles were definitively detected in lunar samples. In 2008, Saal and colleagues detected water in lunar volcanic beads. They followed that discovery with detections of sulfur, chlorine and fluorine. While it became apparent that the Moon was not completely depleted of volatiles as was once thought, none of the volatiles that had been detected were consistent with fire fountain eruptions. For example, if water had been the driving force, there should be mineralogical signatures in recovered samples. There are none.

For this research, Saal and his colleagues carefully analyzed glass beads brought back to Earth from the Apollo 15 and 17 missions. In particular, they looked at samples that contained melt inclusions, tiny dots of molten magma that became trapped within crystals of olivine. The crystals trap gases present in the magma before they can escape.

Although other volatiles were previously detected in the lunar volcanic glasses and melt inclusions, the measurement of carbon remained elusive due to the high detection limits of the available analytical techniques. Erik Hauri from Carnegie Institution for Science developed a state-of-the-art ion probe technique reducing the detection limits of carbon by two orders of magnitude. That allows a measurement of as low as 0.1 part per million.

"This breakthrough depended on the ability of Carnegie's NanoSIMS ion probe to measure incredibly low levels of carbon, on objects that are the diameter of a human hair," said Hauri. "It is really a remarkable achievement both scientifically and technically."

The researchers probed the melt inclusions using secondary ion mass spectroscopy. They calculated that the samples contained initially 44 to 64 parts per million carbon. Having detected carbon, the researchers devised a theoretical model of how gases would escape from lunar magma at various depths and pressures, calibrated from the results of high-pressure lab experiments. The model had long been used for Earth. Saal and colleagues changed several parameters to match the composition and conditions affecting lunar magma.

The model showed that carbon, as it combines with oxygen to form CO gas, would have degassed before other volatiles.

"Most of the carbon would have degassed deep under the surface," Saal said. "Other volatiles like hydrogen degassed later, when the magma was much closer to the surface and after the lava began breaking up into small globules. That suggests carbon was driving the process in its early stages."

In addition to providing a potential answer to longstanding questions surrounding lunar fire fountains, the findings also serve as more evidence that some volatile reservoirs in the Moon's interior share a common origin with reservoirs in the Earth, the researchers say.

The amount of carbon detected in the melt inclusions was found to be very similar to the amount of carbon found in basalts erupted at Earth's mid-ocean ridges. Saal and his colleagues have shown previously that Earth and the Moon have similar concentrations of water and other volatiles. They have also shown that hydrogen isotope ratios from lunar samples are similar to that of Earth.

If volatile reservoirs on the Earth and Moon do indeed share a common source, it has implications for understanding the Moon's origin. Scientists believe the Moon formed when Earth was hit by a Mars-size object very early in its history. Debris from that impact accreted to form the Moon.

"The volatile evidence suggests that either some of Earth's volatiles survived that impact and were included in the accretion of the Moon or that volatiles were delivered to both the Earth and Moon at the same time from a common source -- perhaps a bombardment of primitive meteorites," Saal said.

Read More

Glacial Retreat During Last Ice Age --"Caused by Greenhouse Gases"

Cluster-of-glaciers


A recalculation of the dates at which boulders were uncovered by melting glaciers at the end of the last Ice Age has conclusively shown that the glacial retreat was due to rising levels of carbon dioxide and other greenhouse gases, as opposed to other types of forces. Carbon dioxide levels are now significantly higher than they were at that time, as a result of the Industrial Revolution and other human activities since then. Because of that, the study confirms predictions of future glacial retreat, and that most of the world's glaciers may disappear in the next few centuries.

The findings were published today in Nature Communications by researchers from Oregon State University, Boston College and other institutions. They erase some of the uncertainties about glacial melting that had been due to a misinterpretation of data from some of these boulders, which were exposed to the atmosphere more than 11,500 years ago.

"This shows that at the end of the last Ice Age, it was only the increase in carbon dioxide and other greenhouse gases that could have caused the loss of glaciers around the world at the same time," said Peter Clark, a professor in the OSU College of Earth, Ocean and Atmospheric Sciences, and co-author on the study.

"This study validates predictions that future glacial loss will occur due to the ongoing increase in greenhouse gas levels from human activities," Clark said. "We could lose 80-90 percent of the world's glaciers in the next several centuries if greenhouse gases continue to rise at the current rate."

Glacial loss in the future will contribute to rising sea levels and, in some cases, have impacts on local water supplies.

As the last Ice Age ended during a period of about 7,000 years, starting around 19,000 years ago, the levels of carbon dioxide in the atmosphere increased from 180 parts per million to 280 parts per million. But just in the past 150 years, they have surged from 280 to about 400 parts per million, far higher than what was required to put an end to the last Ice Age.

The new findings, Clark said, were based on a recalculation of the ages at which more than 1,100 glacial boulders from 159 glacial moraines around the world were exposed to the atmosphere after being buried for thousands of years under ice.

The exposure of the boulders to cosmic rays produced cosmogenic nuclides, which had been previously measured and used to date the event. But advances have been made in how to calibrate ages based on that data. Based on the new calculations, the rise in carbon dioxide levels - determined from ancient ice cores -matches up nicely with the time at which glacial retreat took place.

"There had been a long-standing mystery about why these boulders were uncovered at the time they were, because it didn't properly match the increase in greenhouse gases," said Jeremy Shakun, a professor at Boston College and lead author on the study. "We found that the previous ages assigned to this event were inaccurate. The data now show that as soon as the greenhouse gas levels began to rise, the glaciers began to melt and retreat."

There are other forces that can also cause glacial melting on a local or regional scale, the researchers noted, such as changes in the Earth's orbit around the sun, or shifts in ocean heat distribution. These factors probably did have localized effects. But the scientists determined that only the change in greenhouse gas levels could have explained the broader global retreat of glaciers all at the same time.

In the study of climate change, glaciers have always been of considerable interest, because their long-term behavior is a more reliable barometer that helps sort out the ups-and-downs caused by year-to-year weather variability, including short-term shifts in temperature and precipitation.

Other collaborators on this research were from the University of Wisconsin, Purdue University, and the National Center for Atmospheric Research. The work was supported by the National Oceanic and Atmospheric Administration and the National Science Foundation.

Read More

ExoMolecules on Other Worlds --"Could Fulfill Roles of DNA and RNA" (Weekend Feature)

Here on warm, watery Earth, the molecules DNA and RNA serve as the blueprints of life, containing creatures’ genetic instruction manuals. An immense family of proteins carries out these instructions. Yet in a hydrocarbon medium on Saturn's Titan, these molecules could never perform their profound chemical duties. Other molecules must therefore step up to the plate if non-water-based, alien life is to operate and evolve in a Darwinian sense, with genetic changes leading to diversity and complexity.

A new study proposes that molecules called ethers, not used in any genetic molecules on Earth, could fulfill the role of DNA and RNA on worlds with hydrocarbon oceans. These worlds must be a good deal toastier though than Titan, the study found, for plausibly life-like chemistry to take place. The new paper appeared in the March issue of the journal Astrobiology and was funded in part by the Exobiology & Evolutionary Biology element of the NASA Astrobiology Program.

“The genetic molecules we have proposed could perform on ‘warm Titans’,” said paper lead author Steven Benner, a distinguished fellow at the Foundation for Applied Molecular Evolution, a private scientific research organization based in Alachua, Florida.

Bigger molecular cousins to Titan’s methane, such as the octane that helps fuel our vehicles, would also make for far more suitable solvents. Although no “warm Titans” close-in to their host stars have turned up so far in exoplanet exploration, Benner is hopeful there are worlds aplenty that fit the bill.

“Within our own solar system, we do not have a planet big enough, close enough to the Sun, and with the right temperature to support warm hydrocarbon oceans on its surface,” said Benner. “But each week, astronomers are discovering new solar systems other than our own.”

On a fundamental level, the development of life on Earth has been a push-and-pull between molecules changing and staying the same. For an organism to reproduce and make copies of itself, the vast majority of its genetic information must be conserved if the offspring are to survive and still carry life forward. But if life does not change and adapt to inconstant environmental conditions, it will die out. The environmental curve balls to life include temperature swings and varying water and nutrient availability.

DNA and RNA allow for a biological version of the axiom “the more things change, the more they stay the same.” Individual “letters,” or nucleobases, in the four-letter code of DNA and RNA can mutate without destroying the molecule’s overall form and function.

These nucleobase changes can produce novel proteins. These proteins in turn let life chemically interact with its environment in new ways to promote survival. Brand new species arise in this manner, as fresh traits take hold in contrasting conditions and locations. (In the mid-1800s, Charles Darwin famously intuited this overarching concept of the origin of species, though the biomolecular nitty-gritty was not fathomed until many decades hence.)

The general structure, and therefore the general behavior, of DNA and RNA remains the same because of repeating elements in the chemical’s backbone, or main scaffolding. The molecules possess an outwardly negative charge that repeat along their backbones, which allows DNA and RNA to dissolve and float freely in water. In this fluid medium, the DNA and RNA can interact with other biomolecules, leading to complexity in biological systems.

“This is the central point of the ‘polyelectrolyte theory of gene,’ which holds that any genetic biopolymer able to support Darwinian evolution operating in water must have an ever-repeating backbone charge,” explained Benner. “The repeating charges so dominate the physical behavior of the genetic molecule that any changes in the nucleobases that influence genetic information have essentially no significant impact on the molecule’s overall physical properties.”

All of which is well and good for us water-based organisms. The trouble is, for waterless worlds like Titan where hydrocarbons reign, molecules like DNA and RNA would never cut it. These biomolecules cannot dissolve, as required, in hydrocarbons to allow for life’s microscopic bump-and-grind.

“None of these molecules have any chance of dissolving in a hydrocarbon ocean like on Titan or on a warm Titan,” said Benner.

More bothersome still, molecules with any sort of outward charge goop up in hydrocarbons. The blueprints of life on Earth as contained in DNA and RNA cannot translate to hydrocarbon-logged worlds.

Is life, at least as we can conceive of it, impossible amidst hydrocarbons? Benner and colleagues think not. Compounds called ethers, when strung together form complex “polyethers,” can likely perform in a manner that stays faithful to the polyelectrolyte theory of gene.

Ethers, like DNA and DNA, have simple, repeating backbones, in their case of carbon and oxygen. Structurally, ethers do not have an outward charge, like DNA and RNA. But ethers do possess internal charge repulsions that open up useful “spaces” within the molecules, wherein small elemental chunks can go that work like the DNA’s and RNA’s nucleobases.

Following from this insight, Benner and colleagues tested out how well polyethers would dissolve in various hydrocarbons. The researchers further ran experiments at temperatures expected of Titan-esque worlds at different distances from host stars.

Hydrocarbons, like water, can be solids liquids or gases, depending on temperature and pressure. As with the astrobiological hunts for water-based life, the liquid phase of hydrocarbons is the one of interest, because in solids (like ice), biomolecules cannot interact, and in gases (water vapor), the medium is too thin to support enough interaction.

As a rule, the temperature range at which a hydrocarbon is a liquid goes up as the hydrocarbon becomes longer. Methane, the simplest, shortest hydrocarbon with a single carbon atom linked to four hydrogen atoms, has a very narrow liquid range—between about -300 and -280 degrees Fahrenheit. Inconveniently, the solubility of ethers plummets when getting down into these Titanian chills.

According to Benner’s study, and to the disappointment of many scientists, Titan looks like a very unlikely abode for aliens. “We have shown that the methane oceans at Titan are likely to be too cold to hold any genetic biopolymer,” said Benner. (Puzzling readings of less hydrogen and acetylene than expected at Titan’s surface have, however, hinted previously at a form of microbial life.)

A better bet for life than methane-ocean worlds are those instead covered by propane. This hydrocarbon has three carbon atoms to methane’s one, and is another household name here on Earth as a gaseous fuel. It can stay liquid over a much broader and more suitable-for-chemistry range of -300 to -40 degrees Fahrenheit. Still better than propane is octane. This eight-carbon molecule does not freeze until about -70 degrees Fahrenheit, nor does it turn into a gas until reaching a quite-hot 257 degrees Fahrenheit.

That broad a range with sufficient ether solubility suggests that warm Titans could harbor a truly alien biochemistry capable of evolving complexity in a Darwinian manner. These worlds could be found in a fairly wide hydrocarbon “habitable zone” around other stars. The hydrocarbon habitable zone is akin to the familiar water-based zone, wherein a planet is neither too close nor too far from its star to have its water completely boil or freeze away.

Hydrocarbon worlds of interest need not be Titan-like, after all, in that they do not have to be moons of gas giants. Warm Titans could actually be more like oily Earths or super-Earths, drenched in octane.

As research continues, new and exotic solvents other than water and hydrocarbons could yet emerge as plausible milieus for life’s dealings. “Virtually every star has a habitable zone for every solvent,” said Benner.

The Cassini Mission image at the top of the page shows the hazy atmosphere of Saturn's moon, Titan.

Read More

Beyond the Quantum --"Will the Discovery of Dark Matter Revolutionize Our World?"

Image

Physicists suggest a new way to look for dark matter: they believe that dark matter particles annihilate into so-called dark radiation when they collide. If true, then we should be able to detect the signals from this radiation. The majority of the mass in the Universe remains unknown. Despite knowing very little about this dark matter, its overall abundance is precisely measured. In other words: Physicists know it is out there, but they have not yet detected it.

It is definitely worth looking for, argues Ian Shoemaker, former postdoctoral researcher at Centre for Cosmology and Particle Physics Phenomenology (CP3), Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, now at Penn State. “There is no way of predicting what we can do with dark matter, if we detect it. But it might revolutionize our world. When scientists discovered quantum mechanics, it was considered a curiosity. Today quantum mechanics plays an important role in computers”, he says.

Ever since dark matter was first theorized there have been many attempts to look for it, and now Ian Shoemaker and fellow scientists, Associate Professor Mads Toudal Frandsen, CP3, and John F. Cherry, postdoctoral researcher from Los Alamos National Laboratory, USA, suggest a new approach. They present their work in the journal Physical Review Letters.

On Earth several detectors are placed in underground cavities, where disturbing noise is minimized. The hope is that one of these detectors will one day catch a dark matter particle passing through Earth. According to Ian Shoemaker, it is possible that this might happen, but given how little we know about dark matter we should keep an open mind and explore all paths that could lead to its detection.

One reason for this is that dark matter is not very dense in our part of the universe.

“If we add another way of looking for dark matter – the way, we suggest – then we will increase our chances of detecting dark matter in our underground cavities”, says Shoemaker, who now suggest looking for the signs of dark matter activity rather than the dark matter particles themselves.

The photo below shows the Large Underground Xenon (LUX) experiment is placed in this former mine almost 1500 m underground in South Dakota, USA. Credit: Matt Kapust, Stanford Underground Research Facility.

1-newtheoryifw

The researchers believe that when two dark matter particles meet, they will behave just like ordinary particles; that they will annihilate and create radiation in the process. In this case the radiation is called dark radiation, and it may be detected by the existing underground detectors.

“Underground detection experiments may be able to detect the signals created by dark radiation”, Shoemaker says.

The researchers have found that the Large Underground Xenon (LUX) experiment is in fact already sensitive to this signal and can with future data confirm or exclude their hypothesis for dark matter’s origin.

Don't forget to look in the Milky Way, too. The attempt to catch signals from dark radiation is not a new idea – it is currently being performed several places in space with satellite-based experiments. These places include the center of our galaxy, the Milky Way, and the Sun may also be such an area.

Physicists have three ways to try and detect dark matter: Make it Slam matter together and produce dark matter. This has been tried at high-energy particle colliders, the most famous of which is CERN’s Large Hadron Collider (LHC) in Geneva, Switzerland. So far no success.

Break it: This is the “annihilation” process in which two dark matter particles meet and produce some sort of radiation. This can happen whenever dark matter is dense enough so that the probability of two dark matter particles colliding is sufficiently high. So far no success.

“It makes sense to look for dark radiation in certain places in space, where we expect it to be very dense – a lot denser than on Earth”, explains Shoemaker, adding: “If there is an abundance of dark matter in these areas, then we would expect it to annihilate and create radiation.”

None of the satellite-based experiments however have yet detected dark radiation.According to Shoemaker, Frandsen and Cherry, this could be because the experiments look for the wrong signals.

“The traditional satellite-based experiments search for photons, because they expect dark matter to annihilate into photons. But if dark matter annihilates into dark radiation then these satellite-based experiments are hopeless."

In the early days of the universe, when all matter was still extremely dense, dark matter may have collided and annihilated into radiation all the time. This happened to ordinary matter as well, so it is not unlikely that dark matter behaves the same way, the researchers argue.

Read More

New Research Asks: "Is Dark Energy a Hidden 5th Force?"

6a00d8341bf7f753ef01a3fce06a48970b

Dark energy is hiding in our midst in the form of hypothetical particles called “chameleons,” Holger Müller and his team at UC Berkeley plan to flush them out. The results of an experiment reported in this week’s issue of Science narrows the search for chameleons a thousand times compared to previous tests, and Müller, an assistant professor of physics, hopes that his next experiment will either expose chameleons or similar ultralight particles as the real dark energy, or prove they were a will-o’-the-wisp after all.

Dark energy was first discovered in 1998 when scientists observed that the universe was expanding at an ever increasing rate, apparently pushed apart by an unseen pressure permeating all of space and making up about 68 percent of the energy in the cosmos. Several UC Berkeley scientists were members of the two teams that made that Nobel Prize-winning discovery, and physicist Saul Perlmutter shared the prize.

Since then, theorists have proposed numerous theories to explain the still mysterious energy. It could be simply woven into the fabric of the universe, a cosmological constant that Albert Einstein proposed in the equations of general relativity and then disavowed. Or it could be quintessence, represented by any number of hypothetical particles, including offspring of the Higgs boson.

In 2004, theorist and co-author Justin Khoury of the University of Pennsylvania proposed one possible reason why dark energy particles haven’t been detected: they’re hiding from us.

The vacuum chamber of the atom interferometer contains a one-inch diameter aluminum sphere is shown below. If chameleons exist, cesium atoms would fall toward the sphere with a slightly greater acceleration than their gravitational attraction would predict. (Holger Muller photo) If chameleons exist, they would have a very small effect on the gravitational attraction between cesium atoms and an aluminum sphere.

Chambersphere750

Specifically, Khoury proposed that dark energy particles, which he dubbed chameleons, vary in mass depending on the density of surrounding matter.

In the emptiness of space, chameleons would have a small mass and exert force over long distances, able to push space apart. In a laboratory, however, with matter all around, they would have a large mass and extremely small reach. In physics, a low mass implies a long-range force, while a high mass implies a short-range force.

This would be one way to explain why the energy that dominates the universe is hard to detect in a lab.

“The chameleon field is light in empty space but as soon as it enters an object it becomes very heavy and so couples only to the outermost layer of a big object, and not to the internal parts,” said Müller, who is also a faculty scientist at Lawrence Berkeley National Laboratory. “It would pull only on the outermost nanometer.”

When UC Berkeley post-doctoral fellow Paul Hamilton read an article by theorist Clare Burrage last August outlining a way to detect such a particle, he suspected that the atom interferometer he and Müller had built at UC Berkeley would be able to detect chameleons if they existed. Müller and his team have built some of the most sensitive detectors of forces anywhere, using them to search for slight gravitational anomalies that would indicate a problem with Einstein’s General Theory of Relativity. While the most sensitive of these are physically too large to sense the short-range chameleon force, the team immediately realized that one of their less sensitive atom interferometers would be ideal.

Burrage suggested measuring the attraction caused by the chameleon field between an atom and a larger mass, instead of the attraction between two large masses, which would suppress the chameleon field to the point of being undetectable.

That’s what Hamilton, Müller and his team did. They dropped cesium atoms above an inch-diameter aluminum sphere and used sensitive lasers to measure the forces on the atoms as they were in free fall for about 10 to 20 milliseconds. They detected no force other than Earth’s gravity, which rules out chameleon-induced forces a million times weaker than gravity. This eliminates a large range of possible energies for the particle.

Experiments at CERN in Geneva and the Fermi National Accelerator Laboratory in Illinois, as well as other tests using neutron interferometers, also are searching for evidence of chameleons, so far without luck. Müller and his team are currently improving their experiment to rule out all other possible particle energies or, in the best-case scenario, discover evidence that chameleons really do exist.

                Atominterferometer410 (1)

New particles associated with dark energy typically imply a fifth force beyond the known strong, weak, electromagnetic and gravitational forces in the universe. In order not to conflict with known bounds on such fifth forces, a hypothetical new force would have to be camouflaged or “screened” by the matter around it – hence the name chameleon field.

“Holger has ruled out chameleons that interact with normal matter more strongly than gravity, but he is now pushing his experiment into areas where chameleons interact on the same scale as gravity, where they are more likely to exist,” Khoury said.

Their experiments may also help narrow the search for other hypothetical screened dark energy fields, such as symmetrons and forms of modified gravity, such as so-called f(R) gravity.

“In the worst case, we will learn more of what dark energy is not. Hopefully, that gives us a better idea of what it might be,” Müller said. “One day, someone will be lucky and find it.”

The work was funded by the David and Lucile Packard Foundation, the National Science Foundation and the National Aeronautics and Space Administration. Co-authors with Müller, Hamilton and Khoury are UC Berkeley physics graduate students Matt Jaffe and Quinn Simmons and post-doctoral fellow Philipp Haslinger.

Read More

Confirmed! Discovery of Cosmic Neutrinos from Beyond Our Galaxy

         97641_web


Researchers using the IceCube Neutrino Observatory have sorted through the billions of subatomic particles that zip through its frozen cubic-kilometer-sized detector each year to gather powerful new evidence in support of 2013 observations confirming the existence of cosmic neutrinos. The evidence is important because it heralds a new form of astronomy using neutrinos, the nearly massless high-energy particles generated in nature's accelerators: black holes, massive exploding stars and the energetic cores of galaxies.

In a new study, the detection of 21 ultra high-energy muons -- secondary particles created on the very rare occasions when neutrinos interact with other particles --provides independent confirmation of astrophysical neutrinos from our galaxy as well as cosmic neutrinos from sources outside the Milky Way.

The observations were reported today (Aug. 20, 2015) in a paper published in the journal Physical Review Letters by the IceCube Collaboration, which called the data an "unequivocal signal" for astrophysical neutrinos, ultra high-energy particles that have traversed space unimpeded by stars, planets, galaxies, magnetic fields or clouds of interstellar dust -- phenomena that, at very high energies, significantly attenuate more mundane particles like photons.

Because they have almost no mass and no electric charge, neutrinos can be very hard to detect and are only observed indirectly when they collide with other particles to create muons, telltale secondary particles. What's more, there are different kinds of neutrinos produced in different astrophysical processes. The IceCube Collaboration, a large international consortium headquartered at the University of Wisconsin-Madison, has taken on the huge challenge of sifting through a mass of observations to identify perhaps a few dozen of the highest-energy neutrinos that have traveled from sources in the Milky Way and beyond our galaxy.

Those high-energy neutrinos, scientists believe, are created deep inside some of the universe's most violent phenomena. The particles created in these events, including neutrinos and cosmic rays, are accelerated to energy levels that exceed the record-setting earthbound accelerators such as the Large Hadron Collider (LHC) by a factor of more than a million. They are prized by astrophysicists because the information they hold is pristine, unchanged as the particles travel millions of light years between their sources and Earth. The ability to study the highest-energy neutrinos promises insight into a host of problems in physics, including how nature builds powerful and efficient particle accelerators in the universe.

The latest observations were made by pointing the Ice Cube Observatory -- composed of thousands of optical sensors sunk deep beneath the Antarctic ice at the South Pole -- through the Earth to observe the Northern Hemisphere sky. The Earth serves as a filter to help weed out a confusing background of muons created when cosmic rays crash into the Earth's atmosphere.

"Looking for muon neutrinos reaching the detector through the Earth is the way IceCube was supposed to do neutrino astronomy and it has delivered," explains Francis Halzen, a UW-Madison professor of physics and the principal investigator of IceCube. "This is as close to independent confirmation as one can get with a unique instrument."

Between May 2010 and May 2012, IceCube recorded more than 35,000 neutrinos. However, only about 20 of those neutrino events were clocked at energy levels indicative of astrophysical or cosmic sources.

The results are meaningful because, using the different technique, they reaffirm the IceCube Observatory's ability to sample the ghostlike neutrinos. By instrumenting a cubic kilometer of deep Antarctic ice, scientists were able to make a detector big enough to capture the signature of the rare neutrino collision. When that rare smashup occurs, it creates a muon, which, in turn, leaves a trail of Cherenkov light that faithfully mirrors the trajectory of the neutrino. The "optical sonic booms" created when neutrinos smash into another particle are sensed by the optical sensors that make up the IceCube detector array and, in theory, can be used to point back to a source.

"This is an excellent confirmation of IceCube's recent discoveries, opening the doors to a new era in particle physics," says Vladimir Papitashvili, astrophysics and geospace sciences program director in the National Science Foundation's (NSF) Division of Polar Programs. "And it became possible only because of extraordinary qualities of Antarctic ice and NSF's ability to successfully tackle enormous scientific and logistical problems in the most inhospitable places on Earth."

But while the new observations confirm the existence of astrophysical neutrinos and the means to detect them using the IceCube Observatory, actual point sources of high-energy neutrinos remain to be identified.

Albrecht Karle, a UW-Madison professor of physics and a senior author of the Physical Review Letters report, notes that while the neutrino-induced tracks recorded by the IceCube detector have a good pointing resolution, within less than a degree, the IceCube team has not observed a significant number of neutrinos emanating from any single source.

The neutrinos observed in the latest search, however, have energy levels identical to those seen when the observatory sampled the sky of the Southern Hemisphere. That, says Karle, suggests that many of the potential sources of the highest-energy neutrinos are generated beyond the Milky Way. If there were a significant number of sources in our own galaxy, he notes, the IceCube detector would light up when observing the plane of our galaxy -- the region where most neutrino-generating sources would likely be found.

"The plane of the galaxy is where the stars are. It is where cosmic rays are accelerated, so you would expect to see more sources there. But the highest-energy neutrinos we've observed come from random directions," says Karle, whose former graduate student, Chris Weaver, is the corresponding author of the new study. "It is sound confirmation that the discovery of cosmic neutrinos from beyond our galaxy is real."

IceCube is based at the Wisconsin IceCube Particle Astrophysics Center (WIPAC) at UW-Madison. The observatory was built with major support from the National Science Foundation as well as support from partner funding agencies worldwide. More than 300 physicists and engineers from the United States, Germany, Sweden, Belgium, Switzerland, Japan, Canada, New Zealand, Australia, the United Kingdom, Korea and Denmark are involved in the project.

Read More

Saturday 22 August 2015

Beyond the Quantum --"Will the Discovery of Dark Matter Revolutionize Our World?"

Image

Physicists suggest a new way to look for dark matter: they believe that dark matter particles annihilate into so-called dark radiation when they collide. If true, then we should be able to detect the signals from this radiation. The majority of the mass in the Universe remains unknown. Despite knowing very little about this dark matter, its overall abundance is precisely measured. In other words: Physicists know it is out there, but they have not yet detected it.

It is definitely worth looking for, argues Ian Shoemaker, former postdoctoral researcher at Centre for Cosmology and Particle Physics Phenomenology (CP3), Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, now at Penn State. “There is no way of predicting what we can do with dark matter, if we detect it. But it might revolutionize our world. When scientists discovered quantum mechanics, it was considered a curiosity. Today quantum mechanics plays an important role in computers”, he says.

Ever since dark matter was first theorized there have been many attempts to look for it, and now Ian Shoemaker and fellow scientists, Associate Professor Mads Toudal Frandsen, CP3, and John F. Cherry, postdoctoral researcher from Los Alamos National Laboratory, USA, suggest a new approach. They present their work in the journal Physical Review Letters.

On Earth several detectors are placed in underground cavities, where disturbing noise is minimized. The hope is that one of these detectors will one day catch a dark matter particle passing through Earth. According to Ian Shoemaker, it is possible that this might happen, but given how little we know about dark matter we should keep an open mind and explore all paths that could lead to its detection.

One reason for this is that dark matter is not very dense in our part of the universe.

“If we add another way of looking for dark matter – the way, we suggest – then we will increase our chances of detecting dark matter in our underground cavities”, says Shoemaker, who now suggest looking for the signs of dark matter activity rather than the dark matter particles themselves.

The photo below shows the Large Underground Xenon (LUX) experiment is placed in this former mine almost 1500 m underground in South Dakota, USA. Credit: Matt Kapust, Stanford Underground Research Facility.

1-newtheoryifw

The researchers believe that when two dark matter particles meet, they will behave just like ordinary particles; that they will annihilate and create radiation in the process. In this case the radiation is called dark radiation, and it may be detected by the existing underground detectors.

“Underground detection experiments may be able to detect the signals created by dark radiation”, Shoemaker says.

The researchers have found that the Large Underground Xenon (LUX) experiment is in fact already sensitive to this signal and can with future data confirm or exclude their hypothesis for dark matter’s origin.

Don't forget to look in the Milky Way, too. The attempt to catch signals from dark radiation is not a new idea – it is currently being performed several places in space with satellite-based experiments. These places include the center of our galaxy, the Milky Way, and the Sun may also be such an area.

Physicists have three ways to try and detect dark matter: Make it Slam matter together and produce dark matter. This has been tried at high-energy particle colliders, the most famous of which is CERN’s Large Hadron Collider (LHC) in Geneva, Switzerland. So far no success.

Break it: This is the “annihilation” process in which two dark matter particles meet and produce some sort of radiation. This can happen whenever dark matter is dense enough so that the probability of two dark matter particles colliding is sufficiently high. So far no success.

“It makes sense to look for dark radiation in certain places in space, where we expect it to be very dense – a lot denser than on Earth”, explains Shoemaker, adding: “If there is an abundance of dark matter in these areas, then we would expect it to annihilate and create radiation.”

None of the satellite-based experiments however have yet detected dark radiation.According to Shoemaker, Frandsen and Cherry, this could be because the experiments look for the wrong signals.

“The traditional satellite-based experiments search for photons, because they expect dark matter to annihilate into photons. But if dark matter annihilates into dark radiation then these satellite-based experiments are hopeless."

In the early days of the universe, when all matter was still extremely dense, dark matter may have collided and annihilated into radiation all the time. This happened to ordinary matter as well, so it is not unlikely that dark matter behaves the same way, the researchers argue.

Read More

Friday 21 August 2015

New Research Asks: "Is Dark Energy a Hidden 5th Force?"

6a00d8341bf7f753ef01a3fce06a48970b

Dark energy is hiding in our midst in the form of hypothetical particles called “chameleons,” Holger Müller and his team at UC Berkeley plan to flush them out. The results of an experiment reported in this week’s issue of Science narrows the search for chameleons a thousand times compared to previous tests, and Müller, an assistant professor of physics, hopes that his next experiment will either expose chameleons or similar ultralight particles as the real dark energy, or prove they were a will-o’-the-wisp after all.

Dark energy was first discovered in 1998 when scientists observed that the universe was expanding at an ever increasing rate, apparently pushed apart by an unseen pressure permeating all of space and making up about 68 percent of the energy in the cosmos. Several UC Berkeley scientists were members of the two teams that made that Nobel Prize-winning discovery, and physicist Saul Perlmutter shared the prize.

Since then, theorists have proposed numerous theories to explain the still mysterious energy. It could be simply woven into the fabric of the universe, a cosmological constant that Albert Einstein proposed in the equations of general relativity and then disavowed. Or it could be quintessence, represented by any number of hypothetical particles, including offspring of the Higgs boson.

In 2004, theorist and co-author Justin Khoury of the University of Pennsylvania proposed one possible reason why dark energy particles haven’t been detected: they’re hiding from us.

The vacuum chamber of the atom interferometer contains a one-inch diameter aluminum sphere is shown below. If chameleons exist, cesium atoms would fall toward the sphere with a slightly greater acceleration than their gravitational attraction would predict. (Holger Muller photo) If chameleons exist, they would have a very small effect on the gravitational attraction between cesium atoms and an aluminum sphere.

Chambersphere750

Specifically, Khoury proposed that dark energy particles, which he dubbed chameleons, vary in mass depending on the density of surrounding matter.

In the emptiness of space, chameleons would have a small mass and exert force over long distances, able to push space apart. In a laboratory, however, with matter all around, they would have a large mass and extremely small reach. In physics, a low mass implies a long-range force, while a high mass implies a short-range force.

This would be one way to explain why the energy that dominates the universe is hard to detect in a lab.

“The chameleon field is light in empty space but as soon as it enters an object it becomes very heavy and so couples only to the outermost layer of a big object, and not to the internal parts,” said Müller, who is also a faculty scientist at Lawrence Berkeley National Laboratory. “It would pull only on the outermost nanometer.”

When UC Berkeley post-doctoral fellow Paul Hamilton read an article by theorist Clare Burrage last August outlining a way to detect such a particle, he suspected that the atom interferometer he and Müller had built at UC Berkeley would be able to detect chameleons if they existed. Müller and his team have built some of the most sensitive detectors of forces anywhere, using them to search for slight gravitational anomalies that would indicate a problem with Einstein’s General Theory of Relativity. While the most sensitive of these are physically too large to sense the short-range chameleon force, the team immediately realized that one of their less sensitive atom interferometers would be ideal.

Burrage suggested measuring the attraction caused by the chameleon field between an atom and a larger mass, instead of the attraction between two large masses, which would suppress the chameleon field to the point of being undetectable.

That’s what Hamilton, Müller and his team did. They dropped cesium atoms above an inch-diameter aluminum sphere and used sensitive lasers to measure the forces on the atoms as they were in free fall for about 10 to 20 milliseconds. They detected no force other than Earth’s gravity, which rules out chameleon-induced forces a million times weaker than gravity. This eliminates a large range of possible energies for the particle.

Experiments at CERN in Geneva and the Fermi National Accelerator Laboratory in Illinois, as well as other tests using neutron interferometers, also are searching for evidence of chameleons, so far without luck. Müller and his team are currently improving their experiment to rule out all other possible particle energies or, in the best-case scenario, discover evidence that chameleons really do exist.

                Atominterferometer410 (1)

New particles associated with dark energy typically imply a fifth force beyond the known strong, weak, electromagnetic and gravitational forces in the universe. In order not to conflict with known bounds on such fifth forces, a hypothetical new force would have to be camouflaged or “screened” by the matter around it – hence the name chameleon field.

“Holger has ruled out chameleons that interact with normal matter more strongly than gravity, but he is now pushing his experiment into areas where chameleons interact on the same scale as gravity, where they are more likely to exist,” Khoury said.

Their experiments may also help narrow the search for other hypothetical screened dark energy fields, such as symmetrons and forms of modified gravity, such as so-called f(R) gravity.

“In the worst case, we will learn more of what dark energy is not. Hopefully, that gives us a better idea of what it might be,” Müller said. “One day, someone will be lucky and find it.”

The work was funded by the David and Lucile Packard Foundation, the National Science Foundation and the National Aeronautics and Space Administration. Co-authors with Müller, Hamilton and Khoury are UC Berkeley physics graduate students Matt Jaffe and Quinn Simmons and post-doctoral fellow Philipp Haslinger.

Read More

About Me

Designed ByBlogger Templates