Articles on this Page
- 04/07/15--07:26: _Large Hadron Collid...
- 05/04/15--16:01: _Astronomers find fi...
- 05/13/15--10:00: _Cause of galactic d...
- 06/19/15--05:44: _Masters of the univ...
- 07/02/15--05:38: _Three Cambridge pro...
- 07/02/15--11:00: _To conduct, or to i...
- 07/20/15--06:44: _Cambridge scientist...
- 07/22/15--02:55: _Astronomers witness...
- 08/12/15--01:43: _K is for Kingfisher
- 08/31/15--21:04: _Scientists "squeeze...
- 09/04/15--01:14: _Post-16 education m...
- 09/17/15--20:09: _Winton Symposium on...
- 09/18/15--02:34: _Isaac Physics proje...
- 06/22/15--17:26: _Kamerlingh Onnes prize
- 10/26/15--09:15: _Entanglement at hea...
- 10/30/15--04:30: _Mirage maker
- 11/02/15--08:55: _First evidence of ‘...
- 11/03/15--03:38: _Opinion: Girls can ...
- 11/06/15--03:18: _Cambridge researche...
- 11/06/15--07:50: _Graphene means busi...
- 04/07/15--07:26: Large Hadron Collider restarts after two years
- 05/13/15--10:00: Cause of galactic death: strangulation
- 06/19/15--05:44: Masters of the universe
- 07/02/15--11:00: To conduct, or to insulate? That is the question
- 07/20/15--06:44: Cambridge scientists receive Royal Society awards
- 08/12/15--01:43: K is for Kingfisher
- 08/31/15--21:04: Scientists "squeeze" light one particle at a time
- 09/17/15--20:09: Winton Symposium on green computing
- 09/18/15--02:34: Isaac Physics project makes awards shortlist
- 06/22/15--17:26: Kamerlingh Onnes prize
- 10/30/15--04:30: Mirage maker
- 11/02/15--08:55: First evidence of ‘ghost particles’
- 11/06/15--03:18: Cambridge researchers awarded Philip Leverhulme Prizes for 2015
Early on Easter Sunday, the Large Hadron Collider’s second run got underway, when proton beams began rotating in the 27-kilometre ring for the first time in two years. Over the coming weeks, the beams will be accelerated to speeds close to the speed of light, running at the unprecedented energy of 13 Terra-electron-volts (TeV), well above the 8 TeV level of the last run, which discovered the long-sought Higgs boson in 2012.
The new run will subject the Standard Model of particle physics to its toughest tests yet, and may help identify some of the fundamental forces of nature that the Standard Model does not include. With 13 TeV proton-proton collisions expected before summer, the LHC experiments will soon be exploring uncharted territory in particle physics.
Cambridge researchers at CERN are playing a major part in preparing the ATLAS detector – the largest of LHC’s seven particle detectors – for action with new upgraded systems ready to go to work as soon as the beam start to collide. All the preparations are in place to being to analyse the data and early results could be expected before the end of the year if all goes well.
“The current Standard Model explains the known particles and forces, and the discovery of the Higgs completed that picture,” said Professor Andy Parker, Head of the Cavendish Laboratory at the University of Cambridge, and one of the founders of ATLAS. “But the Standard Model does not explain dark matter, which is believed to make up most of the Universe, nor Dark Energy, a mysterious force driving the galaxies ever further apart.”
The answers to these problems in cosmology might lie in the realm of sub-atomic physics studied at CERN. For example, the LHC might be able to produce dark matter particles, which would be glimpsed in the debris of collisions detected by the ATLAS and CMS experiments.
“Even more exciting is the possibility that the Universe could have more than three space dimensions, and that other spaces are hidden all around us,” said Parker. “This could also be revealed at CERN by the production and decay of microscopic quantum black holes, a particular interest of the Cambridge researchers at CERN. Detailed studies of the Higgs boson are also going to test our understanding of the Standard Model, with any unexpected effects leading us towards new physics. The upgrade of the LHC will allow scientists to search for new discoveries which have so far been out of reach.”
The upgrade was a Herculean task. Some 10,000 electrical interconnections between the LHC’s superconducting dipole magnets were consolidated. Magnet protection systems were added, while cryogenic, vacuum and electronics were improved and strengthened. Additionally, the beams will be set up in such a way that they will produce more collisions by bunching protons closer together; with the time separating bunches being reduced from 50 nanoseconds to 25 nanoseconds.
After the discovery of the Higgs boson in 2012 by the ATLAS and CMS experiments, physicists will be putting the Standard Model of particle physics to its most stringent test yet, searching for new physics beyond this well-established theory describing particles and their interactions.
With superconducting magnets cooled to the extreme temperature of -271°C, the LHC is capable of simultaneously circulating particles in opposite directions, in tubes under ultrahigh vacuum, at a speed close to that of light. Gigantic particle detectors, located at four interaction points along the ring, record collisions generated when the beams collide.
In routine operation, protons cover some 11,245 laps of the LHC per second, producing up to 1 billion collisions per second. The CERN computing centre stores over 30 petabytes of data from the LHC experiments every year, the equivalent of 1.2 million Blu-ray discs.
After two years of intense maintenance and consolidation, and several months of preparation for restart, the Large Hadron Collider, the most powerful particle accelerator in the world, is back in operation after a major upgrade.
For the first time, researchers led by the University of Cambridge have detected atmospheric variability on a rocky planet outside the solar system, and observed a nearly threefold change in temperature over a two year period. Although the researchers are quick to point out that the cause of the variability is still under investigation, they believe the readings could be due to massive amounts of volcanic activity on the surface. The ability to peek into the atmospheres of rocky ‘super Earths’ and observe conditions on their surfaces marks an important milestone towards identifying habitable planets outside the solar system.
Using NASA’s Spitzer Space Telescope, the researchers observed thermal emissions coming from the planet, called 55 Cancri e – orbiting a sun-like star located 40 light years away in the Cancer constellation – and for the first time found rapidly changing conditions, with temperatures on the hot ‘day’ side of the planet swinging between 1000 and 2700 degrees Celsius.
“This is the first time we’ve seen such drastic changes in light emitted from an exoplanet, which is particularly remarkable for a super Earth,” said Dr Nikku Madhusudhan of Cambridge’s Institute of Astronomy, a co-author on the new study. “No signature of thermal emissions or surface activity has ever been detected for any other super Earth to date.”
Although the interpretations of the new data are still preliminary, the researchers believe the variability in temperature could be due to huge plumes of gas and dust which occasionally blanket the surface, which may be partially molten. The plumes could be caused by exceptionally high rates of volcanic activity, higher than what has been observed on Io, one of Jupiter’s moons and the most geologically active body in the solar system.
“We saw a 300 percent change in the signal coming from this planet, which is the first time we’ve seen such a huge level of variability in an exoplanet,” said Dr Brice-Olivier Demory of the University’s Cavendish Laboratory, lead author of the new study. “While we can’t be entirely sure, we think a likely explanation for this variability is large-scale surface activity, possibly volcanism, on the surface is spewing out massive volumes of gas and dust, which sometimes blanket the thermal emission from the planet so it is not seen from Earth.”
55 Cancri e is a ‘super Earth’: a rocky exoplanet about twice the size and eight times the mass of Earth. It is one of five planets orbiting a sun-like star in the Cancer constellation, and resides so close to its parent star that a year lasts just 18 hours. The planet is also tidally locked, meaning that it doesn’t rotate like the Earth does – instead there is a permanent ‘day’ side and a ‘night’ side. Since it is the nearest super Earth whose atmosphere can be studied, 55 Cancri e is among the best candidates for detailed observations of surface and atmospheric conditions on rocky exoplanets.
Most of the early research on exoplanets has been on gas giants similar to Jupiter and Saturn, since their enormous size makes them easier to find. In recent years, astronomers have been able to map the conditions on many of these gas giants, but it is much more difficult to do so for super Earths: exoplanets with masses between one and ten times the mass of Earth.
Earlier observations of 55 Cancri e pointed to an abundance of carbon, suggesting that the planet was composed of diamond. However, these new results have muddied those earlier observations considerably and opened up new questions.
“When we first identified this planet, the measurements supported a carbon-rich model,” said Madhusudhan, who along with Demory is a member of the Cambridge Exoplanet Research Centre. “But now we’re finding that those measurements are changing in time. The planet could still be carbon rich, but now we’re not so sure – earlier studies of this planet have even suggested that it could be a water world. The present variability is something we’ve never seen anywhere else, so there’s no robust conventional explanation. But that’s the fun in science – clues can come from unexpected quarters. The present observations open a new chapter in our ability to study the conditions on rocky exoplanets using current and upcoming large telescopes.”
The results have been published online today.
The study was also co-authored by Professor Didier Queloz of the Cavendish Laboratory and Dr Michaël Gillon of the Université of Liège.
Astronomers have detected wildly changing temperatures on a super Earth – the first time any atmospheric variability has been observed on a rocky planet outside the solar system – and believe it could be due to huge amounts of volcanic activity, further adding to the mystery of what had been nicknamed the ‘diamond planet’.
As murder mysteries go, it’s a big one: how do galaxies die and what kills them? A new study, published today in the journal Nature, has found that the primary cause of galactic death is strangulation, which occurs after galaxies are cut off from the raw materials needed to make new stars.
Researchers from the University of Cambridge and the Royal Observatory Edinburgh have found that levels of metals contained in dead galaxies provide key ‘fingerprints’, making it possible to determine the cause of death.
There are two types of galaxies in the Universe: roughly half are ‘alive’ galaxies which produce stars, and the other half are ‘dead’ ones which don’t. Alive galaxies such as our own Milky Way are rich in the cold gas – mostly hydrogen – needed to produce new stars, while dead galaxies have very low supplies. What had been unknown is what’s responsible for killing the dead ones.
Astronomers have come up with two main hypotheses for galactic death: either the cold gas needed to produce new stars is suddenly ‘sucked’ out of the galaxies by internal or external forces, or the supply of incoming cold gas is somehow stopped, slowly strangling the galaxy to death over a prolonged period of time.
In order to get to the bottom of this mystery, the team used data from the Sloan Digital Sky Survey to analyse metal levels in more than 26,000 average-sized galaxies located in our corner of the universe.
“Metals are a powerful tracer of the history of star formation: the more stars that are formed by a galaxy, the more metal content you’ll see,” said Dr Yingjie Peng of Cambridge’s Cavendish Laboratory and Kavli Institute of Cosmology, and the paper’s lead author. “So looking at levels of metals in dead galaxies should be able to tell us how they died.”
If galaxies are killed by outflows suddenly pulling the cold gas out of the galaxies, then the metal content of a dead galaxy should be the same as just before it died, as star formation would abruptly stop.
In the case of death by strangulation however, the metal content of the galaxy would keep rising and eventually stop, as star formation could continue until the existing cold gas gets completely used up.
While it is not possible to analyse individual galaxies due to the massive timescales involved, by statistically investigating the difference of metal content of alive and dead galaxies, the researchers were able to determine the cause of death for most galaxies of average size.
“We found that for a given stellar mass, the metal content of a dead galaxy is significantly higher than a star-forming galaxy of similar mass,” said Professor Roberto Maiolino, co-author of the new study. “This isn’t what we’d expect to see in the case of sudden gas removal, but it is consistent with the strangulation scenario.”
The researchers were then able to independently test their results by looking at the stellar age difference between star-forming and dead galaxies, independent of metal levels, and found an average age difference of four billion years – this is in agreement with the time it would take for a star-forming galaxy to be strangled to death, as inferred from the metallicity analysis.
“This is the first conclusive evidence that galaxies are being strangled to death,” said Peng. “What’s next though, is figuring out what’s causing it. In essence, we know the cause of death, but we don’t yet know who the murderer is, although there are a few suspects.”
Astronomers have partially solved an epic whodunit: what kills galaxies so that they can no longer produce new stars?
Imagine having to design a completely automated system that could take all of the live video from all of the hundreds of thousands of cameras monitoring London, and automatically dispatch an ambulance any time any person falls and hurts themselves, anywhere in the city, without any human intervention whatsoever. That is the scale of the problem facing the team designing the software and computing behind the world’s largest radio telescope.
When it becomes operational in 2023, the Square Kilometre Array (SKA) will probe the origins, evolution and expansion of our universe; test one of the world’s most famous scientific theories; and perhaps even answer the greatest mystery of all — are we alone?
Construction on the massive international project, which involves and is funded by 11 different countries and 100 organisations, will start in 2018. When complete, it will be able to map the sky in unprecedented detail — 10,000 times faster and 50 times more sensitively than any existing radio telescope — and detect extremely weak extraterrestrial signals, greatly expanding our ability to search for planets capable of supporting life.
The SKA will be co-located in South Africa and Australia, where radio interference is least and views of our galaxy are best. The instrument itself will be made up of thousands of dishes that can operate as one gigantic telescope or multiple smaller telescopes — a phenomenon known as astronomical interferometery, which was developed in Cambridge by Sir Martin Ryle almost 70 years ago.
“The SKA is one of the major big data challenges in science,” explains Professor Paul Alexander, who leads the Science Data Processor (SDP) consortium, which is responsible for designing all of the software and computing for the telescope. In 2013, the University’s High Performance Computing Service unveiled ‘Wilkes’ — one of the world’s greenest supercomputers with the computing power of 4,000 desktop machines running at once, and a key test-bed for the development of the SKA computing platform.
During its projected 50-year lifespan, the SKA will carry out several experiments to study the nature of the universe. Cambridge researchers will focus on two of these, the first of which will follow hydrogen through billions of years of cosmic time.
“Hydrogen is the raw material from which everything in the universe developed,” says Alexander. “Everything we can see in the universe and everything that we’re made from started out in the form of hydrogen and a small amount of helium. What we want to do is to figure out how that happened.”
The second of the two experiments will look at pulsars — spinning neutron stars that emit short, quick pulses of radiation. Since the radiation is emitted at regular intervals, pulsars also turn out to be extremely accurate natural clocks, and can be used to test our understanding of space, time and gravity, as proposed by Einstein in his general theory of relativity.
By tracking a pulsar as it orbits a black hole, the telescope will be able to examine general relativity to its absolute limits. As the pulsar moves around the black hole, the SKA will follow how the clock behaves in the very strong gravitational field.
“General relativity tells us that massive objects like black holes warp the space–time around them, and what we call gravity is the effect of that warp,” says Alexander. “This experiment will enable us to test our theory of gravity with much greater precision than ever before, and perhaps even show that our current theories need to be changed.”
Although the SKA experiments will tell us much more than we currently know about the nature of the universe, they also present a massive computing challenge. At any one time, the amount of data gathered from the telescope will be equivalent to five times the global internet traffic, and the SKA’s software must process that vast stream of data quickly enough to keep up with what the telescope is doing.
Moreover, the software also needs to grow and adapt along with the project. The first phase of the SKA will be just 10% of the telescope’s total area. Each time the number of dishes on the ground doubles, the computing load will be increased by more than the square of that, meaning that the computing power required for the completed telescope will be more than 100 times what is required for phase one.
“You can always solve a problem by throwing more and more money and computing power at it,” says Alexander. “We have to make it work sensibly as a single system that is completely automated and capable of learning over time what the best way of getting rid of bad data is. At the moment, scientists tend to look at data but we can’t do that with the SKA, because the volumes are just too large.”
The challenges faced by the SKA team echo those faced in many different fields, and so Alexander’s group is working closely with industrial partners such as Intel and NVIDIA, as well as with academic and funding partners including the Universities of Manchester and Oxford, and the Science and Technology Facilities Council. The big data solutions developed by the SKA partners to solve the challenges faced by a massive radio telescope can then be applied across a range of industries.
One of these challenges is how to process data efficiently and affordably, and convert it into images of the sky. The target for the first phase of the project is a 300 ‘petaflop’ computer that uses no more than eight megawatts of power: more than 10 times the performance of the world’s current fastest supercomputer, for the same amount of energy. ‘Flops’ (floating point operations per second) are a standard measure of computing performance, and one petaflop is equivalent to a million billion calculations per second.
“The investment in the software behind the SKA is as much as €50 million,” adds Alexander. “And if our system isn’t able to grow and adapt, we’d be throwing that investment away, which is the same problem as anyone in this area faces. We want the solutions we’re developing for understanding the most massive objects in the universe to be applied to any number of the big data challenges that society will face in the years to come.”
Inset images: Artist's impression of the SKA, which will be made up of thousands of dishes that operate as one gigantic telescope (SKA Organisation); Professor Paul Alexander (University of Cambridge).
The ‘world’s largest IT project’ — a system with the power of one hundred million home computers — may help to unravel many of the mysteries of our universe: how it began, how it developed and whether humanity is alone in the cosmos.
Professor John Barrow, Professor Judith Driscoll and Professor Henning Sirringhaus have been awarded medals in the 2015 Institute of Physics awards.
Three Cambridge professors have been awarded medals in the 2015 Institute of Physics awards.
Professor John Barrow (Department of Applied Mathematics and Theoretical Physics) has been awarded the Dirac Medal for his combination of mathematical and physical reasoning to increase our understanding of the evolution of the universe, and his use of cosmology to increase our understanding of fundamental physics.
Professor Barrow is a highly original scientist whose work is concerned with fundamental questions about the origin and nature of the universe. He has been at the forefront of theoretical cosmology for more than 35 years. His research in cosmology is extraordinarily far ranging and he has made important contributions across many areas of gravitation, astrophysics and cosmology. It spans work of a mathematical nature, particle physics, mathematical statistics and observation. Barrow is also a distinguished writer and lecturer for non-specialist audiences. His work in this area has made a huge contribution to public engagement with science.
Professor Judith Driscoll (Department of Materials Science and Metallurgy) has been awarded the Joule Medal for her pioneering contributions to the understanding and enhancement of critical physical properties of strongly-correlated oxides, encompassing oxide superconductors, ferroelectrics, multiferroics and semiconductors.
For more than 25 years, Professor Driscoll’s unique approach to research and technological development has significantly advanced our understanding of attaining enhanced physical properties of a number of strongly-correlated oxides. Notably, Driscoll’s “nanotechnology in a thin film” approach has provided a new route to high-performance electronic materials with outstanding property enhancements.
Driscoll’s international leadership in nanostructured materials has been of essential importance.
Professor Henning Sirringhaus (Department of Physics) has been awarded the Faraday Medal for transforming our knowledge of charge transport phenomena in organic semiconductors as well as our ability to exploit them.
Professor Sirringhaus is a highly creative, versatile and productive physicist who is not afraid to tackle challenging problems. His research crosses the interface between condensed matter physics, materials science and electrical engineering. In several areas of soft matter electronics and opto-electronics research he has carried out landmark investigations which have given birth to stunning new technologies and industries.
Before you searching always remember to change your IP adress to not be followed!
PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!
A new study has discovered mysterious behaviour of a material that acts like an insulator in certain measurements, but simultaneously acts like a conductor in others. In an insulator, electrons are largely stuck in one place, while in a conductor, the electrons flow freely. The results, published today (2 July) in the journal Science, challenge current understanding of how materials behave.
Conductors, such as metals, conduct electricity, while insulators, such as rubber or glass, prevent or block the flow of electricity. But by tracing the path that electrons follow as they move through a material, researchers led by the University of Cambridge found that it is possible for a single material to display dual metal-insulator properties at once – although at the very lowest temperatures, it completely disobeys the rules that govern conventional metals. While it’s not known exactly what’s causing this mysterious behaviour, one possibility is the existence of a potential third phase which is neither insulator nor conductor.
The duelling metal-insulator properties were observed throughout the interior of the material, called samarium hexaboride (SmB6). There are other recently-discovered materials which behave both as a conductor and an insulator, but they are structured like a sandwich, so the surface behaves differently from the bulk. But the new study found that in SmB6, the bulk itself can be both conductor and insulator simultaneously.
“The discovery of dual metal-insulator behaviour in a single material has the potential to overturn decades of conventional wisdom regarding the fundamental dichotomy between metals and insulators,” said Dr Suchitra Sebastian of the University’s Cavendish Laboratory, who led the research.
In order to learn more about SmB6 and various other materials, Sebastian and her colleagues traced the path that the electrons take as they move through the material: the geometrical surface traced by the orbits of the electrons leads to a construction which is known as a Fermi surface. In order to find the Fermi surface, the researchers used a technique based on measurements of quantum oscillations, which measure various properties of a material in the presence of a high magnetic field to get an accurate ‘fingerprint’ of the material. For quantum oscillations to be observed, the materials must be as close to pure as possible, so that there are minimal defects for the electrons to bump into. Key experiments for the research were conducted at the National High Magnetic Field Laboratory in Tallahassee, Florida.
SmB6 belongs to the class of materials called Kondo insulators, which are close to the border between insulating and conducting behaviour. Kondo insulators are part of a larger group of materials called heavy fermion materials, in which complex physics arises from an interplay of two types of electrons: highly localised ‘f’ electrons, and ‘d’ electrons, which have larger orbits. In the case of SmB6, correlations between these two types of electrons result in insulating behaviour.
“It’s a dichotomy,” said Sebastian. “The high electrical resistance of SmB6 reveals its insulating behaviour, but the Fermi surface we observed was that of a good metal.”
But the mystery didn’t end there. At the very lowest temperatures, approaching 0 degrees Kelvin (-273 Celsius), it became clear that the quantum oscillations for SmB6 are not characteristic of a conventional metal. In metals, the amplitude of quantum oscillations grows and then levels off as the temperature is lowered. Strangely, in the case of SmB6, the amplitude of quantum oscillations continues to grow dramatically as the temperature is lowered, violating the rules that govern conventional metals.
The researchers considered several reasons for this peculiar behaviour: it could be a novel phase, neither insulator nor conductor; it could be fluctuating back and forth between the two; or because SmB6 has a very small ‘gap’ between insulating and conducting behaviour, perhaps the electrons are capable of jumping that gap.
“The crossover region between two different phases – magnetic and non-magnetic, for example – is where the really interesting physics happens,” said Sebastian. “Because this material is close to the crossover region between insulator and conductor, we found it displays some really strange properties – we’re exploring the possibility that it’s a new quantum phase.”
Tim Murphy, the head of the National High Magnetic Field Laboratory’s DC Field Facility, where most of the research was conducted, said: “This work on SmB6 provides a vivid and exciting illustration of emergent physics resulting from MagLab researchers refining the quality of the materials they study and pushing the sample environment to the extremes of high magnetic fields and low temperatures.”
The Cambridge researchers were funded by the Royal Society, the Winton Programme for the Physics of Sustainability, the European Research Council and the Engineering and Physical Sciences Research Council (UK).
Researchers have identified a material that behaves as a conductor and an insulator at the same time, challenging current understanding of how materials behave, and pointing to a new type of insulating state.
The Royal Society, the UK’s independent academy for science, has announced the recipients of its 2015 Awards, Medals and Prize Lectures. The scientists receive the awards in recognition of their achievements in a wide variety of fields of research. The recipients from the University of Cambridge are:
Professor George Efstathiou FRS (Institute of Astronomy) receives the Hughes Medal for many outstanding contributions to our understanding of the early Universe, in particular his pioneering computer simulations, observations of galaxy clustering and studies of the fluctuations in the cosmic microwave background.
Professor Benjamin Simons (Wellcome Trust/Cancer Research UK Gurdon Institute, Cavendish Laboratory) receives the Gabor Medal for his work analysing stem cell lineages in development, tissue homeostasis and cancer, revolutionising our understanding of stem cell behaviour in vivo.
Professor Russell Cowburn FRS (Department of Physics) receives the Clifford Paterson Medal and Lecture for his remarkable academic, technical and commercial achievements in nano-magnetics.
Dr Madan Babu Mohan (MRC Laboratory of Molecular Biology) receives the Francis Crick Medal and Lecture for his major and widespread contributions to computational biology.
Four Cambridge scientists have been recognised by the Royal Society for their achievements in research.
When the first galaxies started to form a few hundred million years after the Big Bang, the Universe was full of a fog of hydrogen gas. But as more and more brilliant sources — both stars and quasars powered by huge black holes — started to shine they cleared away the mist and made the Universe transparent to ultraviolet light. Astronomers call this the epoch of reionisation, but little is known about these first galaxies, and up to now they have just been seen as very faint blobs. But now new observations using the Atacama Large Millimetre/submillimetre Array (ALMA) are starting to change this.
A team of astronomers led by Roberto Maiolino from the University’s Cavendish Laboratory and Kavli Institute for Cosmology trained ALMA on galaxies that were known to be seen only about 800 million years after the Big Bang. The astronomers were not looking for the light from stars, but instead for the faint glow of ionised carbon coming from the clouds of gas from which the stars were forming. They wanted to study the interaction between a young generation of stars and the cold clumps that were assembling into these first galaxies.
They were also not looking for the extremely brilliant rare objects — such as quasars and galaxies with very high rates of star formation — that had been seen up to now. Instead they concentrated on rather less dramatic, but much more common, galaxies that reionised the Universe and went on to turn into the bulk of the galaxies that we see around us now.
From one of the galaxies — given the label BDF 3299 — ALMA could pick up a faint but clear signal from the glowing carbon. However, this glow wasn’t coming from the centre of the galaxy, but rather from one side.
“These observations enable an unprecedented understanding of the assembly process of the first galaxies formed in the Universe – for the first time we can observe and disentangle the different components contributing to the earliest phases of galaxy formation,” said Maiolino. “These observations have enabled us to test with unprecedented detail theories of galaxy formation in the early Universe.”
The astronomers think that the off-centre location of the glow is because the central clouds are being disrupted by the harsh environment created by the newly formed stars — both their intense radiation and the effects of supernova explosions — while the carbon glow is tracing fresh cold gas that is being accreted from the intergalactic medium.
By combining the new ALMA observations with computer simulations, it has been possible to understand in detail key processes occurring within the first galaxies. The effects of the radiation from stars, the survival of molecular clouds, the escape of ionising radiation and the complex structure of the interstellar medium can now be calculated and compared with observation. BDF 3299 is likely to be a typical example of the galaxies responsible for reionisation.
“We have been trying to understand the interstellar medium and the formation of the reionisation sources for many years. Finally to be able to test predictions and hypotheses on real data from ALMA is an exciting moment and opens up a new set of questions. This type of observation will clarify many of the thorny problems we have with the formation of the first stars and galaxies in the Universe,” said co-author Andrea Ferrara, from the Scuola Normale Superiore in Pisa, Italy.
“This study would have simply been impossible without ALMA, as no other instrument could reach the sensitivity and spatial resolution required,” said Maiolino. “Although this is one of the deepest ALMA observations so far it is still far from achieving its ultimate capabilities. In future ALMA will image the fine structure of primordial galaxies and trace in detail the build-up of the very first galaxies.”
The results are reported in the journal Monthly Notices of the Royal Astronomical Society.
R. Maiolino et al., “The assembly of “normal” galaxies at z∼7 probed by ALMA,” Monthly Notices of the Royal Astronomical Society (2015).
Adapted from an ESO press release.
An international team of astronomers led by the University of Cambridge have detected the most distant clouds of star-forming gas yet found in normal galaxies in the early Universe – less than one billion years after the Big Bang. The new observations will allow astronomers to start to see how the first galaxies were built up and how they cleared the cosmic fog during the era of reionisation. This is the first time that such galaxies have been seen as more than just faint blobs.
Kingfishers are notoriously shy. But one of the best places to spot them in Cambridge is the Botanic Garden where they perch in the swamp cypresses to fish in the lake.
The brilliantly bright plumage of the kingfisher looks almost exotic in comparison to the more modest hues of many birds native to Britain. In motion, the kingfisher’s contrasting colours – orange, cyan and blue – produce a startling flash of colour.
Colour in nature is a fascinating topic. Understanding why and how plants and animals produce and employ colour requires researchers to collaborate and share their expertise across different disciplines. Dr Silvia Vignolini (Department of Chemistry) has been working with Professors Jeremy Baumberg (Department of Physics) and Beverley Glover (Department of Plant Sciences) to look at the extraordinarily clever ways in which nature makes spectacular colour effects.
Blue is a favourite colour of people around the world. But the production of intense blue presents challenges to nature. Most vertebrates are unable to produce blue pigment. The orange of kingfisher plumage is the product of tiny pigment granules but its cyan and blue feathers contain no pigments. These colours are ‘structural’. They are created by the intricate structural arrangement of a transparent material which, depending on its precise make-up and thickness compared to the tiny wavelength of light, produces a range of colours by ‘incident light’ – in other words light shining on the sample.
Structural colours feature in plants too – particularly in fruit and flowers. In research published in 2012, Vignolini and others revealed that an African plant called the Polliacondensata produces a blue fruit, which produces a strikingly shiny blue fruit. The researchers discovered that the Pollia fruit reflects back 30% of the light cast on it. Furthermore, its reflective properties stand the test of time in a remarkable way: a Pollia fruit, locked in a seed drawer at Kew Gardens for 100 years, had lost none of its blueness.
The way in which plants like the Pollia achieve extraordinarily bright and long-lasting colours offers huge scope for material science. Vignolini says: “Cellulose, which is the main material used by this plant to produce colour, can also be manipulated in vitro to obtain a similar optical effect. By controlling the self-assembly process of cellulose, it is therefore possible to produce bio-mimetic colouration without using any toxic pigment.”
Because structural colours can be so intense, their origin in only transparent materials is hard to imagine. Vignolini uses the example of a soap bubble. “If you start from a perfectly transparent water-soap suspension and you blow a soap bubble, you can observe all the colours of the rainbow. These colours cannot be the results of pigmentation, because the liquid is transparent. Instead, the colours result from creating a very thin layer, just a few hundreds of nanometres thick, of the suspension that interacts with light.” she says.
“A plant or animal cell does something similar. Using simple sugars, it creates a multi-layered nano-structure that optimises the reflection of the blue colour. Understanding these incredibly precise processes is the key to be able to copy and mimic these materials.”
In a paper published in 2011, Dr Bodo Wilts (formerly Cambridge, now University of Fribourg) and colleagues focused on the striking plumage of the kingfisher. They found that the cyan and blue barbs of its feathers contain spongy nanostructures with varying dimensions, causing the light to reflect differently and thus produce the observed set of colours. The subtle differences within colours are produced by tiny variations in the structure of the barbs.
Kingfisher feathers reflect light in a way that scientists describe as semi-iridescent. The feathers of peacocks and birds of paradise are truly iridescent. Iridescence is produced by the ways in which layers of material are perfectly aligned and repeated periodically to achieve a shimmer effect. Semi-iridescence is produced when the layers are not quite perfectly aligned but slightly disrupted, thus causing a smaller span of iridescent colour.
There’s much more to be discovered about colour in nature. “Researchers are beginning to learn more about birds’ vision. This work will help us to grasp how they see colour and how they respond to it. To unlock the secret of how cellulose or keratin make fabulously bright colour will involve continuing collaborations between biologists, physicists and materials scientists, ” says Vignolini.
The kingfisher is just one of 100 bird species to be seen in the Botanic Garden which provides an important habitat for birds and other wildlife in the heart of Cambridge. Among them is the increasingly rare song thrush. Mistle thrushes, too, can often be seen in winter atop trees full of mistletoe. The Garden also has thriving populations of great and blue tits while flocks of long-tailed tits are often heard as they fly from tree to tree to search for food. Summer visitors regularly include a pair of sparrowhawks and flocks of swifts.
Next in the Cambridge Animal Alphabet: L is for a creature that has helped archaeologists learn more about the life of people inhabiting the remote and windswept Isle of Oronsay 6,000 years ago.
Have you missed the series so far? Catch up on Medium here.
Inset images: Kingfisher (Gilberto Pereira); The common kingfisher, Alcedo atthis, and its three main feather types: orange feathers at the breast, cyan feathers at the back and blue feathers at the tail (Doekele Stavenga, Jan Tinbergen, Hein Leertouwer, Bodo Wilts); Scanning electron micrographs (SEMs) of sectioned barbs of breast and tail feathers (Doekele Stavenga, Jan Tinbergen, Hein Leertouwer, Bodo Wilts); Close up of a cut vacuole and the surrounding spongy structures (Doekele Stavenga, Jan Tinbergen, Hein Leertouwer, Bodo Wilts).
The Cambridge Animal Alphabet series celebrates Cambridge's connections with animals through literature, art, science and society. Here, K is for Kingfisher. Look out for them among the swamp cypresses at the Botanic Garden, where the secrets behind their cyan and blue feathers are being studied by an extraordinary collaboration of scientists.
A team of scientists has successfully measured particles of light being “squeezed”, in an experiment that had been written off in physics textbooks as impossible to observe.
Squeezing is a strange phenomenon of quantum physics. It creates a very specific form of light which is “low-noise” and is potentially useful in technology designed to pick up faint signals, such as the detection of gravitational waves.
The standard approach to squeezing light involves firing an intense laser beam at a material, usually a non-linear crystal, which produces the desired effect.
For more than 30 years, however, a theory has existed about another possible technique. This involves exciting a single atom with just a tiny amount of light. The theory states that the light scattered by this atom should, similarly, be squeezed.
Unfortunately, although the mathematical basis for this method – known as squeezing of resonance fluorescence – was drawn up in 1981, the experiment to observe it was so difficult that one established quantum physics textbook despairingly concludes: “It seems hopeless to measure it”.
So it has proven – until now. In the journal Nature, a team of physicists report that they have successfully demonstrated the squeezing of individual light particles, or photons, using an artificially constructed atom, known as a semiconductor quantum dot. Thanks to the enhanced optical properties of this system and the technique used to make the measurements, they were able to observe the light as it was scattered, and proved that it had indeed been squeezed.
Professor Mete Atature, from the Cavendish Laboratory, Department of Physics, and a Fellow of St John’s College at the University of Cambridge, led the research. He said: “It’s one of those cases of a fundamental question that theorists came up with, but which, after years of trying, people basically concluded it is impossible to see for real – if it’s there at all.”
“We managed to do it because we now have artificial atoms with optical properties that are superior to natural atoms. That meant we were able to reach the necessary conditions to observe this fundamental property of photons and prove that this odd phenomenon of squeezing really exists at the level of a single photon. It’s a very bizarre effect that goes completely against our senses and expectations about what photons should do.”
Like a lot of quantum physics, the principles behind squeezing light involve some mind-boggling concepts.
It begins with the fact that wherever there are light particles, there are also associated electromagnetic fluctuations. This is a sort of static which scientists refer to as “noise”. Typically, the more intense light gets, the higher the noise. Dim the light, and the noise goes down.
But strangely, at a very fine quantum level, the picture changes. Even in a situation where there is no light, electromagnetic noise still exists. These are called vacuum fluctuations. While classical physics tells us that in the absence of a light source we will be in perfect darkness, quantum mechanics tells us that there is always some of this ambient fluctuation.
"If you look at a flat surface, it seems smooth and flat, but we know that if you really zoom in to a super-fine level, it probably isn't perfectly smooth at all," Atature said. "The same thing is happening with vacuum fluctuations. Once you get into the quantum world, you start to get this fine print. It looks like there are zero photons present, but actually there is just a tiny bit more than nothing."
Importantly, these vacuum fluctuations are always present and provide a base limit to the noise of a light field. Even lasers, the most perfect light source known, carry this level of fluctuating noise.
This is when things get stranger still, however, because, in the right quantum conditions, that base limit of noise can be lowered even further. This lower-than-nothing, or lower-than-vacuum, state is what physicists call squeezing.
In the Cambridge experiment, the researchers achieved this by shining a faint laser beam on to their artificial atom, the quantum dot. This excited the quantum dot and led to the emission of a stream of individual photons. Although normally, the noise associated with this photonic activity is greater than a vacuum state, when the dot was only excited weakly the noise associated with the light field actually dropped, becoming less than the supposed baseline of vacuum fluctuations.
Explaining why this happens involves some highly complex quantum physics. At its core, however, is a rule known as Heisenberg’s uncertainty principle. This states that in any situation in which a particle has two linked properties, only one can be measured and the other must be uncertain.
In the normal world of classical physics, this rule does not apply. If an object is moving, we can measure both its position and momentum, for example, to understand where it is going and how long it is likely to take getting there. The pair of properties – position and momentum – are linked.
In the strange world of quantum physics, however, the situation changes. Heisenberg states that only one part of a pair can ever be measured, and the other must remain uncertain.
In the Cambridge experiment, the researchers used that rule to their advantage, creating a tradeoff between what could be measured, and what could not. By scattering faint laser light from the quantum dot, the noise of part of the electromagnetic field was reduced to an extremely precise and low level, below the standard baseline of vacuum fluctuations. This was done at the expense of making other parts of the electromagnetic field less measurable, meaning that it became possible to create a level of noise that was lower-than-nothing, in keeping with Heisenberg’s uncertainty principle, and hence the laws of quantum physics.
Plotting the uncertainty with which fluctuations in the electromagnetic field could be measured on a graph creates a shape where the uncertainty of one part has been reduced, while the other has been extended. This creates a squashed-looking, or “squeezed” shape, hence the term, “squeezing” light.
Atature added that the main point of the study was simply to attempt to see this property of single photons, because it had never been seen before. “It’s just the same as wanting to look at Pluto in more detail or establishing that pentaquarks are out there,” he said. “Neither of those things has an obvious application right now, but the point is knowing more than we did before. We do this because we are curious and want to discover new things. That’s the essence of what science is all about.”
Additional image: The left diagram represents electromagnetic activity associated with light at what is technically its lowest possible level. On the right, part of the same field has been reduced to lower than is technically possible, at the expense of making another part of the field less measurable. This effect is called “squeezing” because of the shape it produces.
Schulte, CHH, et al. Quadrature squeezed photons from a two-level system. Nature (2015). DOI: 10.1038/nature14868.
A team of scientists have measured a bizarre effect in quantum physics, in which individual particles of light are said to have been “squeezed” – an achievement which at least one textbook had written off as hopeless.
Before you searching always remember to change your IP adress to not be followed!
PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!
Education sits at the heart of our society – and politicians know it. When Tony Blair famously said “education, education, education” it was essentially an election slogan. We are constantly told by our politicians that English A levels are the “gold standard” in education. I say, maybe it’s time for a rethink.
At the heart of the problem is the early specialisation in post-16 education. As a practising scientist I like to think that I can at least have some understanding of any science story presented in the news. But for a large proportion of the population that isn’t the case; our society almost seems to believe that the situation is a virtue. If a politician says “I never could do maths” no one thinks “Philistine”, whereas if they admitted to never having read any Shakespeare or Dickens the reaction would be very different. Why does our society think this is OK?
Science underpins so many decisions; political and personal. In our daily life and jobs, we increasingly need to use quantitative skills: the ability to interpret graphs, utilise spreadsheets and manipulate data. Our national academies recognise this, with a recent report from the British Academy to go with last year’s Vision report from the Royal Society, both calling for all students to continue with some form of maths post-16.
This issue cuts both ways of course. Scientists need to be able to write and communicate better. Whether or not they can quote chunks of poetry, ancient or modern, is not the point. Scientists need to be able to write lucidly and put their work in context. Just about every branch of science is going to touch on the human condition and they need to be able to understand what their research means for the public. Some grasp of history, literature and social science could help them communicate this.
So in my upcoming Presidential Address to the British Science Association, I will be urging politicians to reconsider the structure of our post-16 education. England and Wales are unlike almost all other developed countries in our early specialisation. This leads to a damaging divide between arts and science.
Implicitly, at the point of choosing GCSE topics, a 14 year old will see themselves heading off in one direction or the other. Schools sometimes appear to encourage this, perhaps for the simple reason of easing the timetable. A broader post-16 education would mean moving from the typically narrow choices of A levels to something akin to the European Baccalaureate system (or perhaps the Scottish Higher system), where more subjects are studied for longer.
The teaching shortage
Of course, all this would require an adequate supply of qualified teachers. Currently, however, we neither have enough teachers entering the profession nor staying on for long subsequently. This is a massive problem in many subjects.
In primary school teaching, many schools have no one qualified in science or with a maths degree (the Vision report says only 3% and 5% of primary school teachers have maths and science degrees or specialist teaching qualifications in those subjects respectively). In turn this creates a confidence problem: teachers who haven’t looked at a maths problem since they were 16 are expected to teach numeracy skills they may feel unsure about themselves.
This problem is particularly acute when there is no one else with more relevant experience in the school to whom they can turn for specific advice. This is no criticism of the teachers themselves, but, when teachers have to teach beyond their own areas of confidence and competence, it is harder for them to stimulate the children and to answer their questions.
In the sciences a related problem occurs at secondary school. Teachers may be science teachers, but if their qualification is in biology it is tough for them to teach GCSE physics. Again, this is not meant to apportion blame to the teachers. The Institute of Physics has suggested we need 1000 more physics graduates a year entering the teaching profession if we are to reach a situation where a third of science teachers are qualified in physics – and it would still take 15 years.
To do this would need around a quarter of all physics graduates training as teachers each year. It is hard to imagine that happening, particularly given the level of salaries graduates can otherwise command.
England has this strange habit of splitting our children up into arts and sciences at an age when hormones are surging and peer pressure is liable to be at its most powerful. We should be pressing the government to modify our system so that all children keep studying a broad range of subjects post-16 – and providing adequate funding to do so. In time this would translate into primary school teachers with more confidence to enthuse the next generation in maths and science.
Furthermore, this change would empower everyone to be able to make better-informed decisions about the things that affect them in their everyday life and to make sure that day by day people are able to cope with the numeracy requirements of their jobs with confidence.
Why is it acceptable to say “I never could do maths” but not “I’ve never read any Shakespeare”? It’s symptomatic of the art-science divide that can only be addressed by reforming our education system, writes Professor Athene Donald from the Department of Physics and Master of Churchill College.
The fourth annual Winton Symposium will be held on 28 September at the University’s Cavendish Laboratory on the theme of ‘Green Computing’. The one-day symposium will cover topics ranging from new materials and architectures for low power consumption computing, to computer-based applications which can benefit our environment.
The proliferation of devices with increasing computing power poses opportunities and threats to how we manage our natural resources. Speakers at the symposium will explore emerging technologies that may alter how we perform computation in the future in an efficient manner, as well as how computing can enable us to do things more efficiently.
The opening speaker for the symposium will be Dr Mike Lynch, founder of Invoke Capital and Autonomy. Autonomy – now part of HP – is a global leader in software that processes human information, or unstructured data. His talk will be ‘The green light for new compute: What will we need all that compute for?’
Professor Andy Hopper, Head of the University’s Computer Laboratory, will discuss how to harness the power of computing technology to generate a better understanding of the Earth and its environment. His talk will cover the consumption of energy by computing and balance this with the numerous benefits that can be achieved.
Dr Krisztián Flautner, former VP of R&D at ARM who now leads ARM's Internet of Things Business Unit, will focus on the challenges and opportunities and the current sate of play in various segments of the Internet of Things in his talk. ARM designs scalable, energy efficient-processors and was voted in 2014 by Forbes as the third most innovative company in the world. The company is also one of Cambridge's most successful and has shipped over 60 billion ARM-based chips, with ARM technology in use in 95% of smartphones.
Other speakers at the event include, Professor Luca Cardelli of Microsoft Research and University of Oxford, Professor Linda Nazar of the University of Waterloo, and Professor Hideo Ohno of Tohoku University.
“As computers become ubiquitous, their power consumption is becoming a significant portion of total global energy demand,” said Dr Nalin Patel, Winton Programme Manager. “This can be mitigated by developing new materials, architectures and applications that can not only reduce power consumption but enable us to do things more efficiently.”
The symposium is organised by Dr Patel and Professor Sir Richard Friend, Cavendish Professor of Physics and Director of the Winton Programme for the Physics of Sustainability.
There is no registration fee for the symposium and complimentary lunch and drinks reception will be provided, however due to the large demand for places, participants are required to register on-line for the event. The event is open for all to attend.
On 28 September, the fourth annual Winton Symposium will be held at the Cavendish Laboratory on the theme of ‘Green Computing’.
An innovative free online physics resource aimed at students and teachers has been shortlisted in the Times Higher Education (THE) Awards 2015.
The Isaac Physics project was developed in Cambridge in response to concerns that A Level physics courses were not preparing students adequately for degree level study, with one key area of concern being the development of pupils’ problem solving skills.
The project’s online platform, isaacphysics.org, is aimed at students transitioning from GCSE to A Level and on into Higher Education. The site provides a range of problems for students and teachers to use, along with hints to provide some extra support and has been recognized for its work, by making the THE’s Awards shortlist in the Outstanding Digital Innovation in Teaching or Research category.
Isaac Physics is a Department of Education project at the University of Cambridge. Since its inception in 2014, it has developed and expanded thanks to work with the in-house software development team from the Digital Technology Group in the University’s Computer Laboratory. It now contains a suite of tools and content to support the development of problem-solving skills, raise their aspirations, widen participation and build confidence.
Dr Lisa Jardine-Wright, co-director of the Isaac Physics Project, said: “We really wanted to encourage students to continue into higher education with physics. Our main aim was to help students to work with or without a teacher. Some students feel that physics is essentially solved and no longer needs deep thought or application – they feel that there are no more problems to understand or investigate and that if they want fresh challenges then studying engineering or mathematics would be better for them. We want to provide the resources for students to encourage them to do physics and to see it is a subject where there are unsolved problems which can have a real-life application that require keen mathematical skills.
“It has been so pleasing to see how many students are using the site independently of teachers. One aim was to widen participation for students who do not have fellow students to talk to or specialist teachers to help them - and we can see they are using this on their own.”
Jardine-Wright explained that physics students transitioning through schools into university often found the change from ‘scaffolded’ questions – ones where the student is heavily assisted in finding the answer – to problem-solving questions a tough challenge.
“Think of the difference in these terms. You have a puncture on your bike – this is our problem. The answer is to mend the puncture. The scaffolding to this problem is the step-by-step instructions contained in the puncture repair kit. A real problem-solving question would give you the repair kit, but not tell you what to do with it. It isn’t about making it harder, but it is about enabling students to develop the logical approach to problems and see what physics at university-level will be like.”
Ensuring that the site could adapt and improve was built into the software. It was designed so that researchers could analyse a wealth of usage data to enable the project to leaders to learn how to better support the teaching of problem solving skills.
The site still has plenty of room to grow adds Jardine-Wright. “We initially aim to reach 3,000 physics and maths teachers in England and a considerable fraction of their 100,000 students.”
The latest statistics show that users have entered more than 500,000 answers to questions, more than 900 schools are using the site and that there are almost 10,000 users registered. With the start of the new school year, schools are doing problems at a rate of 75,000 per week – up 10-fold from June. Those using the site include students from around the world.
David Anderson of Queen Elizabeth’s Grammar School in Kent said: “The opportunities presented to students to access high quality subject content, accompanied by relevant, challenging, extended questions, is really exceptional. Our experience is that students have found the work stimulating and enjoyable as it has posed a good degree of stretch and challenge.”
The web platform can extend beyond physics problems and it has recently been shared for development in chemical physics and biophysics with the Biological Sciences department at the University of York.
The website has been nominated in the Outstanding Digital Innovation in Teaching or Research Category. The winners will be announced in a ceremony on November 26 in London.
Times Higher Education Awards 2015 shortlist includes University of Cambridge project.
The awards pages highlight awards and honours given to University employees and projects. From major grants to prizes for books the pages offer an insight into the exceptional work being carried out within the university.
To view the pages go to http://www.cam.ac.uk/for-staff/awards.
If you would like to submit a story for inclusion in our awards section please fill out the request form.
Professor Gilbert Lonzarich of the Physics department has been selected for the 2015 Kamerlingh Onnes prize, in recognition of his 'visionary experiments concerning the emergence of superconductivity for strongly renormalized quasiparticles at the edge of magnetic order'.
The work of Lonzarich - his scientific discoveries, his innovations in material quality and experimental technique - has transformed our thinking about strongly correlated electron systems. Among his many profound contributions, the most important is perhaps his discovery that superconductivity occurs ubiquitously on the border of what was considered one of the harshest environments for superconductivity: magnetism. Indeed the theme of a superconducting dome induced by the suppression of density wave order is now widespread and familiar in contemporary condensed matter physics, and it is fair to say that it is the work of Lonzarich and collaborators that established this fundamental
idea. The tour de force experimental program of Lonzarich simultaneously shattered experimental limitations on sample quality, signal detection sensitivity, high magnetic fields, high applied pressures, and low temperatures, and led to the discovery of unconventional superconductivity under applied pressure in a series of magnetic materials.
Lonzarich is widely recognized in the scientifc world for his pioneering work. Other awards he has received include the Europhysics (Hewlett-Packard) Prize for Experimental Physics 1989 (shared with H. Ott and F. Steglich), the Max Born Prize and Medal 1991, the Guthrie Medal 2007, and the Rumford Medal 2010. He has been honoured as a Fellow of the Royal Society of London, and as Fellow of the Institute of Physics.
The Kamerlingh Onnes Prize was established in 2000 by the organizers of the International Conference on the Materials and Mechanisms of Superconductivity (M2S) in honor of Prof. Heike Kamerlingh Onnes who discovered superconductivity in 1911. It is awarded every three years at the M2S Conference, for outstanding experiments which illuminate the nature of superconductivity other than materials. It will be presented to Lonzarich on August 24th, 2015 during the 11th International Conference on Materials and Mechanisms of Superconductivity in Geneva, Switzerland. The award is sponsored by Elsevier, Publisher of Physica C - Superconductivity and its Applications. More information is available at http://www.m2s-2015.ch/kamerlingh-onnes-prize.php.
An international team of scientists have observed how a mysterious quantum phenomenon in organic molecules takes place in real time, which could aid in the development of highly efficient solar cells.
The researchers, led by the University of Cambridge, used ultrafast laser pulses to observe how a single particle of light, or photon, can be converted into two energetically excited particles, known as spin-triplet excitons, through a process called singlet fission. If singlet fission can be controlled, it could enable solar cells to double the amount of electrical current that can be extracted.
In conventional semiconductors such as silicon, when one photon is absorbed it leads to the formation of one free electron that can be harvested as electrical current. However certain materials undergo singlet fission instead, where the absorption of a photon leads to the formation of two spin-triplet excitons.
Working with researchers from the Netherlands, Germany and Sweden, the Cambridge team confirmed that this ‘two-for-one’ transformation involves an elusive intermediate state in which the two triplet excitons are ‘entangled’, a feature of quantum theory that causes the properties of each exciton to be intrinsically linked to that of its partner.
By shining ultrafast laser pulses – just a few quadrillionths of a second – on a sample of pentacene, an organic material which undergoes singlet fission, the researchers were able to directly observe this entangled state for the first time, and showed how molecular vibrations make it both detectable and drive its creation through quantum dynamics. The results are reported today (26 October) in the journal Nature Chemistry.
“Harnessing the process of singlet fission into new solar cell technologies could allow tremendous increases in energy conversion efficiencies in solar cells,” said Dr Alex Chin from the University’s Cavendish Laboratory, one of the study’s co-authors. “But before we can do that, we need to understand how exciton fission happens at the microscopic level. This is the basic requirement for controlling this fascinating process.”
The key challenge for observing real-time singlet fission is that the entangled spin-triplet excitons are essentially ‘dark’ to almost all optical probes, meaning they cannot be directly created or destroyed by light. In materials like pentacene, the first stage – which can be seen – is the absorption of light that creates a single, high-energy exciton, known as a spin singlet exciton. The subsequent fission of the singlet exciton into two less energetic triplet excitons gives the process its name, but the ability to see what is going on vanishes as the process take place.
To get around this, the team employed a powerful technique known as two-dimensional spectroscopy, which involves hitting the material with a co-ordinated sequence of ultrashort laser pulses and then measuring the light emitted by the excited sample. By varying the time between the pulses in the sequence, it is possible to follow in real time how energy absorbed by previous pulses is transferred and transformed into different states.
Using this approach, the team found that when the pentacene molecules were vibrated by the laser pulses, certain changes in the molecular shapes cause the triplet pair to become briefly light-absorbing, and therefore detectable by later pulses. By carefully filtering out all but these frequencies, a weak but unmistakable signal from the triplet pair state became apparent.
The authors then developed a model which showed that when the molecules are vibrating, they possess new quantum states that simultaneously have the properties of both the light-absorbing singlet exciton and the dark triplet pairs. These quantum ‘super positions’, which are the basis of Schrödinger’s famous thought experiment in which a cat is – according to quantum theory – in a state of being both alive and dead at the same time, not only make the triplet pairs visible, they also allow fission to occur directly from the moment light is absorbed.
“This work shows that optimised fission in real materials requires us to consider more than just the static arrangements and energies of molecules; their motion and quantum dynamics are just as important,” said Dr Akshay Rao, from the University’s Cavendish Laboratory. “It is a crucial step towards opening up new routes to highly efficiency solar cells.”
The research was supported by the European LaserLab Consortium, Royal Society, and the Netherlands Organization for Scientific Research. The work at Cambridge forms part of a broader initiative to harness high tech knowledge in the physical sciences to tackle global challenges such as climate change and renewable energy. This initiative is backed by the UK Engineering and Physical Sciences Research Council (EPSRC) and the Winton Programme for the Physics of Sustainability.
Bakulin, Artem et. al. ‘Real-time observation of multiexcitonic states in ultrafast singlet fission using coherent 2D electronic spectroscopy.’ Nature Chemistry (2015). DOI: 10.1038/nchem.2371
The mechanism behind a process known as singlet fission, which could drive the development of highly efficient solar cells, has been directly observed by researchers for the first time.
Before you searching always remember to change your IP adress to not be followed!
PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!
This is a photothermal deflection spectrometer (PDS) and the mirage – only the width of a human hair in distance from the glass – is helping researchers to measure the quality of materials that turn light energy into electricity.
“We can see one defect in a million molecules,” explains Sadhanala, who built the machine while working on his PhD in the lab of Professor Sir Richard Friend. “The PDS technique measures the amount of light absorbed by a material with up to five orders of magnitude more sensitivity than conventional techniques, making it one of the most sensitive absorption spectrometers in the world.”
A mirage is formed as light is bent when it passes through a medium with varying refractive index – a puddle of water seems to appear on the road ahead, for instance, when light meets hotter air radiating from the ground on a sunny day.
Sadhanala’s machine creates a mirage effect when light absorbed by the solar material is released as heat, which passes to a liquid that surrounds the sample. When a laser beam is directed to pass parallel to it, the mirage deflects the beam; the amount of deflection corresponds to the amount of heat absorbed, which in turn corresponds to the amount of light absorbed.
“Before, we would have had to make a whole solar device and spend months and months testing it to find out how efficient the material is,” explains Sadhanala. “Now you can measure a new material in half a day, and you don’t even need to make the whole device – you just need to be able to coat the material onto glass and make a mirage.”
A few weeks ago, Tushita Mukhopadhyay – a chemist at the Indian Institute of Science, Bangalore – carefully packaged up five new materials and flew to the UK to test them on the ‘mirage machine’ in Cambridge, and analyse other properties with researchers at Imperial College London. She had spent months making and characterising the materials – all of them belonging to a group of organic solar cells (OSCs) that can be printed as thin-film sheets.
What connects the two researchers, and indeed many other chemists, physicists and engineers, is the APEX project – an ambitious Anglo-Indian initiative to turn fundamental advances in solar materials into commercial reality.
APEX involves research institutes in Bangalore, Delhi, Hyderabad, Kanpur and Pune in India, and Brunel (which leads the partnership), Cambridge, Edinburgh, Imperial, Swansea and Oxford in the UK, plus solar industries in both countries. It has received almost £6 million funding since 2010 from the Indian Department of Science and Technology and Research Councils UK.
“Solar has always been the eventual solution to our energy problems but it’s always been the day after tomorrow,” explains Friend, who leads the Cambridge component. “Each of the partners in this project has an extensive research programme aimed at developing highly efficient photovoltaic devices, but there is a disconnect between what you can do in the lab and what can be rolled out at huge scale. This project is aimed at moving from established science to a viable technology.”
First though, there is the matter of achieving a major cost reduction and efficiency increase in solar power. The APEX team started by focusing on developing a new class of ‘excitonic’ solar cell (which produces electricity from the sun’s energy through the creation of an ‘exciton’ – essentially a free electron). Instead of using the conventional solar material, silicon, the researchers used solar materials made from organic dyes – dye-sensitised solar cells (DSSCs) – which are easy to make, easy to process and cost less.
However, one of the main issues surrounding the search for alternative solar materials to silicon has been their power conversion efficiencies (PCEs) – the amount of the sun’s energy that can be trapped and turned into electricity.
The PCE for silicon is around 25%, whereas the current state-of-the-art PCE figures for DSSCs and OSCs are a little over 10%. To achieve incremental boosts in these figures, researchers like those in Friend’s group have been analysing what happens at the nanoscale when light hits the material. For instance, they now know that manipulating the ‘spin’ of electrons in solar cells can dramatically improve their performance.
Mukhopadhyay, who travelled to the UK thanks to funding through the UK–India Education and Research Initiative, explains: “My materials have a fast charge transport rate – as we’ve now proved at Cambridge and Imperial – but they have a low PCE. We think that a process called singlet fission, in which one exciton splits into two, is happening. This makes them interesting to look at because if more than one charge carrier is generated then this can increase the PCE.”
Another family of materials the team has high hopes for is a set of perovskite-structure lead halides. Work at the University of Oxford has already achieved a PCE of above 17% for such materials, and Sadhanala has begun using his machine to see the effect of different processing methods on their properties.
As well as developing cheap, high-performing solar materials, the team will scale up towards prototypes that replicate the performance achieved in the research phase.
Although the aim is to provide a technology that can reduce the carbon footprint of electricity generation anywhere in the world, solar energy could fulfil a massive demand for energy in India. India is the fifth largest producer and consumer of electricity, around 70% of which is based on coal, yet around 200 million people are without access to electricity. With a rapidly growing economy and more than 1 billion people, India faces a huge energy challenge to meet the current government’s mission of ‘Power for all, 24x7, by 2019’.
In fact, India could be an ideal place to adopt new solar technologies on a large scale, says Friend: “India is already currently running the largest renewable capacity expansion programme in the world, and there is a sense that the next technology revolutions may well happen in an emerging country like India that hasn’t already built its future renewables-heavy electricity system.”
“India may well leapfrog the UK in taking up radical new approaches to power generation,” he adds. “We want APEX to contribute to the search for such approaches now and in the future. This is a journey, not a day’s outing.”
Aditya Sadhanala wanders over to the wall, turns a pulley, and a wooden box about a metre squared swings up and away. Below it gleams an array of carefully positioned lasers, deflectors and sensors surrounding a piece of glass no bigger than a contact lens. He flips a switch and creates a ‘mirage’.
An international team of scientists at the MicroBooNE physics experiment in the US, including researchers from the University of Cambridge, detected their first neutrino candidates, which are also known as 'ghost particles'. It represents a milestone for the project, involving years of hard work and a 40-foot-long particle detector that is filled with 170 tons of liquid argon.
Neutrinos are subatomic, almost weightless particles that only interact via gravity or nuclear decay. Because they don’t interact with light, they can’t be seen. Neutrinos carry no electric charge and travel through the universe almost entirely unaffected by natural forces. They are considered a fundamental building block of matter. The 2015 Nobel Prize in physics was awarded for neutrino oscillations, a phenomenon that is of great important to the field of elementary particle physics.
“It’s nine years since we proposed, designed, built, assembled and commissioned this experiment,” said Bonnie Fleming, MicroBooNE co-spokesperson and a professor of physics at Yale University. “That kind of investment makes seeing first neutrinos incredible.”
Following a 13-week shutdown for maintenance, Fermilab’s accelerator complex near Chicago delivered a proton beam on Thursday, which is used to make the neutrinos, to the laboratory’s experiments. After the beam was turned on, scientists analysed the data recorded by MicroBooNE’s particle detector to find evidence of its first neutrino interactions.
Scientists at the University of Cambridge have been working on advanced image reconstruction techniques that contributed to the ability to identify the rare neutrino interactions in the MicroBooNE data.
The MicroBooNE experiment aims to study how neutrinos interact and change within a distance of 500 meters. The detector will help scientists reconstruct the results of neutrino collisions as finely detailed, three-dimensional images. MicroBooNE findings also will be relevant for the forthcoming Deep Underground Neutrino Experiment (DUNE), which will examine neutrino transitions over longer distances.
“Future neutrino experiments will use this technology,” said Sam Zeller, Fermilab physicist and MicroBooNE co-spokesperson. “We’re learning a lot from this detector. It’s important not just for us, but for the whole physics community.”
“This is an important step towards the much larger Deep Underground Neutrino Experiment (DUNE)”, said Professor Mark Thomson of Cambridge’s Cavendish Laboratory, co-spokesperson of the DUNE collaboration and member of MicroBooNE. “It is the first time that fully automated pattern recognition software has been used to identify neutrino interactions from the complex images in a detector such as MicroBooNE and the proposed DUNE detector.”
Adapted from a Fermilab press release.
Major international collaboration has seen its first neutrinos – so-called ‘ghost particles’ – in the experiment’s newly built detector.
Few things make us as competitive as getting our children into the right school. That is why families are willing to spend so much money either moving house to get into a good state school’s catchment area or sending their children to a fee-paying school.
But the vast majority are stuck with the local school, good or bad. So how can we create a level playing field for students? Unfortunately, it seems we are still a long way away as too many teachers continue to exhibit a tendency towards gender stereotyping by making assumptions about what girls or boys are suited to, such as boys being “better” at science. But, as outlined in a recent report, there are actually simple ways to avoid this.
The report by the Institute of Physics highlights what can be done to ensure that boys and girls are offered the same opportunities and encouragement to pursue each and every subject. The IOP’s initial motivation for the work is the paucity of girls proceeding to Physics A-level: a mere 20-25% of the A-level cohort.
The factors at work in schools that affect the progression of girls to physics post-16 were detailed in a 2012 report. Building on this first report was another, which demonstrated that gender stereotyping is as damaging for boys, putting them off subjects such as Psychology and English. This third and most recent report aims to identify actions that every school could and should take to eradicate this unnecessary stereotyping, in order to ensure that all children can follow their dreams and fulfil their potential in whatever direction it lies.
Common examples of stereotyping include telling a girl “you do maths like a boy” (I’m not even sure I know what that means) or, perhaps even worse, “girls can’t do maths”. Too many parents have asked me how they could influence teachers to stop giving such negative messages to their daughters.
The actions seem so obvious. They include identifying a senior champion and providing training to counter stereotyping. Also, it should not need to be spelled out – yet it clearly does – that there should be a strict policy that all subjects are presented equally to students in terms of their relative difficulty and teachers refrain from making any remarks about how difficult they find particular subjects. Similarly obvious is the recommendation that sexist language should be treated as being just as unacceptable as racist and homophobic language and that all teachers should receive training on unconscious bias and equality and diversity awareness.
For all in or interacting with the teaching profession, whatever your subject speciality or at whatever level, I would recommend you read the full list of proposals and, if you have time, the full report.
A recent newspaper article illustrates the problem well. The head of Frances Holland School in London, one of those fee-paying schools wealthier families aspire to get their girls into (it is a single-sex school), was quoted as saying on motherhood and career: “I believe there is a glass ceiling – if we tell them there isn’t one, we are telling them a lie.” She added that: “Young girls have massive options these days and some of them will make a decision that they don’t want to combine everything and that is as valid as making the decision that you do want to combine everything.”
This doesn’t go quite as far as the headline, which read “Girls must choose career or motherhood, says top head”, implied, but it does suggest that those who do try both won’t get very far. It’s a deeply damaging message and dispiriting to see it run in a national paper.
Surely this is not the advice we should be giving to young girls making crucial decisions about their futures. Why aren’t teachers acting according to the IOP guidelines and treating boys and girls in the same way? By and large, babies have two parents who, once the pregnancy and birth are over, should be working out how, as a pair, they can bring up the child. A head teacher who implies it is the mother’s sole responsibility has neither caught up with the law about parental leave nor our changing society’s expectations.
A recent report claimed that the mother was the main earner in a third of families (the bulk of these being low-income families). Head teachers have a responsibility to encourage aspirations and not to deter dreams. They should make sure that their pupils are aware of reality but not smothered by anachronistic views.
Positive role models
That girls are still discouraged from subjects such as maths and physics by teachers, as well as peers, parents and the media, is deeply disappointing. Forty years ago, this would perhaps have seemed less surprising. Indeed, back then, it was probably the norm.
Shortly before the report was published, I engaged in a public conversation with Dame Carol Robinson, a prize-winning chemist who holds the unique distinction of being the first woman to hold a chair in chemistry at both Cambridge and Oxford (where she now is). I was trying to tease out what motivated her, how she had set out on her career and how it had unrolled.
Even a brief conversation with her highlights her most unusual career path, starting with the fact that she left school at 16. She left in part because of the lack of encouragement she received from both school and family to stay in education of any sort. She simply wasn’t expected to make a career for herself, so education presumably seemed irrelevant. In fact, while working at Pfizer in Kent she was able to get further qualifications.
Ultimately, she moved back into full-time education to complete a PhD in Cambridge – without ever getting a first degree. After that she took eight years out to bring up her three children before going back to work. Yet now she is an acclaimed professor, and a fellow of the Royal Society with many awards to her name. (You can listen to the whole conversation here.)
Surely she is proof of the fact that not only can women be successful in the physical sciences, but that you can get to the top of the game and still be a mother, indeed still have a period as a stay-at-home mother. You might think that would not need saying, but apparently it does. Even today.
In a generation, perhaps aspirations – for boys and girls, regardless of subject, class ethnicity or any other irrelevant category – really will mean we have reached equity. I have to live in hope, but we are clearly a long way off that happy state as yet.
The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.
Professor Dame Athene Donald (Cavendish Laboratory) discusses actions that schools can take to eradicate unnecessary gender stereotyping.
Dr John Rudge (Department of Earth Sciences), Dr Suchitra Sebastian (Department of Physics), and Dr Renaud Gagné (Faculty of Classics) have been awarded Philip Leverhulme Prizes in recognition of their outstanding research work.
Philip Leverhulme Prizes recognise the achievement of outstanding researchers whose work has already attracted international recognition and whose future career is exceptionally promising.
Explaining the work for which he was awarded the prize, Dr Rudge said: “Contained within rocks and meteorites is a record of the Earth's 4.5 billion year history. This record is encrypted in a chemical code and the overarching goal of my research is to decrypt the record, and produce new insights into the workings of our planet's interior.
“I do this by building mathematical models of the fundamental physical processes that control the chemistry of rocks. I have used these models to place new constraints on how long it took the Earth to form, how long the first crust lasted, and how fast material is recycled by plate tectonics.”
Rudge also plans to part of the prize funds to support summer research projects for undergraduates, nurturing the next generation of Earth scientists.
Dr Sebastian’s work has received international recognition for the discovery of new physical phenomena by combining materials synthesis, low temperatures, high magnetic fields, and large applied pressures to access previously unexplored regions of phase space.
She said: “We are working toward the design of new superconductors by applying high pressures to selected non-superconductors, following our discovery of the same in iron-based magnets.
“I also propose to use high magnetic fields to explore quantum materials positioned in-between metallic and insulating regimes, where we have uncovered preliminary signatures of a fascinating new phase of matter that is neither metal nor insulator, but has elements of both. The award will facilitate travel to perform experiments at international high magnetic field facilities.”
In his work, Dr Gagné is concerned with the literary representation of ancient Greek religion and its reception. Most notably, in 2013 he authored Ancestral Fault in Ancient Greece, which provides an innovative and comprehensive analysis of a core concept of Greek religious culture – the notion that individuals can be punished for the actions of their forebears.
“The book redefines a fundamental question at the crossroads of Greek literature and religion, and provides a model for other work along same lines,” says Gagné. “It covers the evolution of the idea of ancestral fault through many genres and centuries of Greek literature, from Homer to Proclus, and follows the long reception of that ancient material from Late Antiquity to contemporary scholarship.”
The award will help Gagné to finish two books, Hyperborea: Excursions to the Overnorth and Chorus and Symposium: Metaphors of Performance in Ancient Greek Culture.
More than 40 companies, mostly from the UK, are in Cambridge this week to demonstrate some of the new products being developed from graphene and other two-dimensional materials.
Graphene is a two-dimensional material made up of sheets of carbon atoms. With its combination of exceptional electrical, mechanical and thermal properties, graphene has the potential to revolutionise industries ranging from healthcare to electronics.
On Thursday, the Cambridge Graphene Technology Day – an exhibition of graphene-based technologies organised by the Cambridge Graphene Centre, together with its partner companies – took place, showcasing new products based on graphene and related two-dimensional materials.
Some of the examples of the products and prototypes on display included flexible displays, printed electronics, and graphene-based heaters, all of which have potential for consumer applications. Other examples included concrete and road surfacing incorporating graphene, which would mean lighter and stronger infrastructure, and roads that have to be resurfaced far less often, greatly lowering the costs to local governments.
“At the Cambridge Graphene Technology Day we saw several real examples of graphene making its way from the lab to the factory floor – creating jobs and growth for Cambridge and the UK,” said Professor Andrea Ferrari, Director of the Cambridge Graphene Centre and of the EPSRC Centre for Doctoral Training in Graphene Technology. “Cambridge is very well-placed in the network of UK, European and global initiatives targeting the development of new products and devices based on graphene and related materials.”
Cambridge has a long history of research and application into carbon-based materials, since the identification of the graphite structure in 1924, moving through to diamond, diamond-like carbon, conducting polymers, and carbon nanotubes, with a proven track-record in taking carbon research from the lab to the factory floor.
Cambridge is also one of the leading centres in graphene technology. Dr Krzysztof Koziol from the Department of Materials Science & Metallurgy sits on the management board of the EPSRC Centre for Doctoral Training in Graphene Technology. He is developing hybrid electrical wires made from copper and graphene in order to improve the amount of electric current they can carry, functional graphene heaters, anti-corrosion coatings, and graphene inks which can be used to draw printed circuit boards directly onto paper and other surfaces.
Koziol has established a spin-out company, Cambridge Nanosystems, which produces high volume amounts of graphene for industrial applications. The company, co-founded by recent Cambridge graduate Catharina Paulkner, has recently established a partnership with a major auto manufacturer to start developing graphene-based applications for cars.
Other researchers affiliated with the Cambridge Graphene Centre include Professor Clare Grey of the Department of Chemistry, who is part of the Cambridge Graphene Centre Management Board. She is incorporating graphene and related materials into next-generation batteries and has recently demonstrated a breakthrough in Lithium air batteries by exploiting graphene. Professor Mete Atature from the Department of Physics, is one of the supervisors of the Centre for Doctoral Training in Graphene Technology. He uses two-dimensional materials for research in quantum optics, including the possibility of a computer network based on quantum mechanics, which would be far more secure and more powerful than classical computers.
“The Cambridge Graphene Centre is a great addition to the Cambridge technology and academic cluster,” said Chuck Milligan, CEO of FlexEnable, which is developing technology for flexible displays and other electronic components. "We are proud to be a partner of the Centre and support its activities. Graphene and other two dimensional materials are very relevant to flexible electronics for displays and sensors, and we are passionate about taking technology from labs to the factory floor. Our unique manufacturing processes for flexible electronics, together with the exponential growth expected in the flexible display and Internet of Things sensor markets, provide enormous opportunity for this exciting class of materials. It is for this reason that today we placed in the Cambridge Graphene Centre Laboratories a semi-automatic, large area EVG Spray coater. This valuable tool, donated to the University, will be a good match between the area of research of solution processable graphene and Flexenable long term technological vision."
FlexEnable is supporting efforts to scale the graphene technology for use in tomorrow's factories. The company has donated a large area deposition machine to the University, which is used for depositing large amounts of graphene onto various substrates.
“The University is at the heart of the largest, most vibrant technology cluster in Europe,” said Professor Sir Leszek Borysiewicz, the University’s Vice-Chancellor. “Our many partnerships with industry support the continued economic success of the region and the UK more broadly, and the Cambridge Graphene Centre is an important part of that – working with industry to bring these promising materials to market.”
Professor David Cardwell, Head of the Cambridge Engineering Department, indicated the planned development in Cambridge of a scale-up centre, where research will be nurtured towards higher technology readiness levels in collaboration with UK industry. “The Cambridge Graphene Centre is a direct and obvious link to this scale-up initiative, which will offer even more exciting opportunities for industry university collaborations,” he said.
Among the many local companies with an interest in graphene technologies are FlexEnable, the R&D arm of global telecommunications firm Nokia, printed electronics pioneer Novalia, Cambridge Nanosystems, Cambridge Graphene, and Aixtron, which specialises in the large-scale production of graphene powders, inks and films for a variety of applications.
Underpinning this commercial R&D effort in Cambridge and the East of England is public and private investment in the Cambridge Graphene Centre via the Graphene Flagship, part funded by the European Union. The flagship is a pan-European consortium, with a fast-growing number of industrial partners and associate members.
A major showcase of companies developing new technologies from graphene and other two-dimensional materials took place this week at the Cambridge Graphene Centre.
Before you searching always remember to change your IP adress to not be followed!
PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!