Tagged: MIT Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:20 pm on May 19, 2019 Permalink | Reply
    Tags: "Manipulating atoms one at a time with an electron beam", , Developing a method that can reposition atoms with a highly focused electron beam and control their exact location and bonding orientation., MIT, Scanning transmission electron microscopes, Ultimately the goal is to move multiple atoms in complex ways.   

    From MIT News: “Manipulating atoms one at a time with an electron beam” 

    MIT News

    From MIT News

    May 17, 2019
    David L. Chandler

    1
    This diagram illustrates the controlled switching of positions of a phosphorus atom within a layer of graphite by using an electron beam, as was demonstrated by the research team. Courtesy of the researchers.

    2
    Microscope images are paired with diagrams illustrating the controlled movement of atoms within a graphite lattice, using an electron beam to manipulate the positions of atoms one a time. Courtesy of the researchers.

    New method could be useful for building quantum sensors and computers.

    The ultimate degree of control for engineering would be the ability to create and manipulate materials at the most basic level, fabricating devices atom by atom with precise control.

    Now, scientists at MIT, the University of Vienna, and several other institutions have taken a step in that direction, developing a method that can reposition atoms with a highly focused electron beam and control their exact location and bonding orientation. The finding could ultimately lead to new ways of making quantum computing devices or sensors, and usher in a new age of “atomic engineering,” they say.

    The advance is described today in the journal Science Advances, in a paper by MIT professor of nuclear science and engineering Ju Li, graduate student Cong Su, Professor Toma Susi of the University of Vienna, and 13 others at MIT, the University of Vienna, Oak Ridge National Laboratory, and in China, Ecuador, and Denmark.

    “We’re using a lot of the tools of nanotechnology,” explains Li, who holds a joint appointment in materials science and engineering. But in the new research, those tools are being used to control processes that are yet an order of magnitude smaller. “The goal is to control one to a few hundred atoms, to control their positions, control their charge state, and control their electronic and nuclear spin states,” he says.

    While others have previously manipulated the positions of individual atoms, even creating a neat circle of atoms on a surface, that process involved picking up individual atoms on the needle-like tip of a scanning tunneling microscope and then dropping them in position, a relatively slow mechanical process. The new process manipulates atoms using a relativistic electron beam in a scanning transmission electron microscope (STEM), so it can be fully electronically controlled by magnetic lenses and requires no mechanical moving parts. That makes the process potentially much faster, and thus could lead to practical applications.

    Custom-designed scanning transmission electron microscope at Cornell University by David Muller/Cornell University

    3
    MIT scanning transmission electron microscope

    Using electronic controls and artificial intelligence, “we think we can eventually manipulate atoms at microsecond timescales,” Li says. “That’s many orders of magnitude faster than we can manipulate them now with mechanical probes. Also, it should be possible to have many electron beams working simultaneously on the same piece of material.”

    “This is an exciting new paradigm for atom manipulation,” Susi says.

    Computer chips are typically made by “doping” a silicon crystal with other atoms needed to confer specific electrical properties, thus creating “defects’ in the material — regions that do not preserve the perfectly orderly crystalline structure of the silicon. But that process is scattershot, Li explains, so there’s no way of controlling with atomic precision where those dopant atoms go. The new system allows for exact positioning, he says.

    The same electron beam can be used for knocking an atom both out of one position and into another, and then “reading” the new position to verify that the atom ended up where it was meant to, Li says. While the positioning is essentially determined by probabilities and is not 100 percent accurate, the ability to determine the actual position makes it possible to select out only those that ended up in the right configuration.

    Atomic soccer

    The power of the very narrowly focused electron beam, about as wide as an atom, knocks an atom out of its position, and by selecting the exact angle of the beam, the researchers can determine where it is most likely to end up. “We want to use the beam to knock out atoms and essentially to play atomic soccer,” dribbling the atoms across the graphene field to their intended “goal” position, he says.

    “Like soccer, it’s not deterministic, but you can control the probabilities,” he says. “Like soccer, you’re always trying to move toward the goal.”

    In the team’s experiments, they primarily used phosphorus atoms, a commonly used dopant, in a sheet of graphene, a two-dimensional sheet of carbon atoms arranged in a honeycomb pattern. The phosphorus atoms end up substituting for carbon atoms in parts of that pattern, thus altering the material’s electronic, optical, and other properties in ways that can be predicted if the positions of those atoms are known.

    Ultimately, the goal is to move multiple atoms in complex ways. “We hope to use the electron beam to basically move these dopants, so we could make a pyramid, or some defect complex, where we can state precisely where each atom sits,” Li says.

    This is the first time electronically distinct dopant atoms have been manipulated in graphene. “Although we’ve worked with silicon impurities before, phosphorus is both potentially more interesting for its electrical and magnetic properties, but as we’ve now discovered, also behaves in surprisingly different ways. Each element may hold new surprises and possibilities,” Susi adds.

    The system requires precise control of the beam angle and energy. “Sometimes we have unwanted outcomes if we’re not careful,” he says. For example, sometimes a carbon atom that was intended to stay in position “just leaves,” and sometimes the phosphorus atom gets locked into position in the lattice, and “then no matter how we change the beam angle, we cannot affect its position. We have to find another ball.”

    Theoretical framework

    In addition to detailed experimental testing and observation of the effects of different angles and positions of the beams and graphene, the team also devised a theoretical basis to predict the effects, called primary knock-on space formalism, that tracks the momentum of the “soccer ball.” “We did these experiments and also gave a theoretical framework on how to control this process,” Li says.

    The cascade of effects that results from the initial beam takes place over multiple time scales, Li says, which made the observations and analysis tricky to carry out. The actual initial collision of the relativistic electron (moving at about 45 percent of the speed of light) with an atom takes place on a scale of zeptoseconds — trillionths of a billionth of a second — but the resulting movement and collisions of atoms in the lattice unfolds over time scales of picoseconds or longer — billions of times longer.

    Dopant atoms such as phosphorus have a nonzero nuclear spin, which is a key property needed for quantum-based devices because that spin state is easily affected by elements of its environment such as magnetic fields. So the ability to place these atoms precisely, in terms of both position and bonding, could be a key step toward developing quantum information processing or sensing devices, Li says.

    “This is an important advance in the field,” says Alex Zettl, a professor of physics at the University of California at Berkeley, who was not involved in this research. “Impurity atoms and defects in a crystal lattice are at the heart of the electronics industry. As solid-state devices get smaller, down to the nanometer size scale, it becomes increasingly important to know precisely where a single impurity atom or defect is located, and what are its atomic surroundings. An extremely challenging goal is having a scalable method to controllably manipulate or place individual atoms in desired locations, as well as predicting accurately what effect that placement will have on device performance.”

    Zettl says that these researchers “have made a significant advance toward this goal. They use a moderate energy focused electron beam to coax a desirable rearrangement of atoms, and observe in real-time, at the atomic scale, what they are doing. An elegant theoretical treatise, with impressive predictive power, complements the experiments.”

    Besides the leading MIT team, the international collaboration included researchers from the University of Vienna, the University of Chinese Academy of Sciences, Aarhus University in Denmark, National Polytechnical School in Ecuador, Oak Ridge National Laboratory, and Sichuan University in China. The work was supported by the National Science Foundation, the U.S. Army Research Office through MIT’s Institute for Soldier Nanotechnologies, the Austrian Science Fund, the European Research Council, the Danish Council for Independent Research, the Chinese Academy of Sciences, and the U.S. Department of Energy.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 1:03 pm on May 19, 2019 Permalink | Reply
    Tags: , , MIT, , Reversing traditional plasma shaping provides greater stability for fusion reactions.   

    From MIT News: “Steering fusion’s ‘D-turn'” 

    MIT News

    From MIT News

    May 17, 2019
    Paul Rivenberg | Plasma Science and Fusion Center

    1
    Cross sections of pressure profiles are shown in two different tokamak plasma configurations (the center of the tokamak doughnut is to the left of these). The discharges have high pressure in the core (yellow) that decreases to low pressure (blue) at the edge. Researchers achieved substantial high-pressure operation of reverse-D plasmas at the DIII-D National Fusion Facility.

    Image: Alessandro Marinoni/MIT PSFC

    Research scientist Alessandro Marinoni shows that reversing traditional plasma shaping provides greater stability for fusion reactions.

    Trying to duplicate the power of the sun for energy production on earth has challenged fusion researchers for decades. One path to endless carbon-free energy has focused on heating and confining plasma fuel in tokamaks, which use magnetic fields to keep the turbulent plasma circulating within a doughnut-shaped vacuum chamber and away from the walls. Fusion researchers have favored contouring these tokamak plasmas into a triangular or D shape, with the curvature of the D stretching away from the center of the doughnut, which allows plasma to withstand the intense pressures inside the device better than a circular shape.

    Led by research scientists Alessandro Marinoni of MIT’s Plasma Science and Fusion Center (PSFC) and Max Austin, of the University of Texas at Austin, researchers at the DIII-D National Fusion Facility have discovered promising evidence that reversing the conventional shape of the plasma in the tokamak chamber can create a more stable environment for fusion to occur, even under high pressure. The results were recently published in Physical Review Letters and Physics of Plasmas.

    3
    DIII-D National Fusion Facility. General Atomics

    Marinoni first experimented with the “reverse-D” shape, also known as “negative triangularity,” while pursuing his PhD on the TCV tokamak at Ecole Polytechnique Fédérale de Lausanne, Switzerland.

    4
    The Tokamak à configuration variable (TCV, literally “variable configuration tokamak”) is a Swiss research fusion reactor of the École polytechnique fédérale de Lausanne. Its distinguishing feature over other tokamaks is that its torus section is three times higher than wide. This allows studying several shapes of plasmas, which is particularly relevant since the shape of the plasma has links to the performance of the reactor. The TCV was set up in November 1992.

    The TCV team was able to show that negative triangularity helps to reduce plasma turbulence, thus increasing confinement, a key to sustaining fusion reactions.

    “Unfortunately, at that time, TCV was not equipped to operate at high plasma pressures with the ion temperature being close to that of electrons,” notes Marinoni, “so we couldn’t investigate regimes that are directly relevant to fusion plasma conditions.”

    Growing up outside Milan, Marinoni developed an interest in fusion through an early passion for astrophysical phenomena, hooked in preschool by the compelling mysteries of black holes.

    “It was fascinating because black holes can trap light. At that time I was just a little kid. As such, I couldn’t figure out why the light could be trapped by the gravitational force exerted by black holes, given that on Earth nothing like that ever happens.”

    As he matured he joined a local amateur astronomy club, but eventually decided black holes would be a hobby, not his vocation.

    “My job would be to try producing energy through nuclear fission or fusion; that’s the reason why I enrolled in the nuclear engineering program in the Polytechnic University of Milan.”

    After studies in Italy and Switzerland, Marinoni seized the opportunity to join the PSFC’s collaboration with the DIII-D tokamak in San Diego, under the direction of MIT professor of physics Miklos Porkolab. As a postdoc, he used MIT’s phase contrast imaging diagnostic to measure plasma density fluctuations in DIII-D, later continuing work there as a PSFC research scientist.

    Max Austin, after reading the negative triangularity results from TCV, decided to explore the possibility of running similar experiments on the DIII-D tokamak to confirm the stabilizing effect of negative triangularity. For the experimental proposal, Austin teamed up with Marinoni and together they designed and carried out the experiments.

    “The DIII-D research team was working against decades-old assumptions,” says Marinoni. “It was generally believed that plasmas at negative triangularity could not hold high enough plasma pressures to be relevant for energy production, because of macroscopic scale Magneto-Hydro-Dynamics (MHD) instabilities that would arise and destroy the plasma. MHD is a theory that governs the macro-stability of electrically conducting fluids such as plasmas. We wanted to show that under the right conditions the reverse-D shape could sustain MHD stable plasmas at high enough pressures to be suitable for a fusion power plant, in some respects even better than a D-shape.”

    While D-shaped plasmas are the standard configuration, they have their own challenges. They are affected by high levels of turbulence, which hinders them from achieving the high pressure levels necessary for economic fusion. Researchers have solved this problem by creating a narrow layer near the plasma boundary where turbulence is suppressed by large flow shear, thus allowing inner regions to attain higher pressure. In the process, however, a steep pressure gradient develops in the outer plasma layers, making the plasma susceptible to instabilities called edge localized modes that, if sufficiently powerful, would expel a substantial fraction of the built-up plasma energy, thus damaging the tokamak chamber walls.

    DIII-D was designed for the challenges of creating D-shaped plasmas. Marinoni praises the DIII-D control group for “working hard to figure out a way to run this unusual reverse-D shape plasma.”

    The effort paid off. DIII-D researchers were able to show that even at higher pressures, the reverse-D shape is as effective at reducing turbulence in the plasma core as it was in the low-pressure TCV environment. Despite previous assumptions, DIII-D demonstrated that plasmas at reversed triangularity can sustain pressure levels suitable for a tokamak-based fusion power plant; additionally, they can do so without the need to create a steep pressure gradient near the edge that would lead to machine-damaging edge localized modes.

    Marinoni and colleagues are planning future experiments to further demonstrate the potential of this approach in an even more fusion-power relevant magnetic topology, based on a “diverted” tokamak concept. He has tried to interest other international tokamaks in experimenting with the reverse configuration.

    “Because of hardware issues, only a few tokamaks can create negative triangularity plasmas; tokamaks like DIII-D, that are not designed to produce plasmas at negative triangularity, need a significant effort to produce this plasma shape. Nonetheless, it is important to engage the fusion community worldwide to more fully establish the data base on the benefits of this shape.”

    Marinoni looks forward to where the research will take the DIII-D team. He looks back to his introduction to tokamak, which has become the focus of his research.

    “When I first learned about tokamaks I thought, ‘Oh, cool! It’s important to develop a new source of energy that is carbon free!’ That is how I ended up in fusion.”

    This research is sponsored by the U.S. Department of Energy Office of Science’s Fusion Energy Sciences, using their DIII-D National Fusion Facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 8:41 am on May 16, 2019 Permalink | Reply
    Tags: A slippery surface for liquids with very low surface tension promotes droplet formation facilitating heat transfer., , MIT, , Specialized thin coatings   

    From MIT News: “New surface treatment could improve refrigeration efficiency” 

    MIT News
    MIT Widget

    From MIT News

    May 15, 2019
    David L. Chandler

    1
    Specialized thin coatings developed by the MIT team cause even low-surface-tension fluids to readily form droplets on the surface of a pipe, as seen here, which improves the efficiency of heat transfer. Image courtesy of the researchers

    2
    Artistic rendition of a dropwise condensing shell and tube heat exchanger, where vapor molecules condense onto heat exchange tubes and form drops that shed from the surface. The different colors and shapes represent different vapor materials. Illustration by Demin Liu

    A slippery surface for liquids with very low surface tension promotes droplet formation, facilitating heat transfer.

    Unlike water, liquid refrigerants and other fluids that have a low surface tension tend to spread quickly into a sheet when they come into contact with a surface. But for many industrial processes it would be better if the fluids formed droplets, which could roll or fall off the surface and carry heat away with them.

    Now, researchers at MIT have made significant progress in promoting droplet formation and shedding in such fluids. This approach could lead to efficiency improvements in many large-scale industrial processes including refrigeration, thus saving energy and reducing greenhouse gas emissions.

    The new findings are described in the journal Joule, in a paper by graduate student Karim Khalil, professor of mechanical engineering Kripa Varanasi, professor of chemical engineering and Associate Provost Karen Gleason, and four others.

    Over the years, Varanasi and his collaborators have made great progress in improving the efficiency of condensation systems that use water, such as the cooling systems used for fossil-fuel or nuclear power generation. But other kinds of fluids — such as those used in refrigeration systems, liquification, waste heat recovery, and distillation plants, or materials such as methane in oil and gas liquifaction plants — often have very low surface tension compared to water, meaning that it is very hard to get them to form droplets on a surface. Instead, they tend to spread out in a sheet, a property known as wetting.

    But when these sheets of liquid coat a surface, they provide an insulating layer that inhibits heat transfer, and easy heat transfer is crucial to making these processes work efficiently. “If it forms a film, it becomes a barrier to heat transfer,” Varanasi says. But that heat transfer is enhanced when the liquid quickly forms droplets, which then coalesce and grow and fall away under the force of gravity. Getting low-surface-tension liquids to form droplets and shed them easily has been a serious challenge.

    In condensing systems that use water, the overall efficiency of the process can be around 40 percent, but with low-surface-tension fluids, the efficiency can be limited to about 20 percent. Because these processes are so widespread in industry, even a tiny improvement in that efficiency could lead to dramatic savings in fuel, and therefore in greenhouse gas emissions, Varanasi says.

    By promoting droplet formation, he says, it’s possible to achieve a four- to eightfold improvement in heat transfer. Because the condensation is just one part of a complex cycle, that translates into an overall efficiency improvement of about 2 percent. That may not sound like much, but in these huge industrial processes even a fraction of a percent improvement is considered a major achievement with great potential impact. “In this field, you’re fighting for tenths of a percent,” Khalil says.

    Unlike the surface treatments Varanasi and his team have developed for other kinds of fluids, which rely on a liquid material held in place by a surface texture, in this case they were able to accomplish the fluid-repelling effect using a very thin solid coating — less than a micron thick (one millionth of a meter). That thinness is important, to ensure that the coating itself doesn’t contribute to blocking heat transfer, Khalil explains.

    The coating, made of a specially formulated polymer, is deposited on the surface using a process called initiated chemical vapor deposition (iCVD), in which the coating material is vaporized and grafts onto the surface to be treated, such as a metal pipe, to form a thin coating. This process was developed at MIT by Gleason and is now widely used.

    The authors optimized the iCVD process by tuning the grafting of coating molecules onto the surface, in order to minimize the pinning of condensing droplets and facilitate their easy shedding. The process could be carried out on location in industrial-scale equipment, and could be retrofitted into existing installations to provide a boost in efficiency. The process is “materials agnostic,” Khalil says, and can be applied on either flat surfaces or tubing made of stainless steel, titanium, or other metals commonly used in condensation heat-transfer processes that involve these low-surface-tension fluids. “Whatever materials are used in your facility’s heat exchanger, it tends to be scalable with this process,” he adds.

    The net result is that on these surfaces, condensing fluids like the hydrocarbons pentane or liquid methane, or alcohols like ethanol, will readily form small droplets that quickly fall off the surface, making room for more to form, and in the process shedding heat from the metal to the droplets that fall away.

    One area where such coatings could play a useful role, Varanasi says, is in organic Rankine cycle systems, which are widely used for generating power from waste heat in a variety of industrial processes. “These are inherently inefficient systems,” he says, “but this could make them more efficient.”

    3
    The new coating is shown promoting condensation on a titanium surface, a material widely used in industrial heat exchangers.

    “This new approach to condensation is significant because it promotes drop formation (rather than film formation) even for low-surface-tension fluids, which significantly improves the heat transfer efficiency,” says Jonathan Boreyko, an assistant professor of mechanical engineering at Virginia Tech, who was not connected to this research. While the iCVD process itself is not new, he says, “showing here that it can be used even for the condensation of low-surface-tension fluids is of significant practical importance, as many real-life phase-change systems do not use water.”

    Saying the work is “of very high quality,” Boreyko adds that “simply showing for the first time that a thin, durable, and dry coating can promote the dropwise condensation of low-surface-tension fluids is very important for a wide variety of practical condenser systems.”

    The research was supported by the Shell-MIT Energy Initiative partnership. The team included former MIT graduate students Taylor Farnham and Adam Paxson, and former postdocs Dan Soto and Asli Ugur Katmis.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 1:40 pm on May 13, 2019 Permalink | Reply
    Tags: , , Lincoln Laboratory, MIT   

    From MIT News: “Artificial intelligence shines light on the dark web” 

    MIT News
    MIT Widget

    From MIT News

    May 13, 2019
    Kylie Foy | Lincoln Laboratory

    New tools can find patterns in vast online data to track and identify users on illicit forums.

    1
    Rob Spectre and Jennifer Dolle (center left and right, respectively) of the Human Trafficking Response Unit join Joseph Campbell (left) and Charlie Dagli (right) at Lincoln Laboratory to present how data analytical tools are aiding investigations. Photo: Glen Cooper

    2
    Charlie Dagli, Lincoln Laboratory technical principal investigator of the DARPA Memex program, presents his group’s work on artificial intelligence tools to analyze surface- and dark-web data. Photo: Glen Cooper

    3
    To match users from different forums who are likely the same person, an algorithm calculates similarities in profiles, such as their usernames; in content, such as similar phrasings; and in their network, such as the community with which they interact.

    Beneath the surface web, the public form of the internet you use daily to check email or read news articles, exists a concealed “dark web.” Host to anonymous, password-protected sites, the dark web is where criminal marketplaces thrive in the advertising and selling of weapons, drugs, and trafficked persons. Law enforcement agencies work continuously to stop these activities, but the challenges they face in investigating and prosecuting the real-world people behind the users who post on these sites are tremendous.

    “The pop-up nature of dark-web marketplaces makes tracking their participants and their activities extremely difficult,” says Charlie Dagli, a researcher in MIT Lincoln Laboratory’s Artificial Intelligence Technology and Systems Group.

    Dagli is referring to the fast rate at which dark-web markets close down (because they are hacked, raided, abandoned, or set up as an “exit scam” in which the site shuts down intentionally after customers pay for unfulfilled orders) and new ones appear. These markets’ short lifetimes, from a few months to a couple years, impede efforts to identify their users.

    To overcome this challenge, Lincoln Laboratory is developing new software tools to analyze surface- and dark-web data.

    These tools are leveraging the one benefit this whack-a-mole-like problem presents — the connections sellers and buyers maintain across multiple layers of the web, from surface to dark, and across dark-web forums. “This constant switching between sites is now an established part of how dark-web marketplaces operate,” Dagli says.

    Users are making new profiles constantly. Although they may not be employing the same usernames from site to site, they are keeping their connections alive by signaling to each other through their content. These signals can be used to link personas belonging to the same user across dark-web forums and, more revealingly, to link personas on the dark web to the surface web to uncover a user’s true identity.

    Linking users on the dark web is what law enforcement already tries to do. The problem is that the amount of data that they need to manually shuffle through — 500,000 phone numbers and 2 million sex ads posted a month — is too large and unstructured for them to find connections quickly. Thus, only a low percentage of cases can be pursued.

    To automate the persona-linking process, Lincoln Laboratory is training machine learning algorithms to compute the similarity between users on different forums. The computations are based on three aspects of users’ communications online: “How they identify to others, what they write about, and with whom they write to,” Dagli explains.

    The algorithm is first fed data from users on a given Forum A and creates an authorship model for each user. Then, data from users on Forum B are run against all user models from Forum A. To find matches for profile information, the algorithm looks for straightforward clues, such as changes in username spelling like “sergeygork” on Forum A to “sergey gorkin” on Forum B, or more subtle similarities like “joe knight” to “joe nightmare.”

    The next feature the system looks at is content similarity. The system picks up on unique phrases — for example, “fun in the sun” — that are used in multiple ads. “There’s a lot of copy-and-paste going on, so similar phrasings will pop up that are likely from the same user,” Dagli says. The system then looks for similarities in a user’s network, which is the circle of people that the user interacts with, and the topics that the user’s network discusses.

    The profile, content, and network features are then fused to provide a single output: a probability score that two personas from two forums represent the same real-life person.

    The researchers have been testing these persona-linking algorithms both with open-source Twitter and Instagram data and hand-labeled ground truth data from dark-web forums. All of the data used in this work are obtained through authorized means. The results are promising. “Every time we report a match, we are correct 95 percent of the time. The system is one of the best linking systems that we can find in the literature,” Dagli says.

    This work is the most recent development in ongoing research. From 2014 to 2017, Lincoln Laboratory contributed to the Defense Advanced Research Projects Agency (DARPA) Memex program. Memex resulted in a suite of surface- and dark-web data analysis software developed collaboratively with dozens of universities, national laboratories, and companies. Ten laboratory technologies spanning text, speech, and visual analytics that were created for Memex were released as open-source software via the DARPA Open Catalog.

    Today, more than 30 agencies worldwide are using Memex software to conduct investigations. One of the biggest users, and a stakeholder in Memex’s development, is the Human Trafficking Response Unit (HTRU) in the Manhattan District Attorney’s Office.

    Manhattan District Attorney Cyrus Vance Jr. stated in a written testimony to the U.S. House of Representatives that his office used Memex tools to screen more than 6,000 arrests for signs of human trafficking in 2017 alone. “We also used Memex in 271 human trafficking investigations and in six new sex trafficking indictments that were brought in 2017,” he stated. With the introduction of Memex, prostitution arrests screened by HTRU for human trafficking indicators increased from 5 to 62 percent, and investigations of New York Police Department prostitution-related arrests increased from 15 to 300 per year.

    Jennifer Dolle, the deputy chief of HTRU, visited the laboratory to present how the unit has benefited from these technologies. “We use these tools every single day. They really have changed how we do business in our office,” Dolle says, explaining that prior to Memex, a human trafficking investigation could take a considerably longer time.

    Now, Memex tools are enabling HTRU to quickly enhance emerging cases and build sex trafficking investigations from leads that have little information. For example, these tools — including one called TellFinder (built by Memex contributor Uncharted Software) for indexing, summarizing, and searching sex ad data — have been used to identify additional, underage victims from data in a single online prostitution advertisement. “These additional investigative leads allow HTRU to prosecute traffickers on violent felony charges and hold these defendants responsible for the true nature of the crimes they commit against vulnerable victims,” says Dolle.

    Researchers are continuing to learn how emerging technologies can be tailored to what agencies need and for how the dark web operates. “Data-driven machine learning has become a demonstrably important tool for law enforcement to combat illicit online marketplaces on the dark web,” says Lin Li, a principal investigator of this continuous work in the laboratory’s Human Dynamic Dark Networks program, which is funded through the laboratory’s Technology Office. “But, some of the ongoing challenges and areas of research include expanding our understanding of the demand economy, disrupting the supply economy, and gaining a better overall situational awareness.”

    A better understanding of how the supply-and-demand chains of the dark-web economy work will help the team develop technologies to disrupt these chains. Part of the goal is to raise the risks of participating in this illicit economy; linking personas on the dark web to those on the surface web is one potentially powerful tactic.

    “This fast-growing illicit economy was shown by DARPA to fund terrorist activities and shown by HTRU as a driver of modern-day slavery. Defeating terrorism and eliminating slavery are national and humanitarian needs,” says Joseph Campbell, leader of the Artifical Intelligence Technology and Systems Group. “Our group has extraodinary expertise in AI, machine learning, and the analysis of human networks based on information extracted from multilanguage speech, text, and video combined with network communications and activities. The state-of-the-art technologies that we create, develop, and advance are transferred to our sponsors, who use them daily with tremendous impact for these national and humanitarian needs.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:55 am on May 13, 2019 Permalink | Reply
    Tags: , , Buildings of the future may be lit by collections of glowing plants and designed around an infrastructure of sunlight harvesting water transport and soil collecting and composting systems., , Collaboration between MIT architect and chemical engineer could be at the center of new sustainable infrastructure for buildings., MIT, Nanobionic plant technology,   

    From MIT News: “Ambient plant illumination could light the way for greener buildings” 

    MIT News
    MIT Widget

    From MIT News

    May 9, 2019
    Becky Ham

    Collaboration between MIT architect and chemical engineer could be at the center of new sustainable infrastructure for buildings.

    1
    Glowing nanobionic watercress plants illuminate the Plant Properties Reading Room. Image: KVA Matx and Strano Research Group

    2
    Glowing nanobionic watercress illuminates the book “Paradise Lost.” Image: Strano Research Group

    3
    Pollinator Port – A Plant Properties room featuring an access port for light and pollinators to reach interior plants. Image: KVA Matx and Strano Research Group

    Buildings of the future may be lit by collections of glowing plants and designed around an infrastructure of sunlight harvesting, water transport, and soil collecting and composting systems. That’s the vision behind an interdisciplinary collaboration between an MIT architecture professor and a professor of chemical engineering.

    The light-emitting plants, which debuted in 2017, are not genetically modified to produce light. Instead, they are infused with nanoparticles that turn the plant’s stored energy into light, similar to how fireflies glow. “The transformation makes virtually any plant a sustainable, potentially revolutionary technology,” says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT. “It promises lighting independent of an electrical grid, with ‘batteries’ you never need to charge, and power lines that you never need to lay.”

    But Strano and his colleagues soon realized that they needed partners who could expand the concept and understand its challenges and potential as part of a future of sustainable energy. He reached out to Sheila Kennedy, professor of architecture at MIT and principal at Kennedy and Violich Architecture, who is known for her work in clean energy infrastructure.

    “The science was so new and emergent that it seemed like an interesting design challenge,” says Kennedy. “The work of this design needed to move to a different register, which went beyond the problem of how the plant nanobionics could be demonstrated in architecture. As a design team, we considered some fundamental questions, such as how to understand and express the idea of plant lighting as a living, biological technology and how to invite the public to imagine this new future with plants.”

    “If we treat the development of the plant as we would just another light bulb, that’s the wrong way to go,” Strano adds.

    In 2017, Kennedy and Strano received a Professor Amar G. Bose Research Grant to build on their collaboration. The MIT faculty grants support unconventional, ahead-of-the-curve, and often interdisciplinary research endeavors that are unlikely to be funded through traditional avenues, yet have the potential to lead to big breakthroughs.

    Their first year of the Bose grant yielded several generations of the light-emitting watercress plants, which shine longer and brighter than the first experimental versions. The team is evaluating a new component to the nanobiotic plants that they call light capacitor particles. The capacitor, in the form of infused nanoparticles in the plant, stores spikes in light generation and “bleeds them out over time,” Strano explains. “Normally the light created in the biochemical reaction can be bright but fades quickly over time. Capacitive particles extend the duration of the generated plant light from hours to potentially days and weeks.”

    The researchers have added to their original patent on the light-emitting plant concept, filing a new patent on the capacitor and other components as well, Strano says.

    Designing for display

    As the nanobionic plant technology has advanced, the team is also envisioning how people might interact with the plants as part of everyday life. The architectural possibilities of their light-emitting plant will be on display within a new installation, “Plant Properties, a Future Urban Development,” at the Cooper Hewitt, Smithsonian Design Museum in New York opening May 10.

    Visitors to the installation, part of the 2019 “Nature—Cooper Hewitt Design Triennial” exhibition, can peek into a scaled architectural model of a New York City tenement building — which also serves as a plant incubator — to see the plants at work. The installation also demonstrates a roadmap for how an existing residential building could be adapted and transformed by design to support the natural growth of plants in a future when available energy could be very limited.

    “In Plant Properties, the nanobionic plant-based infrastructure is designed to use nature’s own resources,” says Kennedy. “The building harvests and transports sunlight, collects and recycles water, and enriches soil with compost.”

    The invitation to contribute to the Cooper Hewitt exhibition offered an unexpected way to demonstrate the plants’ possibilities, but designing an exhibit brought about a whole new set of challenges, Kennedy explains. “In the world of design museums, you’re usually asked to show something that’s already been exhibited, but this is new work and a new milestone in this project.”

    “We learned a lot about the care of plants,” Strano adds. “It’s one thing to make a laboratory demonstration, but it’s another entirely to make 33 continuous weeks of a public demonstration.”

    The researchers had to come up with a way to showcase the plants in a low-light museum environment where dirt and insects attracted by living plants are usually banished. “But rather than seeing this as a sort of insurmountable obstacle,” says Kennedy, “we realized that this kind of situation — how do you enable living plants to thrive in the enclosed setting of a museum — exactly paralleled the architectural problem of how to support significant quantities of plants growing inside buildings.”

    In the installation, multiple peepholes into the building model offer glimpses into the ways people in the building are living with the plants. Museum visitors are encouraged to join the experiment and crowdsource information on plant growth and brightness, by uploading their own photos of the plants to Instagram and tagging the MIT Plant Nanobiotics lab, using @plantproperties.

    The team is also collecting data on how the plants respond to the nanoparticles and other potential stresses. “The plants are actually under more stress from being in the museum environment than from the modifications that we introduce, but these effects need to be studied and mitigated if we are to use plants for indoor lighting,” Strano notes.

    Bright and nurturing futures

    Kennedy and Strano say the plants could be at the center of a new — but also “pre-eclectic” — idea in architecture.

    For most of human history, Kennedy explains, natural processes from sunlight to waste composting were part of the essential infrastructure of buildings. But these processes have been excluded in modern thinking or hidden away, preventing people from coming face to face with the environmental costs of energy infrastructure made from toxic materials and powered by fossil fuels.

    “People don’t question the impacts of our own mainstream electrical grid today. It’s very vulnerable, it’s very brittle, it’s so very wasteful and it’s also full of toxic material,” she says. “We don’t question this, but we need to.”

    “Lighting right now consumes a vast portion of our energy demand, approaching close to 20 percent of our global energy consumption, generating two gigatons of carbon dioxide per year,” Strano adds. “Consider that the plants replace more than just the lamp on your desk. There’s an enormous energy footprint that could potentially be replaced by the light-emitting plant.”

    The team is continuing to work on new ways to infuse the nanoparticles in the plants, so that they work over the lifetime of the plant, as well as experimenting on larger plants such as trees. But for the plants to thrive, architects will have to develop building infrastructure that integrates the plants into a new internal ecosystem of sunlight, water and waste disposal, Kennedy says.

    “If plants are to provide people with light, we need to keep plants healthy to benefit from everything they provide for us,” she says. “We think this is going to trigger a much more caring or nurturing relationship of people and their plants, or plants and the people that they illuminate.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:29 am on May 13, 2019 Permalink | Reply
    Tags: , Chemical Engineering/Energy/Biological Engineering Projects Laboratory, , MIT   

    From MIT News: “Making it real” 

    MIT News
    MIT Widget

    From MIT News

    May 13, 2019
    Emily Makowski | School of Engineering

    Students in a cross-disciplinary projects course are working on real-world engineering problems posed by companies and MIT research labs.

    1
    Left to right: Sebastian Esquivel, Jenna Ahn, and Crystal Chen break down whey, normally a byproduct, into components for use in animal feed. Image: Lillie Paquette / School of Engineering.

    2
    Professor Gregory Rutledge advises two students in 10.26/27/29 (Chemical Engineering/Energy/Biological Engineering Projects Laboratory). Image: Lillie Paquette / School of Engineering.

    3
    Left to right: David Silverstein, Gianna Garza, and Connor Chung are researching how the enzyme PETase could be used to break down plastic. Image: Lillie Paquette / School of Engineering.

    Cloudy beige liquid swirls inside a large bioreactor resembling a French press as Jenna Ahn examines small flasks nearby. The lab where Ahn is working, in the subbasement of Building 66, has the feel of a beehive. She’s part of one of nine teams of undergraduates huddling in groups at their benches. Every now and then, someone darts off to use a larger piece of equipment among the shakers, spectrometers, flasks, scales, incubators, and bioreactors lining the walls.

    These students aren’t practicing routine distillations or titrations set up by an instructor. Each team of three or four is trying to solve a problem that doesn’t yet have an answer. In 10.26/27/29 (Chemical Engineering/Energy/Biological Engineering Projects Laboratory), students are focused on data-driven, applied research projects. They work on engineering problems posed by companies and by research labs from across the Institute, with the goal of finding solutions that can be applied to the real world.

    Ahn, a junior majoring in chemical and biological engineering, and her teammates are studying acid whey, a byproduct of cheese and yogurt. Although whey has nutritional value, it is often treated as a waste product, and its disposal can remove oxygen from waterways and kill aquatic life. While it can be purified and treated like wastewater, the process is expensive.

    Ahn’s team is using genetically engineered yeast to break down whey into nutritious components like sugars and omega-3 fatty acids, which could then be introduced back into the food chain. After combining the yeast with the whey, the team regularly checks dissolved oxygen and pH levels and monitors whether the yeast is breaking down the whey into its components. “This could be turned into a component of animal feed for cows and other animals,” says Ahn, gesturing to the swirling the mixture in her flask.

    Fundamentals in action

    Gregory Rutledge, the Lammot du Pont Professor of Chemical Engineering, has been the instructor in charge of 10.26/27/29 (Chemical Engineering/Energy/Biological Engineering Projects Laboratory) for about five years. The excitement among the course’s students stems from the knowledge that they are directly contributing to advancing technology, he says. “It’s a great motivator. They may have gotten fundamentals in their classes, but they may not have seen them in action.”

    The course has existed in its current form for about 30 years, Rutledge estimates. Its chemical engineering, biological engineering, and energy-related projects appeal to a wide variety of interests. Students are given project descriptions at the beginning of the semester and have flexibility in their choices.

    In the current format, students give presentations on their research progress throughout the semester and are evaluated by the 10.26/27/29 professors and their peers. At the end of the term, final presentations are judged by faculty from the entire Department of Chemical Engineering during a project showcase.

    The competitive element, Rutledge says, is just one part of how the course has changed over time. “It has evolved toward this organically, as we figure out what students need to know and how to best get that to them.”

    Each year, the focuses of the students’ projects change. Two of this year’s teams are working in collaboration with Somerville, Massachusetts, startup C16 Biosciences, trying to use yeast to produce a sustainable alternative to palm oil. The production of palm oil, which is primarily used for culinary and cosmetic purposes, is a leading cause of deforestation.

    “We’re trying to increase production of saturated fat sustainably,” explains Kaitlyn Hennacy, a junior majoring in chemical engineering. “This doesn’t require cutting down rainforests and could be a substitute in many applications.” Hennacy is examining a cuvette of yellow liquid in which there is a collection of bright orange blobs. The blobs’ color is a carotenoid pigment produced as a byproduct during the process. Her team is using seven different solvents, such as hexane and pentane, to extract a palm oil alternative from the yeast.

    “It’s the intersection of an energy-related project and a consumer project,” says Carlos Sendao, one of Hennacy’s teammates and a fellow chemical engineering major. “This is a challenge I knew to take.” Sendao is going to continue research on this project over the summer through the Undergraduate Research Opportunities Program (UROP) and the MIT Energy Initiative.

    Another team is looking into recycling plastics with an enzyme called PETase, which breaks down polyethylene terephthalate (PET), the type of plastic found in single-use water bottles. “One of the biggest constraints is time,” says Connor Chung, a junior majoring in chemical engineering. “We only have three to four months to learn as much as we can about this enzyme.”

    Life lessons

    Every year Rutledge is impressed with how much students learn and grow over the course of the semester. The problems they’re tackling aren’t easy, and working in teams presents challenges as students navigate the dynamics of group work.

    “They’re also learning a lot about life. They’re probably going to run into something in the future — whether it’s a boss, a team member, or a piece of lab equipment — that doesn’t work in the way they expect,” he says. “We try to give the students the tools if or when they come across this. And when they give those final presentations, you can see they really have evolved as engineers,” he adds.

    The approach seems to be effective, says Rutledge. “People will come back one, two, three years later when they’re working,” he says. “They say, ‘I learned so much. This is what I actually do.’”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 12:01 pm on May 10, 2019 Permalink | Reply
    Tags: "Painting a fuller picture of how antibiotics act", An additional mechanism that helps some antibiotics kill bacteria., , “We wanted to fundamentally understand which previously undescribed metabolic pathways might be important for us to understand how antibiotics kill.”, “White-box” machine-learning, Exploiting this mechanism could help researchers to discover new drugs that could be used along with antibiotics to enhance their killing ability the researchers say., , MIT, Some of the metabolic byproducts of antibiotics are toxic and help contribute to killing the cells., The findings suggest that it may be possible to enhance the effects of some antibiotics by delivering them along with other drugs that stimulate metabolic activity.   

    From MIT News: “Painting a fuller picture of how antibiotics act” 

    MIT News
    MIT Widget

    From MIT News

    May 9, 2019
    Anne Trafton

    1
    MIT biological engineers used a novel machine-learning approach to discover a mechanism that helps certain antibiotics kill bacteria. Image: Chelsea Turner, MIT

    Most antibiotics work by interfering with critical functions such as DNA replication or construction of the bacterial cell wall. However, these mechanisms represent only part of the full picture of how antibiotics act.

    In a new study of antibiotic action, MIT researchers developed a new machine-learning approach to discover an additional mechanism that helps some antibiotics kill bacteria. This secondary mechanism involves activating the bacterial metabolism of nucleotides that the cells need to replicate their DNA.

    “There are dramatic energy demands placed on the cell as a result of the drug stress. These energy demands require a metabolic response, and some of the metabolic byproducts are toxic and help contribute to killing the cells,” says James Collins, the Termeer Professor of Medical Engineering and Science in MIT’s Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering, and the senior author of the study. Collins is also the faculty co-lead of the Abdul Latif Jameel Clinic for Machine Learning in Health.

    Exploiting this mechanism could help researchers to discover new drugs that could be used along with antibiotics to enhance their killing ability, the researchers say.

    Jason Yang, an IMES research scientist, is the lead author of the paper, which appears in the May 9 issue of Cell. Other authors include Sarah Wright, a recent MIT MEng recipient; Meagan Hamblin, a former Broad Institute research technician; Miguel Alcantar, an MIT graduate student; Allison Lopatkin, an IMES postdoc; Douglas McCloskey and Lars Schrubbers of the Novo Nordisk Foundation Center for Biosustainability; Sangeeta Satish and Amir Nili, both recent graduates of Boston University; Bernhard Palsson, a professor of bioengineering at the University of California at San Diego; and Graham Walker, an MIT professor of biology.

    “White-box” machine-learning

    Collins and Walker have studied the mechanisms of antibiotic action for many years, and their work has shown that antibiotic treatment tends to create a great deal of cellular stress that makes huge energy demands on bacterial cells. In the new study, Collins and Yang decided to take a machine-learning approach to investigate how this happens and what the consequences are.

    Before they began their computer modeling, the researchers performed hundreds of experiments in E. coli. They treated the bacteria with one of three antibiotics — ampicillin, ciprofloxacin, or gentamicin, and in each experiment, they also added one of about 200 different metabolites, including an array of amino acids, carbohydrates, and nucleotides (the building blocks of DNA). For each combination of antibiotics and metabolites, they measured the effects on cell survival.

    “We used a diverse set of metabolic perturbations so that we could see the effects of perturbing nucleotide metabolism, amino acid metabolism, and other kinds of metabolic subnetworks,” Yang says. “We wanted to fundamentally understand which previously undescribed metabolic pathways might be important for us to understand how antibiotics kill.”

    Many other researchers have used machine-learning models to analyze data from biological experiments, by training an algorithm to generate predictions based on experimental data. However, these models are typically “black-box,” meaning that they don’t reveal the mechanisms that underlie their predictions.

    To get around that problem, the MIT team took a novel approach that they call “white-box” machine-learning. Instead of feeding their data directly into a machine-learning algorithm, they first ran it through a genome-scale computer model of E. coli metabolism that had been characterized by Palsson’s lab. This allowed them to generate an array of “metabolic states” described by the data. Then, they fed these states into a machine-learning algorithm, which was able to identify links between the different states and the outcomes of antibiotic treatment.

    Because the researchers already knew the experimental conditions that produced each state, they were able to determine which metabolic pathways were responsible for higher levels of cell death.

    “What we demonstrate here is that by having the network simulations first interpret the data and then having the machine-learning algorithm build a predictive model for our antibiotic lethality phenotypes, the items that get selected by that predictive model themselves directly map onto pathways that we’ve been able to experimentally validate, which is very exciting,” Yang says.

    Markus Covert, an associate professor of bioengineering at Stanford University, says the study is an important step toward showing that machine learning can be used to uncover the biological mechanisms that link inputs and outputs.

    “Biology, especially for medical applications, is all about mechanism,” says Covert, who was not involved in the research. “You want to find something that is druggable. For the typical biologist, it hasn’t been meaningful to find these kinds of links without knowing why the inputs and outputs are linked.”

    Metabolic stress

    This model yielded the novel discovery that nucleotide metabolism, especially metabolism of purines such as adenine, plays a key role in antibiotics’ ability to kill bacterial cells. Antibiotic treatment leads to cellular stress, which causes cells to run low on purine nucleotides. The cells’ efforts to ramp up production of these nucleotides, which are necessary for copying DNA, boost the cells’ overall metabolism and leads to a buildup of harmful metabolic byproducts that can kill the cells.

    “We now believe what’s going on is that in response to this very severe purine depletion, cells turn on purine metabolism to try to deal with that, but purine metabolism itself is very energetically expensive and so this amplifies the energic imbalance that the cells are already facing,” Yang says.

    The findings suggest that it may be possible to enhance the effects of some antibiotics by delivering them along with other drugs that stimulate metabolic activity. “If we can move the cells to a more energetically stressful state, and induce the cell to turn on more metabolic activity, this might be a way to potentiate antibiotics,” Yang says.

    The “white-box” modeling approach used in this study could also be useful for studying how different types of drugs affect diseases such as cancer, diabetes, or neurodegenerative diseases, the researchers say. They are now using a similar approach to study how tuberculosis survives antibiotic treatment and becomes drug-resistant.

    The research was funded by the Defense Threat Reduction Agency, the National Institutes of Health, the Novo Nordisk Foundation, the Paul G. Allen Frontiers Group, the Broad Institute of MIT and Harvard, and the Wyss Institute for Biologically Inspired Engineering.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:42 am on May 8, 2019 Permalink | Reply
    Tags: "Explosions of universe’s first stars spewed powerful jets", , , , , MIT   

    From MIT News: “Explosions of universe’s first stars spewed powerful jets” 

    MIT News
    MIT Widget

    From MIT News

    May 8, 2019
    Jennifer Chu

    Instead of ballooning into spheres, as once thought, early supernovae ejected jets that may have seeded new stars.

    1
    Rana Ezzeddine and Anna Frebel of MIT have observed evidence that the first stars in the universe exploded as asymmetric supernova, strong enough to scatter heavy elements such as zinc across the early universe. Image: Melanie Gonick

    2
    A simulation shows what the first supernovae could have looked like: Instead of spherical as many scientists have assumed, these brilliant explosions may have been asymmetric jets that shot heavy elements such as zinc (green dots) out into the early universe. This simulation shows the shape of the supernova, 50 seconds after the initial explosion. Image: Melanie Gonick

    Several hundred million years after the Big Bang, the very first stars flared into the universe as massively bright accumulations of hydrogen and helium gas. Within the cores of these first stars, extreme, thermonuclear reactions forged the first heavier elements, including carbon, iron, and zinc.

    These first stars were likely immense, short-lived fireballs, and scientists have assumed that they exploded as similarly spherical supernovae.

    But now astronomers at MIT and elsewhere have found that these first stars may have blown apart in a more powerful, asymmetric fashion, spewing forth jets that were violent enough to eject heavy elements into neighboring galaxies. These elements ultimately served as seeds for the second generation of stars, some of which can still be observed today.

    In a paper published today in The Astrophysical Journal, the researchers report a strong abundance of zinc in HE 1327-2326, an ancient, surviving star that is among the universe’s second generation of stars. They believe the star could only have acquired such a large amount of zinc after an asymmetric explosion of one of the very first stars had enriched its birth gas cloud.

    “When a star explodes, some proportion of that star gets sucked into a black hole like a vacuum cleaner,” says Anna Frebel, an associate professor of physics at MIT and a member of MIT’s Kavli Institute for Astrophysics and Space Research. “Only when you have some kind of mechanism, like a jet that can yank out material, can you observe that material later in a next-generation star. And we believe that’s exactly what could have happened here.”

    “This is the first observational evidence that such an asymmetric supernova took place in the early universe,” adds MIT postdoc Rana Ezzeddine, the study’s lead author. “This changes our understanding of how the first stars exploded.”

    “A sprinkle of elements”

    HE 1327-2326 was discovered by Frebel in 2005. At the time, the star was the most metal-poor star ever observed, meaning that it had extremely low concentrations of elements heavier than hydrogen and helium — an indication that it formed as part of the second generation of stars, at a time when most of the universe’s heavy element content had yet to be forged.

    “The first stars were so massive that they had to explode almost immediately,” Frebel says. “The smaller stars that formed as the second generation are still available today, and they preserve the early material left behind by these first stars. Our star has just a sprinkle of elements heavier than hydrogen and helium, so we know it must have formed as part of the second generation of stars.”

    In May of 2016, the team was able to observe the star which orbits close to Earth, just 5,000 light years away. The researchers won time on NASA’s Hubble Space Telescope over two weeks, and recorded the starlight over multiple orbits. They used an instrument aboard the telescope, the Cosmic Origins Spectrograph, to measure the minute abundances of various elements within the star.

    NASA/ESA Hubble Telescope


    NASA Hubble Cosmic Origins Spectrograph

    The spectrograph is designed with high precision to pick up faint ultraviolet light. Some of those wavelength are absorbed by certain elements, such as zinc. The researchers made a list of heavy elements that they suspected might be within such an ancient star, that they planned to look for in the UV data, including silicon, iron, phosophorous, and zinc.

    “I remember getting the data, and seeing this zinc line pop out, and we couldn’t believe it, so we redid the analysis again and again,” Ezzeddine recalls. “We found that, no matter how we measured it, we got this really strong abundance of zinc.”

    A star channel opens

    Frebel and Ezzeddine then contacted their collaborators in Japan, who specialize in developing simulations of supernovae and the secondary stars that form in their aftermath. The researchers ran over 10,000 simulations of supernovae, each with different explosion energies, configurations, and other parameters. They found that while most of the spherical supernova simulations were able to produce a secondary star with the elemental compositions the researchers observed in HE 1327-2326, none of them reproduced the zinc signal.

    As it turns out, the only simulation that could explain the star’s makeup, including its high abundance of zinc, was one of an aspherical, jet-ejecting supernova of a first star. Such a supernova would have been extremely explosive, with a power equivalent to about a nonillion times (that’s 10 with 30 zeroes after it) that of a hydrogen bomb.

    “We found this first supernova was much more energetic than people have thought before, about five to 10 times more,” Ezzeddine says. “In fact, the previous idea of the existence of a dimmer supernova to explain the second-generation stars may soon need to be retired.”

    The team’s results may shift scientists’ understanding of reionization, a pivotal period during which the gas in the universe morphed from being completely neutral, to ionized — a state that made it possible for galaxies to take shape.

    “People thought from early observations that the first stars were not so bright or energetic, and so when they exploded, they wouldn’t participate much in reionizing the universe,” Frebel says. “We’re in some sense rectifying this picture and showing, maybe the first stars had enough oomph when they exploded, and maybe now they are strong contenders for contributing to reionization, and for wreaking havoc in their own little dwarf galaxies.”

    These first supernovae could have also been powerful enough to shoot heavy elements into neighboring “virgin galaxies” that had yet to form any stars of their own.

    “Once you have some heavy elements in a hydrogen and helium gas, you have a much easier time forming stars, especially little ones,” Frebel says. “The working hypothesis is, maybe second generation stars of this kind formed in these polluted virgin systems, and not in the same system as the supernova explosion itself, which is always what we had assumed, without thinking in any other way. So this is opening up a new channel for early star formation.”

    This research was funded, in part, by the National Science Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 8:32 am on May 8, 2019 Permalink | Reply
    Tags: "North Atlantic Ocean productivity has dropped 10 percent during Industrial era", , DMS-dimethylsulfide, , MIT, MSA-methanesulfonic acid, , Phytoplankton, The decline coincides with steadily rising surface temperatures over the same period of time.,   

    From MIT News: “North Atlantic Ocean productivity has dropped 10 percent during Industrial era” 

    MIT News
    MIT Widget

    From MIT News

    May 6, 2019
    Jennifer Chu

    Phytoplankton decline coincides with warming temperatures over the last 150 years.

    1
    Matt Osman, a graduate student in MIT’s Department of Earth, Atmospheric, and Planetary Sciences, overlooking a frozen Baffin Bay to the west, Nuussuaq Peninsula Ice Cap, west Greenland. Image: Luke Trusel (Rowan University).

    2
    Ice core field camp on a clear spring evening, Disko Island Ice Cap, west Greenland. Image: Luke Trusel (Rowan University).

    3
    Iceberg in Disko Bay, west Greenland. Image: Luke Trusel (Rowan University)

    4
    Retrieving an ice core section from the drill barrel during a west Greenland snowstorm, west Greenland Ice Sheet. Image: Sarah Das (WHOI).

    Virtually all marine life depends on the productivity of phytoplankton — microscopic organisms that work tirelessly at the ocean’s surface to absorb the carbon dioxide that gets dissolved into the upper ocean from the atmosphere.

    Through photosynthesis, these microbes break down carbon dioxide into oxygen, some of which ultimately gets released back to the atmosphere, and organic carbon, which they store until they themselves are consumed. This plankton-derived carbon fuels the rest of the marine food web, from the tiniest shrimp to giant sea turtles and humpback whales.

    Now, scientists at MIT, Woods Hole Oceanographic Institution (WHOI), and elsewhere have found evidence that phytoplankton’s productivity is declining steadily in the North Atlantic, one of the world’s most productive marine basins.

    In a paper appearing today in Nature, the researchers report that phytoplankton’s productivity in this important region has gone down around 10 percent since the mid-19th century and the start of the Industrial era. This decline coincides with steadily rising surface temperatures over the same period of time.

    Matthew Osman, the paper’s lead author and a graduate student in MIT’s Department of Earth, Atmospheric, and Planetary Sciences and the MIT/WHOI Joint Program in Oceanography, says there are indications that phytoplankton’s productivity may decline further as temperatures continue to rise as a result of human-induced climate change.

    “It’s a significant enough decine that we should be concerned,” Osman says. “The amount of productivity in the oceans roughly scales with how much phytoplankton you have. So this translates to 10 percent of the marine food base in this region that’s been lost over the industrial era. If we have a growing population but a decreasing food base, at some point we’re likely going to feel the effects of that decline.”

    Drilling through “pancakes” of ice

    Osman and his colleagues looked for trends in phytoplankton’s productivity using the molecular compound methanesulfonic acid, or MSA. When phytoplankton expand into large blooms, certain microbes emit dimethylsulfide, or DMS, an aerosol that is lofted into the atmosphere and eventually breaks down as either sulfate aerosol, or MSA, which is then deposited on sea or land surfaces by winds.

    “Unlike sulfate, which can have many sources in the atmosphere, it was recognized about 30 years ago that MSA had a very unique aspect to it, which is that it’s only derived from DMS, which in turn is only derived from these phytoplankton blooms,” Osman says. “So any MSA you measure, you can be confident has only one unique source — phytoplankton.”

    In the North Atlantic, phytoplankton likely produced MSA that was deposited to the north, including across Greenland. The researchers measured MSA in Greenland ice cores — in this case using 100- to 200-meter-long columns of snow and ice that represent layers of past snowfall events preserved over hundreds of years.

    “They’re basically sedimentary layers of ice that have been stacked on top of each other over centuries, like pancakes,” Osman says.

    The team analyzed 12 ice cores in all, each collected from a different location on the Greenland ice sheet by various groups from the 1980s to the present. Osman and his advisor Sarah Das, an associate scientist at WHOI and co-author on the paper, collected one of the cores during an expedition in April 2015.

    “The conditions can be really harsh,” Osman says. “It’s minus 30 degrees Celsius, windy, and there are often whiteout conditions in a snowstorm, where it’s difficult to differentiate the sky from the ice sheet itself.”

    The team was nevertheless able to extract, meter by meter, a 100-meter-long core, using a giant drill that was delivered to the team’s location via a small ski-equipped airplane. They immediately archived each ice core segment in a heavily insulated cold storage box, then flew the boxes on “cold deck flights” — aircraft with ambient conditions of around minus 20 degrees Celsius. Once the planes touched down, freezer trucks transported the ice cores to the scientists’ ice core laboratories.

    “The whole process of how one safely transports a 100-meter section of ice from Greenland, kept at minus-20-degree conditions, back to the United States is a massive undertaking,” Osman says.

    Cascading effects

    The team incorporated the expertise of researchers at various labs around the world in analyzing each of the 12 ice cores for MSA. Across all 12 records, they observed a conspicuous decline in MSA concentrations, beginning in the mid-19th century, around the start of the Industrial era when the widescale production of greenhouse gases began. This decline in MSA is directly related to a decline in phytoplankton productivity in the North Atlantic.

    “This is the first time we’ve collectively used these ice core MSA records from all across Greenland, and they show this coherent signal. We see a long-term decline that originates around the same time as when we started perturbing the climate system with industrial-scale greenhouse-gas emissions,” Osman says. “The North Atlantic is such a productive area, and there’s a huge multinational fisheries economy related to this productivity. Any changes at the base of this food chain will have cascading effects that we’ll ultimately feel at our dinner tables.”

    The multicentury decline in phytoplankton productivity appears to coincide not only with concurrent long-term warming temperatures; it also shows synchronous variations on decadal time-scales with the large-scale ocean circulation pattern known as the Atlantic Meridional Overturning Circulation, or AMOC. This circulation pattern typically acts to mix layers of the deep ocean with the surface, allowing the exchange of much-needed nutrients on which phytoplankton feed.

    In recent years, scientists have found evidence that AMOC is weakening, a process that is still not well-understood but may be due in part to warming temperatures increasing the melting of Greenland’s ice. This ice melt has added an influx of less-dense freshwater to the North Atlantic, which acts to stratify, or separate its layers, much like oil and water, preventing nutrients in the deep from upwelling to the surface. This warming-induced weakening of the ocean circulation could be what is driving phytoplankton’s decline. As the atmosphere warms the upper ocean in general, this could also further the ocean’s stratification, worsening phytoplankton’s productivity.

    “It’s a one-two punch,” Osman says. “It’s not good news, but the upshot to this is that we can no longer claim ignorance. We have evidence that this is happening, and that’s the first step you inherently have to take toward fixing the problem, however we do that.”

    This research was supported in part by the National Science Foundation (NSF), the National Aeronautics and Space Administration (NASA), as well as graduate fellowship support from the US Department of Defense Office of Naval Research.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 1:26 pm on May 4, 2019 Permalink | Reply
    Tags: "Six suborbital research payloads from MIT fly to space and back", Blue Origin, Living Distance: Xin Liu’s performance sculpture, Maiden Flight: A queen bee and some 40 attendants were on board the research flight, MIT, Self-contained nanolab sent for a microgravity experiment, Telepresent drawings in space, TESSERAE three-tile subassembly   

    From MIT News: “Six suborbital research payloads from MIT fly to space and back” 

    MIT News
    MIT Widget

    From MIT News

    May 3, 2019
    Stephanie Strom | MIT Media Lab

    1
    MIT Media Lab researchers (l-r) Xin Liu, Felix Kraemer, Ariel Ekblaw, Pete Dilworth, Rachel Smith, and Harpreet Sareen stand in front of the Blue Origin capsule holding their six payloads.

    2
    Media Lab research payloads lift off with the Blue Origin New Shepard spacecraft on May 2. Image: Blue Origin

    3
    Researchers Harpreet Sareen, Rachel Smith, and Felix Kraemer prepare their projects for launch. Image: Ariel Ekblaw

    4
    TESSERAE three-tile subassembly, with v2 hardware for Blue Origin suborbital flight. Image: Ariel Ekblaw

    5
    Maiden Flight: A queen bee and some 40 attendants were on board the research flight. Image: Mediated Matter group

    6
    Living Distance: Xin Liu’s performance sculpture, one of six Media Lab projects aboard the New Shepard rocket. Image: Tim Saputo

    7
    Self-contained nanolab sent for a microgravity experiment in the rocket. Image: Harpreet Sareen, Anna Garbier, Jiefu Zheng

    8
    Telepresent drawings in space. Image: Ani Liu

    Space Exploration Initiative research aboard Blue Origin’s New Shepard experiment capsule crossed the Karman line for three minutes of sustained microgravity.

    Blast off! MIT made its latest foray into research in space on May 2 via six payloads from the Media Lab Space Exploration Initiative, tucked into Blue Origin’s New Shepard reusable space vehicle that took off from a launchpad in West Texas.

    It was also the first time in the history of the Media Lab that in-house research projects were launched into space, for several minutes of sustained microgravity. The results of that research may have big implications for semiconductor manufacturing, art and telepresence, architecture and farming, among other things.

    “The projects we’re testing operate fundamentally different in Earth’s gravity compared to how they would operate in microgravity,” explained Ariel Ekblaw, the founder and lead of the Media Lab’s Space Exploration Initiative.

    Previously, the Media Lab sent projects into microgravity aboard the plane used by NASA to train astronauts, lovingly nicknamed “the vomit comet.” These parabolic flights provide repeated 15 to 30 second intervals of near weightlessness. The New Shepard experiment capsule will coast in microgravity for significantly longer and cross the Karman line (the formal boundary of “space”) in the process. While that may not seem like much time, it’s enough to get a lot accomplished.

    “The capsule where the research takes place arcs through space for three minutes, which gives us precious moments of sustained, high quality microgravity,” Ekblaw said. “This provides an opportunity to expand our experiments from prior parabolic flight protocols, and test entirely new research as well.”

    Depending on the results of the experiments done during New Shepard’s flight, some of the projects will undergo further, long-term research aboard the International Space Station, Ekblaw said.

    On this trip, she sent Tessellated Electromagnetic Space Structures for the Exploration of Reconfigurable, Adaptive Environments, otherwise known as TESSERAE, into space. The ultimate goal for these sensor-augmented hexagonal and pentagonal “tiles” is to autonomously self-assemble into space structures. These flexible, reconfigurable modules can then be used for habitat construction, in-space assembly of satellites, or even as infrastructure for parabolic mirrors. Ekblaw hopes TESSERAE will one day support in-orbit staging bases for human exploration of the surface of the moon or Mars, or enable low Earth orbit space tourism.

    An earlier prototype, flown on a parabolic flight in November 2017, validated the research concept mechanical structure, polarity arrangement of bonding magnets, and the self-assembly physical protocol. On the Blue Origin flight, Ekblaw is testing a new embedded sensor network in the tiles, as well as their communication architecture and guidance control aspects of their self-assembly capabilities. “We’re testing whether they’ll autonomously circulate, find correct neighbors, and bond together magnetically in microgravity for robust self-assembly,” Ekblaw said.

    Another experiment aboard New Shepard combined art with the test of a tool for future space exploration — traversing microgravity with augmented mobility. Living Distance, an artwork conceived by the Space Exploration Initiative’s art curator, Xin Liu, explores freedom of movement via a wisdom tooth — yes, you read that correctly!

    The tooth traveled to space carried by a robotic device named EBIFA and encased in a crystalline container. Once New Shepard entered space, the container burst open and EBIFA swung into action, shooting cords out with magnetic tips to latch onto a metal surface. The tooth then floated through space with minimal interference in the virtually zero-gravity environment.

    “In this journey, the tooth became a newborn entity in space, its crystalline, sculptural body and life supported by an electromechanical system,” Xin Liu wrote. “Each of its weightless movements was carefully calculated on paper and modeled in simulation software, as there can never be a true test like this on Earth.”

    The piece builds on a performance art work called Orbit Weaver that Liu performed last year during a parabolic flight, where she was physically tethered to a nylon cord that floated freely and attached to nearby surfaces. Orbit Weaver and Living Distance may offer insights to future human space explorers about how best to navigate weightlessness.

    A piece of charcoal also made the trip to space inside a chamber lined with drawing paper, part of a project designed by Ani Liu, a Media Lab alumna. In microgravity, the charcoal will chart its own course inside the chamber, marking the paper as it floats through an arc far above the Earth.

    When the chamber returns to the Media Lab, the charcoal will join forces with a KUKA robot that will mimic the charcoal’s trajectory during the three-ish minutes of coasting in microgravity. Together, the charcoal and the robot will become a museum exhibit that provides a demonstration of motion in microgravity to a broad audience and illustrates the Space Exploration Initiative’s aim to democratize access to space and invite the public to engage in space exploration.

    Harpreet Sareen, another Media Lab alum, tested how crystals form in microgravity, research that may eventually lead to manufacturing semiconductors in space.

    Semiconductors used in today’s technology require crystals with extremely high levels of purity and perfect shapes, but gravity interferes with crystal growth on Earth, resulting in faults, contact stresses, and other flaws. Sareen and his collaborator, Anna Garbier, created a nano-sized lab in a box a little smaller than a half-gallon milk carton. The electric current that kicked off growth of the crystals during the three minutes the New Shepard capsule was suborbital was triggered by onboard rocket commands from Blue Origin.

    The crystals will be evaluated for potential industrial applications, and they also have a future as an art installation: Floral Cosmonauts.

    And then there are the 40 or so bees (one might say “apionauts”) that made the trip into space on behalf of the Mediated Matter group at the Media Lab, which is interested in seeing the impact space travel has on a queen bee and her retinue. Two queen bees that were inseminated at a U.S. Department of Agriculture facility in Louisiana went to space, each with roughly 20 attendant bees whose job it was to feed her and help control her body temperature.

    The bees traveled via two small containers — metabolic support capsules — into which they previously built honeycomb structures. This unique design gives them a familiar environment for their trip. A modified GoPro camera, pointed into the specially designed container housing the bees, was fitted into the top of the case to film the insects and create a record of their behavior during flight.

    Everything inside the case was designed to make the journey as comfortable as possible for the bees, right down to a tiny golden heating pad that was to kick into action if the temperature dropped too low for a queen bee’s comfort.

    Researchers in the Mediated Matter group will study the behavior of the bees when they return to Earth and are reintroduced to a colony at the Media Lab. Will the queens lay their eggs? Will those eggs hatch? And can bees who’ve been to space continue making pollen and honey once they’ve returned to Earth? Those are among the many questions the team will be asking.

    “We currently have no robotic alternative to bees for pollination of many crops,” Ekblaw said. “If we want to grow crops on Mars, we may need to bring bees with us. Knowing if they can survive a mission, reintegrate into the hive, and thrive afterwards is critical.”

    As these projects show, the Space Exploration Initiative unites engineers, scientists, artists, and designers across a multifaceted research portfolio. The team looks forward to a regular launch cadence and progressing through microgravity research milestones — from parabolic flights, to further launch opportunities with Blue Origin, to the International Space Station and even lunar landings.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: