Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:09 am on August 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology, Hep C, ,   

    From Yale: “One in four hepatitis C patients denied initial approval for drug treatment” 

    Yale University bloc

    Yale University

    August 27, 2015
    Ziba Kashef

    1
    Shutterstock

    Nearly one in four patients with chronic hepatitis C (HCV) are denied initial approval for a drug therapy that treats the most common strain of the infection, according to a Yale School of Medicine study.

    The finding, published Aug. 27 in PLOS ONE, identifies a new barrier to caring for patients with this severe condition.

    Prior to the FDA approval of novel antiviral therapies for HCV in 2014, treatment options for patients were limited, requiring weekly injections of interferon-based therapy that caused severe side effects. The new regimens revolutionized treatment and offered patients an oral therapy with cure rates exceeding 90%. However, the high cost of care led insurers to impose new restrictions on drug authorization.

    In light of the new restrictions, the study authors hypothesized that while most patients would be able to access antiviral therapy, some would experience delays in approval and others would be denied. Led by Dr. Joseph K. Lim, associate professor of medicine and director of the Yale Viral Hepatitis Program, the investigators reviewed records of 129 patients who were prescribed a combination of two drugs (sofosbuvir and ledipasvir, or SOF/LED) between October and December 2014.

    “The first key finding is that upon initial request for treatment, approximately one in four patients are denied,” said Dr. Albert Do, internal medicine resident and co-first author with Yash Mittal, M.D. “That proportion is surprising.”

    The researchers also found that certain subsets of patients were more likely to receive initial approval, including those with advanced liver disease such as cirrhosis and those on public insurance, either Medicare or Medicaid. “It is significant that factors beyond disease state and medical necessity now affect one’s likelihood of accessing HCV treatment,” said Mittal.

    While most patients in the study eventually received approval for treatment through the insurance appeals process, the delays are concerning, said Lim, as time is critical for patients on the verge of developing cirrhosis or liver failure. “It could make the difference for those who can be treated and remain stable long-term, versus those who have gone past the point of no return and will require liver transplantation or succumb to their illness,” he noted.

    This study adds to a growing body of literature on the hepatitis C “cascade of care,” in which attrition occurs at every step from diagnosis, confirmation, linkage to care, and treatment, Lim explained. He hopes the study triggers further research and discussion about this new barrier to HCV care.

    “Delay in access may further challenge our ability to cure hepatitis C in this country,” Lim said. “Some patients are told they must wait until they have advanced liver disease before they can undergo potentially curative treatment. We hope these data may help inform national policy discussions on promoting more rational, patient-centered approaches to HCV treatment access.”

    Other Yale authors include Annmarie Liapakis, Elizabeth Cohen, Hong Chau, Claudia Bertuccio, Dana Sapir, Jessica Wright, Carol Eggers, Kristine Drozd, Maria Ciarleglio, and Yanhong Deng.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Yale University Campus

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

     
  • richardmitnick 8:55 am on August 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From phys.org: “New technique could enable design of hybrid glasses and revolutionize gas storage” 

    physdotorg
    phys.org

    August 28, 2015
    No Writer Credit

    A new method of manufacturing glass could lead to the production of ‘designer glasses’ with applications in advanced photonics, whilst also facilitating industrial scale carbon capture and storage. An international team of researchers, writing today in the journal Nature Communications, report how they have managed to use a relatively new family of sponge-like porous materials to develop new hybrid glasses.

    The work revolves around a family of compounds called metal-organic frameworks (MOFs), which are cage-like structures consisting of metal ions, linked by organic bonds. Their porous properties have led to proposed application in carbon capture, hydrogen storage and toxic gas separations, due to their ability to selectively adsorb and store pre-selected target molecules, much like a building a sieve which discriminates not only on size, but also chemical identity.

    However, since their discovery a quarter of a century ago, their potential for large-scale industrial use has been limited due to difficulties in producing linings, thin films, fibrous or other ‘shaped’ structures from the powders produced by chemical synthesis. Such limitations arise from the relatively poor thermal and mechanical properties of MOFs compared to materials such as ceramics or metals, and have in the past resulted in structural collapse during post-processing techniques such as sintering or melt-casting.

    Now, a team of researchers from Europe, China and Japan has discovered that careful MOF selection and heating under argon appears to raise their decomposition temperature just enough to allow melting, rather than the powders breaking down. The liquids formed have the potential to be shaped, cast and recrystallised, to enable solid structures with uses in gas separation and storage.

    Dr Thomas Bennett from the Department of Materials Science and Metallurgy at the University of Cambridge says: “Traditional methods used in melt-casting of metals or sintering of ceramics cause the structural collapse of MOFs due to the structures thermally degrading at low temperatures. Through exploring the interface between melting, recrystallisation and thermal decomposition, we now should be able to manufacture a variety of shapes and structures that were previously impossible, making applications for MOFs more industrially relevant”.

    Equally importantly, say the researchers, the glasses that can be produced by cooling the liquids quickly are themselves a new category of materials. Further tailoring of the chemical functionalities may be possible by utilising the ease with which different elements can be incorporated into MOFs before melting and cooling.

    Professor Yuanzheng Yue from Aalborg University adds: “A second facet to the work is in the glasses themselves, which appear distinct from existing categories. The formation of glasses that contain highly interchangeable metal and organic components, in is highly unusual, as they are normally either purely organic, for example in solar cell conducting polymers, or entirely inorganic, such as oxide or metallic glasses. Understanding the mechanism of hybrid glass formation will also greatly contribute to our knowledge of glass formers in general.”

    Using the advanced capabilities at the UK’s synchrotron, Diamond Light Source, the team were able to scrutinise the metal organic frameworks in atomic detail.

    Diamond Ligt Source U.K.
    U.K. Diamond Light Source

    Professor Trevor Rayment, Physical Science Director at Diamond, comments: “This work is an exciting example of how work with synchrotron radiation which deepens our fundamental understanding of the properties of glasses also produces tantalising prospects of practical applications of new materials. This work could have a lasting impact on both frontiers of knowledge.”

    The researchers believe the new technique could open up the possibility of the production of ‘chemically designed’ glasses whereby different metals or organics are swapped into, or out of, the MOFs before melting.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 8:19 pm on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Salk: ” New drug squashes cancer’s last-ditch efforts to survive” 

    Salk Institute bloc

    Salk Institute for Biological Studies

    The Salk Institute and Sanford Burnham Prebys Medical Discovery Institute created a compound that stops a cellular recycling process

    1
    Cell recycling (shown in green) is elevated in lung cancer cells treated with an established cancer drug. Recycling is suppressed upon co-treatment with a newly discovered enzyme inhibitor.
    Image: Courtesy of the Salk Institute for Biological Studies

    Salk Institute and Sanford Burnham Prebys Medical Discovery Institute (SBP) scientists have developed a drug that prevents this process from starting in cancer cells. Published June 25, 2015 in Molecular Cell, the new study identifies a small molecule drug that specifically blocked the first step of autophagy, effectively cutting off the recycled nutrients that cancer cells need to live.

    “The finding opens the door to a new way to attack cancer,” says Reuben Shaw, a senior author of the paper, professor in the Molecular and Cell Biology Laboratory at the Salk Institute and a Howard Hughes Medical Institute Early Career Scientist. “The inhibitor will probably find the greatest utility in combination with targeted therapies.”

    Besides cancer, defects in autophagy have been linked with infectious diseases, neurodegeneration and heart problems. In a 2011 study in the journal Science, Shaw and his team discovered how cells starved of nutrients activate the key molecule that kicks off autophagy, an enzyme called ULK1.

    Reasoning that inhibiting ULK1 might snuff out some types of cancer by stifling a main energy supply that comes from the recycling process, Shaw’s group and others wanted to find a drug that would inhibit the enzyme. Only a fraction of such inhibitors that show promise in a test tube end up working well in living cells. Shaw’s group spent more than a year studying how ULK1 works and developing new strategies for screening its function in cells.

    A key breakthrough came when Shaw met the paper’s other senior author, Nicholas Cosford, a professor in the NCI-Designated Cancer Center at SBP. Cosford had been investigating ULK1 using medicinal chemistry and chemical biology, and had identified some promising lead compounds using rational design. The two labs combined efforts to screen hundreds of potential molecules for ULK1 inhibition, narrowing the list down to a few dozen, and eventually one.

    “The key to success for this project came when we combined Reuben’s deep understanding of the fundamental biology of autophagy with our chemical expertise,” says Cosford. “This allowed us to find a drug that targeted ULK1 not just in a test tube but also in tumor cells. Another challenge was finding molecules that selectively targeted the ULK1 enzyme without affecting healthy cells. Our work provides the basis for a novel drug that will treat resistant cancer by cutting off a main tumor cell survival process.”

    The result was a highly selective drug they named SBI-0206965, which successfully killed a number of cancer cell types, including human and mouse lung cancer cells and human brain cancer cells, some of which were previously shown to be particularly reliant on cellular recycling.

    Interestingly, some cancer drugs (such as mTOR inhibitors) further activate cell recycling by shutting off the ability of those cells to take up nutrients, making them more reliant on recycling to provide all the building blocks cells need to stay alive. Rapamycin, for example, works by shutting down cell growth and division. In response, the cells launch into recycling mode by turning on ULK1, which may be one reason why, rather than dying, some cancer cells seem to go into a dormant state and return–often more drug resistant–after treatment stops.

    2
    Matthew Chun, Nicholas Cosford and Reuben Shaw. Image: Courtesy of the Salk Institute for Biological Studies

    “Inhibiting ULK1 would eliminate this last-ditch survival mechanism in the cancer cells and could make existing anti-cancer treatments much more effective,” says Matthew Chun, one of the study’s lead authors and a postdoctoral fellow in the Shaw lab at Salk.

    Indeed, combining SBI-0206965 with mTOR inhibitors made it more effective, killing two to three times as many lung cancer cells as SBI-0206965 alone or the mTOR inhibitors alone.

    Drugging the autophagy pathway to combat cancer has been tried before, but the only drugs that currently block cell recycling work by targeting the cell organelle known as the lysosome, which functions at the final stage of autophagy. Although these lysosomal therapies are being tested in early-stage clinical trials, they inhibit other lysosomal functions beyond autophagy, and therefore may have additional side effects.

    Comparing equivalent concentrations of the lysosomal drug chloroquine with SBI-0206965, in combination mTOR inhibitors, the scientists found that SBI-0206965 was better than chloroquine at killing cancer cells.

    The group is now testing the drug in mouse models of cancer. “An important next step will be testing this drug in other types of cancer and with other therapeutic combinations,” says Shaw, who is deputy director of Salk’s NCI-Designated Cancer Center. “In the meantime, this discovery gives researchers an exciting new toolbox for the inhibition and measurement of cell recycling.”

    Other authors on the study include co-lead author Daniel Egan of Salk’s Molecular and Cell Biology Laboratory; Mitchell Vamos, Haixia Zou, Juan Rong, Dhanya Raveendra-Panickar, Douglas Sheffler, and Peter Teriete of the Cell Death and Survival Networks Research Program in the NCI-Designated Cancer Center at SBP; Chad Miller, Hua Jane Lou, and Benjamin Turk of the Department of Pharmacology in Yale University School of Medicine; John Asara of the Division of Signal Transduction in Beth Israel Deaconess Medical Center and the Department of Medicine in Harvard Medical School; and Chih-Cheng Yang of SBP’s Functional Genomics Core.

    The research was supported by National Institutes of Health, the Department of Defense, and the Leona M. and Harry B. Helmsley Charitable Trust.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Salk Institute Campus

    Every cure has a starting point. Like Dr. Jonas Salk when he conquered polio, Salk scientists are dedicated to innovative biological research. Exploring the molecular basis of diseases makes curing them more likely. In an outstanding and unique environment we gather the foremost scientific minds in the world and give them the freedom to work collaboratively and think creatively. For over 50 years this wide-ranging scientific inquiry has yielded life-changing discoveries impacting human health. We are home to Nobel Laureates and members of the National Academy of Sciences who train and mentor the next generation of international scientists. We lead biological research. We prize discovery. Salk is where cures begin.

     
  • richardmitnick 8:06 pm on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Yale: “Research in the news: Study reveals new way to ‘rewire’ immune cells to slow tumor growth” 

    Yale University bloc

    Yale University

    August 27, 2015
    Ziba Kashef

    1
    (Image via Shutterstock)

    Inside a tumor, immune cells and cancer cells battle for survival. The advantage may go to the cells that metabolize the most glucose, say Yale researchers who have identified a new way to boost immune response by metabolically “rewiring” immune cells.

    Their research, published Aug. 27 online in Cell, may provide a novel approach to cancer immunotherapy.

    Researchers have long known that specific immune cells known as T cells infiltrate tumors. But tumor-infiltrating T cells fail to destroy cancer cells, in part, because inside the tumor they are deprived of glucose, a nutrient essential to T cell function. The research team, led by professor of immunobiology Susan Kaech and postdoctoral fellow Ping-Chih Ho, theorized that metabolic reprogramming of T cells could enhance their anti-tumor response.

    When cells eat glucose, they convert it into a metabolite called phosphoenolpyruvate (PEP). Using biochemical analyses, the researchers identified a new role for PEP in fine-tuning the anti-tumor response of T cells. They genetically reprogrammed the T cells to increase PEP production, restoring cell function and slowing tumor growth.

    The research reveals a potential new form of cancer immunotherapy. “Knowing how the metabolic state of T cells is affected in tumors, we may find new ways of altering their metabolism to make them more efficiently kill tumor cells,” says Kaech. These types of approaches could be directly applied to clinical trials using adoptive T cell therapy, she notes.

    The study results may also apply to conditions other than cancer. “Understanding how immune cell metabolism affects their function could lead to novel approaches to adjust immune responses in a variety of diseases,” says Ho.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Yale University Campus

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

     
  • richardmitnick 7:15 pm on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Caltech: “Caltech Chemists Solve Major Piece of Cellular Mystery” 

    Caltech Logo
    Caltech

    08/27/2015
    Kimm Fesenmaier

    Team determines the architecture of a second subcomplex of the nuclear pore complex [NPC]

    1
    Credit: Lance Hayashida/Caltech and the Hoelz Laboratory/Caltech

    2
    Credit: Hoelz Laboratory/Caltech

    Not just anything is allowed to enter the nucleus, the heart of eukaryotic cells where, among other things, genetic information is stored. A double membrane, called the nuclear envelope, serves as a wall, protecting the contents of the nucleus.

    3
    Human cell nucleus

    Any molecules trying to enter or exit the nucleus must do so via a cellular gatekeeper known as the nuclear pore complex (NPC), or pore, that exists within the envelope.

    How can the NPC be such an effective gatekeeper—preventing much from entering the nucleus while helping to shuttle certain molecules across the nuclear envelope? Scientists have been trying to figure that out for decades, at least in part because the NPC is targeted by a number of diseases, including some aggressive forms of leukemia and nervous system disorders such as a hereditary form of Lou Gehrig’s disease. Now a team led by André Hoelz, assistant professor of biochemistry at Caltech, has solved a crucial piece of the puzzle.

    In February of this year, Hoelz and his colleagues published a paper describing the atomic structure of the NPC’s coat nucleoporin complex, a subcomplex that forms what they now call the outer rings (see illustration). Building on that work, the team has now solved the architecture of the pore’s inner ring, a subcomplex that is central to the NPC’s ability to serve as a barrier and transport facilitator. In order to the determine that architecture, which determines how the ring’s proteins interact with each other, the biochemists built up the complex in a test tube and then systematically dissected it to understand the individual interactions between components. Then they validated that this is actually how it works in vivo, in a species of fungus.

    For more than a decade, other researchers have suggested that the inner ring is highly flexible and expands to allow large macromolecules to pass through. “People have proposed some complicated models to explain how this might happen,” says Hoelz. But now he and his colleagues have shown that these models are incorrect and that these dilations simply do not occur.

    “Using an interdisciplinary approach, we solved the architecture of this subcomplex and showed that it cannot change shape significantly,” says Hoelz. “It is a relatively rigid scaffold that is incorporated into the pore and basically just sits as a decoration, like pom-poms on a bicycle. It cannot dilate.”

    The new paper appears online ahead of print on August 27 in Science Express. The four co-lead authors on the paper are Caltech postdoctoral scholars Tobias Stuwe, Christopher J. Bley, and Karsten Thierbach, and graduate student Stefan Petrovic.

    Together, the inner and outer rings make up the symmetric core of the NPC, a structure that includes 21 different proteins. The symmetric core is so named because of its radial symmetry (the two remaining subcomplexes of the NPC are specific to either the side that faces the cell’s cytoplasm or the side that faces the nucleus and are therefore not symmetric). Having previously solved the structure of the coat nucleoporin complex and located it in the outer rings, the researchers knew that the remaining components that are not membrane anchored must make up the inner ring.

    They started solving the architecture by focusing on the channel nucleoporin complex, or channel, which lines the central transport channel and is made up of three proteins, accounting for about half of the inner ring. This complex produces filamentous structures that serve as docking sites for specific proteins that ferry molecules across the nuclear envelope.

    The biochemists employed bacteria to make the proteins associated with the inner ring in a test tube and mixed various combinations until they built the entire subcomplex. Once they had reconstituted the inner ring subcomplex, they were able to modify it to investigate how it is held together and which of its components are critical, and to determine how the channel is attached to the rest of the pore.

    Hoelz and his team found that the channel is attached at only one site. This means that it cannot stretch significantly because such shape changes require multiple attachment points. Hoelz notes that a new electron microscopy study of the NPC published in 2013 by Martin Beck’s group at the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, indicated that the central channel is bigger than previously thought and wide enough to accommodate even the largest cargoes known to pass through the pore.

    When the researchers introduced mutations that effectively eliminated the channel’s single attachment, the complex could no longer be incorporated into the inner ring. After proving this in the test tube, they also showed this to be true in living cells.

    “This whole complex is a very complicated machine to assemble. The cool thing here is that nature has found an elegant way to wait until the very end of the assembly of the nuclear pore to incorporate the channel,” says Hoelz. “By incorporating the channel, you establish two things at once: you immediately form a barrier and you generate the ability for regulated transport to occur through the pore.” Prior to the channel’s incorporation, there is simply a hole through which macromolecules can freely pass.

    Next, Hoelz and his colleagues used X-ray crystallography to determine the structure of the channel nucleoporin subcomplex bound to the adaptor nucleoporin Nic96, which is its only nuclear pore attachment site. X-ray crystallography involves shining X-rays on a crystallized sample and analyzing the pattern of rays reflected off the atoms in the crystal. Because the NPC is a large and complex molecular machine that also has many moving parts, they used an engineered antibody to essentially “superglue” many copies of the complex into place to form a nicely ordered crystalline sample. Then they analyzed hundreds of samples using Caltech’s Molecular Observatory—a facility developed with support from the Gordon and Betty Moore Foundation that includes an automated X-ray beam line at the Stanford Synchrotron Radiation Laboratory that can be controlled remotely from Caltech—and the GM/CA beam line at the Advanced Photon Source at the Argonne National Laboratory. Eventually, they were able to determine the size, shape, and position of all the atoms of the channel nucleoporin subcomplex and its location within the full NPC.

    “The crystal structure nailed it,” Hoelz says. “There is no way that the channel is changing shape. All of that other work that, for more than 10 years, suggested it was dilating was wrong.”

    The researchers also solved a number of crystal structures from other parts of the NPC and determined how they interact with components of the inner ring. In doing so they demonstrated that one such interaction is critical for positioning the channel in the center of the inner ring. They found that exact positioning is needed for the proper export from the nucleus of mRNA and components of ribosomes, the cell’s protein-making complexes, rendering it critical in the flow of genetic information from DNA to mRNA to protein.

    Hoelz adds that now that the architectures of the inner and outer rings of the NPC are known, getting an atomic structure of the entire symmetric core is “a sprint to the summit.”

    “When I started at Caltech, I thought it might take another 10, 20 years to do this,” he says. “In the end, we have really only been working on this for four and a half years, and the thing is basically tackled. I want to emphasize that this kind of work is not doable everywhere. The people who worked on this are truly special, talented, and smart; and they worked day and night on this for years.”

    Ultimately, Hoelz says he would like to understand how the NPC works in great detail so that he might be able to generate therapies for diseases associated with the dysfunction of the complex. He also dreams of building up an entire pore in the test tube so that he can fully study it and understand what happens as it is modified in various ways. “Just as they did previously when I said that I wanted to solve the atomic structure of the nuclear pore, people will say that I’m crazy for trying to do this,” he says. “But if we don’t do it, it is likely that nobody else will.”

    The paper, “Architecture of the fungal nuclear pore inner ring complex,” had a number of additional Caltech authors: Sandra Schilbach (now of the Max Planck Institute of Biophysical Chemistry), Daniel J. Mayo, Thibaud Perriches, Emily J. Rundlet, Young E. Jeon, Leslie N. Collins, Ferdinand M. Huber, and Daniel H. Lin. Additional coauthors include Marcin Paduch, Akiko Koide, Vincent Lu, Shohei Koide, and Anthony A. Kossiakoff of the University of Chicago; and Jessica Fischer and Ed Hurt of Heidelberg University.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 2:51 pm on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From NERSC: “NERSC, Cray Move Forward With Next-Generation Scientific Computing” 

    NERSC Logo
    NERSC

    April 22, 2015
    Jon Bashor, jbashor@lbl.gov, 510-486-5849

    1
    The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley National Laboratory.

    The U.S. Department of Energy’s (DOE) National Energy Research Scientific Computing (NERSC) Center and Cray Inc. announced today that they have finalized a new contract for a Cray XC40 supercomputer that will be the first NERSC system installed in the newly built Computational Research and Theory facility at Lawrence Berkeley National Laboratory.

    1

    This supercomputer will be used as Phase 1 of NERSC’s next-generation system named “Cori” in honor of bio-chemist and Nobel Laureate Gerty Cori. Expected to be delivered this summer, the Cray XC40 supercomputer will feature the Intel Haswell processor. The second phase, the previously announced Cori system, will be delivered in mid-2016 and will feature the next-generation Intel Xeon Phi™ processor “Knights Landing,” a self-hosted, manycore processor with on-package high bandwidth memory that offers more than 3 teraflop/s of double-precision peak performance per single socket node.

    NERSC serves as the primary high performance computing facility for the Department of Energy’s Office of Science, supporting some 6,000 scientists annually on more than 700 projects. This latest contract represents the Office of Science’s ongoing commitment to supporting computing to address challenges such as developing new energy sources, improving energy efficiency, understanding climate change and analyzing massive data sets from observations and experimental facilities around the world.

    “This is an exciting year for NERSC and for NERSC users,” said Sudip Dosanjh, director of NERSC. “We are unveiling a brand new, state-of-the-art computing center and our next-generation supercomputer, designed to help our users begin the transition to exascale computing. Cori will allow our users to take their science to a level beyond what our current systems can do.”

    “NERSC and Cray share a common vision around the convergence of supercomputing and big data, and Cori will embody that overarching technical direction with a number of unique, new technologies,” said Peter Ungaro, president and CEO of Cray. “We are honored that the first supercomputer in NERSC’s new center will be our flagship Cray XC40 system, and we are also proud to be continuing and expanding our longstanding partnership with NERSC and the U.S. Department of Energy as we chart our course to exascale computing.”
    Support for Data-Intensive Science

    A key goal of the Cori Phase 1 system is to support the increasingly data-intensive computing needs of NERSC users. Toward this end, Phase 1 of Cori will feature more than 1,400 Intel Haswell compute nodes, each with 128 gigabytes of memory per node. The system will provide about the same sustained application performance as NERSC’s Hopper system, which will be retired later this year. The Cori interconnect will have a dragonfly topology based on the Aries interconnect, identical to NERSC’s Edison system.

    However, Cori Phase 1 will have twice as much memory per node than NERSC’s current Edison supercomputer (a Cray XC30 system) and will include a number of advanced features designed to accelerate data-intensive applications:

    Large number of login/interactive nodes to support applications with advanced workflows
    Immediate access queues for jobs requiring real-time data ingestion or analysis
    High-throughput and serial queues can handle a large number of jobs for screening, uncertainty qualification, genomic data processing, image processing and similar parallel analysis
    Network connectivity that allows compute nodes to interact with external databases and workflow controllers
    The first half of an approximately 1.5 terabytes/sec NVRAM-based Burst Buffer for high bandwidth low-latency I/O
    A Cray Lustre-based file system with over 28 petabytes of capacity and 700 gigabytes/second I/O bandwidth

    In addition, NERSC is collaborating with Cray on two ongoing R&D efforts to maximize Cori’s data potential by enabling higher bandwidth transfers in and out of the compute node, high-transaction rate data base access, and Linux container virtualization functionality on Cray compute nodes to allow custom software stack deployment.

    “The goal is to give users as familiar a system as possible, while also allowing them the flexibility to explore new workflows and paths to computation,” said Jay Srinivasan, the Computational Systems Group lead. “The Phase 1 system is designed to enable users to start running their workload on Cori immediately, while giving data-intensive workloads from other NERSC systems the ability to run on a Cray platform.”
    Burst Buffer Enhances I/O

    A key element of Cori Phase 1 is Cray’s new DataWarp technology, which accelerates application I/O and addresses the growing performance gap between compute resources and disk-based storage. This capability, often referred to as a “Burst Buffer,” is a layer of NVRAM designed to move data more quickly between processor and disk and allow users to make the most efficient use of the system. Cori Phase 1 will feature approximately 750 terabytes of capacity and approximately 750 gigabytes/second of I/O bandwidth. NERSC, Sandia and Los Alamos national laboratories and Cray are collaborating to define use cases and test early software that will provide the following capabilities:

    Improve application reliability (checkpoint-restart)
    Accelerate application I/O performance for small blocksize I/O and analysis files
    Enhance quality of service by providing dedicated I/O acceleration resources
    Provide fast temporary storage for out-of-core applications
    Serve as a staging area for jobs requiring large input files or persistent fast storage between coupled simulations
    Support post-processing analysis of large simulation data as well asin situandin transitvisualization and analysis using the Burst Buffer nodes

    Combining Extreme Scale Data Analysis and HPC on the Road to Exascale

    As previously announced, Phase 2 of Cori will be delivered in mid-2016 and will be combined with Phase 1 on the same high speed network, providing a unique resource. When fully deployed, Cori will contain more than 9,300 Knights Landing compute nodes and more than 1,900 Haswell nodes, along with the file system and a 2X increase in the applications I/O acceleration.

    “In the scientific computing community, the line between large scale data analysis and simulation and modeling is really very blurred,” said Katie Antypas, head of NERSC’s Scientific Computing and Data Services Department. “The combined Cori system is the first system to be specifically designed to handle the full spectrum of computational needs of DOE researchers, as well as emerging needs in which data- and compute-intensive work are part of a single workflow. For example, a scientist will be able to run a simulation on the highly parallel Knights Landing nodes while simultaneously performing data analysis using the Burst Buffer on the Haswell nodes. This is a model that we expect to be important on exascale-era machines.”

    NERSC is funded by the Office of Advanced Scientific Computing Research in the DOE’s Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The National Energy Research Scientific Computing Center (NERSC) is the primary scientific computing facility for the Office of Science in the U.S. Department of Energy. As one of the largest facilities in the world devoted to providing computational resources and expertise for basic scientific research, NERSC is a world leader in accelerating scientific discovery through computation. NERSC is a division of the Lawrence Berkeley National Laboratory, located in Berkeley, California. NERSC itself is located at the UC Oakland Scientific Facility in Oakland, California.

    More than 5,000 scientists use NERSC to perform basic scientific research across a wide range of disciplines, including climate modeling, research into new materials, simulations of the early universe, analysis of data from high energy physics experiments, investigations of protein structure, and a host of other scientific endeavors.

    The NERSC Hopper system, a Cray XE6 with a peak theoretical performance of 1.29 Petaflop/s. To highlight its mission, powering scientific discovery, NERSC names its systems for distinguished scientists. Grace Hopper was a pioneer in the field of software development and programming languages and the creator of the first compiler. Throughout her career she was a champion for increasing the usability of computers understanding that their power and reach would be limited unless they were made to be more user friendly.

    gh
    (Historical photo of Grace Hopper courtesy of the Hagley Museum & Library, PC20100423_201. Design: Caitlin Youngquist/LBNL Photo: Roy Kaltschmidt/LBNL)

    NERSC is known as one of the best-run scientific computing facilities in the world. It provides some of the largest computing and storage systems available anywhere, but what distinguishes the center is its success in creating an environment that makes these resources effective for scientific research. NERSC systems are reliable and secure, and provide a state-of-the-art scientific development environment with the tools needed by the diverse community of NERSC users. NERSC offers scientists intellectual services that empower them to be more effective researchers. For example, many of our consultants are themselves domain scientists in areas such as material sciences, physics, chemistry and astronomy, well-equipped to help researchers apply computational resources to specialized science problems.

     
  • richardmitnick 2:28 pm on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Caltech: “Artificial Leaf Harnesses Sunlight for Efficient Fuel Production” 

    Caltech Logo
    Caltech

    08/27/2015
    Jessica Stoller-Conrad

    1
    (From left to right): Chengxiang Xiang and Erik Verlage assemble a monolithically integrated III-V device, protected by a TiO2 stabilization layer, which performs unassisted solar water splitting with collection of hydrogen fuel and oxygen.
    Credit: Lance Hayashida/Caltech

    A highly efficient photoelectrochemical (PEC) device uses the power of the sun to split water into hydrogen and oxygen. The stand-alone prototype includes two chambers separated by a semi-permeable membrane that allows collection of both gas products.
    Credit: Lance Hayashida/Caltech
    3
    Illustration of an efficient, robust and integrated solar-driven prototype featuring protected photoelectrochemical assembly coupled with oxygen and hydrogen evolution reaction catalysts. [View full size]
    Credit: Image provided courtesy of Joint Center for Artificial Photosynthesis; artwork by Darius Siwek

    Generating and storing renewable energy, such as solar or wind power, is a key barrier to a clean-energy economy. When the Joint Center for Artificial Photosynthesis (JCAP) was established at Caltech and its partnering institutions in 2010, the U.S. Department of Energy (DOE) Energy Innovation Hub had one main goal: a cost-effective method of producing fuels using only sunlight, water, and carbon dioxide, mimicking the natural process of photosynthesis in plants and storing energy in the form of chemical fuels for use on demand. Over the past five years, researchers at JCAP have made major advances toward this goal, and they now report the development of the first complete, efficient, safe, integrated solar-driven system for splitting water to create hydrogen fuels.

    “This result was a stretch project milestone for the entire five years of JCAP as a whole, and not only have we achieved this goal, we also achieved it on time and on budget,” says Caltech’s Nate Lewis, George L. Argyros Professor and professor of chemistry, and the JCAP scientific director.

    The new solar fuel generation system, or artificial leaf, is described in the August 27 online issue of the journal Energy and Environmental Science. The work was done by researchers in the laboratories of Lewis and Harry Atwater, director of JCAP and Howard Hughes Professor of Applied Physics and Materials Science.

    “This accomplishment drew on the knowledge, insights and capabilities of JCAP, which illustrates what can be achieved in a Hub-scale effort by an integrated team,” Atwater says. “The device reported here grew out of a multi-year, large-scale effort to define the design and materials components needed for an integrated solar fuels generator.”

    The new system consists of three main components: two electrodes—one photoanode and one photocathode—and a membrane. The photoanode uses sunlight to oxidize water molecules, generating protons and electrons as well as oxygen gas. The photocathode recombines the protons and electrons to form hydrogen gas. A key part of the JCAP design is the plastic membrane, which keeps the oxygen and hydrogen gases separate. If the two gases are allowed to mix and are accidentally ignited, an explosion can occur; the membrane lets the hydrogen fuel be separately collected under pressure and safely pushed into a pipeline.

    Semiconductors such as silicon or gallium arsenide absorb light efficiently and are therefore used in solar panels. However, these materials also oxidize (or rust) on the surface when exposed to water, so cannot be used to directly generate fuel. A major advance that allowed the integrated system to be developed was previous work in Lewis’s laboratory, which showed that adding a nanometers-thick layer of titanium dioxide (TiO2)—a material found in white paint and many toothpastes and sunscreens—onto the electrodes could prevent them from corroding while still allowing light and electrons to pass through. The new complete solar fuel generation system developed by Lewis and colleagues uses such a 62.5-nanometer-thick TiO2 layer to effectively prevent corrosion and improve the stability of a gallium arsenide–based photoelectrode.

    Another key advance is the use of active, inexpensive catalysts for fuel production. The photoanode requires a catalyst to drive the essential water-splitting reaction. Rare and expensive metals such as platinum can serve as effective catalysts, but in its work the team discovered that it could create a much cheaper, active catalyst by adding a 2-nanometer-thick layer of nickel to the surface of the TiO2. This catalyst is among the most active known catalysts for splitting water molecules into oxygen, protons, and electrons and is a key to the high efficiency displayed by the device.

    The photoanode was grown onto a photocathode, which also contains a highly active, inexpensive, nickel-molybdenum catalyst, to create a fully integrated single material that serves as a complete solar-driven water-splitting system.

    A critical component that contributes to the efficiency and safety of the new system is the special plastic membrane that separates the gases and prevents the possibility of an explosion, while still allowing the ions to flow seamlessly to complete the electrical circuit in the cell. All of the components are stable under the same conditions and work together to produce a high-performance, fully integrated system. The demonstration system is approximately one square centimeter in area, converts 10 percent of the energy in sunlight into stored energy in the chemical fuel, and can operate for more than 40 hours continuously.

    “This new system shatters all of the combined safety, performance, and stability records for artificial leaf technology by factors of 5 to 10 or more ,” Lewis says.

    “Our work shows that it is indeed possible to produce fuels from sunlight safely and efficiently in an integrated system with inexpensive components,” Lewis adds, “Of course, we still have work to do to extend the lifetime of the system and to develop methods for cost-effectively manufacturing full systems, both of which are in progress.”

    Because the work assembled various components that were developed by multiple teams within JCAP, coauthor Chengxiang Xiang, who is co-leader of the JCAP prototyping and scale-up project, says that the successful end result was a collaborative effort. “JCAP’s research and development in device design, simulation, and materials discovery and integration all funneled into the demonstration of this new device,” Xiang says.

    These results are published in a paper titled A monolithically integrated, intrinsically safe, 10% efficient, solar-driven water-splitting system based on active, stable earth-abundant electrocatalysts in conjunction with tandem III-V light absorbers protected by amorphous TiO2 films. In addition to Lewis, Atwater, and Xiang, other Caltech coauthors include graduate student Erik Verlage, postdoctoral scholars Shu Hu and Ke Sun, material processing and integration research engineer Rui Liu, and JCAP mechanical engineer Ryan Jones. Funding was provided by the Office of Science at the U.S. Department of Energy, and the Gordon and Betty Moore Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 1:46 pm on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From MIT: “Young brains can take on new functions” 


    MIT News

    August 27, 2015
    Anne Trafton

    1
    Illustration: Jose-Luis Olivares/MIT

    In 2011, MIT neuroscientist Rebecca Saxe and colleagues reported that in blind adults, brain regions normally dedicated to vision processing instead participate in language tasks such as speech and comprehension. Now, in a study of blind children, Saxe’s lab has found that this transformation occurs very early in life, before the age of 4.

    The study, appearing in the Journal of Neuroscience, suggests that the brains of young children are highly plastic, meaning that regions usually specialized for one task can adapt to new and very different roles. The findings also help to define the extent to which this type of remodeling is possible.

    “In some circumstances, patches of cortex appear to take on other roles than the ones that they most typically have,” says Saxe, a professor of cognitive neuroscience and an associate member of MIT’s McGovern Institute for Brain Research. “One question that arises from that is, ‘What is the range of possible differences between what a cortical region typically does and what it could possibly do?’”

    The paper’s lead author is Marina Bedny, a former MIT postdoc who is now an assistant professor at Johns Hopkins University. MIT graduate student Hilary Richardson is also an author of the paper.

    Brain reorganization

    The brain’s cortex, which carries out high-level functions such as thought, sensory processing, and initiation of movement, is made of sheets of neurons, each dedicated to a certain role. Within the visual system, located primarily in the occipital lobe, most neurons are tuned to respond only to a very specific aspect of visual input, such as brightness, orientation, or location in the field of view.

    “There’s this big fundamental question, which is, ‘How did that organization get there, and to what degree can it be changed?’” Saxe says.

    One possibility is that neurons in each patch of cortex have evolved to carry out specific roles, and can do nothing else. At the other extreme is the possibility that any patch of cortex can be recruited to perform any kind of computational task.

    “The reality is somewhere in between those two,” Saxe says.

    To study the extent to which cortex can change its function, scientists have focused on the visual cortex because they can learn a great deal about it by studying people who were born blind.

    A landmark 1996 study of blind people found that their visual regions could participate in a nonvisual task — reading Braille. Some scientists theorized that perhaps the visual cortex is recruited for reading Braille because like vision, it requires discriminating very fine-grained patterns.

    However, in their 2011 study, Saxe and Bedny found that the visual cortex of blind adults also responds to spoken language. “That was weird, because processing auditory language doesn’t require the kind of fine-grained spatial discrimination that Braille does,” Saxe says.

    She and Bedny hypothesized that auditory language processing may develop in the occipital cortex by piggybacking onto the Braille-reading function. To test that idea, they began studying congenitally blind children, including some who had not learned Braille yet. They reasoned that if their hypothesis were correct, the occipital lobe would be gradually recruited for language processing as the children learned Braille.

    However, they found that this was not the case. Instead, children as young as 4 already have language-related activity in the occipital lobe.

    “The response of occipital cortex to language is not affected by Braille acquisition,” Saxe says. “It happens before Braille and it doesn’t increase with Braille.”

    Language-related occipital activity was similar among all of the 19 blind children, who ranged in age from 4 to 17, suggesting that the entire process of occipital recruitment for language processing takes place before the age of 4, Saxe says. Bedny and Saxe have previously shown that this transition occurs only in people blind from birth, suggesting that there is an early critical period after which the cortex loses much of its plasticity.

    The new study represents a huge step forward in understanding how the occipital cortex can take on new functions, says Ione Fine, an associate professor of psychology at the University of Washington.

    “One thing that has been missing is an understanding of the developmental timeline,” says Fine, who was not involved in the research. “The insight here is that you get plasticity for language separate from plasticity for Braille and separate from plasticity for auditory processing.”

    Language skills

    The findings raise the question of how the extra language-processing centers in the occipital lobe affect language skills.

    “This is a question we’ve always wondered about,” Saxe says. “Does it mean you’re better at those functions because you have more of your cortex doing it? Does it mean you’re more resilient in those functions because now you have more redundancy in your mechanism for doing it? You could even imagine the opposite: Maybe you’re less good at those functions because they’re distributed in an inefficient or atypical way.”

    There are hints that the occipital lobe’s contribution to language-related functions “takes the pressure off the frontal cortex,” where language processing normally occurs, Saxe says. Other researchers have shown that suppressing left frontal cortex activity with transcranial magnetic stimulation interferes with language function in sighted people, but not in the congenitally blind.

    This leads to the intriguing prediction that a congenitally blind person who suffers a stroke in the left frontal cortex may retain much more language ability than a sighted person would, Saxe says, although that hypothesis has not been tested.

    Saxe’s lab is now studying children under 4 to try to learn more about how cortical functions develop early in life, while Bedny is investigating whether the occipital lobe participates in functions other than language in congenitally blind people.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 11:04 am on August 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , Digital seafloor maps,   

    From isgtw: “World’s first digital ocean floor map” 


    international science grid this week

    August 26, 2015


    Download mp4 here.

    Researchers from the University of Sydney’s School of Geosciences in Australia have created the world’s first digital map of the seafloor. Understanding the ocean — the Earth’s largest storehouse of carbon — as it relates to the seabed is critical to know how climate change will affect the ocean environment.

    1
    2
    3

    “In order to understand environmental change in the oceans we need to better understand the seabed,” says lead researcher Dr. Adriana Dutkiewicz. “Our research opens the door to a better understanding of the workings and history of the marine carbon cycle. We urgently need to understand how the ocean responds to climate change.”

    The last seabed map was hand drawn more than 40 years ago. Using an artificial intelligence method called support vector machine, experts at the National ICT Australia (NICTA) turned an assemblage of descriptions and sediment samples collected since the 1950s into a single contiguous digital map.

    “The difference between the new and old map is a little like comparing a barren tundra landscape with an exotic tropical paradise full of diversity,” says Dutkiewicz. “The ocean floor used to be portrayed as a monotonous seascape whereas the new map echoes the colorful patchworks of dreamtime art.”

    The map data can be downloaded for free [I got the download, but could not find a program to open it], and you can see the dreamy interactive 3D globe here.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:40 pm on August 26, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From NAOJ Subaru: “Discovering dust-obscured active galaxies as they grow” 

    NAOJ

    NAOJ

    A group of researchers from Ehime University, Princeton University, and the National Astronomical Observatory of Japan (NAOJ) among others has performed an extensive search for Dust Obscured Galaxies (DOGs) using data obtained from the Subaru Strategic Program with Hyper Suprime-Cam (HSC).

    NAOJ Subaru Hyper Suprime Camera
    HSC

    HSC is a new wide-field camera mounted at the prime focus of the Subaru Telescope and is an ideal instrument for searching for this rare and important class of galaxy. The research group discovered 48 DOGs, and has measured how common they are. Since DOGs are thought to harbor a rapidly growing black hole in their centers, these results give us clues for understanding the evolution of galaxies and supermassive black holes.

    1
    Figure 1: Images of 3 DOG’s. The left, middle, and right panels show optical image from HSC, near-infrared image from VIKING, and mid-infrared image from WISE, respectively. The image size is 20 square arcsecond (1 arcsecond is 1/3600 degree). It is clear that DOGs are faint in the optical, but are extremely bright in the infrared. (Credit: Ehime University/NAOJ/NASA/ESO)

    Co-evolution of galaxies and supermassive black holes

    How did galaxies form and evolve during the 13.8-billion-year history of the universe? This question has been the subject of intense observational and theoretical investigation. Recent studies have revealed that almost all massive galaxies harbor a supermassive black hole whose mass reaches up to a hundred thousand or even a billion times the mass of the sun, and their masses are tightly correlated with those of their host galaxies. This correlation suggests that supermassive black holes and their host galaxies have evolved together, closely interacting as they grow.

    Dust Obscured Galaxies

    The group of researchers, lead by Dr. Yoshiki Toba (Ehime University), focused on the Dust Obscured Galaxies (DOGs) as a key population to tackle the mystery of the co-evolution of galaxies and black holes. DOGs are very faint in visible light, because of the large quantity of obscuring dust, but are bright in the infrared. The brightest infrared DOGs in particular are expected to harbor the most actively growing black hole. In addition, most DOGs are seen in the epoch when the star formation activity of galaxies reached its peak, 8-10 billion years ago. Thus both DOGs and their black holes are rapidly growing, at an early phase of their co-evolution. However, since DOGs are rare and are hidden behind significant amount of dust, previous visible light surveys have found very few such objects.

    A search for Dust Obscured Galaxies with HSC

    Hyper Suprime-Cam (HSC) is a new instrument installed on the 8.2 meter Subaru Telescope in 2012. It is a wide-field camera with a field of view nine times the size of the full moon. An ambitious legacy survey with HSC started in March 2014 as a “Subaru strategic program (Note 1)”; total of 300 nights have been allocated for a five year period. The Subaru strategic program with HSC started to deliver large quantities of excellent imaging data.

    The research team selected DOGs from early data from the HSC Subaru Strategic Program (SSP). DOGs are thousand times brighter in the infrared than the optical and the team selected their targets using the HSC and NASA’s Wide-field Infrared Survey Explorer (WISE: Note 2).

    NASA Wise Telescope
    NASA WISE

    They also utilized the data from the [ESO] VISTA Kilo-degree Infrared Galaxy survey (VIKING: Note 3). The all-sky survey data with WISE are crucial to discover spatially rare DOG while the VIKING data are useful to identify the DOGs more precisely.

    ESO Vista Telescope
    ESO Vista Telescope
    ESO/VISTA

    Consequently, 48 DOGs were discovered. Each of these is 10 trillion times more luminous in the infrared than the sun. The number density of these luminous DOGs is about 300 per cubic gigaparsecs. It is theoretically predicted that these DOGs harbor an actively evolving supermassive black hole. This result provides researchers new insights into the mysteries of the co-evolution of galaxies and supermassive black holes from the unique observational prospects.

    2
    Figure 2: The number density of DOGs that were newly selected in this study, as a function of infrared luminosity. Data represented by the red star is the HSC result. The research team found that (i) their infrared luminosity exceeds 10 trillion suns, and (ii) their number density is about 300 per cubic gigaparsecs (1 gigaparsec is about 3×1025 meter). (Credit: Ehime University/NAOJ/NASA/ESO)

    Summary and future prospects

    In this research, the research team discovered 48 Dust Obscured Galaxies and revealed their statistical properties of infrared luminous DOGs in particular, for the first time.

    The first author of the paper Dr. Yoshiki Toba said, “There are no instruments on large telescopes with the sensitivity and field of view of HSC, and hence HSC is unique in its ability to search for DOGs. The HSC survey will cover more than 100 times as much area of the sky as the area used for this study when it is complete, allowing the identification of thousands of DOGs in the near future. We are planning to investigate the detailed properties of DOGs and their central black holes using observations from many telescope.”

    Also, Professor Tohru Nagao, second author of the paper, said “The Subaru Strategic Program with HSC has just begun. In the near future, exciting results will be released not only from studies on galaxy evolution, but also from in fields such as solar systems, stars, nearby galaxies, and cosmology.”

    This research will be published on October 25, 2015 in the Publications of the Astronomical Society of Japan (PASJ) Subaru special issue (Toba et al. 2015, “Hyper-luminous Dust Obscured Galaxies discovered by the Hyper Suprime-Cam on Subaru and WISE”, PASJ, Vol. 67, Issue. 5 ). Online version was posted on July 12, 2015. This work was supported from the Japan Society for the Promotion of Science (grant No. 25707010) and from the Yamada Science Foundation.

    Notes:

    1.In the Subaru Strategic Program, 300 nights have been allocated for five years in total. The superb resolution and sensitivity of these images enable us to find these rare and faint galaxies, to understand the growth and evolution of galaxies and black holes.
    2. The Wide-field Infrared Survey Explorer (WISE) was launched by the National Aeronautics and Space Administration (NASA) in 2009. WISE performed an all-sky survey with high sensitivity in four bands (3.4, 4.6, 12, and 22 microns). WISE has observed over seven hundred million infrared sources so far.
    3. The VISTA Kilo-degree Infrared Galaxy survey (VIKING) is performing a wide area near-infrared imaging survey with five broadband filters using the VISTA InfraRed Camera (VIRCAM) on the VISTA telescope operated by the European Southern Observatory (ESO).

    Authors:

    Yoshiki Toba: Research Center for Space and Cosmic Evolution, Ehime University, Research Fellow
    Tohru Nagao: Research Center for Space and Cosmic Evolution, Ehime University, Professor
    Michael A. Strauss: Department of Astrophysical Sciences, Princeton University, Professor
    Kentaro Aoki: Subaru Telescope, National Astronomical Observatory of Japan, Subaru Support Astronomer
    Tomotsugu Goto: Institute of Astronomy and Department of Physics, National Tsing Hua University, Assistant professor
    Masatoshi Imanishi: Subaru Telescope, National Astronomical Observatory of Japan and SOKENDAI (Graduate University for Advanced Studies, Japan), Assistant professor
    Toshihiro Kawaguchi: Division of Physics, Sapporo Medical University, Lecturer
    Yuichi Terashima: Department of Physics, Ehime University, Professor
    Yoshihiro Ueda: Department of Astronomy, Graduate School of Science, Kyoto University, Associate Professor
    Satoshi Miyazaki: National Astronomical Observatory of Japan and SOKENDAI (Graduate University for Advanced Studies, Japan), Associate Professor

    and 24 other authors

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The National Astronomical Observatory of Japan (NAOJ) is an astronomical research organisation comprising several facilities in Japan, as well as an observatory in Hawaii. It was established in 1988 as an amalgamation of three existing research organizations – the Tokyo Astronomical Observatory of the University of Tokyo, International Latitude Observatory of Mizusawa, and a part of Research Institute of Atmospherics of Nagoya University.

    In the 2004 reform of national research organizations, NAOJ became a division of the National Institutes of Natural Sciences.

    NAOJ Subaru Telescope

    NAOJ Subaru Telescope interior
    Subaru

    ALMA Array
    ALMA

    sft
    Solar Flare Telescope

    Nobeyama Radio Telescope - Copy
    Nobeyama Radio Observatory

    Nobeyama Solar Radio Telescope Array
    Nobeyama Radio Observatory: Solar

    Misuzawa Station Japan
    Mizusawa VERA Observatory

    NAOJ Okayama Astrophysical Observatory Telescope
    Okayama Astrophysical Observatory

    The National Astronomical Observatory of Japan (NAOJ) is an astronomical research organisation comprising several facilities in Japan, as well as an observatory in Hawaii. It was established in 1988 as an amalgamation of three existing research organizations – the Tokyo Astronomical Observatory of the University of Tokyo, International Latitude Observatory of Mizusawa, and a part of Research Institute of Atmospherics of Nagoya University.

    In the 2004 reform of national research organizations, NAOJ became a division of the National Institutes of Natural Sciences.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 463 other followers

%d bloggers like this: