From National Science Foundation- “NSF statement: New development in quantum computing”

From National Science Foundation

1
In this rendering, a trefoil knot, an iconic topological object, is shown coming out of a tunnel with an image of superconducting qubit chips reflected on its surface. Credit: P. Roushan\Martinis lab\UC Santa Barbara

October 23, 2019
Public Affairs, NSF
(703) 292-7090
media@nsf.gov

In Quantum supremacy using a programmable superconducting processor, in the Oct. 24 issue of the journal Nature, a team of researchers led by Google present evidence that their quantum computer has accomplished a task that existing computers built from silicon chips cannot. When verified, the result will add credence to the broader promise of quantum computing. In addition to funding a broad portfolio of quantum research, including for other quantum computing systems and approaches, NSF has provided research support to four of the Nature paper’s co-authors: John Martinis of the University of California, Santa Barbara; Fernando Brandao of Caltech; Edward Farhi of the Massachusetts Institute of Technology; and Dave Bacon of the University of Washington.

Today, Google announced that a quantum computer has accomplished a task not yet possible on a classical device. When verified, this may prove to be a milestone moment, one that builds on more than three decades of continuous NSF investment in the fundamental physics, computer science, materials science, and engineering that underlies many of today’s quantum computing developments — and the researchers behind them — including four of the co-authors who helped create Google’s system. As quantum research continues bridging theory to practice across a range of experimental platforms, it is equally important that NSF, other agencies, and industry invest in the workforce developing quantum technologies and the countless applications that will benefit all of society. Together, we will ensure continuing U.S. leadership in quantum computing.

See the full article here .


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition
The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

We fulfill our mission chiefly by issuing limited-term grants — currently about 12,000 new awards per year, with an average duration of three years — to fund specific research proposals that have been judged the most promising by a rigorous and objective merit-review system. Most of these awards go to individuals or small groups of investigators. Others provide funding for research centers, instruments and facilities that allow scientists, engineers and students to work at the outermost frontiers of knowledge.

NSF’s goals — discovery, learning, research infrastructure and stewardship — provide an integrated strategy to advance the frontiers of knowledge, cultivate a world-class, broadly inclusive science and engineering workforce and expand the scientific literacy of all citizens, build the nation’s research capability through investments in advanced instrumentation and facilities, and support excellence in science and engineering research and education through a capable and responsive organization. We like to say that NSF is “where discoveries begin.”

Many of the discoveries and technological advances have been truly revolutionary. In the past few decades, NSF-funded researchers have won some 236 Nobel Prizes as well as other honors too numerous to list. These pioneers have included the scientists or teams that discovered many of the fundamental particles of matter, analyzed the cosmic microwaves left over from the earliest epoch of the universe, developed carbon-14 dating of ancient artifacts, decoded the genetics of viruses, and created an entirely new state of matter called a Bose-Einstein condensate.

NSF also funds equipment that is needed by scientists and engineers but is often too expensive for any one group or researcher to afford. Examples of such major research equipment include giant optical and radio telescopes, Antarctic research sites, high-end computer facilities and ultra-high-speed connections, ships for ocean research, sensitive detectors of very subtle physical phenomena and gravitational wave observatories.

Another essential element in NSF’s mission is support for science and engineering education, from pre-K through graduate school and beyond. The research we fund is thoroughly integrated with education to help ensure that there will always be plenty of skilled people available to work in new and emerging scientific, engineering and technological fields, and plenty of capable teachers to educate the next generation.

No single factor is more important to the intellectual and economic progress of society, and to the enhanced well-being of its citizens, than the continuous acquisition of new knowledge. NSF is proud to be a major part of that process.

Specifically, the Foundation’s organic legislation authorizes us to engage in the following activities:

Initiate and support, through grants and contracts, scientific and engineering research and programs to strengthen scientific and engineering research potential, and education programs at all levels, and appraise the impact of research upon industrial development and the general welfare.
Award graduate fellowships in the sciences and in engineering.
Foster the interchange of scientific information among scientists and engineers in the United States and foreign countries.
Foster and support the development and use of computers and other scientific methods and technologies, primarily for research and education in the sciences.
Evaluate the status and needs of the various sciences and engineering and take into consideration the results of this evaluation in correlating our research and educational programs with other federal and non-federal programs.
Provide a central clearinghouse for the collection, interpretation and analysis of data on scientific and technical resources in the United States, and provide a source of information for policy formulation by other federal agencies.
Determine the total amount of federal money received by universities and appropriate organizations for the conduct of scientific and engineering research, including both basic and applied, and construction of facilities where such research is conducted, but excluding development, and report annually thereon to the President and the Congress.
Initiate and support specific scientific and engineering activities in connection with matters relating to international cooperation, national security and the effects of scientific and technological applications upon society.
Initiate and support scientific and engineering research, including applied research, at academic and other nonprofit institutions and, at the direction of the President, support applied research at other organizations.
Recommend and encourage the pursuit of national policies for the promotion of basic research and education in the sciences and engineering. Strengthen research and education innovation in the sciences and engineering, including independent research by individuals, throughout the United States.
Support activities designed to increase the participation of women and minorities and others underrepresented in science and technology.

At present, NSF has a total workforce of about 2,100 at its Alexandria, VA, headquarters, including approximately 1,400 career employees, 200 scientists from research institutions on temporary duty, 450 contract workers and the staff of the NSB office and the Office of the Inspector General.

NSF is divided into the following seven directorates that support science and engineering research and education: Biological Sciences, Computer and Information Science and Engineering, Engineering, Geosciences, Mathematical and Physical Sciences, Social, Behavioral and Economic Sciences, and Education and Human Resources. Each is headed by an assistant director and each is further subdivided into divisions like materials research, ocean sciences and behavioral and cognitive sciences.

Within NSF’s Office of the Director, the Office of Integrative Activities also supports research and researchers. Other sections of NSF are devoted to financial management, award processing and monitoring, legal affairs, outreach and other functions. The Office of the Inspector General examines the foundation’s work and reports to the NSB and Congress.

Each year, NSF supports an average of about 200,000 scientists, engineers, educators and students at universities, laboratories and field sites all over the United States and throughout the world, from Alaska to Alabama to Africa to Antarctica. You could say that NSF support goes “to the ends of the earth” to learn more about the planet and its inhabitants, and to produce fundamental discoveries that further the progress of research and lead to products and services that boost the economy and improve general health and well-being.

As described in our strategic plan, NSF is the only federal agency whose mission includes support for all fields of fundamental science and engineering, except for medical sciences. NSF is tasked with keeping the United States at the leading edge of discovery in a wide range of scientific areas, from astronomy to geology to zoology. So, in addition to funding research in the traditional academic areas, the agency also supports “high risk, high pay off” ideas, novel collaborations and numerous projects that may seem like science fiction today, but which the public will take for granted tomorrow. And in every case, we ensure that research is fully integrated with education so that today’s revolutionary work will also be training tomorrow’s top scientists and engineers.

Unlike many other federal agencies, NSF does not hire researchers or directly operate our own laboratories or similar facilities. Instead, we support scientists, engineers and educators directly through their own home institutions (typically universities and colleges). Similarly, we fund facilities and equipment such as telescopes, through cooperative agreements with research consortia that have competed successfully for limited-term management contracts.

NSF’s job is to determine where the frontiers are, identify the leading U.S. pioneers in these fields and provide money and equipment to help them continue. The results can be transformative. For example, years before most people had heard of “nanotechnology,” NSF was supporting scientists and engineers who were learning how to detect, record and manipulate activity at the scale of individual atoms — the nanoscale. Today, scientists are adept at moving atoms around to create devices and materials with properties that are often more useful than those found in nature.

Dozens of companies are gearing up to produce nanoscale products. NSF is funding the research projects, state-of-the-art facilities and educational opportunities that will teach new skills to the science and engineering students who will make up the nanotechnology workforce of tomorrow.

At the same time, we are looking for the next frontier.

NSF’s task of identifying and funding work at the frontiers of science and engineering is not a “top-down” process. NSF operates from the “bottom up,” keeping close track of research around the United States and the world, maintaining constant contact with the research community to identify ever-moving horizons of inquiry, monitoring which areas are most likely to result in spectacular progress and choosing the most promising people to conduct the research.

NSF funds research and education in most fields of science and engineering. We do this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the U.S. The Foundation considers proposals submitted by organizations on behalf of individuals or groups for support in most fields of research. Interdisciplinary proposals also are eligible for consideration. Awardees are chosen from those who send us proposals asking for a specific amount of support for a specific project.

Proposals may be submitted in response to the various funding opportunities that are announced on the NSF website. These funding opportunities fall into three categories — program descriptions, program announcements and program solicitations — and are the mechanisms NSF uses to generate funding requests. At any time, scientists and engineers are also welcome to send in unsolicited proposals for research and education projects, in any existing or emerging field. The Proposal and Award Policies and Procedures Guide (PAPPG) provides guidance on proposal preparation and submission and award management. At present, NSF receives more than 42,000 proposals per year.

To ensure that proposals are evaluated in a fair, competitive, transparent and in-depth manner, we use a rigorous system of merit review. Nearly every proposal is evaluated by a minimum of three independent reviewers consisting of scientists, engineers and educators who do not work at NSF or for the institution that employs the proposing researchers. NSF selects the reviewers from among the national pool of experts in each field and their evaluations are confidential. On average, approximately 40,000 experts, knowledgeable about the current state of their field, give their time to serve as reviewers each year.

The reviewer’s job is to decide which projects are of the very highest caliber. NSF’s merit review process, considered by some to be the “gold standard” of scientific review, ensures that many voices are heard and that only the best projects make it to the funding stage. An enormous amount of research, deliberation, thought and discussion goes into award decisions.

The NSF program officer reviews the proposal and analyzes the input received from the external reviewers. After scientific, technical and programmatic review and consideration of appropriate factors, the program officer makes an “award” or “decline” recommendation to the division director. Final programmatic approval for a proposal is generally completed at NSF’s division level. A principal investigator (PI) whose proposal for NSF support has been declined will receive information and an explanation of the reason(s) for declination, along with copies of the reviews considered in making the decision. If that explanation does not satisfy the PI, he/she may request additional information from the cognizant NSF program officer or division director.

If the program officer makes an award recommendation and the division director concurs, the recommendation is submitted to NSF’s Division of Grants and Agreements (DGA) for award processing. A DGA officer reviews the recommendation from the program division/office for business, financial and policy implications, and the processing and issuance of a grant or cooperative agreement. DGA generally makes awards to academic institutions within 30 days after the program division/office makes its recommendation.

#google, #nsf, #quantum-computing

From MIT Technology Review: “Here’s what quantum supremacy does—and doesn’t—mean for computing”

MIT Technology Review
From MIT Technology Review

Sep 24, 2019
Martin Giles

1

Google has reportedly demonstrated for the first time that a quantum computer is capable of performing a task beyond the reach of even the most powerful conventional supercomputer in any practical time frame—a milestone known in the world of computing as “quantum supremacy.”

The ominous-sounding term, which was coined by theoretical physicist John Preskill in 2012, evokes an image of Darth Vader–like machines lording it over other computers. And the news has already produced some outlandish headlines, such as one on the Infowars website that screamed, “Google’s ‘Quantum Supremacy’ to Render All Cryptography and Military Secrets Breakable.” Political figures have been caught up in the hysteria, too: Andrew Yang, a presidential candidate, tweeted that “Google achieving quantum computing is a huge deal. It means, among many other things, that no code is uncrackable.”

Nonsense. It doesn’t mean that at all. Google’s achievement is significant, but quantum computers haven’t suddenly turned into computing colossi that will leave conventional machines trailing in the dust. Nor will they be laying waste to conventional cryptography in the near future—though in the longer term, they could pose a threat we need to start preparing for now.

Here’s a guide to what Google appears to have achieved—and an antidote to the hype surrounding quantum supremacy.

What do we know about Google’s experiment?

We still haven’t had confirmation from Google about what it’s done. The information about the experiment comes from a paper titled “Quantum Supremacy Using a Programmable Superconducting Processor,” which was briefly posted on a NASA website before being taken down. Its existence was revealed in a report in the Financial Times—and a copy of the paper can be found here.

The experiment is a pretty arcane one, but it required a great deal of computational effort. Google’s team used a quantum processor code-named Sycamore to prove that the figures pumped out by a random number generator were indeed truly random. They then worked out how long it would take Summit, the world’s most powerful supercomputer, to do the same task.

ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

The difference was stunning: while the quantum machine polished it off in 200 seconds, the researchers estimated that the classical computer would need 10,000 years.

When the paper is formally published, other researchers may start poking holes in the methodology, but for now it appears that Google has scored a computing first by showing that a quantum machine can indeed outstrip even the most powerful of today’s supercomputers. “There’s less doubt now that quantum computers can be the future of high-performance computing,” says Nick Farina, the CEO of quantum hardware startup EeroQ.

Why are quantum computers so much faster than classical ones?

In a classical computer, bits that carry information represent either a 1 or a 0; but quantum bits, or qubits—which take the form of subatomic particles such as photons and electrons—can be in a kind of combination of 1 and 0 at the same time, a state known as “superposition.” Unlike bits, qubits can also influence one another through a phenomenon known as “entanglement,” which baffled even Einstein, who called it “spooky action at a distance.”

Thanks to these properties, which are described in more detail in our quantum computing explainer, adding just a few extra qubits to a system increases its processing power exponentially. Crucially, quantum machines can crunch through large amounts of data in parallel, which helps them outpace classical machines that process data sequentially. That’s the theory. In practice, researchers have been laboring for years to prove conclusively that a quantum computer can do something even the most capable conventional one can’t. Google’s effort has been led by John Martinis, who has done pioneering work in the use of superconducting circuits to generate qubits.

Doesn’t this speedup mean quantum machines can overtake other computers now?

No. Google picked a very narrow task. Quantum computers still have a long way to go before they can best classical ones at most things—and they may never get there. But researchers I’ve spoken to since the paper appeared online say Google’s experiment is still significant because for a long time there have been doubts that quantum machines would ever be able to outstrip classical computers at anything.

Until now, research groups have been able to reproduce the results of quantum machines with around 40 qubits on classical systems. Google’s Sycamore processor, which harnessed 53 qubits for the experiment, suggests that such emulation has reached its limits. “We’re entering an era where exploring what a quantum computer can do will now require a physical quantum computer … You won’t be able to credibly reproduce results anymore on a conventional emulator,” explains Simon Benjamin, a quantum researcher at the University of Oxford.

Isn’t Andrew Yang right that our cryptographic defenses can now be blown apart?

Again, no. That’s a wild exaggeration. The Google paper makes clear that while its team has been able to show quantum supremacy in a narrow sampling task, we’re still a long way from developing a quantum computer capable of implementing Shor’s algorithm, which was developed in the 1990s to help quantum machines factor massive numbers. Today’s most popular encryption methods can be broken only by factoring such numbers—a task that would take conventional machines many thousands of years.

But this quantum gap shouldn’t be cause for complacency, because things like financial and health records that are going to be kept for decades could eventually become vulnerable to hackers with a machine capable of running a code-busting algorithm like Shor’s. Researchers are already hard at work on novel encryption methods that will be able to withstand such attacks (see our explainer on post-quantum cryptography for more details).

Why aren’t quantum computers as supreme as “quantum supremacy” makes them sound?

The main reason is that they still make far more errors than classical ones. Qubits’ delicate quantum state lasts for mere fractions of a second and can easily be disrupted by even the slightest vibration or tiny change in temperature—phenomena known as “noise” in quantum-speak. This causes mistakes to creep into calculations. Qubits also have a Tinder-like tendency to want to couple with plenty of others. Such “crosstalk” between them can also produce errors.

Google’s paper suggests it has found a novel way to cut down on crosstalk, which could help pave the way for more reliable machines. But today’s quantum computers still resemble early supercomputers in the amount of hardware and complexity needed to make them work, and they can tackle only very esoteric tasks. We’re not yet even at a stage equivalent to the ENIAC, IBM’s first general-purpose computer, which was put to work in 1945.

So what’s the next quantum milestone to aim for?

Besting conventional computers at solving a real-world problem—a feat that some researchers refer to as “quantum advantage.” The hope is that quantum computers’ immense processing power will help uncover new pharmaceuticals and materials, enhance artificial-intelligence applications, and lead to advances in other fields such as financial services, where they could be applied to things like risk management.

If researchers can’t demonstrate a quantum advantage in at least one of these kinds of applications soon, the bubble of inflated expectations that’s blowing up around quantum computing could quickly burst.

When I asked Google’s Martinis about this in an interview for a story last year, he was clearly aware of the risk. “As soon as we get to quantum supremacy,” he told me, “we’re going to want to show that a quantum machine can do something really useful.” Now it’s time for his team and other researchers to step up to that pressing challenge.

See the full article here .


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

#for-a-long-time-there-have-been-doubts-that-quantum-machines-would-ever-be-able-to-outstrip-classical-computers-at-anything, #google, #mit-technology-review, #quantum-computing-is-somethng-we-need-to-start-preparing-for-now, #quantum-supremacy, #we-still-havent-had-confirmation-from-google-about-what-its-done

From insideHPC: “Jülich Supercomputing Centre Announces Quantum Computing Research Partnership with Google”

From insideHPC

July 8, 2019

Today the Jülich Supercomputing Centre announced it is partnering with Google in the field of quantum computing research. The partnership will include joint research and expert trainings in the fields of quantum technologies and quantum algorithms and the mutual use of quantum hardware.

JUWELS-Jülich Wizard for European Leadership Science

JURECA-Jülich Research on Exascale Cluster Architectures
2

DEEP-EST – Dynamical Exascale Entry Platform – Prototype System DELL R640
3

Pilot systems of the Human Brain Project – JULIA and JURON
JULIA from Cray and JURON from a consortium of IBM and NVIDIA
4

“Quantum computers have the potential to solve certain types of calculations much more efficiently than today’s technologies can,” said Peter Altmaier, Federal Minister for Economic Affairs and Energy. “Quantum computers and quantum algorithms are therefore very important technologies which will shape the future and are being followed closely around the world. At present, quantum computers are still very much at in their infancy, and it is difficult to predict what will become possible – and what perhaps will not. Researchers still have a lot of basic research to do in this area. It was the same situation when we were developing today’s computers. I am therefore delighted that Google and Forschungszentrum Jülich have decided to cooperate in the important forward-looking field of quantum computers.”

Google has been working on the development of quantum processors and quantum algorithms for years. Exploring new technologies for quantum computers is also a key research focus at Forschungszentrum Jülich. The German research center will operate and make publicly accessible a European quantum computer with 50 to 100 superconducting qubits, to be developed within the EU’s Quantum Flagship Program, a large-scale initiative in the field of quantum technologies funded at the 1 billion € level on a 10 years timescale.

Google and Forschungszentrum Jülich will support each other especially in training junior researchers and experts. “A shortage of specialists, like in the field of artificial intelligence, is also foreseeable in the field of quantum computing. For this reason, we invest in training and promoting top academic talent” says Dr. Markus Hoffmann, Head of Quantum Partnerships at Google.

The partnership includes regular research exchange. “Hands-on workshop and spring schools will be organised at Forschungszentrum Jülich. The Jülich UNified Infrastructure for Quantum computing (JUNIQ), a European quantum computer user facility planned for the Jülich Supercomputing Centre (JSC), will be available for training industry professionals, and will be accessible in the cloud to European users,” says Prof. Kristel Michielsen from the JSC, head of the research group Quantum Information Processing.

Forschungszentrum Jülich and Google have commenced the team work. Prof. Kristel Michielsen and Prof. Tommaso Calarco from Forschungszentrum Jülich have received Google Faculty Research Awards in 2018. Prof. Frank Wilhelm-Mauch, recipient of a Google Faculty Research Award in 2015, is a collaborator of Forschungszentrum Jülich within the subproject OpenSuperQ of the European “Quantum Flagship” project.

See the full article here .

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

insideHPC
2825 NW Upshur
Suite G
Portland, OR 97239

Phone: (503) 877-5048

#julich-supercomputing-centre-announces-quantum-computing-research-partnership-with-google, #applied-research-technology, #basic-research, #google, #insidehpc, #supercomputing

From WIRED: “How Google Is Cramming More Data Into Its New Atlantic Cable”

Wired logo

From WIRED

04.05.19
Klint Finley

1
Fiber-optic cable being loaded onto a ship owned by SubCom, which is working with Google to build the world’s fastest undersea data connection. Bill Gallery/SubCom.

1

Google says the fiber-optic cable it’s building across the Atlantic Ocean will be the fastest of its kind. When the cable goes live next year, the company estimates it will transmit around 250 terabits per second, fast enough to zap all the contents of the Library of Congress from Virginia to France three times every second. That’s about 56 percent faster than Facebook and Microsoft’s Marea cable, which can transmit about 160 terabits per second between Virginia and Spain.

Fiber-optic networks work by sending light over thin strands of glass. Fiber-optic cables, which are about the diameter of a garden hose, enclose multiple pairs of these fibers. Google’s new cable is so fast because it carries more fiber pairs. Today, most long-distance undersea cables contain six or eight fiber-optic pairs. Google said Friday that its new cable, dubbed Dunant, is expected to be the first to include 12 pairs, thanks to new technology developed by Google and SubCom, which designs, manufactures, and deploys undersea cables.

Dunant might not be the fastest for long: Japanese tech giant NEC says it has technology that will enable long-distance undersea cables with 16 fiber-optic pairs. And Vijay Vusirikala, head of network architecture and optical engineering at Google, says the company is already contemplating 24-pair cables.

The surge in intercontinental cables, and their increasing capacity, reflect continual growth in internet traffic. They enable activists to livestream protests to distant countries, help companies buy and sell products around the world, and facilitate international romances. “Many people still believe international telecommunications are conducted by satellite,” says NEC executive Atsushi Kuwahara. “That was true in 1980, but nowadays, 99 percent of international telecommunications is submarine.”

So much capacity is being added that, for the moment, it’s outstripping demand. Animations featured in a recent New York Times article illustrated the exploding number of undersea cables since 1989. That growth is continuing. Alan Mauldin of the research firm Telegeography says only about 30 percent of the potential capacity of major undersea cable routes is currently in use—and more than 60 new cables are planned to enter service by 2021. That summons memories of the 1990s Dotcom Bubble, when telecoms buried far more fiber in both the ground and the ocean than they would need for years to come.

3
A selection of fiber-optic cable products made by SubCom. Brian Smith/SubCom.

But the current growth in new cables is driven less by telcos and more by companies like Google, Facebook, and Microsoft that crave ever more bandwidth for the streaming video, photos, and other data scuttling between their global data centers. And experts say that as undersea cable technologies improve, it’s not crazy for companies to build newer, faster routes between continents, even with so much fiber already laying idle in the ocean.

Controlling Their Own Destiny

Mauldin says that although there’s still lots of capacity available, companies like Google and Facebook prefer to have dedicated capacity for their own use. That’s part of why big tech companies have either invested in new cables through consortia or, in some cases, built their own cables.

“When we do our network planning, it’s important to know if we’ll have the capacity in the network,” says Google’s Vusirikala. “One way to know is by building our own cables, controlling our own destiny.”

Another factor is diversification. Having more cables means there are alternate routes for data if a cable breaks or malfunctions. At the same time, more people outside Europe and North America are tapping the internet, often through smartphones. That’s prompted companies to think about new routes, like between North and South America, or between Europe and Africa, says Mike Hollands, an executive at European data center company Interxion. The Marea cable ticks both of those boxes, giving Facebook and Microsoft faster routes to North Africa and the Middle East, while also creating an alternate path to Europe in case one or more of the traditional routes were disrupted by something like an earthquake.

Cost Per Bit

There are financial incentives for the tech companies as well. By owning the cables instead of leasing them from telcos, Google and other tech giants can potentially save money in the long term, Mauldin says.

The cost to build and deploy a new undersea cable isn’t dropping. But as companies find ways to pump more data through these cables more quickly, their value increases.

There are a few ways to increase the performance of a fiber-optic communications system. One is to increase the energy used to push the data from one end to the other. The catch is that to keep the data signal from degrading, undersea cables need repeaters roughly every 100 kilometers, Vusirikala explains. Those repeaters amplify not just the signal, but any noise introduced along the way, diminishing the value of boosting the energy.

4
A rendering of one of SubCom’s specialized Reliance-class cable ships. SubCom.

You can also increase the amount of data that each fiber pair within a fiber-optic cable can carry. A technique called “dense wavelength division multiplexing” now enables more than 100 wavelengths to be sent along a single fiber pair.

Or you can pack more fiber pairs into a cable. Traditionally each pair in a fiber-optic cable required two repeater components called “pumps.” The pumps take up space inside the repeater casing, so adding more pumps would require changes to the way undersea cable systems are built, deployed, and maintained, says SubCom CTO Georg Mohs.

To get around that problem, SubCom and others are using a technique called space-division multiplexing (SDM) to allow four repeater pumps to power four fiber pairs. That will reduce the capacity of each pair, but cutting the required number of pumps in half allows them to add additional pairs that more than makes up for it, Mohs says.

“This had been in our toolkit before,” Mohs says, but like other companies, SubCom has been more focused on adding more wavelengths per fiber pair.

The result: Cables that can move more data than ever before. That means the total cost per bit of data sent across the cable is lower.

See the full article here .

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

#how-google-is-cramming-more-data-into-its-new-atlantic-cable, #applied-research-technology, #google, #google-says-the-fiber-optic-cable-its-building-across-the-atlantic-ocean-will-be-the-fastest-of-its-kind-fiber-optic-networks-work-by-sending-light-over-thin-strands-of-glass, #japanese-tech-giant-nec-says-it-has-technology-that-will-enable-long-distance-undersea-cables-with-16-fiber-optic-pairs, #the-current-growth-in-new-cables-is-driven-less-by-telcos-and-more-by-companies-like-google-facebook-and-microsoft, #today-most-long-distance-undersea-cables-contain-six-or-eight-fiber-optic-pairs, #vijay-vusirikala-head-of-network-architecture-and-optical-engineering-at-google-says-the-company-is-already-contemplating-24-pair-cables, #wired

From Science and Technology Facilities Council: “UK dataset expertise informs Google’s new dataset search”


From Science and Technology Facilities Council

6 September 2018

1
False colour image of Europe captured by Sentinel 3. (Credit: contains modified Copernicus Sentinel data (2018)

ESA Sentinel 3

Experts from UK Research and Innovation have contributed to a search tool newly launched by Google that aims to help scientists, policy makers and other user groups more easily find the data required for their work and their stories, or simply to satisfy their intellectual curiosity.

In today’s world, scientists in many disciplines and a growing number of journalists live and breathe data. There are many thousands of data repositories on the web, providing access to millions of datasets; and local and national governments around the world publish their data as well. As part of the UK Research and Innovation commitment to easy access to data, their experts worked with Google to help develop the Dataset Search, launched today.

Similar to how Google Scholar works, Dataset Search lets users find datasets wherever they’re hosted, whether it’s a publisher’s site, a digital library, or an author’s personal web page.

Google approached UK Research and Innovation’s Natural Environment Research Council (NERC) and Science and Technology Facilities Council (STFC) to help ensure their world-leading environmental datasets were included. The heritage in these organisations for managing huge complex datasets on the atmosphere, oceans, climate change, and even data about the solar system, managed by Dr Sarah Callaghan, the Data and Programme Manager at the UKRI’s national space laboratory STFC RAL Space, led to them working with Google on the project.

Dr Sarah Callaghan said: “In RAL Space we manage, archive and distribute thousands of terabytes of data to make it available to scientific researchers and other interested parties. My experience making datasets findable, usable and interoperable enabled me to advise Google on their Dataset Search and how to best display their search results.”

“I was able to draw on my work with NERC and STFC datasets, not only in just archiving and managing data for the long term and the scientific record, but also helping users to understand if a dataset is the right one for their purposes.”

3
Temperature of Europe during the April 2018 heatwave. (Credit: contains modified Copernicus Sentinel data (2018)

To create Dataset Search, Google developed guidelines for dataset providers to describe their data in a way that search engines can better understand the content of their pages. These guidelines include salient information about datasets: who created the dataset, when it was published, how the data was collected, what the terms are for using the data, etc. This enables search engines to collect and link this information, analyse where different versions of the same dataset might be, and find publications that may be describing or discussing the dataset. The approach is based on an open standard for describing this information (schema.org). Many STFC and NERC datasets for environmental data are already described in this way and are particularly good examples of findable, user-friendly datasets.

“Standardised ways of describing data allows us to help researchers by building tools and services to make it easier to find and use data” said Dr Callaghan, “If people don’t know what datasets exist, they won’t know how to look for what they need to solve their environmental problems. For example, an ecologist might not know where to go to find, or how to access the rainfall data needed to understand a changing habitat. Making data easier to find, will help introduce researchers from a variety of disciplines to the vast amount of data I and my colleagues manage for NERC and STFC.”

The new Google Dataset Search offers references to most datasets in environmental and social sciences, as well as data from other disciplines including government data and data provided by news organisations.

Professor Tim Wheeler, Director of Research and Innovation at NERC, said: “NERC is constantly working to raise awareness of the wealth of environmental information held within its Data Centres, and to improve access to it. This new tool will make it easier than ever for the public, business and science professionals to find and access the data that they’re looking for. We want to get as many people as possible interested in and able to benefit from data collected by the environmental science that we fund.”

NERC JASMIN supercomputer based at STFC’s Rutherford Appleton Laboratory (Credit: STFC)

See the full article here .

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

STFC Hartree Centre

Helping build a globally competitive, knowledge-based UK economy

We are a world-leading multi-disciplinary science organisation, and our goal is to deliver economic, societal, scientific and international benefits to the UK and its people – and more broadly to the world. Our strength comes from our distinct but interrelated functions:

Universities: we support university-based research, innovation and skills development in astronomy, particle physics, nuclear physics, and space science
Scientific Facilities: we provide access to world-leading, large-scale facilities across a range of physical and life sciences, enabling research, innovation and skills training in these areas
National Campuses: we work with partners to build National Science and Innovation Campuses based around our National Laboratories to promote academic and industrial collaboration and translation of our research to market through direct interaction with industry
Inspiring and Involving: we help ensure a future pipeline of skilled and enthusiastic young people by using the excitement of our sciences to encourage wider take-up of STEM subjects in school and future life (science, technology, engineering and mathematics)

We support an academic community of around 1,700 in particle physics, nuclear physics, and astronomy including space science, who work at more than 50 universities and research institutes in the UK, Europe, Japan and the United States, including a rolling cohort of more than 900 PhD students.

STFC-funded universities produce physics postgraduates with outstanding high-end scientific, analytic and technical skills who on graduation enjoy almost full employment. Roughly half of our PhD students continue in research, sustaining national capability and creating the bedrock of the UK’s scientific excellence. The remainder – much valued for their numerical, problem solving and project management skills – choose equally important industrial, commercial or government careers.

Our large-scale scientific facilities in the UK and Europe are used by more than 3,500 users each year, carrying out more than 2,000 experiments and generating around 900 publications. The facilities provide a range of research techniques using neutrons, muons, lasers and x-rays, and high performance computing and complex analysis of large data sets.

They are used by scientists across a huge variety of science disciplines ranging from the physical and heritage sciences to medicine, biosciences, the environment, energy, and more. These facilities provide a massive productivity boost for UK science, as well as unique capabilities for UK industry.

Our two Campuses are based around our Rutherford Appleton Laboratory at Harwell in Oxfordshire, and our Daresbury Laboratory in Cheshire – each of which offers a different cluster of technological expertise that underpins and ties together diverse research fields.

The combination of access to world-class research facilities and scientists, office and laboratory space, business support, and an environment which encourages innovation has proven a compelling combination, attracting start-ups, SMEs and large blue chips such as IBM and Unilever.

We think our science is awesome – and we know students, teachers and parents think so too. That’s why we run an extensive Public Engagement and science communication programme, ranging from loans to schools of Moon Rocks, funding support for academics to inspire more young people, embedding public engagement in our funded grant programme, and running a series of lectures, travelling exhibitions and visits to our sites across the year.

Ninety per cent of physics undergraduates say that they were attracted to the course by our sciences, and applications for physics courses are up – despite an overall decline in university enrolment.

#google, #google-dataset-search, #jasmin-supercmputer, #nerc, #stfc, #supercomputing, #uk-dataset-expertise-informs-googles-new-dataset-search

From Duke University via The News&Observer: “Look out, IBM. A Duke-led group is also a player in quantum computing”

Duke Bloc
Duke Crest

From Duke University

via

The News&Observer

August 13, 2018
Ray Gronberg

1
Duke University professors Iman Marvian, Jungsang Kim and Kenneth Brown, gathered here in Kim’s lab in the Chesterfield Building in downtown Durham, are working together to develop a quantum computer that relies on “trapped ion” technology. The National Science Foundation and the federal Intelligence Advanced Research Projects Activity are helping fund the project. Les Todd LKT Photography, Inc.

There’s a group based at Duke University that thinks it can out-do IBM in the quantum-computing game, and it just got another $15 million in funding from the U.S. government.

Quantum computing – IBM

The National Science Foundation grant is helping underwrite a consortium led by professors Jungsang Kim and Ken Brown that’s previously received backing from the federal Intelligence Advanced Research Projects Activity.

Kim said the group is developing a quantum computer that has “up to a couple dozen qubits” of computational power and reckons it’s a year or so from being operational. The world qubit is the quantum-computing world’s equivalent of normal computing’s “bit” when it comes to gauging processing ability, and each additional qubit represents a doubling of that power.

“One of the goals of this [grant] is to establish the hardware so we can allow researchers to work on the software and systems optimization,” Kim said of the National Science Foundation grant the agency awarded on Aug. 6.

Two or three dozen qubits might not sound like a lot when IBM says it has built and tested a 50-qubit machine. But the Duke-led research group is approaching the problem from an entirely different angle.

The “trapped-ion” design it’s using could hold qubits steady in its internal memory for much longer than superconducting designs like those IBM is working on can manage, Brown said.

Superconducting designs — which operate at extremely cold temperatures — “are a bit faster” than trapped-ion ones and are the focus of “a much larger industrial effort,” Brown said.

That speed-versus-resilience tradeoff could matter because IBM says its machines can hold a qubit steady in memory for only up to about 90 microseconds. That means processing runs have to be short, on the order of no more than a couple of seconds total.

“One thing that’s becoming clear in the community is, the thing we need to scale is not just the number of qubits but also the quality of operations,” said Brown, who in January traded a faculty post at Georgia Tech for a new one at Duke. “If you have a huge number of qubits but the operations are not very good, you effectively have a bad classical computer.”

Kim added that designers working on quantum computers have to look for the same kind of breakthrough in thinking about the technology that the Wright brothers brought to the development of flight.

Just as the Wrights and other people working in the field in the late 19th and early 20th centuries figured out that mimicking birds was a developmental dead end, the builders of quantum computers “have to start with something that’s fundamentally quantum and build the right technology to scale it,” Kim said. “You don’t build quantum computers by mimicking classical computers.”

But for now, the government agencies that are subsidizing the field are backing different approaches and waiting to see what pans out.

The Aug. 6 grant is the third big one Kim’s lab has secured, building on awards from IARPA in 2010 and 2016 that together brought it about $54.5 million in funding. But in both those rounds of funding, teams from IBM were also among those getting awards from the federal agency, which funds what it calls “high-risk/high-payoff” research for the intelligence community.

The stakes are so high because quantum computing could become a breakthrough technology. It exploits the physics of subatomic particles in hopes of developing a machine that can process data that exists in multiple states at once, rather than the binary 1 or 0 of traditional computing.

IBM and the government aren’t the only heavy hitters involved. Google has a quantum-computing project of its own that’s grown with help from IARPA funding.

3
Google’s Quantum Dream Machine

Kim and other people involved in the Duke-led group have also formed a company called IonQ that’s received investment from Google and Amazon.

The Duke-led group also includes teams from from the University of Maryland, the University of Chicago and Tufts University that are working on hardware, software and applications development, respectively, Duke officials say. Researchers from the University of New Mexico, MIT, the National Institute of Standards and Technology and the University of California-Berkeley are also involved.

Duke doesn’t have quantum computing all to itself in the Triangle, as in the spring IBM made N.C. State University part of its Q Network, a group of businesses, universities and government agencies that can use IBM’s quantum machines via the cloud.

But the big difference between the N.C. State and Duke efforts is that with State, the focus is on developing both the future workforce and beginning to push software development, while at Duke it’s more fundamentally about trying to develop the technology.

Not that software is a side issue, mind.

“If I had a quantum computer with 60 qubits, I know there are algorithms I can run on it that I can’t simulate with my regular computers,” Brown said, explaining that the technology requires new thinking there, too. “That’s a weird place to be.”

The quantum project is important enough that Duke has backed it with faculty hires. Brown had been collaborating with Kim’s group for a while, but elected to move to Duke from Georgia Tech after Duke officials decided to conduct what Kim termed “a cluster hire” of quantum specialists.

Brown joined Kim in the Pratt School of Engineering’s electrical and computer engineering department. A search for someone to fill an an endowed chair in physics continues.

Another professor involved, Iman Marvian, also joined the Duke faculty at the start of 2018 thanks to the university’s previously announced “quantitative initiative.” A quantum information theorist, he got a joint appointment in physics and engineering. He came to Duke from MIT after a post-doc stint at the Boston school.

See the full article here .

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

Duke Campus

Younger than most other prestigious U.S. research universities, Duke University consistently ranks among the very best. Duke’s graduate and professional schools — in business, divinity, engineering, the environment, law, medicine, nursing and public policy — are among the leaders in their fields. Duke’s home campus is situated on nearly 9,000 acres in Durham, N.C, a city of more than 200,000 people. Duke also is active internationally through the Duke-NUS Graduate Medical School in Singapore, Duke Kunshan University in China and numerous research and education programs across the globe. More than 75 percent of Duke students pursue service-learning opportunities in Durham and around the world through DukeEngage and other programs that advance the university’s mission of “knowledge in service to society.”

#duke-university, #google, #ibm, #quantum-computing, #qubits, #the-newsobserver

From NYT: “Yale Professors Race Google and IBM to the First Quantum Computer”

New York Times

The New York Times

NOV. 13, 2017
CADE METZ

1
Prof. Robert Schoelkopf inside a lab at Yale University. Quantum Circuits, the start-up he has created with two of his fellow professors, is located just down the road. Credit Roger Kisby for The New York Times

Robert Schoelkopf is at the forefront of a worldwide effort to build the world’s first quantum computer. Such a machine, if it can be built, would use the seemingly magical principles of quantum mechanics to solve problems today’s computers never could.

Three giants of the tech world — Google, IBM, and Intel — are using a method pioneered by Mr. Schoelkopf, a Yale University professor, and a handful of other physicists as they race to build a machine that could significantly accelerate everything from drug discovery to artificial intelligence. So does a Silicon Valley start-up called Rigetti Computing. And though it has remained under the radar until now, those four quantum projects have another notable competitor: Robert Schoelkopf.

After their research helped fuel the work of so many others, Mr. Schoelkopf and two other Yale professors have started their own quantum computing company, Quantum Circuits.

Based just down the road from Yale in New Haven, Conn., and backed by $18 million in funding from the venture capital firm Sequoia Capital and others, the start-up is another sign that quantum computing — for decades a distant dream of the world’s computer scientists — is edging closer to reality.

“In the last few years, it has become apparent to us and others around the world that we know enough about this that we can build a working system,” Mr. Schoelkopf said. “This is a technology that we can begin to commercialize.”

Quantum computing systems are difficult to understand because they do not behave like the everyday world we live in. But this counterintuitive behavior is what allows them to perform calculations at rate that would not be possible on a typical computer.

Today’s computers store information as “bits,” with each transistor holding either a 1 or a 0. But thanks to something called the superposition principle — behavior exhibited by subatomic particles like electrons and photons, the fundamental particles of light — a quantum bit, or “qubit,” can store a 1 and a 0 at the same time. This means two qubits can hold four values at once. As you expand the number of qubits, the machine becomes exponentially more powerful.

Todd Holmdahl, who oversees the quantum project at Microsoft, said he envisioned a quantum computer as something that could instantly find its way through a maze. “A typical computer will try one path and get blocked and then try another and another and another,” he said. “A quantum computer can try all paths at the same time.”

The trouble is that storing information in a quantum system for more than a short amount of time is very difficult, and this short “coherence time” leads to errors in calculations. But over the past two decades, Mr. Schoelkopf and other physicists have worked to solve this problem using what are called superconducting circuits. They have built qubits from materials that exhibit quantum properties when cooled to extremely low temperatures.

With this technique, they have shown that, every three years or so, they can improve coherence times by a factor of 10. This is known as Schoelkopf’s Law, a playful ode to Moore’s Law, the rule that says the number of transistors on computer chips will double every two years.

2
Professor Schoelkopf, left, and Prof. Michel Devoret working on a device that can reach extremely low temperatures to allow a quantum computing device to function. Credit Roger Kisby for The New York Times

“Schoelkopf’s Law started as a joke, but now we use it in many of our research papers,” said Isaac Chuang, a professor at the Massachusetts Institute of Technology. “No one expected this would be possible, but the improvement has been exponential.”

These superconducting circuits have become the primary area of quantum computing research across the industry. One of Mr. Schoelkopf’s former students now leads the quantum computing program at IBM. The founder of Rigetti Computing studied with Michel Devoret, one of the other Yale professors behind Quantum Circuits.

In recent months, after grabbing a team of top researchers from the University of California, Santa Barbara, Google indicated it is on the verge of using this method to build a machine that can achieve “quantum supremacy” — when a quantum machine performs a task that would be impossible on your laptop or any other machine that obeys the laws of classical physics.

There are other areas of research that show promise. Microsoft, for example, is betting on particles known as anyons. But superconducting circuits appear likely to be the first systems that will bear real fruit.

The belief is that quantum machines will eventually analyze the interactions between physical molecules with a precision that is not possible today, something that could radically accelerate the development of new medications. Google and others also believe that these systems can significantly accelerate machine learning, the field of teaching computers to learn tasks on their own by analyzing data or experiments with certain behavior.

A quantum computer could also be able to break the encryption algorithms that guard the world’s most sensitive corporate and government data. With so much at stake, it is no surprise that so many companies are betting on this technology, including start-ups like Quantum Circuits.

The deck is stacked against the smaller players, because the big-name companies have so much more money to throw at the problem. But start-ups have their own advantages, even in such a complex and expensive area of research.

“Small teams of exceptional people can do exceptional things,” said Bill Coughran, who helped oversee the creation of Google’s vast internet infrastructure and is now investing in Mr. Schoelkopf’s company as a partner at Sequoia. “I have yet to see large teams inside big companies doing anything tremendously innovative.”

Though Quantum Circuits is using the same quantum method as its bigger competitors, Mr. Schoelkopf argued that his company has an edge because it is tackling the problem differently. Rather than building one large quantum machine, it is constructing a series of tiny machines that can be networked together. He said this will make it easier to correct errors in quantum calculations — one of the main difficulties in building one of these complex machines.

But each of the big companies insist that they hold an advantage — and each is loudly trumpeting its progress, even if a working machine is still years away.

Mr. Coughran said that he and Sequoia envision Quantum Circuits evolving into a company that can deliver quantum computing to any business or researcher that needs it. Another investor, Canaan’s Brendan Dickinson, said that if a company like this develops a viable quantum machine, it will become a prime acquisition target.

“The promise of a large quantum computer is incredibly powerful,” Mr. Dickinson said. “It will solve problems we can’t even imagine right now.”

See the full article here .

Please help promote STEM in your local schools.

STEM Icon

Stem Education Coalition

#applied-research-technology, #google, #ibm, #microsoft, #nyt, #quantum-circuits-company, #quantum-computing, #quantum-mechanics, #rigetti-computing, #robert-schoelkopf-is-at-the-forefront-of-a-worldwide-effort-to-build-the-worlds-first-quantum-computer, #yale