Tagged: IBM Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:22 pm on March 13, 2019 Permalink | Reply
    Tags: "Quantum computing should supercharge this machine-learning technique", , Certain machine-learning tasks could be revolutionized by more powerful quantum computers., IBM, ,   

    From M.I.T Technology Review: “Quantum computing should supercharge this machine-learning technique” 

    MIT Technology Review
    From M.I.T Technology Review

    March 13, 2019
    Will Knight

    1
    The machine-learning experiment was performed using this IBM Q quantum computer.

    Certain machine-learning tasks could be revolutionized by more powerful quantum computers.

    Quantum computing and artificial intelligence are both hyped ridiculously. But it seems a combination of the two may indeed combine to open up new possibilities.

    In a research paper published today in the journal Nature, researchers from IBM and MIT show how an IBM quantum computer can accelerate a specific type of machine-learning task called feature matching. The team says that future quantum computers should allow machine learning to hit new levels of complexity.

    As first imagined decades ago, quantum computers were seen as a different way to compute information. In principle, by exploiting the strange, probabilistic nature of physics at the quantum, or atomic, scale, these machines should be able to perform certain kinds of calculations at speeds far beyond those possible with any conventional computer (see “What is a quantum computer?”). There is a huge amount of excitement about their potential at the moment, as they are finally on the cusp of reaching a point where they will be practical.

    At the same time, because we don’t yet have large quantum computers, it isn’t entirely clear how they will outperform ordinary supercomputers—or, in other words, what they will actually do (see “Quantum computers are finally here. What will we do with them?”).

    Feature matching is a technique that converts data into a mathematical representation that lends itself to machine-learning analysis. The resulting machine learning depends on the efficiency and quality of this process. Using a quantum computer, it should be possible to perform this on a scale that was hitherto impossible.

    The MIT-IBM researchers performed their simple calculation using a two-qubit quantum computer. Because the machine is so small, it doesn’t prove that bigger quantum computers will have a fundamental advantage over conventional ones, but it suggests that would be the case, The largest quantum computers available today have around 50 qubits, although not all of them can be used for computation because of the need to correct for errors that creep in as a result of the fragile nature of these quantum bits.

    “We are still far off from achieving quantum advantage for machine learning,” the IBM researchers, led by Jay Gambetta, write in a blog post. “Yet the feature-mapping methods we’re advancing could soon be able to classify far more complex data sets than anything a classical computer could handle. What we’ve shown is a promising path forward.”

    “We’re at stage where we don’t have applications next month or next year, but we are in a very good position to explore the possibilities,” says Xiaodi Wu, an assistant professor at the University of Maryland’s Joint Center for Quantum Information and Computer Science. Wu says he expects practical applications to be discovered within a year or two.

    Quantum computing and AI are hot right now. Just a few weeks ago, Xanadu, a quantum computing startup based in Toronto, came up with an almost identical approach to that of the MIT-IBM researchers, which the company posted online. Maria Schuld, a machine-learning researcher at Xanadu, says the recent work may be the start of a flurry of research papers that combine the buzzwords “quantum” and “AI.”

    “There is a huge potential,” she says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 11:01 am on September 5, 2018 Permalink | Reply
    Tags: , , IBM, , ,   

    From Duke University via The News&Observer: “Look out, IBM. A Duke-led group is also a player in quantum computing” 

    Duke Bloc
    Duke Crest

    From Duke University

    via

    The News&Observer

    August 13, 2018
    Ray Gronberg

    1
    Duke University professors Iman Marvian, Jungsang Kim and Kenneth Brown, gathered here in Kim’s lab in the Chesterfield Building in downtown Durham, are working together to develop a quantum computer that relies on “trapped ion” technology. The National Science Foundation and the federal Intelligence Advanced Research Projects Activity are helping fund the project. Les Todd LKT Photography, Inc.

    There’s a group based at Duke University that thinks it can out-do IBM in the quantum-computing game, and it just got another $15 million in funding from the U.S. government.

    Quantum computing – IBM

    The National Science Foundation grant is helping underwrite a consortium led by professors Jungsang Kim and Ken Brown that’s previously received backing from the federal Intelligence Advanced Research Projects Activity.

    Kim said the group is developing a quantum computer that has “up to a couple dozen qubits” of computational power and reckons it’s a year or so from being operational. The world qubit is the quantum-computing world’s equivalent of normal computing’s “bit” when it comes to gauging processing ability, and each additional qubit represents a doubling of that power.

    “One of the goals of this [grant] is to establish the hardware so we can allow researchers to work on the software and systems optimization,” Kim said of the National Science Foundation grant the agency awarded on Aug. 6.

    Two or three dozen qubits might not sound like a lot when IBM says it has built and tested a 50-qubit machine. But the Duke-led research group is approaching the problem from an entirely different angle.

    The “trapped-ion” design it’s using could hold qubits steady in its internal memory for much longer than superconducting designs like those IBM is working on can manage, Brown said.

    Superconducting designs — which operate at extremely cold temperatures — “are a bit faster” than trapped-ion ones and are the focus of “a much larger industrial effort,” Brown said.

    That speed-versus-resilience tradeoff could matter because IBM says its machines can hold a qubit steady in memory for only up to about 90 microseconds. That means processing runs have to be short, on the order of no more than a couple of seconds total.

    “One thing that’s becoming clear in the community is, the thing we need to scale is not just the number of qubits but also the quality of operations,” said Brown, who in January traded a faculty post at Georgia Tech for a new one at Duke. “If you have a huge number of qubits but the operations are not very good, you effectively have a bad classical computer.”

    Kim added that designers working on quantum computers have to look for the same kind of breakthrough in thinking about the technology that the Wright brothers brought to the development of flight.

    Just as the Wrights and other people working in the field in the late 19th and early 20th centuries figured out that mimicking birds was a developmental dead end, the builders of quantum computers “have to start with something that’s fundamentally quantum and build the right technology to scale it,” Kim said. “You don’t build quantum computers by mimicking classical computers.”

    But for now, the government agencies that are subsidizing the field are backing different approaches and waiting to see what pans out.

    The Aug. 6 grant is the third big one Kim’s lab has secured, building on awards from IARPA in 2010 and 2016 that together brought it about $54.5 million in funding. But in both those rounds of funding, teams from IBM were also among those getting awards from the federal agency, which funds what it calls “high-risk/high-payoff” research for the intelligence community.

    The stakes are so high because quantum computing could become a breakthrough technology. It exploits the physics of subatomic particles in hopes of developing a machine that can process data that exists in multiple states at once, rather than the binary 1 or 0 of traditional computing.

    IBM and the government aren’t the only heavy hitters involved. Google has a quantum-computing project of its own that’s grown with help from IARPA funding.

    3
    Google’s Quantum Dream Machine

    Kim and other people involved in the Duke-led group have also formed a company called IonQ that’s received investment from Google and Amazon.

    The Duke-led group also includes teams from from the University of Maryland, the University of Chicago and Tufts University that are working on hardware, software and applications development, respectively, Duke officials say. Researchers from the University of New Mexico, MIT, the National Institute of Standards and Technology and the University of California-Berkeley are also involved.

    Duke doesn’t have quantum computing all to itself in the Triangle, as in the spring IBM made N.C. State University part of its Q Network, a group of businesses, universities and government agencies that can use IBM’s quantum machines via the cloud.

    But the big difference between the N.C. State and Duke efforts is that with State, the focus is on developing both the future workforce and beginning to push software development, while at Duke it’s more fundamentally about trying to develop the technology.

    Not that software is a side issue, mind.

    “If I had a quantum computer with 60 qubits, I know there are algorithms I can run on it that I can’t simulate with my regular computers,” Brown said, explaining that the technology requires new thinking there, too. “That’s a weird place to be.”

    The quantum project is important enough that Duke has backed it with faculty hires. Brown had been collaborating with Kim’s group for a while, but elected to move to Duke from Georgia Tech after Duke officials decided to conduct what Kim termed “a cluster hire” of quantum specialists.

    Brown joined Kim in the Pratt School of Engineering’s electrical and computer engineering department. A search for someone to fill an an endowed chair in physics continues.

    Another professor involved, Iman Marvian, also joined the Duke faculty at the start of 2018 thanks to the university’s previously announced “quantitative initiative.” A quantum information theorist, he got a joint appointment in physics and engineering. He came to Duke from MIT after a post-doc stint at the Boston school.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Duke Campus

    Younger than most other prestigious U.S. research universities, Duke University consistently ranks among the very best. Duke’s graduate and professional schools — in business, divinity, engineering, the environment, law, medicine, nursing and public policy — are among the leaders in their fields. Duke’s home campus is situated on nearly 9,000 acres in Durham, N.C, a city of more than 200,000 people. Duke also is active internationally through the Duke-NUS Graduate Medical School in Singapore, Duke Kunshan University in China and numerous research and education programs across the globe. More than 75 percent of Duke students pursue service-learning opportunities in Durham and around the world through DukeEngage and other programs that advance the university’s mission of “knowledge in service to society.”

     
  • richardmitnick 10:30 am on July 30, 2018 Permalink | Reply
    Tags: , , Hello quantum world, IBM, ,   

    From COSMOS Magazine: “Hello quantum world” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    30 July 2018
    Will Knight

    Quantum computing – IBM

    Inside a small laboratory in lush countryside about 80 kilometres north of New York City, an elaborate tangle of tubes and electronics dangles from the ceiling. This mess of equipment is a computer. Not just any computer, but one on the verge of passing what may, perhaps, go down as one of the most important milestones in the history of the field.

    Quantum computers promise to run calculations far beyond the reach of any conventional supercomputer. They might revolutionise the discovery of new materials by making it possible to simulate the behaviour of matter down to the atomic level. Or they could upend cryptography and security by cracking otherwise invincible codes. There is even hope they will supercharge artificial intelligence by crunching through data more efficiently.

    Yet only now, after decades of gradual progress, are researchers finally close to building quantum computers powerful enough to do things that conventional computers cannot. It’s a landmark somewhat theatrically dubbed ‘quantum supremacy’. Google has been leading the charge toward this milestone, while Intel and Microsoft also have significant quantum efforts. And then there are well-funded startups including Rigetti Computing, IonQ and Quantum Circuits.

    No other contender can match IBM’s pedigree in this area, though. Starting 50 years ago, the company produced advances in materials science that laid the foundations for the computer revolution. Which is why, last October, I found myself at IBM’s Thomas J. Watson Research Center to try to answer these questions: What, if anything, will a quantum computer be good for? And can a practical, reliable one even be built?

    2
    Credit: Graham Carlow

    Why we think we need a quantum computer

    The research center, located in Yorktown Heights, looks a bit like a flying saucer as imagined in 1961. It was designed by the neo-futurist architect Eero Saarinen and built during IBM’s heyday as a maker of large mainframe business machines. IBM was the world’s largest computer company, and within a decade of the research centre’s construction it had become the world’s fifth-largest company of any kind, just behind Ford and General Electric.

    While the hallways of the building look out onto the countryside, the design is such that none of the offices inside have any windows. It was in one of these cloistered rooms that I met Charles Bennett. Now in his 70s, he has large white sideburns, wears black socks with sandals and even sports a pocket protector with pens in it.

    3
    Charles Bennett was one of the pioneers who realised quantum computers could solve some problems exponentially faster than conventional computers. Credit:Bartek Sadowski

    Surrounded by old computer monitors, chemistry models and, curiously, a small disco ball, he recalled the birth of quantum computing as if it were yesterday.

    When Bennett joined IBM in 1972, quantum physics was already half a century old, but computing still relied on classical physics and the mathematical theory of information that Claude Shannon had developed at MIT in the 1950s. It was Shannon who defined the quantity of information in terms of the number of ‘bits’ (a term he popularised but did not coin) required to store it. Those bits, the 0s and 1s of binary code, are the basis of all conventional computing.

    A year after arriving at Yorktown Heights, Bennett helped lay the foundation for a quantum information theory that would challenge all that. It relies on exploiting the peculiar behaviour of objects at the atomic scale. At that size, a particle can exist ‘superposed’ in many states (e.g., many different positions) at once. Two particles can also exhibit ‘entanglement’, so that changing the state of one may instantaneously affect the other.

    Bennett and others realised that some kinds of computations that are exponentially time consuming, or even impossible, could be efficiently performed with the help of quantum phenomena. A quantum computer would store information in quantum bits, or qubits. Qubits can exist in superpositions of 1 and 0, and entanglement and a trick called interference can be used to find the solution to a computation over an exponentially large number of states. It’s annoyingly hard to compare quantum and classical computers, but roughly speaking, a quantum computer with just a few hundred qubits would be able to perform more calculations simultaneously than there are atoms in the known universe.

    In the summer of 1981, IBM and MIT organised a landmark event called the First Conference on the Physics of Computation. It took place at Endicott House, a French-style mansion not far from the MIT campus.

    In a photo that Bennett took during the conference, several of the most influential figures from the history of computing and quantum physics can be seen on the lawn, including Konrad Zuse, who developed the first programmable computer, and Richard Feynman, an important contributor to quantum theory. Feynman gave the conference’s keynote speech, in which he raised the idea of computing using quantum effects. “The biggest boost quantum information theory got was from Feynman,” Bennett told me. “He said, ‘Nature is quantum, goddamn it! So if we want to simulate it, we need a quantum computer.’”

    IBM’s quantum computer – one of the most promising in existence – is located just down the hall from Bennett’s office. The machine is designed to create and manipulate the essential element in a quantum computer: the qubits that store information.

    The gap between the dream and the reality

    The IBM machine exploits quantum phenomena that occur in superconducting materials. For instance, sometimes current will flow clockwise and counterclockwise at the same time. IBM’s computer uses superconducting circuits in which two distinct electromagnetic energy states make up a qubit.

    The superconducting approach has key advantages. The hardware can be made using well-established manufacturing methods, and a conventional computer can be used to control the system. The qubits in a superconducting circuit are also easier to manipulate and less delicate than individual photons or ions.

    Inside IBM’s quantum lab, engineers are working on a version of the computer with 50 qubits. You can run a simulation of a simple quantum computer on a normal computer, but at around 50 qubits it becomes nearly impossible.

    That means IBM is theoretically approaching the point where a quantum computer can solve problems a classical computer cannot: in other words, quantum supremacy.

    But as IBM’s researchers will tell you, quantum supremacy is an elusive concept. You would need all 50 qubits to work perfectly, when in reality quantum computers are beset by errors that need to be corrected. It is also devilishly difficult to maintain qubits for any length of time; they tend to ‘decohere’, or lose their delicate quantum nature, much as a smoke ring breaks up at the slightest air current. And the more qubits, the harder both challenges become.

    3
    The cutting-edge science of quantum computing requires nanoscale precision mixed with the tinkering spirit of home electronics. Researcher Jerry Chow is here shown fitting a circuitboard in the IBM quantum research lab. Jon Simon

    “If you had 50 or 100 qubits and they really worked well enough, and were fully error-corrected – you could do unfathomable calculations that can’t be replicated on any classical machine, now or ever,” says Robert Schoelkopf, a Yale professor and founder of a company called Quantum Circuits. “The flip side to quantum computing is that there are exponential ways for it to go wrong.”

    Another reason for caution is that it isn’t obvious how useful even a perfectly functioning quantum computer would be. It doesn’t simply speed up any task you throw at it; in fact, for many calculations, it would actually be slower than classical machines. Only a handful of algorithms have so far been devised where a quantum computer would clearly have an edge. And even for those, that edge might be short-lived. The most famous quantum algorithm, developed by Peter Shor at MIT, is for finding the prime factors of an integer. Many common cryptographic schemes rely on the fact that this is hard for a conventional computer to do. But cryptography could adapt, creating new kinds of codes that don’t rely on factorisation.

    This is why, even as they near the 50-qubit milestone, IBM’s own researchers are keen to dispel the hype around it. At a table in the hallway that looks out onto the lush lawn outside, I encountered Jay Gambetta, a tall, easygoing Australian who researches quantum algorithms and potential applications for IBM’s hardware. “We’re at this unique stage,” he said, choosing his words with care. “We have this device that is more complicated than you can simulate on a classical computer, but it’s not yet controllable to the precision that you could do the algorithms you know how to do.”

    What gives the IBMers hope is that even an imperfect quantum computer might still be a useful one.

    Gambetta and other researchers have zeroed in on an application that Feynman envisioned back in 1981. Chemical reactions and the properties of materials are determined by the interactions between atoms and molecules. Those interactions are governed by quantum phenomena. A quantum computer can – at least in theory – model those in a way a conventional one cannot.

    Last year, Gambetta and colleagues at IBM used a seven-qubit machine to simulate the precise structure of beryllium hydride. At just three atoms, it is the most complex molecule ever modelled with a quantum system. Ultimately, researchers might use quantum computers to design more efficient solar cells, more effective drugs or catalysts that turn sunlight into clean fuels.

    Those goals are a long way off. But, Gambetta says, it may be possible to get valuable results from an error-prone quantum machine paired with a classical computer.

    4
    Credit Cosmos Magazine

    Physicist’s dream to engineer’s nightmare

    “The thing driving the hype is the realisation that quantum computing is actually real,” says Isaac Chuang, a lean, soft-spoken MIT professor. “It is no longer a physicist’s dream – it is an engineer’s nightmare.”

    Chuang led the development of some of the earliest quantum computers, working at IBM in Almaden, California, during the late 1990s and early 2000s. Though he is no longer working on them, he thinks we are at the beginning of something very big – that quantum computing will eventually even play a role in artificial intelligence.

    But he also suspects that the revolution will not really begin until a new generation of students and hackers get to play with practical machines. Quantum computers require not just different programming languages but a fundamentally different way of thinking about what programming is. As Gambetta puts it: “We don’t really know what the equivalent of ‘Hello, world’ is on a quantum computer.”

    We are beginning to find out. In 2016 IBM connected a small quantum computer to the cloud. Using a programming tool kit called QISKit, you can run simple programs on it; thousands of people, from academic researchers to schoolkids, have built QISKit programs that run basic quantum algorithms. Now Google and other companies are also putting their nascent quantum computers online. You can’t do much with them, but at least they give people outside the leading labs a taste of what may be coming.

    The startup community is also getting excited. A short while after seeing IBM’s quantum computer, I went to the University of Toronto’s business school to sit in on a pitch competition for quantum startups. Teams of entrepreneurs nervously got up and presented their ideas to a group of professors and investors. One company hoped to use quantum computers to model the financial markets. Another planned to have them design new proteins. Yet another wanted to build more advanced AI systems. What went unacknowledged in the room was that each team was proposing a business built on a technology so revolutionary that it barely exists. Few seemed daunted by that fact.

    This enthusiasm could sour if the first quantum computers are slow to find a practical use. The best guess from those who truly know the difficulties –people like Bennett and Chuang – is that the first useful machines are still several years away. And that’s assuming the problem of managing and manipulating a large collection of qubits won’t ultimately prove intractable.

    Still, the experts hold out hope. When I asked him what the world might be like when my two-year-old son grows up, Chuang, who learned to use computers by playing with microchips, responded with a grin. “Maybe your kid will have a kit for building a quantum computer,” he said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 9:12 am on June 18, 2018 Permalink | Reply
    Tags: , Deep Neural Network Training with Analog Memory Devices, , IBM   

    From HPC Wire: “IBM Demonstrates Deep Neural Network Training with Analog Memory Devices” 

    From HPC Wire

    June 18, 2018
    Oliver Peckham

    1
    Crossbar arrays of non-volatile memories can accelerate the training of fully connected neural networks by performing computation at the location of the data. (Source: IBM)

    From smarter, more personalized apps to seemingly-ubiquitous Google Assistant and Alexa devices, AI adoption is showing no signs of slowing down – and yet, the hardware used for AI is far from perfect. Currently, GPUs and other digital accelerators are used to speed the processing of deep neural network (DNN) tasks – but all of those systems are effectively wasting time and energy shuttling that data back and forth between memory and processing. As the scale of AI applications continues to increase, those cumulative losses are becoming massive.

    In a paper published this month in Nature, by Stefano Ambrogio, Pritish Narayanan, Hsinyu Tsai, Robert M. Shelby, Irem Boybat, Carmelo di Nolfo, Severin Sidler, Massimo Giordano, Martina Bodini, Nathan C. P. Farinha, Benjamin Killeen, Christina Cheng, Yassine Jaoudi, and Geoffrey W. Burr, IBM researchers demonstrate DNN training on analog memory devices that they report achieves equivalent accuracy to a GPU-accelerated system. IBM’s solution performs DNN calculations right where the data are located, storing and adjusting weights in memory, with the effect of conserving energy and improving speed.

    Analog computing, which uses variable signals rather than binary signals, is rarely employed in modern computing due to inherent limits on precision. IBM’s researchers, building on a growing understanding that DNN models operate effectively at lower precision, decided to attempt an accurate approach to analog DNNs.

    The research team says it was able to accelerate key training algorithms, notably the backpropagation algorithm, using analog non-volatile memories (NVM). Writing for the IBM blog, lead author Stefano Ambrogio explains:

    “These memories allow the “multiply-accumulate” operations used throughout these algorithms to be parallelized in the analog domain, at the location of weight data, using underlying physics. Instead of large circuits to multiply and add digital numbers together, we simply pass a small current through a resistor into a wire, and then connect many such wires together to let the currents build up. This lets us perform many calculations at the same time, rather than one after the other. And instead of shipping digital data on long journeys between digital memory chips and processing chips, we can perform all the computation inside the analog memory chip.”

    The authors note that their mixed hardware-software approach is able to achieve classification accuracies equivalent to pure software based-training using TensorFlow despite imperfections of existing analog memory devices. Writes Ambrogio:

    “By combining long-term storage in phase-change memory (PCM) devices, near-linear update of conventional Complementary Metal-Oxide Semiconductor (CMOS) capacitors and novel techniques for cancelling out device-to-device variability, we finessed these imperfections and achieved software-equivalent DNN accuracies on a variety of different networks. These experiments used a mixed hardware-software approach, combining software simulations of system elements that are easy to model accurately (such as CMOS devices) together with full hardware implementation of the PCM devices. It was essential to use real analog memory devices for every weight in our neural networks, because modeling approaches for such novel devices frequently fail to capture the full range of device-to-device variability they can exhibit.”

    Ambrogio and his team believe that their early design efforts indicate that a full implemention of the analog approach “should indeed offer equivalent accuracy, and thus do the same job as a digital accelerator – but faster and at lower power.” The team is exploring the design of prototype NVM-based accelerator chips, as part of an IBM Research Frontiers Institute project.

    The team estimates that it will be able to deliver chips with a computational energy efficiency of 28,065 GOP/sec/W and throughput-per-area of 3.6 TOP/sec/mm2. This would be a two orders of magnitude improvement over today’s GPUs according to the reserachers.

    The researchers will now turn their attention to demonstrating their approach on larger networks that call for large, fully-connected layers, such as recurrently-connected Long Short Term Memory (LSTM) and Gated Recurrent Unit (GRU) networks with emerging utility for machine translation, captioning and text analytics. As new and better forms of analog memory are developed, they expect continued improvements in areal density and energy efficiency.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

     
  • richardmitnick 3:46 pm on May 14, 2018 Permalink | Reply
    Tags: Harvard University and MIT, IBM, Institute of Science and Technology Austria, , , University of Geneva,   

    From University of Leeds via phys.org: “Deeper understanding of quantum chaos may be the key to quantum computers” 

    U Leeds bloc

    From University of Leeds

    phys.org

    May 14, 2018

    1
    Quantum systems can exist in many possible states, here illustrated by groups of spins, each pointing along a certain direction. Thermalization occurs when a system evenly explores all allowed configurations. Instead, when a “quantum scar” forms (as shown in the figure), some configurations emerge as special. This feature allows scarred systems to sustain memory of the initial state despite thermalization. Credit: Zlatko Papic, University of Leeds

    New research gives insight into a recent experiment that was able to manipulate an unprecedented number of atoms through a quantum simulator. This new theory could provide another step on the path to creating the elusive quantum computers.

    1
    Quantum computing – IBM

    An international team of researchers, led by the University of Leeds and in cooperation with the Institute of Science and Technology Austria and the University of Geneva, has provided a theoretical explanation for the particular behaviour of individual atoms that were trapped and manipulated in a recent experiment by Harvard University and MIT [Nature Physics]. The experiment used a system of finely tuned lasers to act as “optical tweezers” to assemble a remarkably long chain of 51 atoms.

    When the quantum dynamics of the atom chain were measured, there were surprising oscillations that persisted for much longer than expected and which couldn’t be explained.

    Study co-author, Dr. Zlatko Papic, Lecturer in Theoretical Physics at Leeds, said: “The previous Harvard-MIT experiment created surprisingly robust oscillations that kept the atoms in a quantum state for an extended time. We found these oscillations to be rather puzzling because they suggested that atoms were somehow able to “remember” their initial configuration while still moving chaotically.

    “Our goal was to understand more generally where such oscillations could come from, since oscillations signify some kind of coherence in a chaotic environment—and this is precisely what we want from a robust quantum computer. Our work suggests that these oscillations are due to a new physical phenomenon that we called ‘quantum many-body scar’.”

    In everyday life, particles will bounce off one another until they explore the entire space, settling eventually into a state of equilibrium. This process is called thermalisation. A quantum scar is when a special configuration or pathway leaves an imprint on the particles’ state that keeps them from filling the entire space. This prevents the systems from reaching thermalisation and allows them to maintain some quantum effects.

    Dr. Papic said: “We are learning that quantum dynamics can be much more complex and intricate than simply thermalisation. The practical benefit is that extended periods of oscillations are exactly what is needed if quantum computers are to become a reality. The information processed and stored on these computers will be dependent on keeping the atoms in more than one state at any time, it is a constant battle to keep the particles from settling into an equilibrium.”

    Study lead author, Christopher Turner, doctoral researcher at the School of Physics and Astronomy at Leeds, said: “Previous theories involving quantum scars have been formulated for a single particle. Our work has extended these ideas to systems which contain not one but many particles, which are all entangled with each other in complicated ways. Quantum many-body scars might represent a new avenue to realise coherent quantum dynamics.”

    The quantum many-body scars theory sheds light on the quantum states that underpin the strange dynamics of atoms in the Harvard-MIT experiment. Understanding this phenomenon could also pave the way for protecting or extending the lifetime of quantum states in other classes of quantum many-body systems.

    Read more at: https://phys.org/news/2018-05-deeper-quantum-chaos-key.html#jCp

    See the full article here.

    Please help promote STEM in your local schools.

    stem

    Stem Education Coalition

    U Leeds Campus

    The University, established in 1904, is one of the largest higher education institutions in the UK. We are a world top 100 university and are renowned globally for the quality of our teaching and research. The strength of our academic expertise combined with the breadth of disciplines we cover, provides a wealth of opportunities and has real impact on the world in cultural, economic and societal ways. The University strives to achieve academic excellence within an ethical framework informed by our values of integrity, equality and inclusion, community and professionalism.

     
  • richardmitnick 2:05 pm on April 8, 2018 Permalink | Reply
    Tags: Eight startups selected by IBM to be part of the Q Network, , IBM, Q Network partner ecosystem, QISKitan an open-source software developer kit,   

    HPC Wire: “IBM Expands Quantum Computing Network” 

    April 5, 2018
    Tiffany Trader

    1

    IBM is positioning itself as a first mover in establishing the era of commercial quantum computing. The company believes in order for quantum to work, taming qubits isn’t enough, there needs to be an engaged ecosystem of partners. As part of its strategy to transition from quantum science to what IBM calls quantum-readiness, Big Blue held the first IBM Q Summit in Palo Alto, California, today (April 5), welcoming a group of startups into its quantum network.

    “Membership in the network will enable these startups to run experiments and algorithms on IBM quantum computers via cloud-based access,” explained Jeff Welser, director, IBM Research – Almaden, in a blog post. “Additionally, these startup members will have the opportunity to collaborate with IBM researchers and technical SMEs on potential applications, as well as other IBM Q Network organizations.”

    The Q Network was launched in December in partnership with both industry and academic and government clients, including JP Morgan Chase, Daimler, Samsung, JSR, Barclays, Keio University, Honda, Oak Ridge National Lab, University of Oxford, University of Melbourne, Hitachi Metals and Nagase. Now IBM has brought in these eight industry-leading startups: Cambridge Quantum Computing (CQC), 1QBit, QC Ware, Q-CTRL, Zapata Computing, Strangeworks, QxBranch, and Quantum Benchmark. (Additional info at end of article.)

    Quantum was a major topic of the inaugural IBM Think conference held in Las Vegas last month, where a number of featured speakers shared an optimistic timeline for establishing production usable applications.

    Arvind Krishna, senior vice president, Hybrid Cloud, and director of IBM Research, said he believes IBM will show a practical quantum advantage within five years and it will have built capable machines for that purpose in three-to-five years.

    Krishna hailed a coming era of practical quantum computing. “Quantum computers will help us solve problems that classical computers never could, in areas such as materials, medicines, transportation logistics, and financial risk,” he said during a keynote address.

    IBM has been focused on making the engineering more stable and robust to enable a broader set of users, outside the physics laboratory. “To exploit and win at quantum, you actually have to have a real quantum computer,” said Krishna.

    The community ecosystem is where IBM is distinguishing itself in the tight landscape of quantum competitors, that includes Google, Intel, Microsoft, early pioneer in quantum annealing D-Wave, and Berkeley-based startup Rigetti.

    IBM has a set of three prototype quantum computers, real quantum devices not simulators, made available through its cloud network, which in just two years has seen 80,000 users run more than 3 million remote executions. There are 5-qubit and 16-qubit quantum systems available to anyone with an internet connection via IBM’s Q Experience platform, and a larger 20-qubit machine for select Q Network partners. IBM has also successfully built an operational prototype 50-qubit processor that will be made available in the next generation IBM Q systems.

    As IBM grows its Q Network partner ecosystem, participating organizations will have various levels of cloud-based access to quantum expertise and resources. This means that not all members will get time on the biggest Q System, but startups in the quantum computing space will get “deeper access to APIs and advanced quantum software tools, libraries and applications, as well as consultation on emerging quantum technologies and applications from IBM scientists, engineers and consultants,” according to Welser.

    The goal of the Q Network is to advance practical applications for business and science and ultimately usher in the commercial quantum era. “We will emerge from this transitional era and enter the era of quantum advantage when we run the first commercial application. It’s not about arbitrary tests or quantum supremacy, it’s very practical,” said Anthony Annunziata, associate director, IBM Q, at last month’s event. “When we can do practical things, we will have achieved the practical era.”

    By making the machines available to a broader community, IBM is seeding the development of a software and user ecosystem. Annunziata stressed the importance of educating and preparing users across organizations for the coming of quantum computing. “It doesn’t matter how much we can abstract away,” he said, “quantum computing is just different. It takes a different mindset and skill set to program a quantum computer, especially to take advantage of it.”

    There are two different ways of programming the IBM Q network machines: a graphical interface with drag-and-drop operations and an open-source software developer kit called QISKit. QISKit, as IBM’s Talia Gershon enthusiastically explained in her keynote talk, makes it possible to entangle two qubits with two lines of code.

    2
    Talia Gershon presenting at IBM Think 2018

    Gershon, senior manager, AI Challenges and Quantum Experiences at IBM, holds that having fundamentally new ways of doing computation will open up a new paradigm in how we approach problems, but first we have to stop “thinking too classically.”

    “Thinking too classically, as my colleague Jay Gambetta says, means you’re trying to apply linear classical logical thinking to understand something quantum and it doesn’t work,” said Gershon. “Thinking too classically is a real problem that hinders progress so how do we get people to change the way they think? Well we start in the classroom. When Einstein first discovered relativity I’m sure nobody intuitively got it and understood why was important and today it’s in every modern physics classroom in the world.

    “Within five years the same thing will happen with quantum computing. Not only will physics departments offer quantum information classes but computer science departments will offer a quantum track. Electrical engineering departments will teach students about quantum circuits and microwave signal processing and chemistry classes will teach students not only how to simulate molecules on a classical machine but also on a quantum computer.”

    ____________________________________________________________________

    Descriptions of the eight startups selected by IBM to be part of the Q Network:

    • Zapata Computing – Based in Cambridge, Mass., Zapata Computing is a quantum software, applications and services company developing algorithms for chemistry, machine learning, security, and error correction.

    • Strangeworks – Based in Austin, Texas, and founded by William Hurley, Strangeworks is a quantum computing software company designing and delivering tools for software developers and systems management for IT Administrators and CIOs.

    • QxBranch – Headquartered in Washington, D.C., QxBranch delivers advanced data analytics for finance, insurance, energy, and security customers worldwide. QxBranch is developing tools and applications enabled by quantum computing with a focus on machine learning and risk analytics.

    • Quantum Benchmark – Quantum Benchmark is a venture-backed software company led by a team of the top research scientists and engineers in quantum computing, with headquarters in Kitchener-Waterloo, Canada. Quantum Benchmark provides solutions that enable error characterization, error mitigation, error correction and performance validation for quantum computing hardware.

    • QC Ware – Based in Palo Alto, Calif., QC Ware develops hardware-agnostic enterprise software solutions running on quantum computers. QC Ware’s investors include Airbus Ventures, DE Shaw Ventures and Alchemist, and it has relationships with NASA and other government agencies. QC Ware won a NSF grant, and its customers include Fortune 500 industrial and technology companies.

    • Q-CTRL – This Sydney, Australia-based startup’s hardware agnostic platform – Black Opal – gives users the ability to design and deploy the most effective controls to suppress errors in their quantum hardware before they accumulate. Q-CTRL is backed by Main Sequence Ventures and Horizons Ventures.

    • Cambridge Quantum Computing (CQC) – Established in 2014 in the UK, CQC combines expertise in quantum information processing, quantum technologies, artificial intelligence, quantum chemistry, optimization and pattern recognition. CQC designs solutions such as a proprietary platform agnostic compiler that will allow developers and users to benefit from quantum computing even in its earliest forms. CQC also has a growing focus in quantum technologies that relate to encryption and security.

    • 1QBit – Headquartered in Vancouver, Canada, and founded in 2012, 1Qbit develops general purpose algorithms for quantum computing hardware. The company’s hardware-agnostic platforms and services are designed to enable the development of applications which scale alongside the advances in both classical and quantum computers. 1QBit is backed by Fujitsu Limited, CME Ventures, Accenture, Allianz and The Royal Bank of Scotland.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

     
  • richardmitnick 12:07 pm on April 6, 2018 Permalink | Reply
    Tags: Australia looks set to become a significant global player in the emerging quantum computing market, IBM, Q-CTRL, Sydney teams with Silicon Valley to boost quantum computing R&D,   

    From U Sidney via COSMOS: “Sydney teams with Silicon Valley to boost quantum computing R&D” 

    U Sidney bloc

    University of Sidney

    COSMOS

    06 April 2018
    Andrew Masterson

    1
    A collaboration between IBM and Sydney quantum researchers could pay rich dividends for Australia’s tech sector. NIGEL TREBLIN / Getty Images.

    Australia looks set to become a significant global player in the emerging quantum computing market after a Sydney-based firm was picked to collaborate with US tech giant IBM to develop the industry.

    Q-CTRL, a company established by physicists working at the University of Sydney, is one of eight start-ups chosen by IBM from a wide field of candidates.

    Researchers at the university recently reported a significant advance in tackling one of the major problems bedevilling quantum computing development – system noise.

    The fundamental basis of the field revolves around entangling quantum bits, or qubits, such that they occupy a superposition comprising two possible information states. The “bits” that define classical computing technology can only ever express one such state (0 or 1, for example), and qubits’ ability to express both simultaneously is the key advantage of a quantum system.

    However, entangled qubits are by nature extremely fragile, and entanglement can be lost (and the information encoded by them destroyed) very easily. In practical terms, a major cause of this decoherence – as the jargon has it – is the noise generated by the very system that produces it in the first place.

    In a paper published in the journal Physical Review Letters earlier this year researchers from the university’s Centre for Engineered Quantum Systems revealed a “hack” – a simple modification to coding – that resulted in an enormous improvement in robustness before decoherence kicks in.

    The scientists reported a stunning 400% increase in the amount of interference the system could stand before breaking down.

    This and related research contributed to Q-CTRL’s appeal for IBM.

    Company founder Michael Biercuk says working with the California-based corporation is a natural alliance for it.

    “Working with IBM is a logical step for Q-CTRL to develop real solutions to one of the hardest problems in quantum computing – dealing with hardware error,” he says.

    “As IBM continues to scale-up its quantum computers, we will gain direct access to the company’s most advanced devices and have an opportunity to help solve some of quantum computing’s most vexing challenges.

    “Our techniques are already validated through our ion-trapping laboratory. Working with IBM gives us a new opportunity to test these concepts on a totally different kind of quantum computing hardware.”

    The successful development of working, sustainable quantum computers is widely predicted to transform industry, research, cryptography and, indeed, in time, everyday life.

    IBM is recognised as a leading contender in the field, but it is by no means the only horse in the race. Other tech titans, including Google, Microsoft, IonQ and Rigetti, are prominent in the field.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Sidney campus

    Our founding principle as Australia’s first university was that we would be a modern and progressive institution. It’s an ideal we still hold dear today.

    When Charles William Wentworth proposed the idea of Australia’s first university in 1850, he imagined “the opportunity for the child of every class to become great and useful in the destinies of this country”.

    We’ve stayed true to that original value and purpose by promoting inclusion and diversity for the past 160 years.

    It’s the reason that, as early as 1881, we admitted women on an equal footing to male students. Oxford University didn’t follow suit until 30 years later, and Jesus College at Cambridge University did not begin admitting female students until 1974.

    It’s also why, from the very start, talented students of all backgrounds were given the chance to access further education through bursaries and scholarships.

    Today we offer hundreds of scholarships to support and encourage talented students, and a range of grants and bursaries to those who need a financial helping hand.

     
  • richardmitnick 4:01 pm on December 14, 2017 Permalink | Reply
    Tags: IBM, IBM has announced an initiative to build commercially available “IBM Q” universal quantum computing systems,   

    From HPC: “IBM Launches Commercial Quantum Network with Samsung, ORNL” 

    HPC Wire

    December 14, 2017
    Tiffany Trader

    1
    IBM has announced an initiative to build commercially available “IBM Q” universal quantum computing systems.

    2
    IBM’s Q lab at its T.J. Watson research facility. (Connie Zhou/IBM)

    In the race to commercialize quantum computing, IBM is one of several companies leading the pack. Today, IBM announced it had signed JPMorgan Chase, Daimler AG, Samsung and a number of other corporations to its IBM Q Network, which provides online access to IBM’s experimental quantum computing systems. IBM is also establishing regional research hubs at IBM Research in New York, Oak Ridge National Lab in Tennessee, Keio University in Japan, Oxford University in the United Kingdom, and the University of Melbourne in Australia.

    2
    IBM Q system control panel (photo: IBM)

    Twelve organizations in total will be using the IBM prototype quantum computer via the company’s cloud service to accelerate quantum development as they explore a broad set of industrial and scientific applications. Other partners include JSR Corporation, Barclays, Hitachi Metals, Honda, and Nagase.

    Partners currently have access to the 20 qubit IBM Q system, which IBM announced last month, but Big Blue is also building an operational prototype 50 qubit processor, which will be made available in next generation IBM Q systems. The partners will specifically be looking to identify applications that will elicit a quantum advantage, such that they perform better or faster on a quantum machine than a classical one.

    IBM leadership believes we are at the dawn of the commercial quantum era. “The IBM Q Network will serve as a vehicle to make quantum computing more accessible to businesses and organizations through access to the most advanced IBM Q systems and quantum ecosystem,” said Dario Gil, vice president of AI and IBM Q, IBM Research in a statement. “Working closely with our clients, together we can begin to explore the ways big and small quantum computing can address previously unsolvable problems applicable to industries such as financial services, automotive or chemistry. There will be a shared focus on discovering areas of quantum advantage that may lead to commercial, intellectual and societal benefit in the future.”

    Experts from the newly formed IBM Q Consulting will be able to provide support and offer customized roadmaps to help clients become quantum-ready, says IBM.

    With IBM Q, IBM seeks to be the first tech company to deliver commercial universal quantum computing systems for and in tandem with industry and research users. Although today marks the start of its commercial network, IBM has been providing scientists, researchers, and developers with free access to IBM Q processors since May 2016 via the IBM Q Experience. According to the company, 60,000 registered users have collectively run more than 1.7 million experiments and generated over 35 third-party research publications.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

     
  • richardmitnick 4:58 pm on November 14, 2017 Permalink | Reply
    Tags: , , IBM, , , Quantum Circuits Company, , , , Robert Schoelkopf is at the forefront of a worldwide effort to build the world’s first quantum computer,   

    From NYT: “Yale Professors Race Google and IBM to the First Quantum Computer” 

    New York Times

    The New York Times

    NOV. 13, 2017
    CADE METZ

    1
    Prof. Robert Schoelkopf inside a lab at Yale University. Quantum Circuits, the start-up he has created with two of his fellow professors, is located just down the road. Credit Roger Kisby for The New York Times

    Robert Schoelkopf is at the forefront of a worldwide effort to build the world’s first quantum computer. Such a machine, if it can be built, would use the seemingly magical principles of quantum mechanics to solve problems today’s computers never could.

    Three giants of the tech world — Google, IBM, and Intel — are using a method pioneered by Mr. Schoelkopf, a Yale University professor, and a handful of other physicists as they race to build a machine that could significantly accelerate everything from drug discovery to artificial intelligence. So does a Silicon Valley start-up called Rigetti Computing. And though it has remained under the radar until now, those four quantum projects have another notable competitor: Robert Schoelkopf.

    After their research helped fuel the work of so many others, Mr. Schoelkopf and two other Yale professors have started their own quantum computing company, Quantum Circuits.

    Based just down the road from Yale in New Haven, Conn., and backed by $18 million in funding from the venture capital firm Sequoia Capital and others, the start-up is another sign that quantum computing — for decades a distant dream of the world’s computer scientists — is edging closer to reality.

    “In the last few years, it has become apparent to us and others around the world that we know enough about this that we can build a working system,” Mr. Schoelkopf said. “This is a technology that we can begin to commercialize.”

    Quantum computing systems are difficult to understand because they do not behave like the everyday world we live in. But this counterintuitive behavior is what allows them to perform calculations at rate that would not be possible on a typical computer.

    Today’s computers store information as “bits,” with each transistor holding either a 1 or a 0. But thanks to something called the superposition principle — behavior exhibited by subatomic particles like electrons and photons, the fundamental particles of light — a quantum bit, or “qubit,” can store a 1 and a 0 at the same time. This means two qubits can hold four values at once. As you expand the number of qubits, the machine becomes exponentially more powerful.

    Todd Holmdahl, who oversees the quantum project at Microsoft, said he envisioned a quantum computer as something that could instantly find its way through a maze. “A typical computer will try one path and get blocked and then try another and another and another,” he said. “A quantum computer can try all paths at the same time.”

    The trouble is that storing information in a quantum system for more than a short amount of time is very difficult, and this short “coherence time” leads to errors in calculations. But over the past two decades, Mr. Schoelkopf and other physicists have worked to solve this problem using what are called superconducting circuits. They have built qubits from materials that exhibit quantum properties when cooled to extremely low temperatures.

    With this technique, they have shown that, every three years or so, they can improve coherence times by a factor of 10. This is known as Schoelkopf’s Law, a playful ode to Moore’s Law, the rule that says the number of transistors on computer chips will double every two years.

    2
    Professor Schoelkopf, left, and Prof. Michel Devoret working on a device that can reach extremely low temperatures to allow a quantum computing device to function. Credit Roger Kisby for The New York Times

    “Schoelkopf’s Law started as a joke, but now we use it in many of our research papers,” said Isaac Chuang, a professor at the Massachusetts Institute of Technology. “No one expected this would be possible, but the improvement has been exponential.”

    These superconducting circuits have become the primary area of quantum computing research across the industry. One of Mr. Schoelkopf’s former students now leads the quantum computing program at IBM. The founder of Rigetti Computing studied with Michel Devoret, one of the other Yale professors behind Quantum Circuits.

    In recent months, after grabbing a team of top researchers from the University of California, Santa Barbara, Google indicated it is on the verge of using this method to build a machine that can achieve “quantum supremacy” — when a quantum machine performs a task that would be impossible on your laptop or any other machine that obeys the laws of classical physics.

    There are other areas of research that show promise. Microsoft, for example, is betting on particles known as anyons. But superconducting circuits appear likely to be the first systems that will bear real fruit.

    The belief is that quantum machines will eventually analyze the interactions between physical molecules with a precision that is not possible today, something that could radically accelerate the development of new medications. Google and others also believe that these systems can significantly accelerate machine learning, the field of teaching computers to learn tasks on their own by analyzing data or experiments with certain behavior.

    A quantum computer could also be able to break the encryption algorithms that guard the world’s most sensitive corporate and government data. With so much at stake, it is no surprise that so many companies are betting on this technology, including start-ups like Quantum Circuits.

    The deck is stacked against the smaller players, because the big-name companies have so much more money to throw at the problem. But start-ups have their own advantages, even in such a complex and expensive area of research.

    “Small teams of exceptional people can do exceptional things,” said Bill Coughran, who helped oversee the creation of Google’s vast internet infrastructure and is now investing in Mr. Schoelkopf’s company as a partner at Sequoia. “I have yet to see large teams inside big companies doing anything tremendously innovative.”

    Though Quantum Circuits is using the same quantum method as its bigger competitors, Mr. Schoelkopf argued that his company has an edge because it is tackling the problem differently. Rather than building one large quantum machine, it is constructing a series of tiny machines that can be networked together. He said this will make it easier to correct errors in quantum calculations — one of the main difficulties in building one of these complex machines.

    But each of the big companies insist that they hold an advantage — and each is loudly trumpeting its progress, even if a working machine is still years away.

    Mr. Coughran said that he and Sequoia envision Quantum Circuits evolving into a company that can deliver quantum computing to any business or researcher that needs it. Another investor, Canaan’s Brendan Dickinson, said that if a company like this develops a viable quantum machine, it will become a prime acquisition target.

    “The promise of a large quantum computer is incredibly powerful,” Mr. Dickinson said. “It will solve problems we can’t even imagine right now.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:54 pm on November 14, 2017 Permalink | Reply
    Tags: At a mere 5 nanometers in width the scale of these transistors enables a semiconductor chip the size of a fingernail to hold up to 30 billion of them and this can lead to faster more powerful performa, GLOBALFOUNDRIES, IBM, Samsung, , The more miniscule the transistors the more of them that can be piled into a computer chip   

    From SUNY Polytech: “Researchers create smallest-ever 5-nanometer transistor” 

    suny-poly-bloc

    SUNY Polytechnic Institute

    11.14.17
    No writer credit

    1
    SUNYPoly-IBM Partnership

    Bigger isn’t always better; in the semiconductor business tiny is tops. That is, the more miniscule the transistors, the more of them that can be piled into a computer chip, leading to faster processing speed and more sophisticated computational capabilities.

    Researchers from IBM, as well as partners from GLOBALFOUNDRIES and Samsung, have created a transistor at SUNY Polytechnic Institute with components that surpass in smallness those of any currently available in the world. At a mere 5 nanometers in width, the scale of these transistors enables a semiconductor chip the size of a fingernail to hold up to 30 billion of them, and this can lead to faster, more powerful performance.

    “To give you an idea of the size of this transistor’s components, a single nanometer is the length of about three or four atoms of gold, so with this 5-nanometer transistor, IBM, GLOBALFOUNDRIES, and Samsung are approaching atomic-scale devices,” said Dr. Bahgat Sammakia, Interim President of SUNY Poly. “It’s a remarkable achievement.”

    To create the 5-nanometer transistor, the researchers had to take a different approach from what is typically used to manufacture semiconductors. Rather than lining up the transistors’ “fins” in rows and columns, the team stacked silicon nanosheets one on top of the other. The result is a transistor that, that when scaled up as a chip, will better meet the future demands of artificial intelligence and mobile devices, among other technologies.

    In fact, according to Sammakia, a silicon nanosheet 5-nanometer chip will greatly improve cognitive machines’ abilities to learn and make decisions. Think self-driving cars, for example. In addition, it could assist with making appliances, devices, or even buildings that are part of the “Internet of Things” more useful and efficient through an expanded ability to rapidly evaluate and make use of data. Further, because of the potential energy savings, the semiconductor chip could help mobile devices maintain their charges two-to-three times longer.

    Mukesh Khare, vice president of semiconductor technology research at IBM Research, added that IBM’s partnership with SUNY Poly over the years has been essential to future semiconductor technology success. For example, in July 2015, IBM and its partners fabricated the world’s first 7-nanometer test chip in partnership with SUNY Poly. Less than two years later, IBM Research built the industry’s first 5-nanometer device structure at SUNY Poly.

    “Our advanced research and development work is aided by the ongoing support and collaboration of SUNY Poly,” said Khare. “IBM views the existing programs of cooperation between SUNY Poly and IBM as important to our ongoing success.”

    GLOBALFOUNDRIES, too, regards its relationship with SUNY Poly as key to the success of the new transistor.

    “Through our long history of collaboration with IBM and SUNY Poly, we have built a highly successful, globally recognized partnership at the Albany NanoTech Complex and accelerated development of next-generation technology,” said George Gomba, vice president of technology research at GLOBALFOUNDRIES. “Our decades of research has enabled GF to maintain our focus on technology leadership for our customers and partners by helping to address the development challenges central to producing a smaller, faster, more cost efficient generation of semiconductors.”

    In turn, SUNY Poly reaps numerous benefits from its association with these companies.

    “It is beneficial for our faculty members and students to have the opportunity to work with industry leaders,” said Dr. Sammakia. “In fact, many of our students end up working with IBM and GLOBALFOUNDRIES as interns or as employees.”

    According to Sammakia, the 5-nanometer transistor is currently in the small-scale manufacturing phase, and within a few quarters, should transfer to large-scale manufacturing at GLOBALFOUNDRIES.

    “A 5-nanometer chip is a significant milestone for the entire semiconductor industry,” he said, “and we eagerly anticipate its move into the marketplace.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    suny-poly-campus

    The State University of New York Polytechnic Institute, commonly referred to as SUNY Polytechnic Institute or SUNY Poly, is a public research university with campuses in the town of Marcy in the Utica-Rome metropolitan area and Albany, New York. Founded in 1966 using classrooms at a primary school, SUNY Poly is New York’s public polytechnic college. The Marcy campus, formerly the SUNY Institute of Technology, has a Utica, New York mailing address and was established in 1987. The Albany campus was formerly a component of the University at Albany, established in January 2003.

    SUNY Poly is accredited by the Middle States Association of Colleges and Schools. The university offers over 30 bachelor’s degrees, 15 master’s degrees, and three doctoral degrees within five different colleges. SUNY Poly students come from across the state of New York, throughout the United States, and more than twenty other nations. More than 25,000 alumni enjoy successful careers in a wide range of fields.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: