Tagged: MIT Technology Review (US) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:31 pm on November 1, 2021 Permalink | Reply
    Tags: "High-performance low-cost machine learning infrastructure is accelerating innovation in the cloud", , AWS is building high-performance and low-cost machine learning chips., Cloud machine learning technologies like AWS Inferentia are accelerating organizations’ AI transformation., MIT Technology Review (US)   

    From MIT Technology Review (US) : “High-performance low-cost machine learning infrastructure is accelerating innovation in the cloud” 

    From MIT Technology Review (US)

    November 1, 2021
    By Amazon Web Services

    Cloud machine learning technologies like AWS Inferentia are accelerating organizations’ AI transformation.

    1

    Artificial intelligence and machine learning (AI and ML) are key technologies that help organizations develop new ways to increase sales, reduce costs, streamline business processes, and understand their customers better. AWS helps customers accelerate their AI/ML adoption by delivering powerful compute, high-speed networking, and scalable high-performance storage options on demand for any machine learning project. This lowers the barrier to entry for organizations looking to adopt the cloud to scale their ML applications.

    Developers and data scientists are pushing the boundaries of technology and increasingly adopting deep learning, which is a type of machine learning based on neural network algorithms. These deep learning models are larger and more sophisticated resulting in rising costs to run underlying infrastructure to train and deploy these models.

    2

    To enable customers to accelerate their AI/ML transformation, AWS is building high-performance and low-cost machine learning chips. AWS Inferentia is the first machine learning chip built from the ground up by AWS for the lowest cost machine learning inference in the cloud. In fact, Amazon EC2 Inf1 instances powered by Inferentia, deliver 2.3x higher performance and up to 70% lower cost for machine learning inference than current generation GPU-based EC2 instances. AWS Trainium is the second machine learning chip by AWS that is purpose-built for training deep learning models and will be available in late 2021.

    Customers across industries have deployed their ML applications in production on Inferentia and seen significant performance improvements and cost savings. For example, AirBnB’s customer support platform enables intelligent, scalable, and exceptional service experiences to its community of millions of hosts and guests across the globe. It used Inferentia-based EC2 Inf1 instances to deploy natural language processing (NLP) models that supported its chatbots. This led to a 2x improvement in performance out of the box over GPU-based instances.

    With these innovations in silicon, AWS is enabling customers to train and execute their deep learning models in production easily with high performance and throughput at significantly lower costs.

    Machine learning challenges speed shift to cloud-based infrastructure

    Machine learning is an iterative process that requires teams to build, train, and deploy applications quickly, as well as train, retrain, and experiment frequently to increase the prediction accuracy of the models. When deploying trained models into their business applications, organizations need to also scale their applications to serve new users across the globe. They need to be able to serve multiple requests coming in at the same time with near real-time latency to ensure a superior user experience.

    Emerging use cases such as object detection, natural language processing (NLP), image classification, conversational AI, and time series data rely on deep learning technology. Deep learning models are exponentially increasing in size and complexity, going from having millions of parameters to billions in a matter of a couple of years.

    Training and deploying these complex and sophisticated models translates to significant infrastructure costs. Costs can quickly snowball to become prohibitively large as organizations scale their applications to deliver near real-time experiences to their users and customers.

    This is where cloud-based machine learning infrastructure services can help. The cloud provides on-demand access to compute, high-performance networking, and large data storage, seamlessly combined with ML operations and higher level AI services, to enable organizations to get started immediately and scale their AI/ML initiatives.

    How AWS is helping customers accelerate their AI/ML transformation

    AWS Inferentia and AWS Trainium aim to democratize machine learning and make it accessible to developers irrespective of experience and organization size. Inferentia’s design is optimized for high performance, throughput, and low latency, which makes it ideal for deploying ML inference at scale.

    Each AWS Inferentia chip contains four NeuronCores that implement a high-performance systolic array matrix multiply engine, which massively speeds up typical deep learning operations, such as convolution and transformers. NeuronCores are also equipped with a large on-chip cache, which helps to cut down on external memory accesses, reducing latency, and increasing throughput.

    AWS Neuron, the software development kit for Inferentia, natively supports leading ML frameworks, like TensorFlow and PyTorch. Developers can continue using the same frameworks and lifecycle developments tools they know and love. For many of their trained models, they can compile and deploy them on Inferentia by changing just a single line of code, with no additional application code changes.

    The result is a high-performance inference deployment, that can easily scale while keeping costs under control.

    Sprinklr, a software-as-a-service company, has an AI-driven unified customer experience management platform that enables companies to gather and translate real-time customer feedback across multiple channels into actionable insights. This results in proactive issue resolution, enhanced product development, improved content marketing, and better customer service. Sprinklr used Inferentia to deploy its NLP and some of its computer vision models and saw significant performance improvements.

    Several Amazon services also deploy their machine learning models on Inferentia.

    Amazon Prime Video uses computer vision ML models to analyze video quality of live events to ensure an optimal viewer experience for Prime Video members. It deployed its image classification ML models on EC2 Inf1 instances and saw a 4x improvement in performance and up to a 40% savings in cost as compared to GPU-based instances.

    Another example is Amazon Alexa’s AI and ML-based intelligence, powered by Amazon Web Services, which is available on more than 100 million devices today. Alexa’s promise to customers is that it is always becoming smarter, more conversational, more proactive, and even more delightful. Delivering on that promise requires continuous improvements in response times and machine learning infrastructure costs. By deploying Alexa’s text-to-speech ML models on Inf1 instances, it was able to lower inference latency by 25% and cost-per-inference by 30% to enhance service experience for tens of millions of customers who use Alexa each month.

    Unleashing new machine learning capabilities in the cloud

    As companies race to future-proof their business by enabling the best digital products and services, no organization can fall behind on deploying sophisticated machine learning models to help innovate their customer experiences. Over the past few years, there has been an enormous increase in the applicability of machine learning for a variety of use cases, from personalization and churn prediction to fraud detection and supply chain forecasting.

    Luckily, machine learning infrastructure in the cloud is unleashing new capabilities that were previously not possible, making it far more accessible to non-expert practitioners. That’s why AWS customers are already using Inferentia-powered Amazon EC2 Inf1 instances to provide the intelligence behind their recommendation engines and chatbots and to get actionable insights from customer feedback.

    With AWS cloud-based machine learning infrastructure options suitable for various skill levels, it’s clear that any organization can accelerate innovation and embrace the entire machine learning lifecycle at scale. As machine learning continues to become more pervasive, organizations are now able to fundamentally transform the customer experience—and the way they do business—with cost-effective, high-performance cloud-based machine learning infrastructure.

    Learn more about how AWS’s machine learning platform can help your company innovate here.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review (US) is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 10:26 am on October 22, 2021 Permalink | Reply
    Tags: "How AI is reinventing what computers are", Chipmakers like Intel and Arm and Nvidia which supplied many of the first GPUs are pivoting to make hardware tailored specifically for AI., Google’s latest offering-the Pixel 6-is the first phone to have a separate chip dedicated to AI that sits alongside its standard processor., MIT Technology Review (US), The chip that runs the iPhone has for the last couple of years contained what Apple calls a “neural engine” also dedicated to AI., The core of computing is changing from number-crunching to decision-­making.   

    From MIT Technology Review (US) : “How AI is reinventing what computers are” 

    From MIT Technology Review (US)

    October 22, 2021
    Will Douglas Heaven

    1
    Credit: Andrea Daquino.

    Fall 2021: the season of pumpkins, pecan pies, and peachy new phones. Every year, right on cue, Apple, Samsung, Google, and others drop their latest releases. These fixtures in the consumer tech calendar no longer inspire the surprise and wonder of those heady early days. But behind all the marketing glitz, there’s something remarkable going on.

    Google’s latest offering-the Pixel 6-is the first phone to have a separate chip dedicated to AI that sits alongside its standard processor. And the chip that runs the iPhone has for the last couple of years contained what Apple calls a “neural engine” also dedicated to AI. Both chips are better suited to the types of computations involved in training and running machine-learning models on our devices, such as the AI that powers your camera. Almost without our noticing, AI has become part of our day-to-day lives. And it’s changing how we think about computing.

    What does that mean? Well, computers haven’t changed much in 40 or 50 years. They’re smaller and faster, but they’re still boxes with processors that run instructions from humans. AI changes that on at least three fronts: how computers are made, how they’re programmed, and how they’re used. Ultimately, it will change what they are for.

    “The core of computing is changing from number-crunching to decision-­making,” says Pradeep Dubey, director of the parallel computing lab at Intel. Or, as MIT CSAIL director Daniela Rus puts it, AI is freeing computers from their boxes.

    More haste, less speed

    The first change concerns how computers—and the chips that control them—are made. Traditional computing gains came as machines got faster at carrying out one calculation after another. For decades the world benefited from chip speed-ups that came with metronomic regularity as chipmakers kept up with Moore’s Law.

    But the deep-learning models that make current AI applications work require a different approach: they need vast numbers of less precise calculations to be carried out all at the same time. That means a new type of chip is required: one that can move data around as quickly as possible, making sure it’s available when and where it’s needed. When deep learning exploded onto the scene a decade or so ago, there were already specialty computer chips available that were pretty good at this: graphics processing units, or GPUs, which were designed to display an entire screenful of pixels dozens of times a second.

    Anything can become a computer. Indeed, most household objects, from toothbrushes to light switches to doorbells, already come in a smart version.

    Now chipmakers like Intel and Arm and Nvidia which supplied many of the first GPUs are pivoting to make hardware tailored specifically for AI. Google and Facebook are also forcing their way into this industry for the first time, in a race to find an AI edge through hardware.

    For example, the chip inside the Pixel 6 is a new mobile version of Google’s tensor processing unit, or TPU. Unlike traditional chips, which are geared toward ultrafast, precise calculations, TPUs are designed for the high-volume but low-­precision calculations required by neural networks. Google has used these chips in-house since 2015: they process people’s photos and natural-­language search queries. Google’s sister company DeepMind uses them to train its AIs.

    In the last couple of years, Google has made TPUs available to other companies, and these chips—as well as similar ones being developed by others—are becoming the default inside the world’s data centers.

    AI is even helping to design its own computing infrastructure. In 2020, Google used a reinforcement-­learning algorithm—a type of AI that learns how to solve a task through trial and error—to design the layout of a new TPU. The AI eventually came up with strange new designs that no human would think of—but they worked. This kind of AI could one day develop better, more efficient chips.

    Show, don’t tell

    The second change concerns how computers are told what to do. For the past 40 years we have been programming computers; for the next 40 we will be training them, says Chris Bishop, head of Microsoft Research in the UK.

    Traditionally, to get a computer to do something like recognize speech or identify objects in an image, programmers first had to come up with rules for the computer.

    With machine learning, programmers no longer write rules. Instead, they create a neural network that learns those rules for itself. It’s a fundamentally different way of thinking.

    Examples of this are already commonplace: speech recognition and image identification are now standard features on smartphones. Other examples made headlines, as when AlphaZero taught itself to play Go better than humans. Similarly, AlphaFold cracked open a biology problem—working out how proteins fold—that people had struggled with for decades.

    For Bishop, the next big breakthroughs are going to come in molecular simulation: training computers to manipulate the properties of matter, potentially making world-changing leaps in energy usage, food production, manufacturing, and medicine.

    Breathless promises like this are made often. It is also true that deep learning has a track record of surprising us. Two of the biggest leaps of this kind so far—getting computers to behave as if they understand language and to recognize what is in an image—are already changing how we use them.

    Computer knows best

    For decades, getting a computer to do something meant typing in a command, or at least clicking a button.

    Machines no longer need a keyboard or screen for humans to interact with. Anything can become a computer. Indeed, most household objects, from toothbrushes to light switches to doorbells, already come in a smart version. But as they proliferate, we are going to want to spend less time telling them what to do. They should be able to work out what we need without being told.

    This is the shift from number-­crunching to decision-making that Dubey sees as defining the new era of computing.

    Rus wants us to embrace the cognitive and physical support on offer. She imagines computers that tell us things we need to know when we need to know them and intervene when we need a hand. “When I was a kid, one of my favorite movie [scenes] in the whole world was ‘The Sorcerer’s Apprentice,’” says Rus. “You know how Mickey summons the broom to help him tidy up? We won’t need magic to make that happen.”

    We know how that scene ends. Mickey loses control of the broom and makes a big mess. Now that machines are interacting with people and integrating into the chaos of the wider world, everything becomes more uncertain. The computers are out of their boxes.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review (US) is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 9:05 am on September 4, 2021 Permalink | Reply
    Tags: "Building a better chemical factory—out of microbes", , , , Bioprocess engineering, , , , , , Glucaric acid, Metabolic engineering, Metabolite valve, , MIT Technology Review (US), ,   

    From MIT Technology Review (US) : “Building a better chemical factory—out of microbes” 

    From MIT Technology Review (US)

    August 24, 2021
    Leigh Buchanan

    1
    Credit: Sasha Israel.

    Professor Kristala Jones Prather ’94 has made it practical to turn microbes into efficient producers of desired chemicals. She’s also working to reduce our dependence on petroleum.

    Metabolic engineers have a problem: cells are selfish. The scientists want to use microbes to produce chemical compounds for industrial applications. The microbes prefer to concentrate on their own growth.

    Kristala L. Jones Prather ’94 has devised a tool that satisfies both conflicting objectives. Her metabolite valve acts like a train switch: it senses when a cell culture has reproduced enough to sustain itself and then redirects metabolic flux—the movement of molecules in a pathway—down the track that synthesizes the desired compound. The results: greater yield of the product and sufficient cell growth to keep the culture healthy and productive.

    William E. Bentley, a professor of bioengineering at The University of Maryland (US), has been following Prather’s work for more than two decades. He calls the valves “a new principle in engineering” that he anticipates will be highly valued in the research community. Their ability to eliminate bottlenecks can prove so essential to those attempting to synthesize a particular molecule in useful quantities that “in many cases it might decide whether it is a successful endeavor or not,” says Bentley.

    Prather, The Massachusetts Institute of Technology (US)’s Arthur D. Little Professor of Chemical Engineering, labors in the intersecting fields of synthetic biology and metabolic engineering: a place where science, rather than art, imitates life. The valves play a major role in her larger goal of programming microbes—chiefly E. coli—to produce chemicals that can be used in a wide range of fields, including energy and medicine. She does that by observing what nature can do. Then she hypothesizes what it should be able to do with an assist from strategically inserted DNA.

    “We are increasing the synthetic capacity of biological systems,” says Prather, who made MIT Technology Review’s TR35 list in 2007. “We need to push beyond what biology can naturally do and start getting it to make compounds that it doesn’t normally make.”

    Prather describes her work as creating a new kind of chemical factory inside microbial cells—one that makes ultra-pure compounds efficiently at scale. Coaxing microbes into producing desired compounds is safer and more environmentally friendly than relying on traditional chemical synthesis, which typically involves high temperatures, high pressures, and complicated instrumentation—and, often, toxic by-products. She didn’t originate the idea of turning microbes into chemical factories, but her lab is known for developing tools and fine-tuning processes that make it efficient and practical.

    That’s the approach she has taken with glucaric acid, which has multiple commercial applications, some of them green. Water treatment plants, for example, have long relied on phosphates to prevent corrosion in pipes and to bind with metals like lead and copper so they don’t leach into the water supply. But phosphates also feed algae blooms in lakes and oceans. Glucaric acid does the same work as phosphates without feeding those toxic blooms.

    Producing glucaric acid the usual way—through chemical oxidation of glucose—is expensive, often yields product that isn’t very pure, and creates a lot of hazardous waste. Prather’s microbial factories produce it with high levels of purity and without the toxic by-products, at a reasonable cost. She cofounded the startup Kalion in 2011 to put her microbial-factory approach into practice. (Prather is Kalion’s chief science officer. Her husband, Darcy Prather ’91, is its president.)

    The company, which is lining up large-scale production in Slovakia, has several prospective customers. Although the largest of these are in oil services, “it also turns out, in the wonderful, wacky way chemistry works, that the same compound is used in pharmaceutical manufacturing,” Prather says. It’s required, for example, in production of the ADHD drug Adderall. And it can be used to make textiles stronger, which could lead to more effective recycling of cotton and other natural materials.

    Kalion’s first target is phosphates, because of their immediate commercial applications. But in her wider research, Prather has also drawn a great big bull’s-eye on petroleum. Eager to produce greener alternatives to gasoline and plastics, she and her research group at MIT are using bacteria to synthesize molecules that would normally be derived from petroleum. “Big picture, if we are successful,” Prather says, “what we are doing is moving things one by one off the shelf to say, ‘That no longer is made from petroleum. That now is made from biomass.’”

    From East Texas to MIT

    Born in Cincinnati, Prather grew up in Longview, Texas, against a backdrop of oilfield pumps and derricks. Her father died before she turned two. Her mother worked at Wylie College, a small, historically Black school—and earned a bachelor’s degree there herself in 2004, Prather is quick to add.

    Her high school’s first valedictorian of color, Prather had only vague ideas about academic and professional opportunities outside her state. With college brochures flooding the family’s mailbox in her junior year, she sought advice from a history teacher. “Math was my favorite subject in high school, and I was enjoying chemistry,” says Prather. The teacher told her that math plus chemistry equaled chemical engineering, and that if she wanted to be an engineer she should go to The Massachusetts Institute of Technology (US). “What’s MIT?” asked Prather.

    Others in the community were no better informed. What was then the DeVry Institute of Technology, a for-profit school with a less-than-stellar academic reputation and campuses around the country, was advertising heavily on television. When she told people she was going to MIT, they assumed it was a DeVry branch in Massachusetts. “They were disappointed, because they thought I was going to do great things,” says Prather. “But here I was going to this trade school to be a plumber’s assistant.”

    In June 1990 Prather arrived on campus to participate in Interphase, a program offered through MIT’s Office of Minority Education. Designed to ease the transition for incoming students, Interphase “was a game-changer,” says Prather. The program introduced her to an enduring group of friends and familiarized her with the campus. Most important, it instilled confidence. Coming from a school without AP classes, Prather had worried about starting off behind the curve. When she found she knew the material in her Interphase math class, it came as a relief. “When I was bored, I thought, ‘I belong here,’” she says.

    As an undergraduate Prather was exposed to bioprocess engineering, which uses living cells to induce desired chemical or physical changes in a material. At that time scientists treated the cells from which the process starts as something fixed. Prather became intrigued by the idea that you could engineer not only the process but also the biology of the cell itself. “The way you could copy and cut and paste DNA appealed to the part of me that liked math,” she says.

    After graduating in 1994, Prather got her PhD at The University of California-Berkeley (US), where her advisor was Jay Keasling, a professor of chemical and biomolecular engineering who was at the forefront of the new field of synthetic biology. At Berkeley, Prather sought ways to move DNA in and out of cells to optimize the creation of desirable proteins.

    The practice at that time was to bulk up cells with lots of DNA, which would in turn produce lots of protein, which would generate lots of the desired chemical compound. But there was a problem, which Prather—who lives near a scenic state park—explains with a local analogy. “I can go for a light hike in the Blue Hills Reservation,” she says, “but not if you put a 50-pound pack on my back.” Similarly, an overloaded cell “can sometimes respond by saying, ‘I am too tired.’” Prather’s doctoral thesis explored systems that efficiently produce a lot of a desired chemical using less DNA.

    In her fourth year at Berkeley, Prather received a fellowship from DuPont and traveled to Delaware for her first full-length presentation. Following standard conference practice, she laid out for her audience the three motivations underlying her research. Afterward, one of the company’s scientists politely explained to her why all three were misguided. “He said, ‘What you are doing is interesting and important, but you are motivated by what you think is important in industry,’” says Prather. “‘And we just don’t care about any of that stuff.’”

    Humbled, Prather decided a sojourn in the corporate world would reduce the risk that her academic career would be consigned to real-world irrelevance. She spent the next four years at Merck, in a group developing processes to make things like therapeutic proteins and vaccines. There she learned about the kinds of projects and problems that matter most to practitioners like her DuPont critic.

    Merck employed hordes of chemists to produce large quantities of chemical compounds for use in new drugs. When part of that process seemed better suited to biology than to chemistry, they would hand it off to the department Prather worked in, which used enzymes to perform the next step. “They were typically not very complicated reactions,” says Prather. “A single step converting A to B.”

    Prather was intrigued by the possibility of performing not just individual steps but the entire chemical synthesis within cells, using chains of reactions called metabolic pathways. That work inspired what would become some of her most acclaimed research at MIT, where she joined the faculty in 2004.

    Finding the production switch

    It wasn’t long after returning to MIT that this young woman from the Texas oil patch took aim at fossil fuels and their by-­products. Many of her lab’s projects focus on replacing petroleum as a feedstock. In one—a collaboration with MIT colleagues Brad Olsen ’03, a chemical engineer, and Desiree Plata, PhD ’09, a civil and environmental engineer—Prather is using biomass to create renewable polymers that could lead to a greener kind of plastic. Her lab is figuring out how to induce microbes to convert sugar from plants into monomers that can then be chemically converted into polymers to create plastic. At the end of the plastic’s usable life, it biodegrades and turns back into nutrients. Those nutrients “will give you more plants from which you can extract more sugar that you can turn into new chemicals to go into new plastics,” says Prather. “It’s the circle of life there.”

    These days she is drawing the most attention for her research in optimizing metabolic pathways—research that she and other scientists can then use to maximize the yield of a desired product.

    The challenge is that cells prioritize the use of nutrients, such as glucose, to grow rather than to manufacture these desirable compounds. More growth for the cell means less product for the scientist. “So you run into a competition problem,” says Prather.

    Take glucaric acid, the chemical produced by Prather’s company—and one that Keasling says is extremely important to industry. (“These molecules are not trivial to produce, particularly at the levels that are needed industrially,” he says.) Prather and her lab had been adding three genes—drawn from mice, yeast, and a bacterium—to E. coli, enabling the bacteria to transform a type of simple sugar into glucaric acid. But the bacteria also needed that sugar for a metabolic pathway that breaks down glucose to feed its own growth and reproduction.

    Prather’s team wanted to shut down the pathway nourishing growth and divert the sugar into a pathway producing glucaric acid—but only after the bacterial culture had grown enough to sustain itself as a productive chemical factory. To do so they used quorum sensing, a kind of communication through which bacteria share information about changes in the number of cells in their colony, which allows them to coordinate colony-wide functions such as gene regulation. The team engineered each cell to produce a protein that then creates a molecule called AHL. When quorum sensing detects a certain amount of AHL—the amount produced in the time it takes for the culture to reach a sustainable size—it activates a switch that turns off production of an enzyme that is part of the glucose breakdown process. The glucose shifts to the chemical-synthesis pathway, greatly increasing the amount of glucaric acid produced.

    Prather’s switches, called metabolite valves, are now used in processes that harness microbes to produce a wide range of desired chemicals. The valves open or close in response to changes in the density of particular molecules in a pathway. These switches can be fine-tuned to optimize production without compromising the health of the bacteria, dramatically increasing output. The researchers’ flagship paper, which was published in Nature Biology in 2017, has been cited almost 200 times. The goal at this point is to step up the scale.

    Like many of the mechanisms Prather uses in her research, such switches already exist in biology. Cells whose resources are threatened by neighboring foreign cells will switch from growth mode to producing antibiotics to kill off their competitors, for example. “Cells that make things like antibiotics have a natural way of first making more of themselves, then putting their resources into making product,” she says. “We developed a synthetic way of mimicking nature.”

    Prather’s Berkeley advisor, Keasling, has been using a derivative of the switch inspired by her research. “The tool for channeling metabolic flux—the flow of material through a metabolic pathway—is super-important work that I think will be widely used in the future by metabolic engineers,” he says. “When Kristala publishes something, you know it is going to work.”

    Mentoring young scientists

    Prather receives at least as much recognition for teaching and mentoring as for her research. “She cares deeply about education and is invested in her students in a way that really stands out,” says Keasling. Students describe her optimism and supportiveness, saying that she motivates without commanding. “She created an environment where I was free to make my own mistakes and learn and grow,” says Kevin V. Solomon, SM ’08, PhD ’12, who studied with Prather between 2007 and 2012 and is now an assistant professor of chemical and biomedical engineering at The University of Delaware (US). In some other labs, he notes, “you have hard deadlines, and you perform or you freak out.”

    It was at Merck that Prather realized how much she loves working with young scientists—and it was also where she assembled the management arsenal she uses to run her lab. So, for example, she makes sure to get to know each student’s preferences about communication style, because “treating everyone fairly is not the same as treating everyone the same,” she says. One-on-one meetings commence with a few minutes of chat about general topics, so Prather can suss out students’ states of mind and make sure they are okay. She sets clear standards, intent on avoiding the uncertainty about expectations that is common in academic labs. And when students do raise concerns, “it is important to document and confirm that they have been heard,” she says.

    The most effective leaders model the behaviors they want to see in others. Prather, who received MIT’s Martin Luther King Leadership Award in 2017, expects commitment and high performance from her grad students and postdocs, but not at the cost of their physical or mental health. She discourages working on weekends—to the extent that is possible in biology—and insists that lab members take vacations. And from the beginning she has demonstrated that it is possible to simultaneously do first-class science and have a personal life.

    Prather’s two daughters were both campus kids. She was 31, with a two-month-old baby, when she joined the faculty, and she would nurse her daughter in her office before leaving her at the Institute’s new infant-care facility. Later, she set up a small table and chairs near her desk as a play area. The children have accompanied her on work trips—Prather and her husband took turns watching them when they were small—and frequently attend their mother’s evening and weekend events. Prather recalls turning up for a presentation in 32-123 with both children in tow and setting them up with snacks in the front row. “My daughter promptly dropped the marinara sauce to go with her mozzarella sticks on the floor,” she says. “I was on my hands and knees wiping up red sauce 15 minutes before giving a talk.”

    Prather does set boundaries. She turns down almost every invitation for Friday nights, which is family time. Trips are limited to two a month, and she won’t travel on any family member’s birthday or on her anniversary. But she also welcomes students into her home, where she hosts barbecues and Thanksgiving dinners for anyone without a place to go. “I bring them into my home and into my life,” she says.

    When Solomon was Prather’s student, she even hosted his parents. That hospitality continued after he graduated, when he and his mother ran into his former professor at a conference in Germany. “She graciously kept my mom occupied because she knew I was networking to further my career,” says Solomon.

    It was an act in keeping with Prather’s priorities. Beyond the innovations, beyond the discoveries, her overarching objective is to create independently successful scientists. “The most important thing we do as scientists is to train students and postdocs,” she says. “If your students are well trained and ready to advance knowledge—even if the thing we are working on goes nowhere—to me that is a win.”

    On being Black at MIT-Bearing witness to racism

    As a student at MIT, Kristala Jones Prather ’94 was never the target of racist behavior. But she says other Black students weren’t so lucky. Even though no one challenged her directly, “there was a general atmosphere on campus that questioned the validity of my existence,” she says. Articles in The Tech claimed that affirmative action was diluting the quality of the student pool.

    During her junior year, a group standing on the roof of a frat hurled racial slurs at Black students walking back to their dorm. In response, Prather and another student collaborated with Clarence G. Williams, HM ’09, special assistant to the president, to produce a documentary called It’s Intuitively Obvious about the experience of Black students at MIT.

    “I was involved in a lot of activism to create a climate where students didn’t have to be subjected to the notion that MIT was doing charity,” says Prather. Rather, “it was providing an opportunity for students who had demonstrated their capacity to represent the institution proudly.”

    Prather’s decision to return to MIT as a faculty member was difficult, in part because her Black former classmates, many of whom had experienced overt racism, were discouraging their own children from attending. She worried, too, that she wouldn’t be able to avoid personal attacks this time around. “I didn’t want all the positive feelings I had about MIT to be ruined,” she says.

    Those fears turned out to be unfounded. Prather says she has received tremendous support from her department head and colleagues, as well as abundant leadership opportunities. But she recognizes that not all her peers can say the same. She is guardedly optimistic about the Institute’s current diversity initiative. “We are making progress,” she says. “I am waiting to see if there’s a real commitment to creating an environment where students of color can feel like the Institute is a home for them.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review (US) is to equip its audiences with the intelligence to understand a world shaped by technology.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: