Tagged: Computer Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:05 pm on February 17, 2019 Permalink | Reply
    Tags: "The Secret History of Women in Coding", , Computer Science,   

    From The New York Times: Women In STEM-“The Secret History of Women in Coding” 

    New York Times

    From The New York Times

    Feb. 13, 2019
    Clive Thompson

    Computer programming once had much better gender balance than it does today. What went wrong?

    1
    2
    Mary Allen Wilkes with a LINC at M.I.T., where she was a programmer. Credit Joseph C. Towler, Jr.

    As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

    By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

    But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them.


    So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

    It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding.

    Wilkes quickly became a programming whiz. She first worked on the IBM 704, which required her to write in an abstruse “assembly language.”

    7
    An IBM 704 computer, with IBM 727 tape drives and IBM 780 CRT display. (Image courtesy of LLNL.)

    (A typical command might be something like “LXA A, K,” telling the computer to take the number in Location A of its memory and load it into to the “Index Register” K.) Even getting the program into the IBM 704 was a laborious affair. There were no keyboards or screens; Wilkes had to write a program on paper and give it to a typist, who translated each command into holes on a punch card. She would carry boxes of commands to an “operator,” who then fed a stack of such cards into a reader. The computer executed the program and produced results, typed out on a printer.

    Often enough, Wilkes’s code didn’t produce the result she wanted. So she had to pore over her lines of code, trying to deduce her mistake, stepping through each line in her head and envisioning how the machine would execute it — turning her mind, as it were, into the computer. Then she would rewrite the program. The capacity of most computers at the time was quite limited; the IBM 704 could handle only about 4,000 “words” of code in its memory. A good programmer was concise and elegant and never wasted a word. They were poets of bits. “It was like working logic puzzles — big, complicated logic puzzles,” Wilkes says. “I still have a very picky, precise mind, to a fault. I notice pictures that are crooked on the wall.”

    What sort of person possesses that kind of mentality? Back then, it was assumed to be women. They had already played a foundational role in the prehistory of computing: During World War II, women operated some of the first computational machines used for code-breaking at Bletchley Park in Britain.

    9
    A Colossus Mark 2 computer being operated by Wrens. The slanted control panel on the left was used to set the “pin” (or “cam”) patterns of the Lorenz. The “bedstead” paper tape transport is on the right.

    Develope-Tommy Flowers, assisted by Sidney Broadhurst, William Chandler and for the Mark 2 machines, Allen Coombs
    Manufacturer-Post Office Research Station
    Type-Special-purpose electronic digital programmable computer
    Generation-First-generation computer
    Release date Mk 1: December 1943 Mk 2: 1 June 1944
    Discontinued 1960

    8
    The Lorenz SZ machines had 12 wheels, each with a different number of cams (or “pins”).
    Wheel number 1 2 3 4 5 6 7 8 9 10 11 12
    BP wheel name[13] ψ1 ψ2 ψ3 ψ4 ψ5 μ37 μ61 χ1 χ2 χ3 χ4 χ5
    Number of cams (pins) 43 47 51 53 59 37 61 41 31 29 26 23

    Colossus was a set of computers developed by British codebreakers in the years 1943–1945 to help in the cryptanalysis of the Lorenz cipher. Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Colossus is thus regarded as the world’s first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.

    Colossus was designed by research telephone engineer Tommy Flowers to solve a problem posed by mathematician Max Newman at the Government Code and Cypher School (GC&CS) at Bletchley Park. Alan Turing’s use of probability in cryptanalysis (see Banburismus) contributed to its design. It has sometimes been erroneously stated that Turing designed Colossus to aid the cryptanalysis of the Enigma.Turing’s machine that helped decode Enigma was the electromechanical Bombe, not Colossus.

    In the United States, by 1960, according to government statistics, more than one in four programmers were women. At M.I.T.’s Lincoln Labs in the 1960s, where Wilkes worked, she recalls that most of those the government categorized as “career programmers” were female. It wasn’t high-status work — yet.

    In 1961, Wilkes was assigned to a prominent new project, the creation of the LINC.

    LINC from MIT Lincoln Lab


    Wesley Clark in 1962 at a demonstration of the first Laboratory Instrument Computer, or LINC. Credit MIT Lincoln Laboratory

    As one of the world’s first interactive personal computers, it would be a breakthrough device that could fit in a single office or lab. It would even have its own keyboard and screen, so it could be programmed more quickly, without awkward punch cards or printouts. The designers, who knew they could make the hardware, needed Wilkes to help write the software that would let a user control the computer in real time.

    For two and a half years, she and a team toiled away at flow charts, pondering how the circuitry functioned, how to let people communicate with it. “We worked all these crazy hours; we ate all kinds of terrible food,” she says. There was sexism, yes, especially in the disparity between how men and women were paid and promoted, but Wilkes enjoyed the relative comity that existed among the men and women at Lincoln Labs, the sense of being among intellectual peers. “We were a bunch of nerds,” Wilkes says dryly. “We were a bunch of geeks. We dressed like geeks. I was completely accepted by the men in my group.” When they got an early prototype of the LINC working, it solved a fiendish data-processing problem for a biologist, who was so excited that he danced a happy jig around the machine.

    In late 1964, after Wilkes returned from traveling around the world for a year, she was asked to finish writing the LINC’s operating system. But the lab had been relocated to St. Louis, and she had no desire to move there. Instead, a LINC was shipped to her parents’ house in Baltimore. Looming in the front hall near the foot of the stairs, a tall cabinet of whirring magnetic tapes across from a refrigerator-size box full of circuitry, it was an early glimpse of a sci-fi future: Wilkes was one of the first people on the planet to have a personal computer in her home. (Her father, an Episcopal clergyman, was thrilled. “He bragged about it,” she says. “He would tell anybody who would listen, ‘I bet you don’t have a computer in your living room.’ ”) Before long, LINC users around the world were using her code to program medical analyses and even create a chatbot that interviewed patients about their symptoms.

    But even as Wilkes established herself as a programmer, she still craved a life as a lawyer. “I also really finally got to the point where I said, ‘I don’t think I want to do this for the rest of my life,’ ” she says. Computers were intellectually stimulating but socially isolating. In 1972, she applied and got in to Harvard Law School, and after graduating, she spent the next four decades as a lawyer. “I absolutely loved it,” she says.

    Today Wilkes is retired and lives in Cambridge, Mass. White-haired at 81, she still has the precise mannerisms and the ready, beaming smile that can be seen in photos from the ’60s, when she posed, grinning, beside the LINC. She told me that she occasionally gives talks to young students studying computer science. But the industry they’re heading into is, astonishingly, less populated with women — and by many accounts less welcoming to them — than it was in Wilkes’s day. In 1960, when she started working at M.I.T., the proportion of women in computing and mathematical professions (which are grouped together in federal government data) was 27 percent. It reached 35 percent in 1990. But, in the government’s published figures, that was the peak. The numbers fell after that, and by 2013, women were down to 26 percent — below their share in 1960.

    When Wilkes talks to today’s young coders, they are often shocked to learn that women were among the field’s earliest, towering innovators and once a common sight in corporate America. “Their mouths are agape,” Wilkes says. “They have absolutely no idea.”

    Almost 200 years ago, the first person to be what we would now call a coder was, in fact, a woman: Lady Ada Lovelace.

    4
    Ada Lovelace aka Augusta Ada Byron-1843 or 1850 a rare daguerreotype by Antoine Claudet. Picture taken in his studio probably near Regents Park in London
    Date 2 January 1843
    Source https://blogs.bodleian.ox.ac.uk/adalovelace/2015/10/14/only-known-photographs-of-ada-lovelace-in-bodleian-display/ Reproduction courtesy of Geoffrey Bond.
    Augusta Ada King, Countess of Lovelace (née Byron; 10 December 1815 – 27 November 1852) was an English mathematician and writer, chiefly known for her work on Charles Babbage’s proposed mechanical general-purpose computer, the Analytical Engine [below]. She was the first to recognise that the machine had applications beyond pure calculation, and published the first algorithm intended to be carried out by such a machine. As a result, she is sometimes regarded as the first to recognise the full potential of a “computing machine” and the first computer programmer.

    Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage’s difference engine

    As a young mathematician in England in 1833, she met Charles Babbage, an inventor who was struggling to design what he called the Analytical Engine, which would be made of metal gears and able to execute if/then commands and store information in memory. Enthralled, Lovelace grasped the enormous potential of a device like this. A computer that could modify its own instructions and memory could be far more than a rote calculator, she realized. To prove it, Lovelace wrote what is often regarded as the first computer program in history, an algorithm with which the Analytical Engine would calculate the Bernoulli sequence of numbers. (She wasn’t shy about her accomplishments: “That brain of mine is something more than merely mortal; as time will show,” she once wrote.) But Babbage never managed to build his computer, and Lovelace, who died of cancer at 36, never saw her code executed.

    Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage’s difference engine

    When digital computers finally became a practical reality in the 1940s, women were again pioneers in writing software for the machines. At the time, men in the computing industry regarded writing code as a secondary, less interesting task. The real glory lay in making the hardware. Software? “That term hadn’t yet been invented,” says Jennifer S. Light, a professor at M.I.T. who studies the history of science and technology.

    This dynamic was at work in the development of the first programmable digital computer in the United States, the Electronic Numerical Integrator and Computer, or Eniac, during the 1940s.

    3
    Computer operators with an Eniac — the world’s first programmable general-purpose computer. Credit Corbis/Getty Images

    ENIAC progamming. Columbia University

    Funded by the military, the thing was a behemoth, weighing more than 30 tons and including 17,468 vacuum tubes. Merely getting it to work was seen as the heroic, manly engineering feat. In contrast, programming it seemed menial, even secretarial. Women had long been employed in the scut work of doing calculations. In the years leading up to the Eniac, many companies bought huge electronic tabulating machines — quite useful for tallying up payroll, say — from companies like IBM; women frequently worked as the punch-card operators for these overgrown calculators. When the time came to hire technicians to write instructions for the Eniac, it made sense, to the men in charge, to pick an all-female team: Kathleen McNulty, Jean Jennings, Betty Snyder, Marlyn Wescoff, Frances Bilas and Ruth Lichterman. The men would figure out what they wanted Eniac to do; the women “programmed” it to execute the instructions.

    “We could diagnose troubles almost down to the individual vacuum tube,” Jennings later told an interviewer for the IEEE Annals of the History of Computing. Jennings, who grew up as the tomboy daughter of low-income parents near a Missouri community of 104 people, studied math at college. “Since we knew both the application and the machine, we learned to diagnose troubles as well as, if not better than, the engineer.”

    The Eniac women were among the first coders to discover that software never works right the first time — and that a programmer’s main work, really, is to find and fix the bugs. Their innovations included some of software’s core concepts. Betty Snyder realized that if you wanted to debug a program that wasn’t running correctly, it would help to have a “break point,” a moment when you could stop a program midway through its run. To this day, break points are a key part of the debugging process.

    In 1946, Eniac’s creators wanted to show off the computer to a group of leaders in science, technology and the military. They asked Jennings and Snyder to write a program that calculated missile trajectories. After weeks of intense effort, they and their team had a working program, except for one glitch: It was supposed to stop when the missile landed, but for some reason it kept running. The night before the demo, Snyder suddenly intuited the problem. She went to work early the next day, flipped a single switch inside the Eniac and eliminated the bug. “Betty could do more logical reasoning while she was asleep than most people can do awake,” Jennings later said. Nonetheless, the women got little credit for their work. At that first official demonstration to show off Eniac, the male project managers didn’t mention, much less introduce, the women.

    After the war, as coding jobs spread from the military into the private sector, women remained in the coding vanguard, doing some of the highest-profile work.

    3
    Rear Admiral Grace M. Hopper, 1984

    Grace Brewster Murray Hopper (née Murray; December 9, 1906 – January 1, 1992) was an American computer scientist and United States Navy rear admiral. One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first compiler related tools. She popularized the idea of machine-independent programming languages, which led to the development of COBOL, an early high-level programming language still in use today.

    The pioneering programmer Grace Hopper is frequently credited with creating the first “compiler,” a program that lets users create programming languages that more closely resemble regular written words: A coder could thus write the English-like code, and the compiler would do the hard work of turning it into ones and zeros for the computer. Hopper also developed the “Flowmatic” language for nontechnical businesspeople. Later, she advised the team that created the Cobol language, which became widely used by corporations. Another programmer from the team, Jean E. Sammet, continued to be influential in the language’s development for decades. Fran Allen was so expert in optimizing Fortran, a popular language for performing scientific calculations, that she became the first female IBM fellow.

    NERSC Hopper Cray XE6 supercomputer

    When the number of coding jobs exploded in the ’50s and ’60s as companies began relying on software to process payrolls and crunch data, men had no special advantage in being hired. As Wilkes had discovered, employers simply looked for candidates who were logical, good at math and meticulous. And in this respect, gender stereotypes worked in women’s favor: Some executives argued that women’s traditional expertise at painstaking activities like knitting and weaving manifested precisely this mind-set. (The 1968 book Your Career in Computers stated that people who like “cooking from a cookbook” make good programmers.)

    The field rewarded aptitude: Applicants were often given a test (typically one involving pattern recognition), hired if they passed it and trained on the job, a process that made the field especially receptive to neophytes. “Know Nothing About Computers? Then We’ll Teach You (and Pay You While Doing So),” one British ad promised in 1965. In a 1957 recruiting pitch in the United States, IBM’s brochure titled My Fair Ladies specifically encouraged women to apply for coding jobs.

    Such was the hunger for programming talent that a young black woman named Arlene Gwendolyn Lee [no photo available] could become one of the early female programmers in Canada, despite the open discrimination of the time. Lee was half of a biracial couple to whom no one would rent, so she needed money to buy a house. According to her son, who has described his mother’s experience in a blog post, Lee showed up at a firm after seeing its ad for data processing and systems analytics jobs in a Toronto newspaper sometime in the early 1960s. Lee persuaded the employers, who were all white, to let her take the coding aptitude test. When she placed in the 99th percentile, the supervisors grilled her with questions before hiring her. “I had it easy,” she later told her son. “The computer didn’t care that I was a woman or that I was black. Most women had it much harder.”

    Elsie Shutt learned to code during her college summers while working for the military at the Aberdeen Proving Ground, an Army facility in Maryland.

    8
    Elsie Shutt founded one of the first software businesses in the U.S. in 1958

    In 1953, while taking time off from graduate school, she was hired to code for Raytheon, where the programmer work force “was about 50 percent men and 50 percent women,” she told Janet Abbate, a Virginia Tech historian and author of the 2012 book Recoding Gender. “And it really amazed me that these men were programmers, because I thought it was women’s work!”

    When Shutt had a child in 1957, state law required her to leave her job; the ’50s and ’60s may have been welcoming to full-time female coders, but firms were unwilling to offer part-time work, even to superb coders. So Shutt founded Computations Inc., a consultancy that produced code for corporations. She hired stay-at-home mothers as part-time employees; if they didn’t already know how to code, she trained them. They cared for their kids during the day, then coded at night, renting time on local computers. “What it turned into was a feeling of mission,” Shutt told Abbate, “in providing work for women who were talented and did good work and couldn’t get part-time jobs.” Business Week called the Computations work force the “pregnant programmers” in a 1963 article illustrated with a picture of a baby in a bassinet in a home hallway, with the mother in the background, hard at work writing software. (The article’s title: Mixing Math and Motherhood.)

    By 1967, there were so many female programmers that Cosmopolitan magazine published an article about The Computer Girls, accompanied by pictures of beehived women at work on computers that evoked the control deck of the U.S.S. Enterprise. The story noted that women could make $20,000 a year doing this work (or more than $150,000 in today’s money). It was the rare white-collar occupation in which women could thrive. Nearly every other highly trained professional field admitted few women; even women with math degrees had limited options: teaching high school math or doing rote calculations at insurance firms.

    “Women back then would basically go, ‘Well, if I don’t do programming, what else will I do?’ ” Janet Abbate says. “The situation was very grim for women’s opportunities.”

    If we want to pinpoint a moment when women began to be forced out of programming, we can look at one year: 1984. A decade earlier, a study revealed that the numbers of men and women who expressed an interest in coding as a career were equal. Men were more likely to enroll in computer-science programs, but women’s participation rose steadily and rapidly through the late ’70s until, by the 1983-84 academic year, 37.1 percent of all students graduating with degrees in computer and information sciences were women. In only one decade, their participation rate more than doubled.

    But then things went into reverse. From 1984 onward, the percentage dropped; by the time 2010 rolled around, it had been cut in half. Only 17.6 percent of the students graduating from computer-science and information-science programs were women.

    One reason for this vertiginous decline has to do with a change in how and when kids learned to program. The advent of personal computers in the late ’70s and early ’80s remade the pool of students who pursued computer-science degrees. Before then, pretty much every student who showed up at college had never touched a computer or even been in the room with one. Computers were rare and expensive devices, available for the most part only in research labs or corporate settings. Nearly all students were on equal footing, in other words, and new to programming.

    Once the first generation of personal computers, like the Commodore 64 or the TRS-80, found their way into homes, teenagers were able to play around with them, slowly learning the major concepts of programming in their spare time.

    9
    Commodore 64

    10
    Radio Shack Tandy TRS80

    By the mid-’80s, some college freshmen were showing up for their first class already proficient as programmers. They were remarkably well prepared for and perhaps even a little jaded about what Computer Science 101 might bring. As it turned out, these students were mostly men, as two academics discovered when they looked into the reasons women’s enrollment was so low.

    5
    Keypunch operators at IBM in Stockholm in the 1930s. Credit IBM

    One researcher was Allan Fisher, then the associate dean of the computer-science school at Carnegie Mellon University. The school established an undergraduate program in computer science in 1988, and after a few years of operation, Fisher noticed that the proportion of women in the major was consistently below 10 percent. In 1994, he hired Jane Margolis, a social scientist who is now a senior researcher in the U.C.L.A. School of Education and Information Studies, to figure out why. Over four years, from 1995 to 1999, she and her colleagues interviewed and tracked roughly 100 undergraduates, male and female, in Carnegie Mellon’s computer-science department; she and Fisher later published the findings in their 2002 book “Unlocking the Clubhouse: Women in Computing.”

    What Margolis discovered was that the first-year students arriving at Carnegie Mellon with substantial experience were almost all male. They had received much more exposure to computers than girls had; for example, boys were more than twice as likely to have been given one as a gift by their parents. And if parents bought a computer for the family, they most often put it in a son’s room, not a daughter’s. Sons also tended to have what amounted to an “internship” relationship with fathers, working through Basic-language manuals with them, receiving encouragement from them; the same wasn’t true for daughters. “That was a very important part of our findings,” Margolis says. Nearly every female student in computer science at Carnegie Mellon told Margolis that her father had worked with her brother — “and they had to fight their way through to get some attention.”

    Their mothers were typically less engaged with computers in the home, they told her. Girls, even the nerdy ones, picked up these cues and seemed to dial back their enthusiasm accordingly. These were pretty familiar roles for boys and girls, historically: Boys were cheered on for playing with construction sets and electronics kits, while girls were steered toward dolls and toy kitchens. It wasn’t terribly surprising to Margolis that a new technology would follow the same pattern as it became widely accepted.

    At school, girls got much the same message: Computers were for boys. Geeky boys who formed computer clubs, at least in part to escape the torments of jock culture, often wound up, whether intentionally or not, reproducing the same exclusionary behavior. (These groups snubbed not only girls but also black and Latino boys.) Such male cliques created “a kind of peer support network,” in Fisher’s words.

    This helped explain why Carnegie Mellon’s first-year classes were starkly divided between the sizable number of men who were already confident in basic programming concepts and the women who were frequently complete neophytes. A cultural schism had emerged. The women started doubting their ability. How would they ever catch up?

    What Margolis heard from students — and from faculty members, too — was that there was a sense in the classroom that if you hadn’t already been coding obsessively for years, you didn’t belong. The “real programmer” was the one who “had a computer-screen tan from being in front of the monitor all the time,” as Margolis puts it. “The idea was, you just have to love being with a computer all the time, and if you don’t do it 24/7, you’re not a ‘real’ programmer.” The truth is, many of the men themselves didn’t fit this monomaniacal stereotype. But there was a double standard: While it was O.K. for the men to want to engage in various other pursuits, women who expressed the same wish felt judged for not being “hard core” enough. By the second year, many of these women, besieged by doubts, began dropping out of the program. (The same was true for the few black and Latino students who also arrived on campus without teenage programming experience.)

    A similar pattern took hold at many other campuses. Patricia Ordóñez, a first-year student at Johns Hopkins University in 1985, enrolled in an Introduction to Minicomputers course. She had been a math whiz in high school but had little experience in coding; when she raised her hand in class at college to ask a question, many of the other students who had spent their teenage years programming — and the professor — made her feel singled out. “I remember one day he looked at me and said, ‘You should already know this by now,’ ” she told me. “I thought, I’m never going to succeed.” She switched majors as a result.

    Yet a student’s decision to stick with or quit the subject did not seem to be correlated with coding talent. Many of the women who dropped out were getting perfectly good grades, Margolis learned. Indeed, some who left had been top students. And the women who did persist and made it to the third year of their program had by then generally caught up to the teenage obsessives. The degree’s coursework was, in other words, a leveling force. Learning Basic as a teenage hobby might lead to lots of fun and useful skills, but the pace of learning at college was so much more intense that by the end of the degree, everyone eventually wound up graduating at roughly the same levels of programming mastery.

    5
    An E.R.A./Univac 1103 computer in the 1950s.Credit Hum Images/Alamy

    “It turned out that having prior experience is not a great predictor, even of academic success,” Fisher says. Ordóñez’s later experience illustrates exactly this: After changing majors at Johns Hopkins, she later took night classes in coding and eventually got a Ph.D. in computer science in her 30s; today, she’s a professor at the University of Puerto Rico Río Piedras, specializing in data science.

    By the ’80s, the early pioneering work done by female programmers had mostly been forgotten. In contrast, Hollywood was putting out precisely the opposite image: Computers were a male domain. In hit movies like Revenge of the Nerds, Weird Science, Tron, WarGames and others, the computer nerds were nearly always young white men. Video games, a significant gateway activity that led to an interest in computers, were pitched far more often at boys, as research in 1985 by Sara Kiesler [Psychology of Women Quartly], a professor at Carnegie Mellon, found. “In the culture, it became something that guys do and are good at,” says Kiesler, who is also a program manager at the National Science Foundation. “There were all kinds of things signaling that if you don’t have the right genes, you’re not welcome.”

    A 1983 study involving M.I.T. students produced equally bleak accounts. Women who raised their hands in class were often ignored by professors and talked over by other students. They would be told they weren’t aggressive enough; if they challenged other students or contradicted them, they heard comments like “You sure are bitchy today — must be your period.” Behavior in some research groups “sometimes approximates that of the locker room,” the report concluded, with men openly rating how “cute” their female students were. (“Gee, I don’t think it’s fair that the only two girls in the group are in the same office,” one said. “We should share.”) Male students mused about women’s mediocrity: “I really don’t think the woman students around here are as good as the men,” one said.

    By then, as programming enjoyed its first burst of cultural attention, so many students were racing to enroll in computer science that universities ran into a supply problem: They didn’t have enough professors to teach everyone. Some added hurdles, courses that students had to pass before they could be accepted into the computer-science major. Punishing workloads and classes that covered the material at a lightning pace weeded out those who didn’t get it immediately. All this fostered an environment in which the students mostly likely to get through were those who had already been exposed to coding — young men, mostly. “Every time the field has instituted these filters on the front end, that’s had the effect of reducing the participation of women in particular,” says Eric S. Roberts, a longtime professor of computer science, now at Reed College, who first studied this problem and called it the “capacity crisis.”

    When computer-science programs began to expand again in the mid-’90s, coding’s culture was set. Most of the incoming students were men. The interest among women never recovered to the levels reached in the late ’70s and early ’80s. And the women who did show up were often isolated. In a room of 20 students, perhaps five or even fewer might be women.

    In 1991, Ellen Spertus, now a computer scientist at Mills College, published a report on women’s experiences in programming classes. She cataloged a landscape populated by men who snickered about the presumed inferiority of women and by professors who told female students that they were “far too pretty” to be studying electrical engineering; when some men at Carnegie Mellon were asked to stop using pictures of naked women as desktop wallpaper on their computers, they angrily complained that it was censorship of the sort practiced by “the Nazis or the Ayatollah Khomeini.”

    As programming was shutting its doors to women in academia, a similar transformation was taking place in corporate America. The emergence of what would be called “culture fit” was changing the who, and the why, of the hiring process. Managers began picking coders less on the basis of aptitude and more on how well they fit a personality type: the acerbic, aloof male nerd.

    The shift actually began far earlier, back in the late ’60s, when managers recognized that male coders shared a growing tendency to be antisocial isolates, lording their arcane technical expertise over that of their bosses. Programmers were “often egocentric, slightly neurotic,” as Richard Brandon, a well-known computer-industry analyst, put it in an address at a 1968 conference, adding that “the incidence of beards, sandals and other symptoms of rugged individualism or nonconformity are notably greater among this demographic.”

    In addition to testing for logical thinking, as in Mary Allen Wilkes’s day, companies began using personality tests to select specifically for these sorts of caustic loner qualities. “These became very powerful narratives,” says Nathan Ensmenger, a professor of informatics at Indiana University, who has studied [Gender and Computing] this transition. The hunt for that personality type cut women out. Managers might shrug and accept a man who was unkempt, unshaven and surly, but they wouldn’t tolerate a women who behaved the same way. Coding increasingly required late nights, but managers claimed that it was too unsafe to have women working into the wee hours, so they forbid them to stay late with the men.

    At the same time, the old hierarchy of hardware and software became inverted. Software was becoming a critical, and lucrative, sector of corporate America. Employers increasingly hired programmers whom they could envision one day ascending to key managerial roles in programming. And few companies were willing to put a woman in charge of men. “They wanted people who were more aligned with management,” says Marie Hicks, a historian at the Illinois Institute of Technology. “One of the big takeaways is that technical skill does not equate to success.”

    By the 1990s and 2000s, the pursuit of “culture fit” was in full force, particularly at start-ups, which involve a relatively small number of people typically confined to tight quarters for long hours. Founders looked to hire people who were socially and culturally similar to them.

    “It’s all this loosey-goosey ‘culture’ thing,” says Sue Gardner, former head of the Wikimedia Foundation, the nonprofit that hosts Wikipedia and other sites. After her stint there, Gardner decided to study why so few women were employed as coders. In 2014, she surveyed more than 1,400 women in the field and conducted sit-down interviews with scores more. It became clear to her that the occupation’s takeover by men in the ’90s had turned into a self-perpetuating cycle. Because almost everyone in charge was a white or Asian man, that was the model for whom to hire; managers recognized talent only when it walked and talked as they did. For example, many companies have relied on whiteboard challenges when hiring a coder — a prospective employee is asked to write code, often a sorting algorithm, on a whiteboard while the employers watch. This sort of thing bears almost no resemblance to the work coders actually do in their jobs. But whiteboard questions resemble classroom work at Ivy League institutions. It feels familiar to the men doing the hiring, many of whom are only a few years out of college. “What I came to realize,” Gardner says, “is that it’s not that women are excluded. It’s that practically everyone is excluded if you’re not a young white or Asian man who’s single.”

    One coder, Stephanie Hurlburt, was a stereotypical math nerd who had deep experience working on graphics software. “I love C++, the low-level stuff,” she told me, referring to a complex language known for allowing programmers to write very fast-running code, useful in graphics. Hurlburt worked for a series of firms this decade, including Unity (which makes popular software for designing games), and then for Facebook on its Oculus Rift VR headset, grinding away for long hours in the run-up to the release of its first demo. Hurlburt became accustomed to shrugging off negative attention and crude sexism. She heard, including from many authority figures she admired, that women weren’t wired for math. While working as a coder, if she expressed ignorance of any concept, no matter how trivial, male colleagues would disparage her. “I thought you were at a higher math level,” one sniffed.

    In 2016, Hurlburt and a friend, Rich Geldreich, founded a start-up called Binomial, where they created software that helps compress the size of “textures” in graphics-heavy software. Being self-employed, she figured, would mean not having to deal with belittling bosses. But when she and Geldreich went to sell their product, some customers assumed that she was just the marketing person. “I don’t know how you got this product off the ground when you only have one programmer!” she recalls one client telling Geldreich.

    In 2014, an informal analysis by a tech entrepreneur and former academic named Kieran Snyder of 248 corporate performance reviews for tech engineers determined that women were considerably more likely than men to receive reviews with negative feedback; men were far more likely to get reviews that had only constructive feedback, with no negative material. In a 2016 experiment conducted by the tech recruiting firm Speak With a Geek, 5,000 résumés with identical information were submitted to firms. When identifying details were removed from the résumés, 54 percent of the women received interview offers; when gendered names and other biographical information were given, only 5 percent of them did.

    Lurking beneath some of this sexist atmosphere is the phantasm of sociobiology. As this line of thinking goes, women are less suited to coding than men because biology better endows men with the qualities necessary to excel at programming. Many women who work in software face this line of reasoning all the time. Cate Huston, a software engineer at Google from 2011 to 2014, heard it from colleagues there when they pondered why such a low percentage of the company’s programmers were women. Peers would argue that Google hired only the best — that if women weren’t being hired, it was because they didn’t have enough innate logic or grit, she recalls.

    In the summer of 2017, a Google employee named James Damore suggested in an internal email that several qualities more commonly found in women — including higher rates of anxiety — explained why they weren’t thriving in a competitive world of coding; he cited the cognitive neuroscientist Simon Baron-Cohen, who theorizes that the male brain is more likely to be “systemizing,” compared with women’s “empathizing” brains. Google fired Damore, saying it could not employ someone who would argue that his female colleagues were inherently unsuited to the job. But on Google’s internal boards, other male employees backed up Damore, agreeing with his analysis. The assumption that the makeup of the coding work force reflects a pure meritocracy runs deep among many Silicon Valley men; for them, sociobiology offers a way to explain things, particularly for the type who prefers to believe that sexism in the workplace is not a big deal, or even doubts it really exists.

    But if biology were the reason so few women are in coding, it would be impossible to explain why women were so prominent in the early years of American programming, when the work could be, if anything, far harder than today’s programming. It was an uncharted new field, in which you had to do math in binary and hexadecimal formats, and there were no helpful internet forums, no Google to query, for assistance with your bug. It was just your brain in a jar, solving hellish problems.

    If biology limited women’s ability to code, then the ratio of women to men in programming ought to be similar in other countries. It isn’t. In India, roughly 40 percent of the students studying computer science and related fields are women. This is despite even greater barriers to becoming a female coder there; India has such rigid gender roles that female college students often have an 8 p.m. curfew, meaning they can’t work late in the computer lab, as the social scientist Roli Varma learned when she studied them in 2015. The Indian women had one big cultural advantage over their American peers, though: They were far more likely to be encouraged by their parents to go into the field, Varma says. What’s more, the women regarded coding as a safer job because it kept them indoors, lessening their exposure to street-level sexual harassment. It was, in other words, considered normal in India that women would code. The picture has been similar in Malaysia, where in 2001 — precisely when the share of American women in computer science had slid into a trough — women represented 52 percent of the undergraduate computer-science majors and 39 percent of the Ph.D. candidates at the University of Malaya in Kuala Lumpur.

    Today, when midcareer women decide that Silicon Valley’s culture is unlikely to change, many simply leave the industry. When Sue Gardner surveyed those 1,400 women in 2014, they told her the same story: In the early years, as junior coders, they looked past the ambient sexism they encountered. They loved programming and were ambitious and excited by their jobs. But over time, Gardner says, “they get ground down.” As they rose in the ranks, they found few, if any, mentors. Nearly two-thirds either experienced or witnessed harassment, she read in “The Athena Factor” (a 2008 study of women in tech); in Gardner’s survey, one-third reported that their managers were more friendly toward and gave more support to their male co-workers. It’s often assumed that having children is the moment when women are sidelined in tech careers, as in many others, but Gardner discovered that wasn’t often the breaking point for these women. They grew discouraged seeing men with no better or even lesser qualifications get superior opportunities and treatment.

    “What surprised me was that they felt, ‘I did all that work!’ They were angry,” Gardner says. “It wasn’t like they needed a helping hand or needed a little extra coaching. They were mad. They were not leaving because they couldn’t hack it. They were leaving because they were skilled professionals who had skills that were broadly in demand in the marketplace, and they had other options. So they’re like, ‘[expletive] it — I’ll go somewhere where I’m seen as valuable.’ ”

    The result is an industry that is drastically more male than it was decades ago, and far more so than the workplace at large. In 2018, according to data from the Bureau of Labor Statistics, about 26 percent of the workers in “computer and mathematical occupations” were women. The percentages for people of color are similarly low: Black employees were 8.4 percent, Latinos 7.5 percent. (The Census Bureau’s American Community Survey put black coders at only 4.7 percent in 2016.) In the more rarefied world of the top Silicon Valley tech firms, the numbers are even more austere: A 2017 analysis by Recode, a news site that covers the technology industry, revealed that 20 percent of Google’s technical employees were women, while only 1 percent were black and 3 percent were Hispanic. Facebook was nearly identical; the numbers at Twitter were 15 percent, 2 percent and 4 percent, respectively.

    The reversal has been profound. In the early days of coding, women flocked to programming because it offered more opportunity and reward for merit, more than fields like law. Now software has the closed door.

    In the late 1990s, Allan Fisher decided that Carnegie Mellon would try to address the male-female imbalance in its computer-science program. Prompted by Jane Margolis’s findings, Fisher and his colleagues instituted several changes. One was the creation of classes that grouped students by experience: The kids who had been coding since youth would start on one track; the newcomers to coding would have a slightly different curriculum, allowing them more time to catch up. Carnegie Mellon also offered extra tutoring to all students, which was particularly useful for the novice coders. If Fisher could get them to stay through the first and second years, he knew, they would catch up to their peers.

    5
    Components from four of the earliest electronic computers, held by Patsy Boyce Simmers, Gail Taylor, Millie Beck and Norma Stec, employees at the United States Army’s Ballistics Research Laboratory.Credit Science Source

    They also modified the courses in order to show how code has impacts in the real world, so a new student’s view of programming wouldn’t just be an endless vista of algorithms disconnected from any practical use. Fisher wanted students to glimpse, earlier on, what it was like to make software that works its way into people’s lives. Back in the ’90s, before social media and even before the internet had gone mainstream, the influence that code could have on daily life wasn’t so easy to see.

    Faculty members, too, adopted a different perspective. For years some had tacitly endorsed the idea that the students who came in already knowing code were born to it. Carnegie Mellon “rewarded the obsessive hacker,” Fisher told me. But the faculty now knew that their assumptions weren’t true; they had been confusing previous experience with raw aptitude. They still wanted to encourage those obsessive teenage coders, but they had come to understand that the neophytes were just as likely to bloom rapidly into remarkable talents and deserved as much support. “We had to broaden how faculty sees what a successful student looks like,” he says. The admissions process was adjusted, too; it no longer gave as much preference to students who had been teenage coders.

    No single policy changed things. “There’s really a virtuous cycle,” Fisher says. “If you make the program accommodate people with less experience, then people with less experience come in.” Faculty members became more used to seeing how green coders evolve into accomplished ones, and they learned how to teach that type.

    Carnegie Mellon’s efforts were remarkably successful. Only a few years after these changes, the percentage of women entering its computer-science program boomed, rising to 42 percent from 7 percent; graduation rates for women rose to nearly match those of the men. The school vaulted over the national average. Other schools concerned about the low number of female students began using approaches similar to Fisher’s. In 2006, Harvey Mudd College tinkered with its Introduction to Computer Science course, creating a track specifically for novices, and rebranded it as Creative Problem Solving in Science and Engineering Using Computational Approaches — which, the institution’s president, Maria Klawe, told me, “is actually a better description of what you’re actually doing when you’re coding.” By 2018, 54 percent of Harvey Mudd’s graduates who majored in computer science were women.

    A broader cultural shift has accompanied the schools’ efforts. In the last few years, women’s interest in coding has begun rapidly rising throughout the United States. In 2012, the percentage of female undergraduates who plan to major in computer science began to rise at rates not seen for 35 years [Computing Research News], since the decline in the mid-’80s, according to research by Linda Sax, an education professor at U.C.L.A. There has also been a boomlet of groups and organizations training and encouraging underrepresented cohorts to enter the field, like Black Girls Code and Code Newbie. Coding has come to be seen, in purely economic terms, as a bastion of well-paying and engaging work.

    In an age when Instagram and Snapchat and iPhones are part of the warp and weft of life’s daily fabric, potential coders worry less that the job will be isolated, antisocial and distant from reality. “Women who see themselves as creative or artistic are more likely to pursue computer science today than in the past,” says Sax, who has pored over decades of demographic data about the students in STEM fields. They’re still less likely to go into coding than other fields, but programming is increasingly on their horizon. This shift is abetted by the fact that it’s much easier to learn programming without getting a full degree, through free online coding schools, relatively cheaper “boot camps” or even meetup groups for newcomers — opportunities that have emerged only in the last decade.

    Changing the culture at schools is one thing. Most female veterans of code I’ve spoken to say that what is harder is shifting the culture of the industry at large, particularly the reflexive sexism and racism still deeply ingrained in Silicon Valley. Some, like Sue Gardner, sometimes wonder if it’s even ethical for her to encourage young women to go into tech. She fears they’ll pour out of computer-science programs in increasing numbers, arrive at their first coding job excited and thrive early on, but then gradually get beaten down by industry. “The truth is, we can attract more and different people into the field, but they’re just going to hit that wall in midcareer, unless we change how things happen higher up,” she says.

    On a spring weekend in 2017, more than 700 coders and designers were given 24 hours to dream up and create a new product at a hackathon in New York hosted by TechCrunch, a news site devoted to technology and Silicon Valley. At lunchtime on Sunday, the teams presented their creations to a panel of industry judges, in a blizzard of frantic elevator pitches. There was Instagrammie, a robot system that would automatically recognize the mood of an elderly relative or a person with limited mobility; there was Waste Not, an app to reduce food waste. Most of the contestants were coders who worked at local high-tech firms or computer-science students at nearby universities.

    6
    Despite women’s historical role in the vanguard of computer programing, some female veterans of code wonder if it’s even ethical to encourage young women to go into tech because of the reflexive sexism in the current culture of Silicon Valley.CreditApic/Getty Images

    The winning team, though, was a trio of high school girls from New Jersey: Sowmya Patapati, Akshaya Dinesh and Amulya Balakrishnan. In only 24 hours, they created reVIVE, a virtual-reality app that tests children for signs of A.D.H.D. After the students were handed their winnings onstage — a trophy-size check for $5,000 — they flopped into chairs in a nearby room to recuperate. They had been coding almost nonstop since noon the day before and were bleary with exhaustion.

    “Lots of caffeine,” Balakrishnan, 17, said, laughing. She wore a blue T-shirt that read WHO HACK THE WORLD? GIRLS. The girls told me that they had impressed even themselves by how much they accomplished in 24 hours. “Our app really does streamline the process of detecting A.D.H.D.,” said Dinesh, who was also 17. “It usually takes six to nine months to diagnose, and thousands of dollars! We could do it digitally in a much faster way!”

    They all became interested in coding in high school, each of them with strong encouragement from immigrant parents. Balakrishnan’s parents worked in software and medicine; Dinesh’s parents came to the United States from India in 2000 and worked in information technology. Patapati immigrated from India as an infant with her young mother, who never went to college, and her father, an information-tech worker who was the first in his rural family to go to college.

    Drawn to coding in high school, the young hackers got used to being the lone girl nerds at school, as Dinesh told me.

    “I tried so hard to get other girls interested in computer science, and it was like, the interest levels were just so low,” she says. “When I walked into my first hackathon, it was the most intimidating thing ever. I looked at a room of 80 kids: Five were girls, and I was probably the youngest person there.” But she kept on competing in 25 more hackathons, and her confidence grew. To break the isolation and meet more girls in coding, she attended events by organizations like #BuiltByGirls, which is where, a few days previously, she had met Patapati and Balakrishnan and where they decided to team up. To attend TechCrunch, Patapati, who was 16, and Balakrishnan skipped a junior prom and a friend’s birthday party. “Who needs a party when you can go to a hackathon?” Patapati said.

    Winning TechCrunch as a group of young women of color brought extra attention, not all of it positive. “I’ve gotten a lot of comments like: ‘Oh, you won the hackathon because you’re a girl! You’re a diversity pick,” Balakrishnan said. After the prize was announced online, she recalled later, “there were quite a few engineers who commented, ‘Oh, it was a girl pick; obviously that’s why they won.’ ”

    Nearly two years later, Balakrishnan was taking a gap year to create a heart-monitoring product she invented, and she was in the running for $100,000 to develop it. She was applying to college to study computer science and, in her spare time, competing in a beauty pageant, inspired by Miss USA 2017, Kara McCullough, who was a nuclear scientist. “I realized that I could use pageantry as a platform to show more girls that they could embrace their femininity and be involved in a very technical, male-dominated field,” she says. Dinesh, in her final year at high school, had started an all-female hackathon that now takes place annually in New York. (“The vibe was definitely very different,” she says, more focused on training newcomers.)

    Patapati and Dinesh enrolled at Stanford last fall to study computer science; both are interested deeply in A.I. They’ve noticed the subtle tensions for women in the coding classes. Patapati, who founded a Women in A.I. group with an Apple tech lead, has watched as male colleagues ignore her raised hand in group discussions or repeat something she just said as if it were their idea. “I think sometimes it’s just a bias that people don’t even recognize that they have,” she says. “That’s been really upsetting.”

    Dinesh says “there’s absolutely a difference in confidence levels” between the male and female newcomers. The Stanford curriculum is so intense that even the relative veterans like her are scrambling: When we spoke recently, she had just spent “three all-nighters in a row” on a single project, for which students had to engineer a “print” command from scratch. At 18, she has few illusions about the road ahead. When she went to a blockchain conference, it was a sea of “middle-aged white and Asian men,” she says. “I’m never going to one again,” she adds with a laugh.

    “My dream is to work on autonomous driving at Tesla or Waymo or some company like that. Or if I see that there’s something missing, maybe I’ll start my own company.” She has begun moving in that direction already, having met one venture capitalist via #BuiltByGirls. “So now I know I can start reaching out to her, and I can start reaching out to other people that she might know,” she says.

    Will she look around, 20 years from now, to see that software has returned to its roots, with women everywhere? “I’m not really sure what will happen,” she admits. “But I do think it is absolutely on the upward climb.”

    Correction: Feb. 14, 2019
    An earlier version of this article misidentified the institution Ellen Spertus was affiliated with when she published a 1991 report on women’s experiences in programming classes. Spertus was at M.I.T. when she published the report, not Mills College, where she is currently a professor.

    Correction: Feb. 14, 2019
    An earlier version of this article misstated Akshaya Dinesh’s current age. She is 18, not 19.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:58 am on January 30, 2019 Permalink | Reply
    Tags: , Algorithm could help autonomous underwater vehicles explore risky but scientifically-rewarding environments, , , Computer Science, Engineers program marine robots to take calculated risks, ,   

    From MIT News: “Engineers program marine robots to take calculated risks” 

    MIT News
    MIT Widget

    From MIT News

    January 30, 2019
    Jennifer Chu

    1
    MIT engineers have now developed an algorithm that lets autonomous underwater vehicles weigh the risks and potential rewards of exploring an unknown region.
    Image: stock image.

    Algorithm could help autonomous underwater vehicles explore risky but scientifically-rewarding environments.

    We know far less about the Earth’s oceans than we do about the surface of the moon or Mars. The sea floor is carved with expansive canyons, towering seamounts, deep trenches, and sheer cliffs, most of which are considered too dangerous or inaccessible for autonomous underwater vehicles (AUV) to navigate.

    But what if the reward for traversing such places was worth the risk?

    MIT engineers have now developed an algorithm that lets AUVs weigh the risks and potential rewards of exploring an unknown region. For instance, if a vehicle tasked with identifying underwater oil seeps approached a steep, rocky trench, the algorithm could assess the reward level (the probability that an oil seep exists near this trench), and the risk level (the probability of colliding with an obstacle), if it were to take a path through the trench.

    “If we were very conservative with our expensive vehicle, saying its survivability was paramount above all, then we wouldn’t find anything of interest,” says Benjamin Ayton, a graduate student in MIT’s Department of Aeronautics and Astronautics. “But if we understand there’s a tradeoff between the reward of what you gather, and the risk or threat of going toward these dangerous geographies, we can take certain risks when it’s worthwhile.”

    Ayton says the new algorithm can compute tradeoffs of risk versus reward in real time, as a vehicle decides where to explore next. He and his colleagues in the lab of Brian Williams, professor of aeronautics and astronautics, are implementing this algorithm and others on AUVs, with the vision of deploying fleets of bold, intelligent robotic explorers for a number of missions, including looking for offshore oil deposits, investigating the impact of climate change on coral reefs, and exploring extreme environments analogous to Europa, an ice-covered moon of Jupiter that the team hopes vehicles will one day traverse.

    “If we went to Europa and had a very strong reason to believe that there might be a billion-dollar observation in a cave or crevasse, which would justify sending a spacecraft to Europa, then we would absolutely want to risk going in that cave,” Ayton says. “But algorithms that don’t consider risk are never going to find that potentially history-changing observation.”

    Ayton and Williams, along with Richard Camilli of the Woods Hole Oceanographic Institution, will present their new algorithm at the Association for the Advancement of Artificial Intelligence conference this week in Honolulu.

    A bold path

    The team’s new algorithm is the first to enable “risk-bounded adaptive sampling.” An adaptive sampling mission is designed, for instance, to automatically adapt an AUV’s path, based on new measurements that the vehicle takes as it explores a given region. Most adaptive sampling missions that consider risk typically do so by finding paths with a concrete, acceptable level of risk. For instance, AUVs may be programmed to only chart paths with a chance of collision that doesn’t exceed 5 percent.

    But the researchers found that accounting for risk alone could severely limit a mission’s potential rewards.

    “Before we go into a mission, we want to specify the risk we’re willing to take for a certain level of reward,” Ayton says. “For instance, if a path were to take us to more hydrothermal vents, we would be willing to take this amount of risk, but if we’re not going to see anything, we would be willing to take less risk.”

    The team’s algorithm takes in bathymetric data, or information about the ocean topography, including any surrounding obstacles, along with the vehicle’s dynamics and inertial measurements, to compute the level of risk for a certain proposed path. The algorithm also takes in all previous measurements that the AUV has taken, to compute the probability that such high-reward measurements may exist along the proposed path.

    If the risk-to-reward ratio meets a certain value, determined by scientists beforehand, then the AUV goes ahead with the proposed path, taking more measurements that feed back into the algorithm to help it evaluate the risk and reward of other paths as the vehicle moves forward.

    The researchers tested their algorithm in a simulation of an AUV mission east of Boston Harbor. They used bathymetric data collected from the region during a previous NOAA survey, and simulated an AUV exploring at a depth of 15 meters through regions at relatively high temperatures. They looked at how the algorithm planned out the vehicle’s route under three different scenarios of acceptable risk.

    In the scenario with the lowest acceptable risk, meaning that the vehicle should avoid any regions that would have a very high chance of collision, the algorithm mapped out a conservative path, keeping the vehicle in a safe region that also did not have any high rewards — in this case, high temperatures. For scenarios of higher acceptable risk, the algorithm charted bolder paths that took a vehicle through a narrow chasm, and ultimately to a high-reward region.

    The team also ran the algorithm through 10,000 numerical simulations, generating random environments in each simulation through which to plan a path, and found that the algorithm “trades off risk against reward intuitively, taking dangerous actions only when justified by the reward.”

    A risky slope

    Last December, Ayton, Williams, and others spent two weeks on a cruise off the coast of Costa Rica, deploying underwater gliders, on which they tested several algorithms, including this newest one. For the most part, the algorithm’s path planning agreed with those proposed by several onboard geologists who were looking for the best routes to find oil seeps.

    Ayton says there was a particular moment when the risk-bounded algorithm proved especially handy. An AUV was making its way up a precarious slump, or landslide, where the vehicle couldn’t take too many risks.

    “The algorithm found a method to get us up the slump quickly, while being the most worthwhile,” Ayton says. “It took us up a path that, while it didn’t help us discover oil seeps, it did help us refine our understanding of the environment.”

    “What was really interesting was to watch how the machine algorithms began to ‘learn’ after the findings of several dives, and began to choose sites that we geologists might not have chosen initially,” says Lori Summa, a geologist and guest investigator at the Woods Hole Oceanographic Institution, who took part in the cruise. “This part of the process is still evolving, but it was exciting to watch the algorithms begin to identify the new patterns from large amounts of data, and couple that information to an efficient, ‘safe’ search strategy.”

    In their long-term vision, the researchers hope to use such algorithms to help autonomous vehicles explore environments beyond Earth.

    “If we went to Europa and weren’t willing to take any risks in order to preserve a probe, then the probability of finding life would be very, very low,” Ayton says. “You have to risk a little to get more reward, which is generally true in life as well.”

    This research was supported, in part, by Exxon Mobile, as part of the MIT Energy Initiative, and by NASA.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 12:20 pm on January 28, 2019 Permalink | Reply
    Tags: , Computer Science, Converting Wi-Fi signals to electricity with new 2-D materials, , ,   

    From MIT News: “Converting Wi-Fi signals to electricity with new 2-D materials” 

    MIT News
    MIT Widget

    From MIT News

    January 28, 2019
    Rob Matheson

    1
    Researchers from MIT and elsewhere have designed the first fully flexible, battery-free “rectenna” — a device that converts energy from Wi-Fi signals into electricity — that could be used to power flexible and wearable electronics, medical devices, and sensors for the “internet of things.” Image: Christine Daniloff

    Device made from flexible, inexpensive materials could power large-area electronics, wearables, medical devices, and more.

    Imagine a world where smartphones, laptops, wearables, and other electronics are powered without batteries. Researchers from MIT and elsewhere have taken a step in that direction, with the first fully flexible device that can convert energy from Wi-Fi signals into electricity that could power electronics.

    Devices that convert AC electromagnetic waves into DC electricity are known as “rectennas.” The researchers demonstrate a new kind of rectenna, described in a study appearing in Nature today, that uses a flexible radio-frequency (RF) antenna that captures electromagnetic waves — including those carrying Wi-Fi — as AC waveforms.

    The antenna is then connected to a novel device made out of a two-dimensional semiconductor just a few atoms thick. The AC signal travels into the semiconductor, which converts it into a DC voltage that could be used to power electronic circuits or recharge batteries.

    In this way, the battery-free device passively captures and transforms ubiquitous Wi-Fi signals into useful DC power. Moreover, the device is flexible and can be fabricated in a roll-to-roll process to cover very large areas.

    “What if we could develop electronic systems that we wrap around a bridge or cover an entire highway, or the walls of our office and bring electronic intelligence to everything around us? How do you provide energy for those electronics?” says paper co-author Tomás Palacios, a professor in the Department of Electrical Engineering and Computer Science and director of the MIT/MTL Center for Graphene Devices and 2D Systems in the Microsystems Technology Laboratories. “We have come up with a new way to power the electronics systems of the future — by harvesting Wi-Fi energy in a way that’s easily integrated in large areas — to bring intelligence to every object around us.”

    Promising early applications for the proposed rectenna include powering flexible and wearable electronics, medical devices, and sensors for the “internet of things.” Flexible smartphones, for instance, are a hot new market for major tech firms. In experiments, the researchers’ device can produce about 40 microwatts of power when exposed to the typical power levels of Wi-Fi signals (around 150 microwatts). That’s more than enough power to light up an LED or drive silicon chips.

    Another possible application is powering the data communications of implantable medical devices, says co-author Jesús Grajal, a researcher at the Technical University of Madrid. For example, researchers are beginning to develop pills that can be swallowed by patients and stream health data back to a computer for diagnostics.

    “Ideally you don’t want to use batteries to power these systems, because if they leak lithium, the patient could die,” Grajal says. “It is much better to harvest energy from the environment to power up these small labs inside the body and communicate data to external computers.”

    All rectennas rely on a component known as a “rectifier,” which converts the AC input signal into DC power. Traditional rectennas use either silicon or gallium arsenide for the rectifier. These materials can cover the Wi-Fi band, but they are rigid. And, although using these materials to fabricate small devices is relatively inexpensive, using them to cover vast areas, such as the surfaces of buildings and walls, would be cost-prohibitive. Researchers have been trying to fix these problems for a long time. But the few flexible rectennas reported so far operate at low frequencies and can’t capture and convert signals in gigahertz frequencies, where most of the relevant cell phone and Wi-Fi signals are.

    To build their rectifier, the researchers used a novel 2-D material called molybdenum disulfide (MoS2), which at three atoms thick is one of the thinnest semiconductors in the world. In doing so, the team leveraged a singular behavior of MoS2: When exposed to certain chemicals, the material’s atoms rearrange in a way that acts like a switch, forcing a phase transition from a semiconductor to a metallic material. The resulting structure is known as a Schottky diode, which is the junction of a semiconductor with a metal.

    “By engineering MoS2 into a 2-D semiconducting-metallic phase junction, we built an atomically thin, ultrafast Schottky diode that simultaneously minimizes the series resistance and parasitic capacitance,” says first author and EECS postdoc Xu Zhang, who will soon join Carnegie Mellon University as an assistant professor.

    Parasitic capacitance is an unavoidable situation in electronics where certain materials store a little electrical charge, which slows down the circuit. Lower capacitance, therefore, means increased rectifier speeds and higher operating frequencies. The parasitic capacitance of the researchers’ Schottky diode is an order of magnitude smaller than today’s state-of-the-art flexible rectifiers, so it is much faster at signal conversion and allows it to capture and convert up to 10 gigahertz of wireless signals.

    “Such a design has allowed a fully flexible device that is fast enough to cover most of the radio-frequency bands used by our daily electronics, including Wi-Fi, Bluetooth, cellular LTE, and many others,” Zhang says.

    The reported work provides blueprints for other flexible Wi-Fi-to-electricity devices with substantial output and efficiency. The maximum output efficiency for the current device stands at 40 percent, depending on the input power of the Wi-Fi input. At the typical Wi-Fi power level, the power efficiency of the MoS2 rectifier is about 30 percent. For reference, today’s rectennas made from rigid, more expensive silicon or gallium arsenide achieve around 50 to 60 percent.

    There are 15 other paper co-authors from MIT, Technical University of Madrid, the Army Research Laboratory, Charles III University of Madrid, Boston University, and the University of Southern California.

    The team is now planning to build more complex systems and improve efficiency. The work was made possible, in part, by a collaboration with the Technical University of Madrid through the MIT International Science and Technology Initiatives (MISTI). It was also partially supported by the Institute for Soldier Nanotechnologies, the Army Research Laboratory, the National Science Foundation’s Center for Integrated Quantum Materials, and the Air Force Office of Scientific Research.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 7:24 am on March 4, 2018 Permalink | Reply
    Tags: , Barbara Engelhardt, , Computer Science, , GTEx-Genotype-Tissue Expression Consortium, , ,   

    From Quanta Magazine: “A Statistical Search for Genomic Truths” 

    Quanta Magazine
    Quanta Magazine

    February 27, 2018
    Jordana Cepelewicz

    1
    Barbara Engelhardt, a Princeton University computer scientist, wants to strengthen the foundation of biological knowledge in machine-learning approaches to genomic analysis. Sarah Blesener for Quanta Magazine.

    We don’t have much ground truth in biology.” According to Barbara Engelhardt, a computer scientist at Princeton University, that’s just one of the many challenges that researchers face when trying to prime traditional machine-learning methods to analyze genomic data. Techniques in artificial intelligence and machine learning are dramatically altering the landscape of biological research, but Engelhardt doesn’t think those “black box” approaches are enough to provide the insights necessary for understanding, diagnosing and treating disease. Instead, she’s been developing new statistical tools that search for expected biological patterns to map out the genome’s real but elusive “ground truth.”

    Engelhardt likens the effort to detective work, as it involves combing through constellations of genetic variation, and even discarded data, for hidden gems. In research published last October [Nature], for example, she used one of her models to determine how mutations relate to the regulation of genes on other chromosomes (referred to as distal genes) in 44 human tissues. Among other findings, the results pointed to a potential genetic target for thyroid cancer therapies. Her work has similarly linked mutations and gene expression to specific features found in pathology images.

    The applications of Engelhardt’s research extend beyond genomic studies. She built a different kind of machine-learning model, for instance, that makes recommendations to doctors about when to remove their patients from a ventilator and allow them to breathe on their own.

    She hopes her statistical approaches will help clinicians catch certain conditions early, unpack their underlying mechanisms, and treat their causes rather than their symptoms. “We’re talking about solving diseases,” she said.

    To this end, she works as a principal investigator with the Genotype-Tissue Expression (GTEx) Consortium, an international research collaboration studying how gene regulation, expression and variation contribute to both healthy phenotypes and disease.

    2

    Right now, she’s particularly interested in working on neuropsychiatric and neurodegenerative diseases, which are difficult to diagnose and treat.

    Quanta Magazine recently spoke with Engelhardt about the shortcomings of black-box machine learning when applied to biological data, the methods she’s developed to address those shortcomings, and the need to sift through “noise” in the data to uncover interesting information. The interview has been condensed and edited for clarity.

    What motivated you to focus your machine-learning work on questions in biology?

    I’ve always been excited about statistics and machine learning. In graduate school, my adviser, Michael Jordan [at the University of California, Berkeley], said something to the effect of: “You can’t just develop these methods in a vacuum. You need to think about some motivating applications.” I very quickly turned to biology, and ever since, most of the questions that drive my research are not statistical, but rather biological: understanding the genetics and underlying mechanisms of disease, hopefully leading to better diagnostics and therapeutics. But when I think about the field I am in — what papers I read, conferences I attend, classes I teach and students I mentor — my academic focus is on machine learning and applied statistics.

    We’ve been finding many associations between genomic markers and disease risk, but except in a few cases, those associations are not predictive and have not allowed us to understand how to diagnose, target and treat diseases. A genetic marker associated with disease risk is often not the true causal marker of the disease — one disease can have many possible genetic causes, and a complex disease might be caused by many, many genetic markers possibly interacting with the environment. These are all challenges that someone with a background in statistical genetics and machine learning, working together with wet-lab scientists and medical doctors, can begin to address and solve. Which would mean we could actually treat genetic diseases — their causes, not just their symptoms.

    You’ve spoken before about how traditional statistical approaches won’t suffice for applications in genomics and health care. Why not?

    First, because of a lack of interpretability. In machine learning, we often use “black-box” methods — [classification algorithms called] random forests, or deeper learning approaches. But those don’t really allow us to “open” the box, to understand which genes are differentially regulated in particular cell types or which mutations lead to a higher risk of a disease. I’m interested in understanding what’s going on biologically. I can’t just have something that gives an answer without explaining why.

    The goal of these methods is often prediction, but given a person’s genotype, it is not particularly useful to estimate the probability that they’ll get Type 2 diabetes. I want to know how they’re going to get Type 2 diabetes: which mutation causes the dysregulation of which gene to lead to the development of the condition. Prediction is not sufficient for the questions I’m asking.

    A second reason has to do with sample size. Most of the driving applications of statistics assume that you’re working with a large and growing number of data samples — say, the number of Netflix users or emails coming into your inbox — with a limited number of features or observations that have interesting structure. But when it comes to biomedical data, we don’t have that at all. Instead, we have a limited number of patients in the hospital, a limited number of genotypes we can sequence — but a gigantic set of features or observations for any one person, including all the mutations in their genome. Consequently, many theoretical and applied approaches from statistics can’t be used for genomic data.

    What makes the genomic data so challenging to analyze?

    The most important signals in biomedical data are often incredibly small and completely swamped by technical noise. It’s not just about how you model the real, biological signal — the questions you’re trying to ask about the data — but also how you model that in the presence of this incredibly heavy-handed noise that’s driven by things you don’t care about, like which population the individuals came from or which technician ran the samples in the lab. You have to get rid of that noise carefully. And we often have a lot of questions that we would like to answer using the data, and we need to run an incredibly large number of statistical tests — literally trillions — to figure out the answers. For example, to identify an association between a mutation in a genome and some trait of interest, where that trait might be the expression levels of a specific gene in a tissue. So how can we develop rigorous, robust testing mechanisms where the signals are really, really small and sometimes very hard to distinguish from noise? How do we correct for all this structure and noise that we know is going to exist?

    So what approach do we need to take instead?

    My group relies heavily on what we call sparse latent factor models, which can sound quite mathematically complicated. The fundamental idea is that these models partition all the variation we observed in the samples, with respect to only a very small number of features. One of these partitions might include 10 genes, for example, or 20 mutations. And then as a scientist, I can look at those 10 genes and figure out what they have in common, determine what this given partition represents in terms of a biological signal that affects sample variance.

    So I think of it as a two-step process: First, build a model that separates all the sources of variation as carefully as possible. Then go in as a scientist to understand what all those partitions represent in terms of a biological signal. After this, we can validate those conclusions in other data sets and think about what else we know about these samples (for instance, whether everyone of the same age is included in one of these partitions).

    When you say “go in as a scientist,” what do you mean?

    I’m trying to find particular biological patterns, so I build these models with a lot of structure and include a lot about what kinds of signals I’m expecting. I establish a scaffold, a set of parameters that will tell me what the data say, and what patterns may or may not be there. The model itself has only a certain amount of expressivity, so I’ll only be able to find certain types of patterns. From what I’ve seen, existing general models don’t do a great job of finding signals we can interpret biologically: They often just determine the biggest influencers of variance in the data, as opposed to the most biologically impactful sources of variance. The scaffold I build instead represents a very structured, very complex family of possible patterns to describe the data. The data then fill in that scaffold to tell me which parts of that structure are represented and which are not.

    So instead of using general models, my group and I carefully look at the data, try to understand what’s going on from the biological perspective, and tailor our models based on what types of patterns we see.

    How does the latent factor model work in practice?

    We applied one of these latent factor models to pathology images [pictures of tissue slices under a microscope], which are often used to diagnose cancer. For every image, we also had data about the set of genes expressed in those tissues. We wanted to see how the images and the corresponding gene expression levels were coordinated.

    We developed a set of features describing each of the images, using a deep-learning method to identify not just pixel-level values but also patterns in the image. We pulled out over a thousand features from each image, give or take, and then applied a latent factor model and found some pretty exciting things.

    For example, we found sets of genes and features in one of these partitions that described the presence of immune cells in the brain. You don’t necessarily see these cells on the pathology images, but when we looked at our model, we saw a component there that represented only genes and features associated with immune cells, not brain cells. As far as I know, no one’s seen this kind of signal before. But it becomes incredibly clear when we look at these latent factor components.


    Video: Barbara Engelhardt, a computer scientist at Princeton University, explains why traditional machine-learning techniques have often fallen short for genomic analysis, and how researchers are overcoming that challenge. Sarah Blesener for Quanta Magazine

    You’ve worked with dozens of human tissue types to unpack how specific genetic variations help shape complex traits. What insights have your methods provided?

    We had 44 tissues, donated from 449 human cadavers, and their genotypes (sequences of their whole genomes). We wanted to understand more about the differences in how those genotypes expressed their genes in all those tissues, so we did more than 3 trillion tests, one by one, comparing every mutation in the genome with every gene expressed in each tissue. (Running that many tests on the computing clusters we’re using now takes about two weeks; when we move this iteration of GTEx to the cloud as planned, we expect it to take around two hours.) We were trying to figure out whether the [mutant] genotype was driving distal gene expression. In other words, we were looking for mutations that weren’t located on the same chromosome as the genes they were regulating. We didn’t find very much: a little over 600 of these distal associations. Their signals were very low.

    But one of the signals was strong: an exciting thyroid association, in which a mutation appeared to distally regulate two different genes. We asked ourselves: How is this mutation affecting expression levels in a completely different part of the genome? In collaboration with Alexis Battle’s lab at Johns Hopkins University, we looked near the mutation on the genome and found a gene called FOXE1, for a transcription factor that regulates the transcription of genes all over the genome. The FOXE1 gene is only expressed in thyroid tissues, which was interesting. But we saw no association between the mutant genotype and the expression levels of FOXE1. So we had to look at the components of the original signal we’d removed before — everything that had appeared to be a technical artifact — to see if we could detect the effects of the FOXE1 protein broadly on the genome.

    We found a huge impact of FOXE1 in the technical artifacts we’d removed. FOXE1, it seems, regulates a large number of genes only in the thyroid. Its variation is driven by the mutant genotype we found. And that genotype is also associated with thyroid cancer risk. We went back to the thyroid cancer samples — we had about 500 from the Cancer Genome Atlas — and replicated the distal association signal. These things tell a compelling story, but we wouldn’t have learned it unless we had tried to understand the signal that we’d removed.

    What are the implications of such an association?

    Now we have a particular mechanism for the development of thyroid cancer and the dysregulation of thyroid cells. If FOXE1 is a druggable target — if we can go back and think about designing drugs to enhance or suppress the expression of FOXE1 — then we can hope to prevent people at high thyroid cancer risk from getting it, or to treat people with thyroid cancer more effectively.

    The signal from broad-effect transcription factors like FOXE1 actually looks a lot like the effects we typically remove as part of the noise: population structure, or the batches the samples were run in, or the effects of age or sex. A lot of those technical influences are going to affect approximately similar numbers of genes — around 10 percent — in a similar way. That’s why we usually remove signals that have that pattern. In this case, though, we had to understand the domain we were working in. As scientists, we looked through all the signals we’d gotten rid of, and this allowed us to find the effects of FOXE1 showing up so strongly in there. It involved manual labor and insights from a biological background, but we’re thinking about how to develop methods to do it in a more automated way.

    So with traditional modeling techniques, we’re missing a lot of real biological effects because they look too similar to noise?

    Yes. There are a ton of cases in which the interesting pattern and the noise look similar. Take these distal effects: Pretty much all of them, if they are broad effects, are going to look like the noise signal we systematically get rid of. It’s methodologically challenging. We have to think carefully about how to characterize when a signal is biologically relevant or just noise, and how to distinguish the two. My group is working fairly aggressively on figuring that out.

    Why are those relationships so difficult to map, and why look for them?

    There are so many tests we have to do; the threshold for the statistical significance of a discovery has to be really, really high. That creates problems for finding these signals, which are often incredibly small; if our threshold is that high, we’re going to miss a lot of them. And biologically, it’s not clear that there are many of these really broad-effect distal signals. You can imagine that natural selection would eliminate the kinds of mutations that affect 10 percent of genes — that we wouldn’t want that kind of variability in the population for so many genes.

    But I think there’s no doubt that these distal associations play an enormous role in disease, and that they may be considered as druggable targets. Understanding their role broadly is incredibly important for human health.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 12:39 pm on February 14, 2018 Permalink | Reply
    Tags: , Carnegie Mellon University - College of Engineering, Computer Science,   

    From Carnegie Mellon University: “Moore’s Law is ending. What’s next?” 

    Carnegie Mellon University – College of Engineering

    2.14.18

    Adam Dove
    amdove@andrew.cmu.edu.

    The speed of our technology doubles every year, right? Not anymore.

    1
    ExtremeTech

    We’ve come to take for granted that as the years go on, computing technology gets faster, cheaper and more energy-efficient.

    In their recent paper, Science and research policy at the end of Moore’s law published in Nature Electronics, however, Carnegie Mellon University researchers Hassan Khan, David Hounshell, and Erica Fuchs argue that future advancement in microprocessors faces new and unprecedented challenges.

    In 1965 R&D Director at Fairchild (and later Intel co-founder) Gordon Moore predicted continued systemic declines in cost and increase in performance of integrated circuits in his paper “Cramming more components onto integrated circuits.” This trend, later coined Moore’s Law, held for over 40 years, making possible a “dazzling array” of new products from intercontinental ballistic missiles to global environmental monitoring systems and from smart phones to medical implants.

    “Half of economic growth in the U.S. and world-wide has also been attributed to this trend and the innovations it enabled throughout the economy,” says Engineering and Public Policy Professor Erica Fuchs.

    In the seven decades following the invention of the transistor at Bell Labs, warnings about impending limits to miniaturization and the corresponding slow down of Moore’s Law have come regularly from industry observers and academic researchers. Despite these warnings, semiconductor technology continually progressed along the Moore’s Law trajectory. Khan, Hounshell, and Fuchs’ archival work and oral histories, however, make clear that times are changing.

    “The current technological and structural challenges facing the industry are unprecedented and undermine the incentives for continued collective action in research and development,” the authors state in the paper, “which has underpinned the last 50 years of transformational worldwide economic growth and social advance.”

    As the authors explain in their paper, progress in semiconductor technology is undergoing a seismic shift driven by changes in the underlying technology and product-end markets. Achieving continued performance improvements through transistor miniaturization has grown increasingly expensive and the emergence of new end-markets has driven innovation in more specialized domains. As such, in recent years there has been a splintering of technology trajectories, such that the entire industry moving in lock-step to Moore’s Law is no longer of economic benefit to all firms. Examples in the paper include search companies (such as Microsoft Bing) using field-programmable gate-arrays in data centers as accelerators in conjunction with CPUs, and Google’s announcement of proprietary ‘tensor-processing unit’ chips developed in-house for its deep-learning activities.

    “While these innovations will drive many domain-specific advances, to continue advancing general purpose computing capabilities at reduced cost with economy-wide benefits will likely require entirely new semiconductor process and device technology.” explains Engineering and Public Policy graduate Hassan Khan. “The underlying science for this technology is as of yet unknown, and will require significant research funds – an order of magnitude more than is being invested today.”

    The authors conclude by arguing that the lack of private incentives creates a case for greatly increased public funding and the need for leadership beyond traditional stakeholders. They suggest that funding is needed of $600 million dollars per year with 90% of those funds from public research dollars, and the rest most likely from defense agencies.

    In terms of allocating those funds, they argue for pursuing two avenues in parallel: A research institute with academics and government program managers as key players, that also engages industry players across the computing technology stack; coupled with a semi-coordinated government funding effort focused on the next generation of transistor technology across all government agencies, such as was undertaken in the case of the National Nanotechnology Initiative, in which key program managers met weekly to discuss initiatives and share insights.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The College of Engineering is well-known for working on problems of both scientific and practical importance. Our acclaimed faculty focus on transformative results that will drive the intellectual and economic vitality of our community, nation and world. Our “maker” culture is ingrained in all that we do, leading to novel approaches and unprecedented results.

    Carnegie Mellon Campus

    Carnegie Mellon University (CMU) is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

     
  • richardmitnick 8:07 am on January 18, 2018 Permalink | Reply
    Tags: , Computer Science, , Raspberry Pi   

    From JHU HUB: “Raspberry Pi DIY workshop teaches students how to use customizable, versatile minicomputers” 

    Johns Hopkins
    JHU HUB

    1.17.18
    Saralyn Cruickshank

    1
    Second-year student Julia Costacurta wants to use Raspberry Pi, a customizable minicomputer, to program a plant-watering device. Image credit: Saralyn Cruickshank.

    Raspberry Pi, a single-board computer the size of a credit card, has revolutionized the tech industry since its debut in 2012.

    What would you build if you could design any kind of tech device for $35? Would you bypass the costs of major cellphone providers and build your own smartphone? Would you build a bartending robot to mix your favorite cocktails perfectly each time? Would you relive your youth and create a retro video game console that cuts out the need for those pesky, dust-prone game cartridges?

    This month, 15 Johns Hopkins students are learning the ins and outs of a versatile device capable of these and infinitely more functions: the Raspberry Pi, a small, green single-board computer capable of fitting inside an Altoids tin.

    Raspberry Pi is a revolution in computer hardware, enabling users to build their own computers and smart devices. Developed in the United Kingdom, Raspberry Pi costs as little as $35 and has become a worldwide phenomenon, with global sales surpassing 15 million units last year. These bare bones, accessory-compatible minicomputers have become especially popular in fields relating to home automation, data visualization, robotics, and Internet-capable devices.

    The Raspberry Pi DIY Intersession course, offered by the Department of Electrical and Computer Engineering in the Whiting School of Engineering, provides students with not only the Pis but also a fleet of peripheral devices such as LED light panels, Google-compatible voice kits, and a series of sensors and circuits enabling them to build their own devices. Students are guided through setup and programming before being unleashed to pursue their own Pi-powered projects.

    Instructor Bryan Bosworth, a postdoctoral fellow in Electrical and Computer Engineering, says these minicomputers and their capabilities represent autonomy and freedom.

    2
    These single-board computers are compatible with a variety of computer accessories including keyboards and monitors. Image credit: Saralyn Cruickshank.

    “You don’t have to take what The Man gives you,” he says half-seriously to the class in their second meeting. “When you master this kind of computer engineering, you won’t be beholden to anyone—you can get these devices to do what you want.”

    He speaks from experience—he’s owned more than a dozen Raspberry Pis since the tool’s release in 2012. On display in the class computer lab is a Pi-powered LED panel he built himself: A cross between a Doppler radar weather map and a Lite Brite, the device displays a minute-by-minute heat map of crime in Baltimore City, with pinpoints of light blinking when a new crime is reported, then slowly fading away.

    He also uses the minicomputer to automate some of his creature comforts at home. A Raspberry Pi with voice command serves as his enhanced and personalized virtual assistant to stream videos and music, and he recently paired Pi technology with LED light panels arranged in a cube to look like a Halloween pumpkin that intermittently blinks and yawns. He has aspirations to reprogram it to appear like a Rubik’s cube and wants to add a motion sensor to the rig so that users can command each twist of the Rubik’s cube with a wave of their hand.

    “I view computing devices and the software that runs on them as an essential freedom,” Bosworth says. “You have to be able to know what’s going on under the hood, especially when more and more of modern-day life relies on these things. You have to be able to poke at it and make changes to suit you.”

    For many of the students in the class, this is their first time working with computer hardware.

    “I’ve known how to code for a while, but I’ve never done hardware,” says first-year student Vivek Gopalakrishnan, a biomedical engineering and electrical engineering double-major. “I thought learning how to use a Raspberry Pi would be a good idea for learning how to build my own devices.”

    Others want to build on skills they already have. Second-year biomedical engineering major Julia Costacurta says she used a similar open source computer hardware device called Arduino for a prosthetics project last summer.

    “I figured this would be a really good way to get more hands-on experience with computer hardware,” she says. “I have a lot of plants in my apartment, so I want to build a Raspberry Pi that’s able to sense when they need to be watered and remind me.”

    First-year computer engineering major Anderson Adon has worked with Raspberry Pi before but says the workshop setting will give him a more formal introduction to the minicomputer’s capabilities. He too already has a plan for his final project: He says he wants to pair his Raspberry Pi to a digital thermometer in his residence hall and program it to automatically tweet the university whenever the temperature in his room dips below a certain point.

    By the end of the first class meeting, he’s set up his Raspberry Pi—in less time than it took him to set up his Twitter account.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About the Hub

    We’ve been doing some thinking — quite a bit, actually — about all the things that go on at Johns Hopkins. Discovering the glue that holds the universe together, for example. Or unraveling the mysteries of Alzheimer’s disease. Or studying butterflies in flight to fine-tune the construction of aerial surveillance robots. Heady stuff, and a lot of it.

    In fact, Johns Hopkins does so much, in so many places, that it’s hard to wrap your brain around it all. It’s too big, too disparate, too far-flung.

    We created the Hub to be the news center for all this diverse, decentralized activity, a place where you can see what’s new, what’s important, what Johns Hopkins is up to that’s worth sharing. It’s where smart people (like you) can learn about all the smart stuff going on here.

    At the Hub, you might read about cutting-edge cancer research or deep-trench diving vehicles or bionic arms. About the psychology of hoarders or the delicate work of restoring ancient manuscripts or the mad motor-skills brilliance of a guy who can solve a Rubik’s Cube in under eight seconds.

    There’s no telling what you’ll find here because there’s no way of knowing what Johns Hopkins will do next. But when it happens, this is where you’ll find it.

    Johns Hopkins Campus

    The Johns Hopkins University opened in 1876, with the inauguration of its first president, Daniel Coit Gilman. “What are we aiming at?” Gilman asked in his installation address. “The encouragement of research … and the advancement of individual scholars, who by their excellence will advance the sciences they pursue, and the society where they dwell.”

    The mission laid out by Gilman remains the university’s mission today, summed up in a simple but powerful restatement of Gilman’s own words: “Knowledge for the world.”

    What Gilman created was a research university, dedicated to advancing both students’ knowledge and the state of human knowledge through research and scholarship. Gilman believed that teaching and research are interdependent, that success in one depends on success in the other. A modern university, he believed, must do both well. The realization of Gilman’s philosophy at Johns Hopkins, and at other institutions that later attracted Johns Hopkins-trained scholars, revolutionized higher education in America, leading to the research university system as it exists today.

     
  • richardmitnick 10:04 am on January 11, 2018 Permalink | Reply
    Tags: , , Blue Brain Nexus, , Computer Science,   

    From EPFL: “Blue Brain Nexus: an open-source tool for data-driven science” 

    EPFL bloc

    École Polytechnique Fédérale de Lausanne

    11.01.18
    BBP communications

    1
    © iStockphotos

    Knowledge sharing is an important driving force behind scientific progress. In an open-science approach, EPFL’s Blue Brain Project has created and open sourced Blue Brain Nexus that allows the building of data integration platforms. Blue Brain Nexus enables data-driven science through searching, integrating and tracking large-scale data and models.

    EPFL’s Blue Brain Project today announces the release of its open source software project ‘Blue Brain Nexus’, designed to enable the FAIR (Findable, Accessible, Interoperable, and Reusable) data management principles for the Neuroscience and broader scientific community. It is part of EPFL’s open-science initiative, which seeks to maximize the reach and impact of research conducted at the school.

    The aim of the Blue Brain Project is to build accurate, biologically detailed, digital reconstructions and simulations of the rodent brain and, ultimately the human brain. Blue Brain Nexus is instrumental in supporting all stages of Blue Brain’s data-driven modelling cycle including, but not limited to experimental data, single cell models, circuits, simulations and validations. The brain is a complex multi-level system and is one of the biggest ‘Big Data’ problems we have today. Therefore, Blue Brain Nexus has been built to organize, store and process exceptionally large volumes of data and support usage by a broad number of users.

    At the heart of Blue Brain Nexus is the Knowledge Graph, which acts as a data repository and metadata catalogue. It also remains agnostic of the domain to be represented by allowing users to design arbitrary domains, which enables other scientific initiatives (e.g. astronomy, medical research and agriculture) to reuse Blue Brain Nexus as the core of their data platforms. Blue Brain Nexus services are already being evaluated for integration into the Human Brain Project’s Neuroinformatics Platform.

    2
    Specific to enabling scientific progress, Blue Brain Nexus’s Knowledge Graph treats provenance as a first-class citizen, thus facilitating the tracking of the origin of data as well as how it is being used. This allow users to assess the quality of data, and consequently to enable them to build trust. Another key feature of Blue Brain Nexus is its semantic search capability, whereby search is integrated over data and its provenance to enable scientists to easily discover and access new relevant data.

    EPFL Professor Sean Hill commented: “We see that nearly all sciences are becoming data-driven. Blue Brain Nexus represents the culmination of many years of research into building a state-of-the-art semantic data management platform. We can’t wait to see what the community will do with Blue Brain Nexus.”

    Blue Brain Nexus is available under the Apache 2 license, at https://github.com/BlueBrain/nexus

    For more information, please contact:

    EPFL Communications, emmanuel.barraud@epfl.ch, +41 21 693 21 90

    Blue Brain Project communications, kate.mullins@epfl.ch, +41 21 695 51 41

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EPFL is Europe’s most cosmopolitan technical university. It receives students, professors and staff from over 120 nationalities. With both a Swiss and international calling, it is therefore guided by a constant wish to open up; its missions of teaching, research and partnership impact various circles: universities and engineering schools, developing and emerging countries, secondary schools and gymnasiums, industry and economy, political circles and the general public.

     
  • richardmitnick 5:22 pm on January 3, 2018 Permalink | Reply
    Tags: Computer Science, , , MORPHEUS, , Unhackable computer   

    From U Michigan: “Unhackable computer under development with $3.6M DARPA grant” 

    U Michigan bloc

    University of Michigan

    December 20, 2017
    Nicole Casal Moore
    ncmoore@umich.edu
    (734) 647-7087

    The researchers say they’re making an unsolvable puzzle: ‘It’s like if you’re solving a Rubik’s Cube and every time you blink, I rearrange it.’

    1
    The MORPHEUS approach outlines a new way to design hardware so that information is rapidly and randomly moved and destroyed. The technology works to elude attackers from the critical information they need to construct a successful attack. Photo: Getty Images

    By turning computer circuits into unsolvable puzzles, a University of Michigan team aims to create an unhackable computer with a new $3.6 million grant from the Defense Advanced Research Projects Agency.

    Todd Austin, U-M professor of computer science and engineering, leads the project, called MORPHEUS. Its cybersecurity approach is dramatically different from today’s, which relies on software—specifically software patches to vulnerabilities that have already been identified. It’s been called the “patch and pray” model, and it’s not ideal.

    This spring, DARPA announced a $50 million program in search of cybersecurity solutions that would be baked into hardware.

    “Instead of relying on software Band-Aids to hardware-based security issues, we are aiming to remove those hardware vulnerabilities in ways that will disarm a large proportion of today’s software attacks,” said Linton Salmon, manager of DARPA’s System Security Integrated Through Hardware and Firmware program.

    The U-M grant is one of nine that DARPA has recently funded through SSITH.

    MORPHEUS outlines a new way to design hardware so that information is rapidly and randomly moved and destroyed. The technology works to elude attackers from the critical information they need to construct a successful attack. It could protect both hardware and software.

    “We are making the computer an unsolvable puzzle,” Austin said. “It’s like if you’re solving a Rubik’s Cube and every time you blink, I rearrange it.”

    In this way, MORPHEUS could protect against future threats that have yet to be identified, a dreaded vulnerability that the security industry called a “zero day exploit.”

    “What’s incredibly exciting about the project is that it will fix tomorrow’s vulnerabilities,” Austin said. “I’ve never known any security system that could be future proof.”

    Austin said his approach could have protected against the Heartbleed bug discovered in 2014. Heartbleed allowed attackers to read the passwords and other critical information on machines.

    “Typically, the location of this data never changes, so once attackers solve the puzzle of where the bug is and where to find the data, it’s ‘game over,’” Austin said.

    Under MORPHEUS, the location of the bug would constantly change and the location of the passwords would change, he said. And even if an attacker were quick enough to locate the data, secondary defenses in the form of encryption and domain enforcement would throw up additional roadblocks. The bug would still be there, but it wouldn’t matter. The attacker won’t have the time or the resources to exploit it.

    “These protections don’t exist today because they are too expensive to implement in software, but with DARPA’s support we can take the offensive against attackers with new defenses in hardware and implement then with virtually no impact to software,” Austin said.

    More than 40 percent of the “software doors” that hackers have available to them today would be closed if researchers could eliminate seven classes of hardware weaknesses, according to DARPA. The hardware weakness classes have been identified by a crowd-source listing of security vulnerabilities called the Common Weakness Enumeration. The classes are: permissions and privileges, buffer errors, resource management, information leakage, numeric errors, crypto errors, and code injection.

    DARPA is aiming to render these attacks impossible within five years. If developed, MORPHEUS could do it now, Austin said.

    While the complexity required might sound expensive, Austin said he’s confident his team can make it possible at low cost.

    Also on the project team are: Valeria Bertacco, an Arthur F. Thurnau Professor and professor of computer science and engineering at U-M; Mohit Tiwari, an assistant professor of electrical and computer engineering at the University of Texas; and Sharad Malik, the George Van Ness Lothrop Professor of Engineering and a professor of electrical engineering at Princeton University.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U MIchigan Campus

    The University of Michigan (U-M, UM, UMich, or U of M), frequently referred to simply as Michigan, is a public research university located in Ann Arbor, Michigan, United States. Originally, founded in 1817 in Detroit as the Catholepistemiad, or University of Michigania, 20 years before the Michigan Territory officially became a state, the University of Michigan is the state’s oldest university. The university moved to Ann Arbor in 1837 onto 40 acres (16 ha) of what is now known as Central Campus. Since its establishment in Ann Arbor, the university campus has expanded to include more than 584 major buildings with a combined area of more than 34 million gross square feet (781 acres or 3.16 km²), and has two satellite campuses located in Flint and Dearborn. The University was one of the founding members of the Association of American Universities.

    Considered one of the foremost research universities in the United States,[7] the university has very high research activity and its comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields (Science, Technology, Engineering and Mathematics) as well as professional degrees in business, medicine, law, pharmacy, nursing, social work and dentistry. Michigan’s body of living alumni (as of 2012) comprises more than 500,000. Besides academic life, Michigan’s athletic teams compete in Division I of the NCAA and are collectively known as the Wolverines. They are members of the Big Ten Conference.

     
  • richardmitnick 11:42 am on December 18, 2017 Permalink | Reply
    Tags: , , Computer Science, Ming C. Lin, ,   

    From UMD: Women in STEM: “Ming Lin Named Chair of UMD Department of Computer Science” 

    U Maryland bloc

    University of Maryland

    Media Relations Contact:
    Abby Robinson
    301-405-5845
    abbyr@umd.edu

    Writer: Tom Ventsias

    1
    Ming C. Lin. Photo: John T. Consoli

    Ming C. Lin will lead the University of Maryland’s Department of Computer Science, effective January 1, 2018.

    A noted educator and expert in virtual reality, computer graphics and robotics, Lin will assume the role of Elizabeth Stevinson Iribe Chair of Computer Science with a joint appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS).

    As chair, she will oversee a department that has experienced significant increases in student enrollment; expanded its research in virtual and augmented reality, robotics, machine learning, cybersecurity and quantum information science; and grown its outreach efforts to fuel more corporate and philanthropic support. The department currently ranks 15th in the nation according to U.S. News & World Report.

    Lin comes to Maryland from the University of North Carolina at Chapel Hill, where she was the John R. and Louise S. Parker Distinguished Professor of Computer Science and a faculty member for 20 years.

    She arrives at an opportune time—the department’s faculty, staff and students will move in late 2018 to the Brendan Iribe Center for Computer Science and Innovation, a 215,000-square-foot-facilty that will offer unprecedented opportunities to explore and imagine bold new directions in computer science. The new building became a reality thanks to a $31 million gift from Brendan Iribe, a UMD alumnus and co-founder of the virtual reality company Oculus.

    “We are thrilled that Ming Lin will lead our efforts in advancing computer science at the University of Maryland,” said Gerald Wilkinson, interim dean of the UMD College of Computer, Mathematical, and Natural Sciences. “She brings a wealth of experience as a skilled educator and as a phenomenal researcher that will help her guide the department to further success.”

    Lin spent the past several months meeting with UMD students, faculty, staff, alumni and other stakeholders. She also attended outreach events focused on highlighting opportunities in computer science to prospective students, many of whom are women or other underrepresented groups in the field. The department’s continuing efforts to enhance diversity have resulted in the number of female undergraduates in the department tripling over the last five years, with more than 600 women currently pursuing undergraduate degrees.

    “Computing is no longer a specialty, it is part of everyone’s daily life,” Lin said. “So, for me to see such a considerable amount of academic talent and enthusiasm from everyone I’ve met at Maryland has been amazing. I am truly honored and privileged to be given the opportunity to work with this community.”

    The department’s undergraduate enrollment has increased rapidly. In fall 2013, the department had 1,386 undergraduate majors; that number swelled to 3,106 in fall 2017, making it one of the largest computer science programs in the country and the most popular major on campus.

    “One of my primary goals is to ensure that our students will be successful in their careers when they graduate,” Lin said. “They are going to be the leaders in a society where practically every aspect of daily life is enabled and impacted by computing. Giving them the knowledge and skills to excel in a technology-empowered world is a mission I take very seriously.”

    The department includes more than 50 tenured or tenure-track faculty members and 11 full-time professional track instructional faculty members. They are pursuing new methods of instruction that encourage innovation and entrepreneurship, with these activities expected to significantly ramp up once the Iribe Center opens.

    “I am encouraged to see that so many faculty members here are active in blended learning and advocate for more makerspaces and other student-initiated activities like the Bitcamp and Technica hackathons where students turn a spark of creativity into new technology,” Lin said.

    Along with the 16 labs and centers in UMIACS, the department brings in approximately $25 million in external funding each year. The more than 250 computer science graduate students work closely with faculty members, postdocs and others on cutting-edge research that often crosses academic disciplines. They explore topics that include cryptocurrency exchanges, deep learning for autonomous robotics, computational linguistics, genome microbial sequencing and more.

    Lin plans to bring some of her UNC research group to Maryland, continuing her research in virtual reality, computer graphics and robotics that focuses on multimodal interaction, physically based animations and simulations, as well as algorithmic robotics and their use in physical and virtual environments. Her research has extensive applications in medical simulations, cancer screening, urban computing, as well as supporting city-scale planning, human-centric computing, intelligent transportation and traffic management.

    “We’ve constantly been working on scientific problems where the solutions will have considerable social impact,” Lin said. “That’s important for me—I am hoping through research, teaching and advising that I can make some difference.”

    Lin earned her B.S., M.S. and Ph.D. in electrical engineering and computer sciences from the University of California, Berkeley. She received a National Science Foundation Faculty Early Career Development (CAREER) Award in 1995, and she is a fellow of the Association for Computing Machinery, IEEE and the Eurographics Association. She also serves on the board of directors of the Computing Research Association’s Committee on the Status of Women in Computing Research.

    She has authored or co-authored more than 250 refereed publications and has authored or co-edited four books. She is a former editor-in-chief of IEEE Transactions on Visualization and Computer Graphics (2011–2014) and has served on numerous steering committees and advisory boards of international conferences, as well as government and industrial technical advisory committees. Lin also co-founded the 3-D audio startup Impulsonic, which was recently acquired by Valve Software.

    Lin succeeds Larry Davis, who became interim chair of the department on July 1, 2017. Prior to Davis, Samir Khuller completed a five-year term as chair, serving as the inaugural Elizabeth Stevinson Iribe Chair of Computer Science. Elizabeth Iribe established the endowed chair in 2015 with a $1.5 million donation. Her donation received $1.1 million in matching funds from the state’s Maryland E-Nnovation Initiative Fund, which aims to spur private donations to universities for applied research in scientific and technical fields by matching such donations.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Maryland Campus

    Driven by the pursuit of excellence, the University of Maryland has enjoyed a remarkable rise in accomplishment and reputation over the past two decades. By any measure, Maryland is now one of the nation’s preeminent public research universities and on a path to become one of the world’s best. To fulfill this promise, we must capitalize on our momentum, fully exploit our competitive advantages, and pursue ambitious goals with great discipline and entrepreneurial spirit. This promise is within reach. This strategic plan is our working agenda.

    The plan is comprehensive, bold, and action oriented. It sets forth a vision of the University as an institution unmatched in its capacity to attract talent, address the most important issues of our time, and produce the leaders of tomorrow. The plan will guide the investment of our human and material resources as we strengthen our undergraduate and graduate programs and expand research, outreach and partnerships, become a truly international center, and enhance our surrounding community.

    Our success will benefit Maryland in the near and long term, strengthen the State’s competitive capacity in a challenging and changing environment and enrich the economic, social and cultural life of the region. We will be a catalyst for progress, the State’s most valuable asset, and an indispensable contributor to the nation’s well-being. Achieving the goals of Transforming Maryland requires broad-based and sustained support from our extended community. We ask our stakeholders to join with us to make the University an institution of world-class quality with world-wide reach and unparalleled impact as it serves the people and the state of Maryland.

     
  • richardmitnick 9:11 am on March 31, 2017 Permalink | Reply
    Tags: , , Computer Science, Wei Xu,   

    From BNL: Women in STEM “Visualizing Scientific Big Data in Informative and Interactive Ways” Wei Xu 

    Brookhaven Lab

    March 31, 2017
    Ariana Tantillo
    atantillo@bnl.gov

    Brookhaven Lab computer scientist Wei Xu develops visualization tools for analyzing large and varied datasets.

    1
    Wei Xu, a computer scientist who is part of Brookhaven Lab¹s Computational Science Initiative, helps scientists analyze large and varied datasets by developing visualization tools, such as the color-mapping tool seen projected from her laptop onto the large screen.

    Humans are visual creatures: our brain processes images 60,000 times faster than text, and 90 percent of information sent to the brain is visual. Visualization is becoming increasingly useful in the era of big data, in which we are generating so much data at such high rates that we cannot keep up with making sense of it all. In particular, visual analytics—a research discipline that combines automated data analysis with interactive visualizations—has emerged as a promising approach to dealing with this information overload.

    “Visual analytics provides a bridge between advanced computational capabilities and human knowledge and judgment,” said Wei Xu, a computer scientist in the Computational Science Initiative (CSI) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and a research assistant professor in the Department of Computer Science at Stony Brook University. “The interactive visual representations and interfaces enable users to efficiently explore and gain insights from massive datasets.”

    At Brookhaven, Xu has been leading the development of several visual analytics tools to facilitate the scientific decision-making and discovery process. She works closely with Brookhaven scientists, particularly those at the National Synchrotron Light Source II (NSLS-II) and the Center for Functional Nanomaterials (CFN)—both DOE Office of Science User Facilities.


    NSLS-II

    By talking to researchers early on, Xu learns about their data analysis challenges and requirements. She continues the conversation throughout the development process, demoing initial prototypes and making refinements based on their feedback. She also does her own research and proposes innovative visual analytics methods to the scientists.

    Recently, Xu has been collaborating with the Visual Analytics and Imaging (VAI) Lab at Stony Brook University—her alma mater, where she completed doctoral work in computed tomography with graphics processing unit (GPU)-accelerated computing.

    Though Xu continued work in these and related fields when she first joined Brookhaven Lab in 2013, she switched her focus to visualization by the end of 2015.

    “I realized how important visualization is to the big data era,” Xu said. “The visualization domain, especially information visualization, is flourishing, and I knew there would be lots of research directions to pursue because we are dealing with an unsolved problem: how can we most efficiently and effectively understand the data? That is a quite interesting problem not only in the scientific world but also in general.”

    It was at this time that Xu was awarded a grant for a visualization project proposal she submitted to DOE’s Laboratory Directed Research and Development program, which funds innovative and creative research in areas of importance to the nation’s energy security. At the same time, Klaus Mueller—Xu’s PhD advisor at Stony Brook and director of the VAI Lab—was seeking to extend his research to a broader domain. Xu thought it would be a great opportunity to collaborate: she would present the visualization problem that originated from scientific experiments and potential approaches to solve it, and, in turn, doctoral students in Mueller’s lab would work with her and their professor to come up with cutting-edge solutions.

    This Brookhaven-Stony Brook collaboration first led to the development of an automated method for mapping data involving multiple variables to color. Variables with a similar distribution of data points have similar colors. Users can manipulate the color maps, for example, enhancing the contrast to view the data in more detail. According to Xu, these maps would be helpful for any image dataset involving multiple variables.

    3
    The color-mapping tool was used to visualize a multivariable fluorescence dataset from the Hard X-ray Nanoprobe (HXN) beamline at Brookhaven’s National Synchrotron Light Source II. The color map (a) shows how the different variables—the chemical elements cerium (Ce), cobalt (Co), iron (Fe), and gadolinium (Gd)—are distributed in a sample of an electrolyte material used in solid oxide fuel cells. The fluorescence spectrum of the selected data point (the circle indicated by the overlaid white arrows) is shown by the colored bars, with their height representing the relative elemental ratios. The fluorescence image (b), pseudo-colored based on the color map in (a), represents a joint colorization of the individual images in (d), whose colors are based on the four points at the circle boundary (a) for each of the four elements. The arrow indicates where new chemical phases can exist—something hard to detect when observing the individual plots (d). Enhancing the color contrast—for example, of the rectangular region in (b)—enables a more detailed view, in this case providing better contrast between Fe (red) and Co (green) in image (c).

    “Different imaging modalities—such as fluorescence, differential phase contrasts, x-ray scattering, and tomography—would benefit from this technique, especially when integrating the results of these modalities,” she said. “Even subtle differences that are hard to identify in separate image displays, such as differences in elemental ratios, can be picked up with our tool—a capability essential for new scientific discovery.” Currently, Xu is trying to install the color mapping at NSLS-II beamlines, and advanced features will be added gradually.

    In conjunction with CFN scientists, the team is also developing a multilevel display for exploring large image sets. When scientists scan a sample, they generate one scattering image at each point within the sample, known as the raw image level. They can zoom in on this image to check the individual pixel values (the pixel level). For each raw image, scientific analysis tools are used to generate a series of attributes that represent the analyzed properties of the sample (the attribute level), with a scatterplot showing a pseudo-color map of any user-chosen attribute from the series—for example, the sample’s temperature or density. In the past, scientists had to hop between multiple plots to view these different levels. The interactive display under development will enable scientists to see all of these levels in a single view, making it easier to identify how the raw data are related and to analyze data across the entire scanned sample. Users will be able to zoom in and out on different levels of interest, similar to how Google Maps works.

    4
    The multilevel display tool enables scientists conducting scattering experiments to explore the resulting image sets at the scatterplot level (0), attribute pseudo-color level (1), zoom-in attribute level (2), raw image level (3), zoom-in raw image level (4), and pixel level (5), all in a single display.

    The ability to visually reconstruct a complete joint dataset from several partial marginal datasets is at the core of another visual analytics tool that Xu’s Stony Brook collaborators developed. This web-based tool enables users to reconstruct all possible solutions to a given problem and locate the subset of preferred solutions through interactive filtering.

    “Scientists commonly describe a single object with datasets from different sources—each covering only a portion of the complete properties of that object—for example, the same sample scanned in different beamlines,” explained Xu. “With this tool, scientists can recover a property with missing fields by refining its potential ranges and interactively acquiring feedback about whether the result makes sense.”

    Their research led to a paper that was published in the Institute of Electrical and Electronics Engineers (IEEE) journal Transactions on Visualization and Computer Graphics and awarded the Visual Analytics Science and Technology (VAST) Best Paper Honorable Mention at the 2016 IEEE VIS conference.

    At this same conference, another group of VAI Lab students whom Xu worked with were awarded the Scientific Visualization (SciVis) Best Poster Honorable Mention for their poster, “Extending Scatterplots to Scalar Fields.” Their plotting technique helps users link correlations between attributes and data points in a single view, with contour lines that show how the numerical values of the attributes change. For their case study, the students demonstrated how the technique could help college applications select the right university by plotting the desired attributes (e.g., low tuition, high safety, small campus size) with different universities (e.g., University of Virginia, Stanford University, MIT). The closer a particular college is to some attribute, the higher that attribute value.

    5
    The scatter plots above are based on a dataset containing 46 universities with 14 attributes of interest for prospective students: academics, athletics, housing, location, nightlife, safety, transportation, weather, score, tuition, dining, PhD/faculty, population, and income. The large red nodes represent the attributes and the small blue points represent the universities; the contour lines (middle plot) show how the numerical values of the attributes change. This prospective student wants to attend a university with good academics (>9/10). Universities that meet this criterion are within the contours lines whose value exceeds 9. To determine which universities meet multiple criteria, students would see where the universities and attributes overlap (right plot).

    According to Xu, this kind of technique also could be applied to visualize artificial neural networks—the deep learning (a type of machine learning) frameworks that are used to address problems such as image classification and speech recognition.

    “Because neural network models have a complex structure, it is hard to understand how their intrinsic learning process works and how they arrive at intermediate results, and thus quite challenging to debug them,” explained Xu. “Neural networks are still largely regarded as black boxes. Visualization tools like this one could help researchers get a better idea of their model’s performance.”

    Besides her Stony Brook collaborations, Xu is currently involved in the Co-Design Center for Online Data Analysis and Reduction at the Exascale (CODAR), which Brookhaven is partnering on with other national laboratories and universities through DOE’s Exascale Computing Project. Her role is to visualize data evaluating the performance of computing clusters, applications, and workflows that the CODAR team is developing to analyze and reduce data online before the data are written to disk for possible further offline analysis. Exascale computer systems are projected to provide unprecedented increases in computational speed but the input/output (I/O) rates of transferring the computed results to storage disks are not expected to keep pace, so it will be infeasible for scientists to save all of their scientific results for offline analysis. Xu’s visualization will help the team “diagnose” any performance issues with the computation processes, including individual application execution, computation job management in the clusters, I/O performance in the runtime system, and data reduction and reconstruction efficiency.

    Xu is also part of a CSI effort to build a virtual reality (VR) lab for an interactive data visualization experience. “It would be a more natural way to observe and interact with data. VR techniques replicate a realistic and immersive 3D environment,” she said.

    For Xu, her passion for visualization most likely stemmed from an early interest in drawing.

    “As a child, I liked to draw,” she said. “In growing up, I took my drawings from paper to the computer.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: