Tagged: Motherboard Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:27 am on March 2, 2019 Permalink | Reply
    Tags: A basic prerequisite for becoming a spacefaring civilization logically would be learning to live within the sustainable limits on our home world, At this moment in 2019 man has been able to ascertain and ascertain and identify the existence of thousands of exoplanets orbiting distant stars, “Great Filter”, Based on surveys completed by the Kepler telescope thus far the odds of finding a habitable terrestrial planet within 20 light years of the earth are good, Design and Tech.Co, It is conceivable we could find alien life in some form on Mars or one or more of the moons in the outer solar system. We have ample proof several moons around Jupiter and Saturn have liquid oceans in, It is unrealistic to think we can find a planet with alien life and simply move in, Motherboard, One day presuming we can overcome our principal flaws and the current limitations of physics we will travel to distant stars, Probabilities are everything on an explanet or exomoon will be toxic to us the biology will be quite different, Serious exploration of our region of the galaxy will become the ultimate frontier, So long as we don’t self-destruct one day we will embark upon the greatest adventure imaginable: the exploration of the Universe beyond our solar system, Spreading Life Beyond Earth, Terraforming will be a slow process that will take centuries so there has to be a plan for what we will do and how we are to survive in the interim as we re-engineer the atmosphere and introduce earth, The discovery of the earth’s definitive twin remains elusive for the moment, The second phase will be our taking the body of knowledge skills and abilities we acquire and apply them to the building of viable mesocosms in space, Unless these steps are taken humans cannot successfully travel to and inhabit other star systems, we need to begin thinking about and setting the rules that monitor limit and control our behavior NOW   

    From Design and Tech.Co via Motherboard: “Spreading Life Beyond Earth-Design and Tech.Co” 

    Design and Technology William Reynolds Primary School

    Design and Tech.Co

    motherboard

    Motherboard

    Feb 25, 2019
    Jerry M Lawson

    One day, presuming we can overcome our principal flaws and the current limitations of physics, we will travel to distant stars. Serious exploration of our region of the galaxy will become the ultimate frontier. Dim as that prospect seems today, I believe it will come. In my youth, much of what is today’s reality was fanciful science fiction and wishful thinking. We learned life wasn’t possible anywhere else in our solar system, and there were no planets around distant stars. We thought ourselves special and unique. We believed we were the center of the Universe.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Inflationary Universe. NASA/WMAP

    Much has changed in the past 50 years. Our exploration of our solar system has greatly expanded our knowledge of the local environment and about the possibilities for finding other forms of life. Although nothing has been found as yet, we know the possibilities are much greater than what I learned in school in the 1950s and 60s. Overcoming our technological limitations may seem impossible or insurmountable, but we are a highly adaptive, ingenious, and clever species. So long as we don’t self-destruct, one day we will embark upon the greatest adventure imaginable: the exploration of the Universe beyond our solar system.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    However, long before that day arrives we must become thoroughly familiar with everything in our own backyard and decide based on the wisdom we acquire in those endeavors the steps we need to take to become an interstellar species. It will be a long painful process. Learning what we must and must not do, what we can and can’t do, and then impose necessary limits on ourselves so when we finally do encounter alien life in any form we will have a plan. What will our rules of engagement be? Do we have the right to interfere, alter or otherwise harm any life we find? What if the world we find is inhabited, but coveted by us for our own purposes?

    In the popular science fiction series Star Trek, the Prime Directive is the guiding principle of the United Federation of Planets. This directive states Starfleet is prohibited from interfering with the internal development of any aliens they encounter. Its intent is to prevent interference with less developed civilizations to avoid the inevitable disaster such interference would cause. In this, they merely recognize our own experience on planet Earth when more highly developed cultures encountered primitive human societies throughout our long blemished and checkered history. Such contacts inevitably resulted in the destruction of the lesser-developed culture coupled with environmental degradation by the more developed one, regardless of their intentions.

    However, before we leave this world we have important lessons to learn. There is a test of sorts I suspect all civilizations reaching our level of development must pass. Call it a filter. This prospect was first raised by Enrico Fermi, creator of the first nuclear reactor, suggesting there may be ‘filters’ an advanced civilization might have to pass to become spacefaring.

    Enrico Fermi nuclear physicist and childish practical joker from The Spectator

    These filters could take many forms and conceivably are both environmental and developmental in nature. A short list might include: acquiring nuclear weapons and their capability of obliterating life, overpopulation, environmental degradation causing climate change, and overcoming our tribal nature to redefine the meaning of our tribe to include all life. I’m sure there are others.

    A basic prerequisite for becoming a spacefaring civilization logically would be learning to live within the sustainable limits on our home world, the Earth. We will, in this century pass this test, make it through this filter, or perish. It may sound harsh and extreme, but it is our reality. All we need do is look at what we are doing to our planet at this minute. We first have to create a demonstrably sustainable human civilization on Earth by overcoming all the above problems. The knowledge we acquire and lessons learned in accomplishing these tasks will open the doors necessary for us to succeed in the next phase. Think of it being like the transformation of the caterpillar into the butterfly. Its struggle in exiting the chrysalis is vital and necessary to the success and survival of the emerging butterfly.

    The second phase will be our taking the body of knowledge, skills, and abilities we acquire and apply them to the building of viable mesocosms in space. What is mesocosm? Simply put, a mesocosm is recreating the earth’s biological system in miniature. We may start with a station parked somewhere high above the Earth or near the moon. We can build on our success by building new bases on the moon taking advantage of its extensive system of caves created in the early phase of the moon’s development when there was extensive volcanism. From there we can move on to similar bases on Mars as we begin terraforming it, and, perhaps most importantly, build cloud cities on Venus as visualized by NASA. Once we learn how to live, work, and thrive in these three diverse environments the doors to the future are wide open.

    2
    True color image of Mars taken by the OSIRIS instrument on the European Space Agency (ESA) Rosetta spacecraft during its February 2007 flyby of the planet.

    ESA Rosetta annotated

    Hot as hell, but long ago Venus might have hosted abundant life. Cosmos Magazine

    At this moment, in 2019, man has been able to ascertain and ascertain and identify the existence of thousands of exoplanets orbiting distant stars. These planets come in all sizes and are rewriting our understanding of star and planet formation. We are looking for a second earth and have found several possibilities, and more continue to be found. The discovery of the earth’s definitive twin remains elusive for the moment.

    Chances are once we learn to sharpen our skills with new and improved tools; we will see many more that have been hiding in the shadows. These new tools are being developed. Recently a new way to identify magnetic fields has been discovered that expands the number of known planets possibly capable of harboring life.

    While we imagine finding a second earth rich with life and covered with water, caution is in order. We have to remember any planet we go to in another star system will present us the utmost challenges. We must be able to determine whether or not there is life there before we go. The answer to that question tells us what the constraints are for us in visiting that world.

    It is conceivable we could find alien life in some form on Mars or one or more of the moons in the outer solar system.

    NASA’s Solar System Exploration. Color image of icy Enceladus, the sixth-largest moon of Saturn

    Saturn’s Moon Titan – Universe Today

    We have ample proof several moons around Jupiter and Saturn have liquid oceans in their interiors. Pluto, the dwarf planet, was recently found to have an ocean hiding under its frozen surface.

    Pluto NASA New Horizon via Johns Hopkins University Applied Physics Laboratory and Southwest Research Institute-Alex Parker

    How would discovering life of some kind hiding in the dark on one or more of these worlds alter our approach? What should our rules of engagement be? Do we have the right to interfere, alter, or otherwise harm ANY life we find, even if that world is desired by us for our own uses?

    Presuming we can find ways of overcoming the cosmic speed limit, what do we do if we find a promising planet with alien life within a dozen light-years or so? How do we behave? What are the ethical constraints and limits we must observe? Or are we morally and ethically free to do as we wish?

    Discovery of an exoplanet in our immediate neighborhood comes with a double edge. If we find a planet we are relatively certain harbors life, wouldn’t we be faced with a number of complications and contradictions? We focus on the thrill of finding other Earth-like worlds, but we never acknowledge or talk about the fact such a world may present a bigger problem than one with possibilities but barren of life, or at least higher forms of life. Contradictory as it sounds, discovery only complicates what happens in the future. Why? First of all, we have to recognize the most basic and essential reality. We don’t just live on earth; we are the earth. We are related to and part of every single living system on this planet. Wherever we go we must take earth with us. What that means in practical terms is that if we find a planet with promising features there are constraints on our actions. What do we do if we find a world with more advanced forms of alien life? The reality is that if we find such a planet, and we most likely will eventually, what do we do? How do we behave? What are the ethical constraints and limits? We must begin answering these questions now.

    With those issues in mind, I asked University of Arizona Astronomer and professor Chris Impey a number of questions concerning our leaving earth, encountering life, and our response to that possibility. Impey is the author of several books touching on these issues including Beyond: Our Future in Space and Encountering Life in the Universe. He not only authored books about these issues but also has been deeply involved with groups who meet to discuss and study them.

    Impey acknowledged the idea of the “Great Filter” was, in consideration of the issues we face, a distinct and serious possibility. Concerning respect for all life, he noted microbial life under the Martian surface and on several moons in the outer solar system was possible, but we probably would feel no moral obligation toward microbes. This response is important in understanding where we might set the limits for our intervention on other worlds. The possibility of encountering microbial life in our own solar system would serve as a crucial learning experience for our species on how to deal with such matters going forward. What we learn in our own solar system would prove invaluable when we eventually visit worlds around other suns.

    Impey replied, based on surveys completed by the Kepler telescope thus far, the odds of finding a habitable terrestrial planet within 20 light years of the earth are good.

    NASA/Kepler Telescope, and K2 March 7, 2009 until November 15, 2018

    NASA/MIT TESS replaced Kepler in search for exoplanets

    He indicated NASA has already initiated a policy of not contaminating or interfering with any life forms it might find on other worlds. At least the U.S. is operating within a moral framework of non-intervention. We hope our example will serve as the basis for a policy others will follow. Finally, in regard to finding life elsewhere, he said, “Yes, if life elsewhere has a different biological basis, it might be toxic or dangerous to our form of biology, and difficult to anticipate what exact form it might take. All the planning I’ve seen suggests a very cautious approach.” At this point, this is probably the best we can hope. Impey ended by acknowledging these questions are to the point and that the astrobiology community is taking them seriously.

    His responses suggest current thinking among people like Impey who are looking into our future, discussing, raising questions, and thinking about issues related to our species becoming spacefaring are on the right path.

    On the other hand, it isn’t hard to see, reflecting on man’s history, that we might see any kind of life standing in the way of our plans or desires as an obstacle to be removed. History suggests the only life we are willing to consider important or worthy of serious thought and consideration is our own. It often seems everything else is expendable so the work of Impey and others in the astrobiology community and elsewhere is vital to our reigning in our baser instincts.

    Evolution hardwired some things into our DNA that once served as an advantage that made our dominance of earth possible but is now the opposite. We have the knowledge and wisdom to overcome those things, but it won’t be easy. The work being done to addresses these issues today could make the difference in whether we are successful or fail to get through the “Great Filter”.

    It is unrealistic to think we can find a planet with alien life and simply move in. Probabilities are, everything on such a world will be toxic to us, the biology will be quite different. There undoubtedly will be a great temptation to change it and to accomplish that would endeavor to kill and destroy all the life on that world and replacing it with our own. Think of chemotherapy or doing a bone marrow transplant. Does this concept have a familiar ring? How many science fiction stories and movies have used that very premise to portray an alien menace trying to either alter earth for their purposes (War of the Worlds) or simply wanting to strip the planet of all its useful materials and resources for their own needs (Independence Day, Oblivion, Avatar)?

    Morally and ethically we should find this kind of behavior unacceptable. Is not life sacred and deserving of its own existence and having the opportunity to develop and evolve, as it will? If we wish to spread our kinds of life across the Universe aren’t we really looking for potentially habitable worlds currently missing a few key qualities? Many such worlds may be inhabited with simple life forms. In this instance, we would bring all our acquired knowledge, skills, abilities and tempered by the wisdom we’ve acquired learning how to live sustainably on earth, to transform or terraform the new world using local resources and whatever power was brought from home.

    Terraforming will be a slow process that will take centuries so there has to be a plan for what we will do and how we are to survive in the interim as we re-engineer the atmosphere and introduce earth’s entire ecosystem, so it becomes like earth and compatible with our existence and survival.

    Even though we are probably centuries from reaching this threshold for entering deep space and spreading to other worlds, we need to begin thinking about and setting the rules that monitor, limit and control our behavior NOW.

    Unless these steps are taken, humans cannot successfully travel to and inhabit other star systems. The preparation itself is a multi-century project and one that relies crucially on its first step succeeding, which is the creation of a sustainable long-term civilization on Earth. This is the vital test of any species seeking to become a spacefaring civilization. Its lessons are basic and vital to being able to live on other worlds and overcoming hostile environments. Learning to live within sustainable limits respecting the biosphere that makes our life possible and altering our behavior to celebrate and enhance its growth and health are like learning to talk and walk all over again. This achievement is necessary, although not sufficient, a precondition for any success in interstellar voyaging. If we don’t create sustainability on our own world the consequences are clear and catastrophic, there is no Planet B.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 11:10 am on February 25, 2019 Permalink | Reply
    Tags: "The Joystick Artisan", , Benj Edwards, From the arcades to the living room- how the controller has evolved—and why one tech historian Benj Edwards started building his own, Joysticks, Motherboard   

    From Tedium via Motherboard: “The Joystick Artisan” 

    motherboard

    From Motherboard

    Feb 25 2019
    Ernie Smith

    From the arcades to the living room, how the controller has evolved—and why one tech historian, Benj Edwards, started building his own.

    1

    Of the year’s many Super Bowl ads, perhaps the most compelling and interesting—and according to a post-game study, effective—was an ad Microsoft created to highlight a game controller.

    The Xbox Adaptive Controller, with its massive action buttons and dozens of accessories, is the kind of device that perhaps won’t sell a lot of units—but the ones it does sell will be greatly appreciated by those with disabilities who want to game, too. It represents a statement piece for Microsoft—a statement that the company wants to ensure everyone who wants to play its games can do so, and that it was willing to take steps to encourage such accessibility.

    It highlights how far we’ve come from the origin of the joystick, which actually predates the video game by more than 40 years, with the first patent being filed in 1926.

    Its use case, in case you were wondering, was in aviation, as a part of a remote control aviation system. To test the device, engineer Carlos B. Mirick effectively invented a primitive drone-like device–a radio-controlled device on wheels he called the “electric dog.”

    Gamers, of course, are very particular about their controllers—as they should be, because they spend hours with them, whether at an arcade, on the go, or at home in front of a 4K TV set or old CRT—and are willing to look far and wide for one that best suits their needs. This growing interest has led to the rise of players like 8BitDo, which have made a name on building retro-style controllers with modern niceties, and Hori, which sells a whole line of Smash Bros.-ready Gamecube controllers for the Nintendo Switch.

    And gamers willing to pay a premium for those controllers, too—which means that there’s room for a more bespoke approach, leading some creators to make their own joysticks by hand.

    It’s the kind of thing that requires you to know your history, which means that Benj Edwards, a digital historian and tech journalist whose writing often leads him into the realms of gaming and early computing, might perhaps be the perfect guy to get a controller business off the ground in 2019.

    2
    The N64 controller, perhaps one of the most controversial controllers of all time, helped to popularize the thumbstick. Image: Evan Amos/Wikimedia Commons

    Five evolutionary control types that contributed to the history of the game controller

    The paddle. An early joystick variant that first found its use in Nolan Bushnell’s calling card Pong, it’s essentially a giant knob that relies on circular motion, ensuring that the range of movement was fairly limited. Despite looking nothing like a paddle, the name is effectively a reference to its use in tennis-style games. “In fact, this type of controller probably wouldn’t even be called a paddle if it weren’t for Bushnell’s pioneering game,” PC Magazine noted in 1984, alongside a review of two paddles it said were “shaped like miniature coffins.”
    The D-Pad. As pointed out in a 2017 episode of The Gaming Historian, Nintendo icon Gunpei Yokoi invented the basic design behind the directional pad, with Yokoi’s key innovation involving a ball pivot in the center of the directional pad, which allowed for better freedom of motion. Nintendo patented this basic design, which meant that, for decades, every other game maker had to use an alternative design for their own directional sticks. (Notably, the Joy-Cons on Nintendo’s latest console, the Switch, do not use a D-Pad, though popular third-party manufacturers like Hori sell them.)
    The thumbstick. A tiny version of the joystick, popular with 3D games, was popularized with the Nintendo 64, which included a stick directly in the middle of the controller. As writer Mark Christian noted on his blog, this design ended up proving unique, as the approach was eschewed in favor of making the stick and directional pad equally accessible at all times. Of note: One early variant of the thumbstick was the NES Max, which used a slider-type device called a cycloid that was functionally similar to a thumbstick, but was harder to use. This has led enthusiasts to mod the controller to add an actual thumbstick.
    The trackball. While lacking an actual stick, of course, the trackball, which we covered in 2017, served a functionally similar purpose in many early video games, most famously Arkanoid and Centipede. The Apple Pippin, perhaps Cupertino’s most obscure product, included a trackball in the middle of its boomerang-style controller.
    The touchpad. More recently, modern controllers have taken to implementing miniature touchpads directly within the controller—particularly the PlayStation 4, whose DualShock touchpad is a notable feature. Recent variants of the Apple TV, which have remotes that double as game controllers in some cases, also include touchpads.

    How a computing historian found himself running a joystick business

    Technology, so often, is a game of mass manufacturing—of products that are produced at a certain scale, that don’t really shine or reach an ideal price/performance ratio unless they’re made in the tens of thousands. Certainly joysticks, like many video game items, have traditionally fit into this mold.

    But retro gaming, with its willing body of enthusiasts who are looking for ways to perfect their nostalgia experience—and willing to pay premium prices along the way—are perhaps the most prominent exception to this rule. And that creates opportunities to take a more, shall we say, DIY approach.

    Benj Edwards, who has run his Vintage Computing and Gaming website for nearly 15 years, has built an array of joysticks for platforms common and obscure—mostly by hand. He notes that BX Foundry comes at a time when retro gaming is full of people looking for an upgrade.

    “There is a rather large enthusiast community of retro gamers out there right now, and they are used to paying (relatively) high amounts for rare games or special limited-run accessories,” Edwards told me in an email interview. “So boutique handcrafted joysticks fit into this quite well—especially when the joysticks turn out to be amazingly good and deliver an experience for older systems that you can’t get out of any mass-produced retail product.”

    Edwards, having written stories about the invention of the first video game cartridge, why software piracy is a necessity for preservation, and the challenge of saving the history of the early online service Prodigy, knows his tech history, and he brings that knowledge to his work in joysticks.

    3
    Sample of some of Edwards’ organic, minimalist controllers. Image: Benj Edwards

    He describes his approach to his joysticks as “organic and minimalist”—and moving against the prevailing trend among arcade controllers for home consoles of being akin to literal replicas of what’s usually found in arcade cases. His strategy is, effectively, to keep it simple—he develops the cases from off-the-shelf plastic boxes, uses genuine arcade sticks and controllers, generally uses a drill press to cut the holes, and relies on the placement of his own hands to figure out the general design for a given controller.

    “Some people who make home arcade sticks get obsessed with plotting out precise positions for all the buttons, even drawing them up in a computer CAD program first,” he explains. “I just use a pencil and a ruler if I need to plan anything out.”

    This strategy has proven effective for his lineup of joysticks, which includes systems largely of a particular vintage, including the Atari 800 and the NES. (Lots of room for offbeat stuff, however; my personal favorite controller he makes is the BX–10 Cyclops, which is literally described on the website as “a single button to make a dramatic statement.”)

    4
    The SNES-focused BX–110, which had a higher-capacity production cycle. Image: Benj Edwards

    Of course, keeping it simple has its limits. It means that scaling up production can be a challenge, especially in the case of the BX–110, a joystick for the Super NES that he’s currently ramping up production for. Things that you might not think about as a consumer, such as the color of the buttons or the number of holes on a box, suddenly become significant issues.

    “I had lot of interest, but if I tried to build 100 of them myself, it would take forever and not be practical,” he explained. (Full disclosure: I’m currently waiting on a BX–110, and I’m patiently counting down the days.)

    In the past, I’ve always been super-curious about the logistics of manufacturing something from scratch—not just the success stories, but the challenges and complications. The way Edwards talks about it really highlights just how hard this stuff can be for newbies, how it sets parameters (“the up-front investment costs of custom injection-molded cases is astronomical, and I can’t afford that right now”) and highlights, unfortunately, the way the world works these days.

    “I’ll tell you the challenge that is most depressing,” he notes, by way of example. “American factories and companies are glacially slow and inefficient, while Chinese manufacturers are lightning fast, highly skilled, and fall all over themselves trying to help you in any way you could possibly need.”

    (He does have some help in the form of the parts he’s using for the devices—many of which come from Sanwa Denshi, a Japanese manufacturer of arcade joystick parts that has been active in the joystick business since the early 1980s. The Japanese company manufactures a variety of key elements for arcade-style controls, and has become a de facto standard for many arcade cabinets in the years since.)

    He says that the Super NES joysticks, high in demand, are being held up by one thing, though he’s working it out. Even with the frustrations, though, he sees the bright side here.

    “I’ve learned a ton about manufacturing along the way, though, so it has been a fascinating and gratifying experience,” he says.

    The thing I find fascinating about Benj’s move to build a joystick business is that his knowledge of computer history is wide—and this manifested itself in interesting ways, such as a large collection of vintage computer equipment that literally became a news story when he tried to sell it last year. On the journalism front, we had a frequent joke going on between one another for a while where one of us would be looking for links while researching a story idea and would run into the other person’s name, having already written a definitive story about the given topic.

    Benj could have parlayed his knowledge into any number of areas—heck, even writing a book. But he chose game controllers, an area that is perhaps the most tangible possible area he could have put his talents into. Even if his business remains defiantly boutique or fails to scale beyond his home, it still adds value to an important niche, one that gets what he’s offering to the world of retro gaming.

    5
    This is what a Virtual Boy fighting stick looks like. Image: Benj Edwards

    Recently, Edwards had a breakout hit of sorts in an unexpected place: He found unusually strong demand for controllers for the Virtual Boy, which is not a platform known for having a lot of third-party controllers, due to its short lifespan and extremely limited popularity at launch. But like every system created that isn’t the Gizmondo, it has a fanbase, and in the case of the Virtual Boy, that fanbase has a homebrew port of Street Fighter II: Hyper Fighting at their fingertips.

    Which means that Edwards found himself, not long after announcing a dual-stick Virtual Boy controller, announcing a Virtual Boy fighting stick that created an unexpected buzz last month. Both sticks need to use custom PCBs and connectors from abandoned Virtual Boy controllers. After initially only expecting to make one for himself and fellow gaming enthusiast and historian Jeremy Parish, he’s made roughly a dozen Virtual Boy controllers so far. The only thing limiting the demand for the controllers, essentially, is Edwards’ time.

    “Many others said that if the price were lower, they would have bought one too—but each stick takes me all day to make, so I end up getting paid something like $5/hour to make them when the math is all done,” he says. “It’s been impractical but wildly fun nonetheless.”

    Edwards, who frequently tweets his creations, has numerous other sticks for less-heralded platforms in the works as well, including some for the Neo Geo and Sega Saturn. (The more-heralded PlayStation 1 might get one, too.)

    I know that, as a fellow writer who has occasionally dabbled in the work of publishing zines, making the shift from writing about things to actually physically making them, and dealing with the logistics of shipping, design, and putting up with storefronts is quite the leap—but like everything else that’s happened with Edwards’ work, he appears to have let history lead the way.

    “I never suspected it once. But I also never planned on becoming a professional journalist, either,” he says of his newfound status as a maker. “When your doors are open for new experiences, life falls into place in ways that you least expect it.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 3:46 pm on February 15, 2019 Permalink | Reply
    Tags: As the second biggest earthquake on record—an 8.2 on the Richter scale—the 1994 Bolivian event fit the bill, “We need big earthquakes to allow seismic waves to travel through the mantle and core bounce off the 660-kilometer discontinuity and travel all the way back through the Earth to be detected at the t, Motherboard, Mountains Bigger Than Everest May Lie Deep Inside Earth, One of the only ways to peer inside of Earth is with seismic waves   

    From Motherboard: “Mountains Bigger Than Everest May Lie Deep Inside Earth” 

    motherboard

    From Motherboard

    Feb 15 2019
    Becky Ferreira

    Scientists used the second largest earthquake on record to glimpse the terrain 410 miles under our planet’s surface.

    1
    Model of Earth’s interior. Image: NASA/JPL/Université Paris Diderot and the Institut de Physique du Globe de Paris.

    Hundreds of miles below our feet, there is a subterranean mountain range with peaks that may rival the Himalayas, says a new study.

    Scientists were able to catch a glimpse of the hulking structures in seismic wave data captured during the 1994 Bolivia earthquake, according to the study, published Thursday in Science.

    Earth’s mantle is a dense band of silicate rock that extends from the crust to the core, accounting for 84 percent of our planet’s volume. At 410 miles from the surface, a boundary known as the 660-kilometer discontinuity divides the mantle into its upper and lower levels.

    Scientists can tell that rock becomes significantly rougher and denser at this spot, but it’s difficult to get a read of the topography. Detailed information about the boundary could help resolve many mysteries about the mantle, such as how much the upper and lower layers mix together, so scientists wanted to examine it more closely.

    One of the only ways to peer inside of Earth is with seismic waves, which are ripples of energy that travel through the planet during major disruptions such as earthquakes or asteroid impacts. When the waves meet different textures, minerals, and structures, they bounce off them in a similar way to light waves reflecting off objects. This provides a rough seismic snapshot of Earth’s interior.

    “We need big earthquakes to allow seismic waves to travel through the mantle and core, bounce off the 660-kilometer discontinuity, and travel all the way back through the Earth to be detected at the top of the crust,” Jessica Irving, a geophysicist at Princeton University and an author of the study, told Motherboard in an email.

    As the second biggest earthquake on record—an 8.2 on the Richter scale—the 1994 Bolivian event fit the bill.

    The team enlisted Princeton’s Tiger supercomputer cluster to analyze measurements from the quake, so that they could reconstruct the structures at the boundary.

    Tiger Dell Linux supercomputer at Princeton University

    While the statistical model of the study could not pinpoint exact heights, there is “stronger topography than the Rocky Mountains or the Appalachians” at the boundary, according to lead author Wenbo Wu.

    “I can’t give you an estimated number,” Irving said, regarding the range’s altitude. “But the mountains on the 660-kilometer boundary could be bigger than Mount Everest.”

    The ruggedness of the range may be partly caused by an accumulation of old chunks of seafloor that get sucked into the mantle and then drift down to the boundary. There might be ancient relics of Earth’s earliest days piled up like cairns there.

    As seismology and supercomputing techniques continue to develop, scientists hope to learn more about the mantle mountains.

    “I think future research will be able teach us more about these topographic mountains and how they are distributed around the planet—we see already that some parts of the 660-kilometer boundary are much smoother than others,” Irving said.

    The research not only informs ongoing debates about Earth’s evolution, it also sheds light on the processes and structures that may shape other planets.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 3:27 pm on January 31, 2019 Permalink | Reply
    Tags: All matter is subject to phase transitions that can be induced by varying the conditions (pressure temperature etc.) in a system, Half-quantum vortices, Motherboard, Physicists Created Quantum Structures That May Have Birthed Dark Matter, The extremely high temperatures and pressures that dominated the first microseconds after the Big Bang would have prevented the creation of permanent particles, Walls bounded by strings   

    From Motherboard: “Physicists Created Quantum Structures That May Have Birthed Dark Matter” 

    motherboard

    From Motherboard

    Jan 31 2019
    Daniel Oberhaus

    Some cosmologists have predicted the existence of “walls bounded by strings” in the aftermath of the Big Bang, and now a team of physicists have created these quantum structures on Earth for the first time.

    1
    Physicists in Finland have experimentally created quantum structures that some cosmologists believe were formed seconds after the Big Bang, and may have given birth to dark matter.

    As detailed in paper recently published in Nature Communications, the researchers at the Low Temperature Laboratory at Aalto University were able to create quantum objects known as half-quantum vortices and walls bounded by strings in superfluid helium.

    A superfluid is a liquid that has no viscosity and is thus able to flow without losing its energy. Although half-quantum vortices were created in superfluid helium for the first time a few years ago, this is the first time that researchers have demonstrated they are able to survive phase transitions into different types of superfluidity. The physicists also demonstrated that after a certain superfluid phase transition these half-quantum vortices form a new quantum object known as walls bounded by strings, which were first predicted by cosmologists decades ago, but never realized in a lab until now.

    The breakthrough may have applications for testing theories about the early universe, especially certain theories about the origin of dark matter. But before we dive into the significance of the research it will help to have a little background.

    HOW DO PHASE TRANSITIONS BREAK SYMMETRY?

    All matter is subject to phase transitions that can be induced by varying the conditions (pressure, temperature, etc.) in a system. An example of a phase transition familiar to all of us is when liquid water transitions to a solid (ice) at 32 degrees Fahrenheit at sea level. Each time a phase transition occurs in a material, it alters the material’s symmetry.

    Symmetry is the most fundamental concept in physics, and ultimately constrains how particles can interact with one another. In physics, symmetry can be thought of as the properties of a system that stay the same when some change is applied to that system. This is pretty abstract, so consider an example given by University of Oregon professor James Schombert in which he likens pure symmetry to spinning a coin.

    “The coin has two states [heads or tails], but while it is spinning neither state is determined, and yet both states exist,” Schombert wrote. “The coin is in a state of both/or. When the coin hits the floor, the symmetry is broken (it’s either heads or tails) and energy is released in the process [as noise].”

    HOW TO CREATE TOPOLOGICAL DEFECTS

    Breaking symmetry during phase transitions can produce topological defects, artefacts from the original ground state that remain after the system has undergone a phase transition.

    Consider again the case of liquid water turning into ice. As the temperature drops, the water begins to turn into ice crystals at many different locations, and those crystals grow until they begin to intersect with the other ice crystals. Each of those ice crystals independently have an ordered crystalline structure, but this pattern is broken at the boundaries where they intersect with other ice crystals. The jagged boundaries of the ice crystals are an example of topological defects.

    At extremely low temperatures, topological defects can take the form of quantum objects such as half-quantum vortices and domain walls.

    To see if these quantum objects can survive helium’s phase transition between different types of superfluids, the researchers cooled helium-3 down to less than half a millikelvin above absolute zero (theoretically the lowest possible temperature). Depending on the ambient pressure, helium-3 transitions into a superfluid at temperatures between one and three millikelvin.

    2
    An artist’s depiction of “walls bounded by strings.” Image: Ella Maru Studios.

    Half-quantum vortices can be thought of as a perpetual whirlpool of helium. According to Jere Mäkinen, a doctoral student and the lead author of the new research, these half-quantum vortices can only be created during helium’s transition into a superfluid in the polar phase. “Polar” here means that the pairs of tightly-bonded helium-3 atoms that are formed during the phase change have an angular momentum, or rotation, that is aligned either “up” or “down.” By orienting itself one way instead of the other during a phase transition, the superfluid helium-3 breaks a fundamental symmetry, which results in the formation of the vortices.

    While the formation of half-quantum vortices in helium-3 had been demonstrated in previous experiments, what Mäkinen and his colleagues wanted to know was whether the vortices would survive a phase transition into two “deeper” phases of superfluidity that are characterized by polar distortion, known as polar distortion-A (pdA) and polar distortion-B (pdB).

    As Mäkinen told me in an email, not only did the half-quantum vortices survive helium’s transition into both of the polar distorted superfluid phases, which had never been seen before, but the fact that the vortices survived the transition to pdB meant that “walls bounded by strings” must have been created in the process.

    Unlike a normal wall we encounter in day-to-day life, these quantum walls do not block the flow of the helium vortices but rather alter the magnetic properties of the helium in the vortex. This was the first time that walls bounded by strings, also known as domain walls, were created in a laboratory setting.

    CREATING THE EARLY UNIVERSE IN A LAB

    The dynamics of symmetry-breaking and the topological defects that are produced during phase transitions are fundamental to how some cosmologists explain how the universe formed directly after the Big Bang.

    The further back in time we go toward the Big Bang, the matter in the universe gets hotter, and more symmetric. This process can be extrapolated back to the earliest instance, a theoretical point known as the “grand unification.” This might be thought of as the original phase, which rapidly underwent a series of transitions in the first few seconds after the Big Bang until the universe gradually cooled and formed the matter that we’re all familiar with today.

    The problem for cosmologists is that the extremely high temperatures and pressures that dominated the first microseconds after the Big Bang would have prevented the creation of permanent particles. These could only come about later, after the universe had sufficiently cooled.

    What cosmologists want to understand, then, is the dynamics of the phase transitions in the early universe that allowed for the emergence of the fundamental forces (weak, strong, electromagnetic) and finally the particles that make up the ordinary matter we’re all familiar with.

    According to Tanmay Vachaspati, a theoretical cosmologist at Arizona State University, there is a grand unified model that incorporates vortices and walls as the precursors of axions, a leading particle candidate for dark matter. Although the vortices and domain walls would be destroyed in the process of producing the axions, the resulting dark matter would be the scaffolding upon which the large scale structures of the universe, such as galaxies, are built.

    Although this theory about vortices in the early universe and their role in the macrostructure of the universe has been around for decades, it lacked any clear path to an experimental test.

    The theory about the role of half-quantum vortices in the formation of the macrostructure of the universe is by no means widely accepted. Mäkinen said that many leading cosmologists have abandoned this idea in favor of quantum fluctuations and inflation as the explanation for the large-scale structure of the universe.

    Nevertheless, Vachaspati that since the physics demonstrated in the lab would carry over to the universe at large, the results of the experiments in the Aalto lab are of interest to cosmologists.

    In this respect, Mäkinen and his colleagues have created a way for cosmologists to experimentally recreate properties of the early universe predicted in some cosmological models. Going forward, these experimental tests of cosmological theories could greatly advance our understanding of why the universe formed the way it did—or at least help rule out some alternative theories.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 11:22 am on January 31, 2019 Permalink | Reply
    Tags: , , , Blandford-Znajek mechanism, , Motherboard, , New Supercomputer Simulations Show How Plasma Jets Escape Black Holes, Penrose process   

    From Motherboard: “New Supercomputer Simulations Show How Plasma Jets Escape Black Holes” 

    motherboard

    From Motherboard

    Jan 30 2019
    Daniel Oberhaus

    Black holes swallow everything that comes in contact with them, so how do plasma jets manage to escape their intense gravity?

    1
    Visualization of a general-relativistic collisionless plasma simulation. Image: Parfrey/LBNL

    Researchers used one of the world’s most powerful supercomputers to better understand how jets of high energy plasma escapes the intense gravity of a black hole, which swallows everything else in its path—including light.

    Before stars and other matter cross a black hole’s point of no return—a boundary known as the “event horizon”—and get consumed by the black hole, they get swept up in the black hole’s rotation. A question that has vexed physicists for decades was how some energy managed to escape the process and get channeled into streams of plasma that travel through space near the speed of light.

    As detailed in a paper published last week in Physical Review Letters, researchers affiliated with the Department of Energy and the University of California Berkeley used a supercomputer at the DoE’s Lawrence Berkeley National Laboratory to simulate the jets of plasma, an electrically charged gas-like substance.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    The simulations ultimately reconciled two decades-old theories that attempt to explain how energy can be extracted from a rotating black hole.

    The first theory describes how electric currents around a black hole twist its magnetic field to create a jet, which is known as the Blandford-Znajek mechanism. This theory posits that material caught in the gravity of a rotating black hole will become increasingly magnetized the closer it gets to the event horizon. The black hole acts like a massive conductor spinning in a huge magnetic field, which will cause an energy difference (voltage) between the poles of the black hole and its equator. This energy difference is then diffused as jets at the poles of the black hole.

    “There is a region around a rotating black hole, called the ergosphere, inside of which all particles are forced to rotate in the same direction as the black hole,” Kyle Parfrey, the lead author of the paper and a theoretical astrophysicist at NASA, told me in an email. “In this region it’s possible for a particle to effectively have negative energy in some sense, if it tries to orbit against the hole’s rotation.”

    In other words, if one half of the split particle is launched against the spin of the black hole, it will reduce the black hole’s angular momentum or rotation. But that rotational energy has to go somewhere. In this case, it’s converted into energy that propels the other half of the particle away from the black hole.

    According to Parfrey, the Penrose process observed in their simulations was a bit different from the classical situation of a particle splitting that was described above, however. Rather than particles splitting, charged particles in the plasma are acted on by electromagnetic forces, some of which are propelled against the rotation of the black hole on a negative energy trajectory. It is in this sense, Parfrey told me, that they are still considered a type of Penrose process.

    The surprising part of the simulation, Parfrey told me, was that it appeared to establish a link between the Penford process and Blandford-Znajek mechanism, which had never been seen before.

    To create the twisting magnetic fields that extract energy from the black hole in the Blandford-Znajek mechanism requires the electric current carried by particles inside the plasma and a substantial number of these particles had the negative energy property characteristic of the Penrose process.

    “So it appears that, at least in some cases, the two mechanisms are linked,” Parfrey said.

    Parfrey and his colleagues hope that their models will provide much needed context for photos from the Event Horizon Telescope, an array of telescopes that aim to directly image the event horizon where these plasma jets form. Until that first image is produced, however, Parfery said he and his colleagues want to refine these simulations so that they conform even better to existing observations.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 1:15 pm on December 21, 2018 Permalink | Reply
    Tags: , AI is also one of the ten key industries outlined in Made in China 2025, Artificial intelligence requires large amounts of data to learn and discern patterns whether those are pictures audio or text as they interpret media differently from humans, China Is Achieving AI Dominance by Relying on Young Blue-Collar Workers, Last year venture capitalists poured $5 billion into AI startups in China, Motherboard   

    From Motherboard: “China Is Achieving AI Dominance by Relying on Young Blue-Collar Workers” 

    motherboard

    From Motherboard

    1
    Employees at Jun Peng Technology, the only data labeling shop in Minquan, a town of 318,000 in Henan.

    Dec 21 2018
    Huizhong Wu

    To remain the world leader in artificial intelligence, China relies on young “data labelers” who work eight hours a day processing massive amounts of data to make computers smart.

    Zhou Junkai’s office sits on the edge of Dongsha river, a staid body of water that divides the old and new sections of Minquan, a town of 318,000 in central China’s Henan province. It is here that Zhou, 19, founded his small shop of data labelers along with his 26-year old cousin this summer.

    The office of Jun Peng Technology company’s office is a rented traditional courtyard home, found in China’s rural areas. These homes are large and two or three stories tall, unlike the ubiquitous apartment towers seen across China. Behind the house, a man is raking dead leaves on a plot of land that Zhou said is still used for crops.

    Inside, the only warm room is the office, where a dozen young people sit in front of wide, glowing screens. The screens and the fluorescent lights do little to brighten the room on a November day where the pollution levels have blocked the sun with a dense smoggy haze.

    The young people are “data labelers,” people who sit in front of computers for eight hours a day and click on dozens of photos, outlining backgrounds, foregrounds, and specific objects, all according to the specifications of a client who is working on artificial intelligence. Some may label medical scans; others, photos of landscapes and trees; and still others, pictures of the road for a driverless vehicle. This is the data given to artificial intelligence algorithms to learn to “see.” The artificial intelligence industry relies on this cheap, human labor as algorithms and “machine learning” are in many cases trained by real people.

    Artificial intelligence requires large amounts of data to learn and discern patterns, whether those are pictures, audio or text as they interpret media differently from humans. To teach the algorithms how to accurately recognize an apple is an apple, it needs thousands to millions of pictures of apples. Further, it’s easily fooled. In one experiment, security researchers found that by distorting a picture of a school bus, although the change was invisible to the human eye, the artificial intelligence system could no longer recognize that it was a school bus.

    Money is flowing into China’s artificial intelligence industry, and few places illustrate that better than Henan. In a province that just a few years ago was known for its Foxconn plant (which makes Apple products) and electronics factories, its towns now boast offices of workers who are doing the laborious input work that makes computers smart.

    Last year, venture capitalists poured $5 billion into AI startups in China, which raised more money in the sector than the United States for the first time, according to consultancy AIB research. The Chinese government has made the field a priority, announcing an ambitious policy the same summer to construct an industry worth $150 billion by 2030.

    AI is also one of the ten key industries outlined in Made in China 2025, an economic master plan that the government is pushing to take the country from a mass manufacturing, low-value economy to a high technology, high-value one. China is now home to Sensetime, the world’s most valuable AI company, which focuses on facial and image recognition and works with local governments across the country on surveillance. It’s worth an estimated $4.5 billion, according to research firm CB Insights.

    But in an echo of the manufacturing factories that pushed China’s economic development in the 2000s, the country has also found itself home to a growing side industry of labor-intensive data labeling companies, which supply and process the massive amounts of data for the algorithms to consume. Aside from a few established large firms in China’s biggest cities, these companies are mainly growing in smaller cities, towns, and rural areas.

    Zhou had the idea of setting up shop after seeing a number of similar outfits in the town of Pingding Shan, a few hours west. Together, the cousins pooled together their family’s years of savings ($45,000) to buy a few dozen computers and rent an office space. They are, as far as they know, the only ones in Minquan.

    “If you don’t know what you’ll do in the future, you can either go to a big city, and be a white-collar worker and then everyday you’re squeezed onto public transport,” he said. “As for other [fields], if you want to be No. 1, you need a lot of knowledge, experience, and education. These are things we don’t have.”

    It was difficult to find a job as a car mechanic, he said. He worked in a factory briefly and then quit. Those shifts were grueling—14 hour days.

    “I thought I couldn’t stand it anymore,” he said. But “this industry felt like it had potential.”

    Many are flocking to the data labeling industry now, said Han Jinhao, who started his data labeling company a little more than a year ago in Zhengzhou, the capital city of Henan province. His company, Dianwokeji, employs more than 100 data labelers.

    “Even though labeling is rather low-level work, the barrier to entry is relatively low, and it is still the AI industry,” he said. “So we thought if we can start from here, we can slowly, step by step, move towards something more high-value.”

    an counts more than 6,000 data labeling outfits that have registered on a Craigslist-like platform he built, where smaller outfits can find outsourcing gigs and hire new employees.

    Zhao Mengyao, 18, is new to the job. She started working at Zhou’s company in October. When I visit the office she is tracing over the white lines of a parking space in a parking lot. The picture is distorted, with the lines bent as if the camera had a fish-eye lens, but she mouses over them with ease. After 20 minutes, Zhao moves on to the next photo in her set. It’s another photo of a parking lot, from a different angle.

    3
    A young woman studies how to label photos of cars. Data labelers are given specific instructions on how to label pictures for each client. Image: Author

    Next to her, a young man draws around the fluffy edges of a singer’s orange dress, pixel by pixel. After that, he starts tracing the outline of a man playing golf.

    Zhao used to be a makeup artist at a wedding portrait studio, but quit because she found the work exhausting. There were days where she had to wake up at 4 AM to prepare for the client’s shoot and would get home by 7 PM.

    Now, she says, she starts at 8 AM and leaves at 6 PM, with an hour and a half off in between. During the lunch break, Zhao and her co-workers trade snappy comments as they play games at the same consoles where they labeled photos.

    “I think this is pretty good. There’s a lot of freedom working here,” she said.

    Zhao says the salary is okay. She gets paid per set of 20 pictures, at about 20 RMB (roughly $3). She can finish anywhere between four and eight sets, or 80 to 160 pictures, a day. When I asked her where she thought the pictures would go, she said she didn’t know.

    The seven data labelers I talked to received monthly salaries from around 2,000 RMB ($290) to 4,000 RMB ($580). This is on par with a Chinese worker’s average disposable income or their take home income after taxes, which in 2017 was 2,164 RMB (or about $330). “There are many jobs available at this kind of salary in Zhengzhou,” said Wang Yushuang, a 25-year-old Dianwokeji employee.

    4
    Most of the employees are in their early 20s at Dianwokeji, a data labeling company in Zhengzhou. Most of the employees are in their early 20s at Dianwokeji, a data labeling company in Zhengzhou. Image: Author

    The standard for teaching AI photo recognition is to use images from ImageNet, a database of more than 14 million images created by Stanford University professor Li Fei-Fei and her team. The database relies on Amazon’s Mechanical Turk, which outsources labor-intensive tasks such as labeling photos for a few cents to internet users.

    But as businesses around the world are racing to find applications for artificial intelligence across industries ranging from driverless vehicles to medical diagnostics, ImageNet and Mechanical Turk are proving to be insufficient.

    A healthcare business that helps provide more accurate diagnoses needs very detailed points to help the artificial intelligence learn the difference, say, for example, between a tumor and an eyeball in a CT scan, because it wouldn’t be able to distinguish them on its own at first, Peter Yang, the founder of data labeling company Awakening Vector, told me by phone. It needs data that points out what a tumor looks like in a picture, across many different pictures, which requires a human to click and label the photo.

    But most AI startups only have a few full-time employees, usually data scientists, Yang said.

    “It’s something that requires a lot of physical labour.” Yang said. “You can’t expect people who have such high salaries to do this labor-intensive work, so you have to outsource this.”

    Further, there are issues of privacy and quality control. Medical images need to be kept private, for example. Mechanical Turk tasks are performed by any registered user who wants to earn money, not employees with a dedicated salary who work Monday through Friday.

    Outsourcing has meant that these businesses are now popping up all over China. Yang’s business is based in China’s Xinjiang Uyghur Autonomous Region and clients include Baidu, China’s main search engine, and Novartis, the multinational pharmaceutical company. Han’s company, which serves Chinese firms, such as a few driverless vehicle startups, has branches in smaller cities across Henan and neighboring Shandong province.

    Conventional wisdom goes that with more advanced technology, those with low-skilled jobs will lose the most. Academic research has mostly backed that up. But it doesn’t mean that technology will necessarily replace all jobs.

    Historical research shows that automation has led to a job boom, James Bessen, the executive director at Boston University’s Technology and Policy Research Initiative, told me. He pointed to the textile industry as an example.

    In the early 19th century, most people only had one set of clothing because of the cost of cloth, Bessen said. But as technology increased, and certain tasks became automated and lowered the cost of creating clothing, the demand for cloth grew. More clothes led to more jobs. Although the textile industry was considered “low-skilled,” as it expanded dramatically in size, it also brought on new workers who had to learn to operate complicated machinery. While jobs were outsourced to developing countries, there was no net loss of jobs. It is only when demand is satisfied that the number of jobs start declining.

    China, for now, is cheap relative to the US and it has the labour to take advantage of this.

    The work is also expanding beyond picture labeling. Many companies are also paying for sound recognition, video labeling, and even raw data. Zhou and his team have collected children’s voice recordings, or people speaking a Henan dialect.

    For some workers, there’s a distinct sense of pride at being a part of a new industry. “We are doing something very basic, but we are [also] a very important part of it, helping the robots learn and see a bunch of data,” said Wang.

    What happens when one day the algorithms have learned to recognize things on their own? Will the tens of thousands of low-skilled in AI lose their jobs?

    Han seems unconcerned. “If it’s really at that stage, then maybe humans won’t be alive anymore. Do you think mankind will let something that’s not even alive control mankind? We would only teach it to serve us. I wouldn’t teach it so well that one day I serve the machine.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 1:40 pm on November 27, 2018 Permalink | Reply
    Tags: , , Motherboard, , NASA’s InSight Lander Is Already Snapping Amazing Pictures of Mars   

    From Motherboard: “NASA’s InSight Lander Is Already Snapping Amazing Pictures of Mars” 

    motherboard

    From Motherboard

    Nov 27 2018
    Becky Ferreira

    On its first sol on the red planet, the mission sent home images of a dusty landscape, a lander selfie, and a wide shot of Mars from space.

    1
    NASA’s InSight lander takes its first selfie on November 26, 2018. Image: NASA/JPL-Caltech.

    Shortly after it successfully touched down on Mars on Monday, NASA’s InSight lander took a selfie showing off its new home in the Elysium Planitia region. The picture was taken by the mission’s Instrument Deployment Camera (IDC), mounted on the lander’s robotic arm, and captures the upper deck of InSight’s instrument package, against a backdrop of flat Martian terrain.

    Though it was InSight’s first selfie on the red planet, it was not the first picture the lander sent back to Earth. Just minutes after its nail-biting touchdown, InSight sent a quick landscape shot home to the mission control team at NASA’s Jet Propulsion Laboratory.

    2
    This image was taken with the Instrument Context Camera (ICC), which is attached directly to the lander’s deck and provides a wide-angle fisheye view of the landscape. The ICC lens is speckled with dust kicked up by the retrorockets that guided the craft safely down to its landing site.

    But the lander wasn’t the only mission component busy snapping exhilarating new pictures of Mars. Perhaps the most groundbreaking snapshot came from MarCO-B, a trailblazing satellite that imaged Mars during its flyby at a distance of about 3,700 miles (6,000 kilometers).

    5

    JPL Cubesat MarCO Mars Cube

    MarCO-B, along with its twin MarCO-A—nicknamed “Wall-E” and “EVE” respectively—are both CubeSats, a class of miniaturized cubic satellite introduced to reduce the cost of spaceflight. Hundreds of CubeSats have been deployed in low Earth orbit, but the MarCO satellites are the first to voyage into deep space.

    The CubeSats are about the size of a shoebox, and were launched with the InSight lander back in May, before separating from the main spacecraft to pursue their own trajectories to Mars. Just a few days into the trip, MarCO-B took this picture of Earth with its wide field camera.

    The MarCO satellites were not essential for the mission, and were bundled into InSight to test out CubeSat performance in deep space. Their successful communications performance and the dazzling shots bode well for the use of CubeSats in interplanetary missions.

    Given how many fascinating visuals InSight has sent home on its very first sol on Mars, it seems like the mission is already paying off. No doubt the lander will produce many more stunning pictures—not to mention tantalizing data about Mars’ interior—in the years to come.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 5:34 pm on August 30, 2018 Permalink | Reply
    Tags: , Borexino observatory, , , , , DarkSide experiment, Davide D’Angelo-physical scientist, , , Motherboard, , , , Pobbile dark matter candidates-axions gravitinos Massive Astrophysical Compact Halo Objects (MACHOs) and Weakly Interacting Massive Particles (WMIPs.)), SABRE-Sodium Iodide with Active Background Rejection Experiment, , Solar neutrinos-recently caught at U Wisconsin IceCube at the South Pole, , , , , , WIMPs that go by names like the gravitino sneutrino and neutralino   

    From Gran Sasso via Motherboard: “The New Hunt for Dark Matter Is Taking Place Under a Mountain” 

    From Gran Sasso

    via

    Motherboard

    1

    Aug 30 2018
    Daniel Oberhaus

    Davide D’Angelo wasn’t always interested in dark matter, but now he’s at the forefront of the hunt to find the most elusive particle in the universe.

    About an hour outside of Rome there’s a dense cluster of mountains known as the Gran Sasso d’Italia. Renowned for their natural beauty, the Gran Sasso are a popular tourist destination year round, offering world-class skiing in the winter and plenty of hiking and swimming opportunities in the summer. For the 43-year old Italian physicist Davide D’Angelo, these mountains are like a second home. Unlike most people who visit Gran Sasso, however, D’Angelo spends more time under the mountains than on top of them.

    It’s here, in a cavernous hall thousands of feet beneath the earth, that D’Angleo works on a new generation of experiments dedicated to the hunt for dark matter particles, an exotic form of matter whose existence has been hypothesized for decades but never proven experimentally.

    Dark matter is thought to make up about 27 percent of the universe and characterizing this elusive substance is one of the most profound problems in contemporary physics. Although D’Angelo is optimistic that a breakthrough will occur in his lifetime, so was the last generation of physicists. In fact, there’s a decent chance that the particles D’Angelo is looking for don’t exist at all. Yet for physicists probing the fundamental nature of the universe, the possibility that they might spend their entire career “hunting ghosts,” as D’Angelo put it, is the price of advancing science.

    WHAT’S UNDER THE ‘GREAT STONE’?

    In 1989, Italy’s National Institute for Nuclear Physics opened the Gran Sasso National Laboratory, the world’s largest underground laboratory dedicated to astrophysics. Gran Sasso’s three cavernous halls were purposely built for physics, which is something of a luxury as far as research centers go. Most other underground astrophysics laboratories like SNOLAB are ad hoc facilities that repurpose old or active mine shafts, which limits the amount of time that can be spent in the lab and the types of equipment that can be used.


    SNOLAB, Sudbury, Ontario, Canada.

    Buried nearly a mile underground to protect it from the noisy cosmic rays that bathe the Earth, Gran Sasso is home to a number of particle physics experiments that are probing the foundations of the universe. For the last few years, D’Angelo has divided his time between the Borexino observatory and the Sodium Iodide with Active Background Rejection Experiment (SABRE), which are investigating solar neutrinos and dark matter, respectively.

    Borexino Solar Neutrino detector

    SABRE experiment at INFN Gran Sasso

    2
    Davide D’Angelo with the SABRE proof of concept. Image: Xavier Aaronson/Motherboard

    Over the last 100 years, characterizing solar neutrinos and dark matter was considered to be one of the most important tasks of particle physics. Today, the mystery of solar neutrinos is resolved, but the particles are still of great interest to physicists for the insight they provide into the fusion process occurring in our Sun and other stars. The composition of dark matter, however, is still considered to be one of the biggest questions in particle physics. Despite the radically different nature of the particles, they are united insofar as they both can only be discovered in environments where the background radiation is at a minimum: Thousands of feet beneath the Earth’s surface.

    “The mountain acts as a shield so if you go below it, you have so-called ‘cosmic silence,’” D’Angelo said. “That’s the part of my research I like most: Going into the cave, putting my hands on the detector and trying to understand the signals I’m seeing.”

    After finishing grad school, D’Angelo got a job with Italy’s National Institute for Nuclear Physics where his research focused on solar neutrinos, a subatomic particle with no charge that is produced by fusion in the Sun. For the better part of four decades, solar neutrinos [recently caught at U Wisconsin IceCube at the South Pole] were at the heart of one of the largest mysteries in astrophysics.

    IceCube neutrino detector interior


    U Wisconsin ICECUBE neutrino detector at the South Pole

    The problem was that instruments measuring the energy from solar neutrinos returned results much lower than predicted by the Standard Model, the most accurate theory of fundamental particles in physics.

    Given how accurate the Standard Model had proven to be for other aspects of cosmology, physicists were reluctant to make alterations to it to account for the discrepancy. One possible explanation was that physicists had faulty models of the Sun and better measurements of its core pressure and temperature were needed. Yet after a string of observations in the 60s and 70s demonstrated that the models of the sun were essentially correct, physicists sought alternative explanations by turning to the neutrino.

    A TALE OF THREE NEUTRINOS

    Ever since they were first proposed by the Austrian physicist Wolfgang Pauli in 1930, neutrinos have been called upon to patch holes in theories. In Pauli’s case, he first posited the existence of an extremely light, chargeless particle as a “desperate remedy” to explain why the law of the conservation of energy appeared to be violated during radioactive decay. Three years later, the Italian physicist Enrico Fermi gave these hypothetical particles a name. He called them “neutrinos,” Italian for “little neutrons.”

    A quarter of a century after Pauli posited their existence, two American physicists reported the first evidence of neutrinos produced in a fission reactor. The following year, in 1957, Bruno Pontecorvo, an Italian physicist working in the Soviet Union, developed a theory of neutrino oscillations. At the time, little was known about the properties of neutrinos and Pontecorvo suggested that there might be more than one type of neutrino. If this were the case, Pontecorvo theorized that it could be possible for the neutrinos to switch between types.

    By 1975, part of Pontecorvo’s theory had been proven correct. Three different types, or “flavors,” of neutrino had been discovered: electron neutrinos, muon neutrinos, and tau neutrinos. Importantly, observations from an experiment in a South Dakota mineshaft had confirmed that the Sun produced electron neutrinos. The only issue was that the experiment detected far fewer neutrinos than the Standard Model predicted.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    Prior to the late 90s, there was scant indirect evidence that neutrinos could change from one flavor to another. In 1998, a group of researchers working in Japan’s Super-Kamiokande Observatory observed oscillations in atmospheric neutrinos, which are mostly produced by the interactions between photons and the Earth’s atmosphere.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    Three years later, Canada’s Sudbury Neutrino Observatory (SNO) provided the first direct evidence of oscillations from solar neutrinos.

    Sudbury Neutrino Observatory, no longer operating

    This was, to put it lightly, a big deal in cosmological physics. It effectively resolved the mystery of the missing solar neutrinos, or why experiments only observed about a third as many neutrinos radiating from the Sun compared to predictions made by the Standard Model. If neutrinos could oscillate between flavors, this means a neutrino that is emitted in the Sun’s core could be a different type of neutrino by the time it reaches Earth. Prior to the mid-80s, most experiments on Earth were only looking for electron neutrinos, which meant they were missing the other two flavors of neutrinos that were created en route from the Sun to the Earth.

    When SNO was dreamt up in the 80s, it was designed so that it would be capable of detecting all three types of neutrinos, instead of just electron neutrinos. This decision paid off. In 2015, the directors of the experiments at Super-Kamiokande and SNO shared the Nobel Prize in physics for resolving the mystery of the missing solar neutrinos.

    Although the mystery of solar neutrinos has been solved, there’s still plenty of science to be done to better understand them. Since 2007, Gran Sasso’s Borexino observatory has been refining the measurements of solar neutrino flux, which has given physicists unprecedented insight into the fusion process powering the Sun. From the outside, the Borexino observatory looks like a large metal sphere, but on the inside it looks like a technology transplanted from an alien world.

    Borexino detector. Image INFN

    In the center of the sphere is basically a large, transparent nylon sack that is almost 30 feet in diameter and only half a millimeter thick. This sack contains a liquid scintillator, a chemical mixture that releases energy when a neutrino passes through it. This nylon sphere is suspended in 1,000 metric tons of a purified buffer liquid and surrounded by 2,200 sensors to detect energy released by electrons that are freed when neutrinos interact with the liquid scintillator. Finally, an outer buffer of nearly 3,000 tons of ultrapure water helps provide additional shielding for the detector. Taken together, the Borexino observatory has the most protection from outside radiation interference of any liquid scintillator in the world.

    For the last decade, physicists at Borexino—including D’Angelo, who joined the project in 2011—have been using this one-of-a-kind device to observe low energy solar neutrinos produced by proton collisions during the fusion process in the Sun’s core. Given how difficult it is to detect these chargless, ultralight particles that hardly ever interact with matter, detecting the low energy solar neutrinos would be virtually impossible without such a sensitive machine. When SNO directly detected the first solar neutrino oscillations, for instance, it could only observe the highest energy solar neutrinos due to interference from background radiation. This amounted to only about 0.01 percent of all the neutrinos emitted by the Sun. Borexino’s sensitivity allows it to observe solar neutrinos whose energy is a full order of magnitude lower than those detected by SNO, opening the door for an incredibly refined model of solar processes as well as more exotic events like supernovae.

    “It took physicists 40 years to understand solar neutrinos and it’s been one of the most interesting puzzles in particle physics,” D’Angelo told me. “It’s kind of like how dark matter is now.”

    SHINING A LIGHT ON DARK MATTER

    If neutrinos were the mystery particle of the twentieth century, then dark matter is the particle conundrum for the new millenium. Just like Pauli proposed neutrinos as a “desperate remedy” to explain why experiments seemed to be violating one of the most fundamental laws of nature, the existence of dark matter particles is inferred because cosmological observations just don’t add up.

    In the early 1930s, the American astronomer Fritz Zwicky was studying the movement of a handful of galaxies in the Coma cluster, a collection of over 1,000 galaxies approximately 320 million light years from Earth.

    Fritz Zwicky, the Father of Dark Matter research.No image credit after long search

    Vera Rubin did much of the work on proving the existence of Dark Matter. She and Fritz were both overlooked for the Nobel prize.

    Vera Rubin measuring spectra (Emilio Segre Visual Archives AIP SPL)


    Astronomer Vera Rubin at the Lowell Observatory in 1965. (The Carnegie Institution for Science)

    Using data published by Edwin Hubble, Zwicky calculated the mass of the entire Coma galaxy cluster.

    Coma cluster via NASA/ESA Hubble

    When he did, Zwicky noticed something odd about the velocity dispersion—the statistical distribribution of the speeds of a group of objects—of the galaxies: The velocity distribution was about 12 times higher than it should be based on the amount of matter in the galaxies.

    Inside Gran Sasso- Image- Xavier Aaronson-Motherboard

    This was a surprising calculation and its significance wasn’t lost on Zwicky. “If this would be confirmed,” he wrote, “we would get the surprising result that dark matter is present in much greater amount than luminous matter.”

    The idea that the universe was made up mostly of invisible matter was a radical idea in Zwicky’s time and still is today. The main difference, however, is that astronomers now have much stronger empirical evidence pointing to its existence. This is mostly due to the American astronomer Vera Rubin, whose measurement of galactic rotations in the 1960s and 70s put the existence of dark matter beyond a doubt. In fact, based on Rubin’s measurements and subsequent observations, physicists now think dark matter makes up about 27 percent of the “stuff” in the universe, about seven times more than the regular, baryonic matter we’re all familiar with. The burning question, then, is what is it made of?

    Since Rubin’s pioneering observations, a number of dark matter candidate particles have been proposed, but so far all of them have eluded detection by some of the world’s most sensitive instruments. Part of the reason for this is that physicists aren’t exactly sure what they’re looking for. In fact, a small minority of physicists think dark matter might not be a particle at all and is just an exotic gravitational effect. This makes designing dark matter experiments kind of like finding a car key in a stadium parking lot and trying to track down the vehicle it pairs with. There’s a pretty good chance the car is somewhere in the parking lot, but you’re going to have to try a lot of doors before you find your ride—if it even exists.

    Among the candidates for dark matter are subatomic particles with goofy names like axions, gravitinos, Massive Astrophysical Compact Halo Objects (MACHOs), and Weakly Interacting Massive Particles (WMIPs.) D’Angelo and his colleagues at Gran Sasso have placed their bets on WIMPs, which until recently were considered to be the leading particle candidate for dark matter.

    Over the last few years, however, physicists have started to look at other possibilities after some critical tests failed to confirm the existence of WIMPs. WIMPs are a class of hypothetical elementary particles that hardly ever interact with regular baryonic matter and don’t emit light, which makes them exceedingly hard to detect. This problem is compounded by the fact that no one is really sure how to characterize a WIMP. Needless to say, it’s hard to find something if you’re not even really sure what you’re looking for.

    So why would physicists think WIMPs exist at all? In the 1970s, physicists conceptualized the Standard Model of particle physics, which posited that everything in the universe was made out of a handful of fundamental particles.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    The Standard Model works great at explaining almost everything the universe throws at it, but it’s still incomplete since it doesn’t incorporate gravity into the model.

    Gravity measured with two slightly different torsion pendulum set ups and slightly different results

    In the 1980s, an extension of the Standard Model called Supersymmetry emerged, which hypothesizes that each fundamental particle in the Standard Model has a partner.

    Standard model of Supersymmetry DESY

    These particle pairs are known as supersymmetric particles and are used as the theoretical explanation for a number of mysteries in Standard Model physics, such as the mass of the Higgs boson and the existence of dark matter. Some of the most complex and expensive experiments in the world like the Large Hadron Collider particle accelerator were created in an effort to discover these supersymmetric particles, but so far there’s been no experimental evidence that these particles actually exist.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Many of the lightest particles theorized in the supersymmetric model are WIMPs and go by names like the gravitino, sneutrino and neutralino. The latter is still considered to be the leading candidate for dark matter by many physicists and is thought to have formed in abundance in the early universe. Detecting evidence of this ancient theoretical particle is the goal of many dark matter experiments, including the one D’Angelo works on at Gran Sasso.

    D’Angelo told me he became interested in dark matter a few years after joining the Gran Sasso laboratory and began contributing to the laboratory’s DarkSide experiment, which seemed like a natural extension of his work on solar neutrinos. DarkSide is essentially a large tank filled with liquid argon and equipped with incredibly sensitive sensors. If WIMPs exist, physicists expect to detect them from the ionization produced through their collision with the argon nuclei.

    Dark Side-50 Dark Matter Experiment at Gran Sasso

    The set up of the SABRE experiment is deliberately similar to another experiment that has been running at Gran Sasso since 1995 called DAMA. In 2003, the DAMA experiment began looking for seasonal fluctuations in dark matter particles that was predicted in the 1980s as a consequence of the relative motion of the sun and Earth to the rest of the galaxy. The theory posited that the relative speed of any dark matter particles detected on Earth should peak in June and bottom out in December.

    The DarkSide experiment has been running at Gran Sasso since 2013 and D’Angelo said it is expected to continue for several more years. These days, however, he’s found himself involved with a different dark matter experiment at Gran Sasso called SABRE [above], which will also look for direct evidence of dark matter particles based on the light produced when energy is released through their collision with Sodium-Iodide crystals.

    Over the course of nearly 15 years, DAMA did in fact register seasonal fluctuations in its detectors that were in accordance with this theory and the expected signature of a dark matter particle. In short, it seemed as if DAMA was the first experiment in the world to detect a dark matter particle. The problem, however, was that DAMA couldn’t completely rule out the possibility that the signature it had detected was in fact due to some other seasonal variation on Earth, rather than the ebb and flow of dark matter as the Earth revolved around the Sun.

    SABRE aims to remove the ambiguities in DAMA’s data. After all the kinks are worked out in the testing equipment, the Gran Sasso experiment will become one half of SABRE. The other half will be located in Australia in a converted gold mine. By having a laboratory in the northern hemisphere and another in the southern hemisphere, this should help eliminate any false positives that result from normal seasonal fluctuations. At the moment, the SABRE detector is still in a proof of principle phase and is expected to begin observations in both hemispheres within the next few years.

    When it comes to SABRE, it’s possible that the experiment may disprove the best evidence physicists have found so far for a dark matter particle. But as D’Angelo pointed out, this type of disappointment is a fundamental part of science.

    “Of course I am afraid that there might not be any dark matter there and we are hunting ghosts, but science is like this,” D’Angelo said. “Sometimes you spend several years looking for something and in the end it’s not there so you have to change the way you were thinking about things.”

    For D’Angelo, probing the subatomic world with neutrino and dark matter research from a cave in Italy is his way of connecting to the universe writ large.

    “The tiniest elements of nature are bonded to the most macroscopic phenomena, like the expansion of the universe,” D’Angelo said. “The infinitely small touches the infinitely big in this sense, and I find that fascinating. The physics I do, it’s goal is to push over the boundary of human knowledge.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    INFN Gran Sasso National Laboratory (LNGS) is the largest underground laboratory in the world devoted to neutrino and astroparticle physics, a worldwide research facility for scientists working in this field of research, where particle physics, cosmology and astrophysics meet. It is unequalled anywhere else, as it offers the most advanced underground infrastructures in terms of dimensions, complexity and completeness.

    LNGS is funded by the National Institute for Nuclear Physics (INFN), the Italian Institution in charge to coordinate and support research in elementary particles physics, nuclear and sub nuclear physics

    Located between L’Aquila and Teramo, at about 120 kilometres from Rome, the underground structures are on one side of the 10-kilometre long highway tunnel which crosses the Gran Sasso massif (towards Rome); the underground complex consists of three huge experimental halls (each 100-metre long, 20-metre large and 18-metre high) and bypass tunnels, for a total volume of about 180.000 m3.

    Access to experimental halls is horizontal and it is made easier by the highway tunnel. Halls are equipped with all technical and safety equipment and plants necessary for the experimental activities and to ensure proper working conditions for people involved.

    The 1400 metre-rock thickness above the Laboratory represents a natural coverage that provides a cosmic ray flux reduction by one million times; moreover, the flux of neutrons in the underground halls is about thousand times less than on the surface due to the very small amount of uranium and thorium of the Dolomite calcareous rock of the mountain.

    The permeability of cosmic radiation provided by the rock coverage together with the huge dimensions and the impressive basic infrastructure, make the Laboratory unmatched in the detection of weak or rare signals, which are relevant for astroparticle, sub nuclear and nuclear physics.

    Outside, immersed in a National Park of exceptional environmental and naturalistic interest on the slopes of the Gran Sasso mountain chain, an area of more than 23 acres hosts laboratories and workshops, the Computing Centre, the Directorate and several other Offices.

    Currently 1100 scientists from 29 different Countries are taking part in the experimental activities of LNGS.
    LNGS research activities range from neutrino physics to dark matter search, to nuclear astrophysics, and also to earth physics, biology and fundamental physics.

     
    • Marco Pereira 2:43 pm on September 1, 2018 Permalink | Reply

      I created a theory called the Hypergeometrical Universe Theory (HU). This theory uses three hypotheses:
      a) The Universe is a lightspeed expanding hyperspherical hypersurface. This was later proven correct by observations by the Sloan Digital Sky Survey
      https://hypergeometricaluniverse.quora.com/Proof-of-an-Extra-Spatial-Dimension
      b) Matter is made directly and simply from coherences between stationary states of deformation of the local metric called Fundamental Dilator or FD.
      https://hypergeometricaluniverse.quora.com/The-Fundamental-Dilator
      c) FDs obey the Quantum Lagrangian Principle (QLP). Yves Couder had a physical implementation (approximation) of the Fundamental Dilator and was perplexed that it would behave Quantum Mechanically. FDs and the QLP are the reason for Quantum Mechanics. QLP replaces Newtonian Dynamics and allows for the derivation of Quantum Gravity or Gravity as applied to Black Holes.

      HU derives a new law of Gravitation that is epoch-dependent. That makes Type 1a Supernovae to be epoch-dependent (within the context of the theory). HU then derives the Absolute Luminosity of SN1a as a function of G and showed that Absolute Luminosity scales with G^{-3}.
      Once corrected the Photometrically Determined SN1a distances, HU CORRECTLY PREDICTS all SN1a distances given their redshifts z.

      The extra dimension refutes all 4D spacetime theories, including General Relativity and L-CDM. HU also falsifies all Dark Matter evidence:
      https://www.quora.com/Are-dark-matter-and-dark-energy-falsifiable/answer/Marco-Pereira-1
      including the Spiral Galaxy Conundrum and the Coma Cluster Conundrum.

      Somehow, my theory is still been censored by the community as a whole (either directly or by omission).

      I hope this posting will help correct this situation.

      Like

  • richardmitnick 3:58 pm on December 6, 2017 Permalink | Reply
    Tags: , Future of fossile fuels, IPCC- Intergovernmental Panel on Climate Change, Last month 20 countries including the United Kingdom and Canada declared they were phasing out coal altogether, Motherboard, Oil has begun to go down the same road, Paris climate targets   

    From Motherboard: “Earth Will Likely Be Much Warmer in 2100 Than We Anticipated, Scientists Warn” 

    motherboard

    Motherboard

    Dec 6 2017
    Stephen Leahy

    But there’s some good news, too. Countries are starting to move away from fossil fuels.

    1
    Death Valley National Park. Image: Shutterstock

    Global temperature rise by 2100 could be 15 percent higher than the highest projections from the United Nations’ Intergovernmental Panel on Climate Change (IPCC), according to a new analysis of the most realistic climate models to date. This means cuts in greenhouse gases like carbon dioxide (CO2) will have to be even greater than expected to meet the Paris climate target of keeping global warming to less than 2℃.

    The world is a long way from making sufficient emission reductions to meet the Paris climate targets to begin with—nevermind cutting out another 15 percent. But there’s some good news, too. Both rich and poor countries have begun to move away from coal and oil, the two biggest CO2 sources, according to many energy analysts.

    “Coal and oil are too dirty, too expensive and too risky to invest in,” said Tom Sanzillo, Director of Finance at the Institute for Energy Economics and Financial Analysis, in an interview.

    Patrick Brown is a researcher at the Carnegie Institution for Science in Pasadena, California, a co-author of the study published Wednesday in Nature. “Our results imply 15 percent less cumulative emissions than previously calculated [are needed] in order to stay below 2℃,” he told me. Brown and co-authors focused on finding out what future warming might be, using only the climate models that best replicate observations over the last 15-20 years.

    On a business-as-usual emissions trajectory, they found that the mean global temperature rise would be 4.8℃ by 2100, compared to the IPCC estimate of 4.3℃. The latter estimate is considered catastrophic for our planet, and would lead to sea level rise of over 30 feet, potentially putting the homes of 600 million people underwater.

    The IPCC uses some 40 different climate models in its projections of future temperature increases, but some don’t replicate recent temperature observations well.

    “Our study shows that climate models that simulate the current climate with the most skill, tend to be models that project more global warming in the future,” Brown said.

    The Nature study combines the best knowledge of observations with state-of-the-art climate modelling, said Joeri Rogelj, a climate scientist at the International Institute for Applied Systems Analysis, a scientific institute focused on critical issues of global environmental, economic, technological, and social change, and based in Austria. (He was not involved in the paper.)

    “It shows that if we do not strongly cut emissions the risks of high warming could be at the higher end of earlier expected ranges,” Rogelj said in an email. “This heightens the urgency and need for global efforts to limit greenhouse gas emissions.”

    Fortunately, massive reductions in the cost of solar and wind energy, along with the growing electric vehicle market, shows that the cost of rapid CO2 reductions need not be as great as feared. In fact, it’s now cheaper to build and operate new solar and wind facilities than to keep old coal and nuclear plants running, according to a new report by Lazard, one the world’s biggest financial advisory and asset management firms.

    The Trump administration plans to force electricity customers to pay for a multi-billion dollar annual bailout of old and uncompetitive coal and nuclear plants through surcharges on their monthly energy bills. The handout, which is estimated to amount to between $311 million and $288 billion by independent analysts, would only go to a handful of energy companies. The plan is to go before the Federal Energy Regulatory Commission on December 11.

    There’s no future for coal, said Sanzillo, the energy finance expert. Between 2002 and 2016, 531 coal-generating units were retired, according to the US Department of Energy. Coal now generates less than 30 percent of US electricity compared to 35 percent from gas.

    Last month 20 countries, including the United Kingdom and Canada, declared they were phasing out coal altogether. Another 30 are expected to join the coal phaseout by 2018.

    Oil has begun to go down the same road. Last month the trillion-dollar Government Pension Fund of Norway said oil is now a risky investment with poor long term prospects and is removing its $36 billion in oil stocks, said Sanzillo.

    “Norway’s pension fund knows as much as any investor about the oil market,” he told Motherboard. For other investors holding oil stocks, Norway’s declaration is a “sign written in neon,” he said.

    Since the 2008 recession, oil and gas stocks have been falling while the broader stock market has been gaining. For decades, big oil led the markets. Now they lag, he said. “Oil is in decline. There’s a major structural change underway in energy markets.”

    There is a growing understanding that energy no longer has to be expensive because there are now better alternatives to fossil fuels, he said. On top of that, the world’s automobile companies are investing heavily in electric vehicles. This means a diminishing number of investors will put their money into high-cost oil production like Canada’s oil sands or drilling in the Arctic. Since 2014, energy companies have delayed or canceled at least 64 oil sands projects.

    Last week the US Senate passed a bill allowing oil drilling in Alaska’s Arctic National Wildlife Refuge as part of a sweeping tax overhaul bill. However oil prices would have to substantially higher to make it profitable to drill there.

    “It will be interesting to see who wants to go there,” said Sanzillo.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
  • richardmitnick 12:44 pm on September 15, 2017 Permalink | Reply
    Tags: , , , , , Motherboard, Timescape cosmology, Universe   

    From Motherboard: “New Supernova Analysis Questions Dark Energy, Cosmic Acceleration” 

    motherboard

    Motherboard

    Sep 15 2017
    Michael Byrne

    Timescape cosmology offers a way around one of the universe’s best mysteries.

    1
    Andrew Pontzen and Fabio Governato/ Wikimedia Commons

    One of my personal favorite features of the universe is that it is at this moment being ripped to shreds. Granted, it’s so far a very slow ripping, but, thanks to a peculiar property often referred to as dark energy, the universe is not just expanding, but it is accelerating in its expansion.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    It will continue to do so, which means that as time increases, it will expand faster and faster. Eventually all of this ripping will render existence an endless expanse of cold nothingness. Space will have been shredded and scattered to infinity.

    This is a still pretty new understanding. Though Einstein kinda-sorta predicted it, it wasn’t until the 1990s that observations of distant supernovae indicated to astronomers that space is receding from itself, that there is some fundamental-seeming driver―commonly referred to as dark energy―that makes empty spaces want to become bigger and emptier. The evidence was that light from these supernovae appeared to be redshifted, a phenomenon where life waves become stretched out as a light source moves away from the observer.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    According to a paper published this week in the Monthly Notices of the Royal Astronomical Society, we might just be wrong about all of this. The accelerating expansion may just be a sort of illusion driven by an incorrect assumption about the nature of the distribution of mass across the universe. As cosmological assumptions go, it’s a big one: The universe will remain, on average, smooth and uniform in all locations and from all perspectives.

    Maybe not?

    In more technical terms, we’re talking about the cosmological properties of isotropy and homogeneity. Together, they form the cosmological principle, which is mostly supported by the apparent uniformity of the cosmic microwave background. What the authors behind the current study suggest is that maybe the cosmological principle is bunk, and, if this is the case, then observations of distant supernovae take on a different meaning because we can no longer assume that the universe looks about the same for every observer in every location.

    “While the remarkable isotropy of the CMB points to an initial state with a very high degree of smoothness, the late epoch Universe encompasses a complex cosmic web of structures,” the paper notes.

    CMB per ESA/Planck

    ESA/Planck

    “It is dominated in volume by voids that are threaded and surrounded by clusters of galaxies distributed in sheets, knots and filaments.” In other words, space doesn’t really look all that smooth and uniform after all.

    The researchers’ alternative has a name: the timescape scenario. Because matter distributions may differ across the universe, different observers and different points within that space can be imagined to have their own relatively independent clocks (per Einstein, gravity bends light and so it bends time). With different notions of time, these different locations will then have different notions of cosmic expansion.

    The study doesn’t read like a classic crank/contrarian screed and the authors seem willing enough to concede that there may well be nothing to it. It will just depend on more data. In the meantime, the timescape scenario may at least serve as a “diagnostic tool” or alternative perspective that can help astronomers better test current understandings of the large-scale structure of the universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The future is wonderful, the future is terrifying. We should know, we live there. Whether on the ground or on the web, Motherboard travels the world to uncover the tech and science stories that define what’s coming next for this quickly-evolving planet of ours.

    Motherboard is a multi-platform, multimedia publication, relying on longform reporting, in-depth blogging, and video and film production to ensure every story is presented in its most gripping and relatable format. Beyond that, we are dedicated to bringing our audience honest portraits of the futures we face, so you can be better informed in your decision-making today.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: