Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Collected Friday Feed, 2019

Anthony Starks
December 27, 2019
110

Collected Friday Feed, 2019

The Friday Feed for 2019

Anthony Starks

December 27, 2019
Tweet

Transcript

  1. Friday Feed January 4, 2019 Software Engineering at Google There

    are many reasons for Google’s success, including enlightened leadership, great people, a high hiring bar, and the financial strength that comes from successfully taking advantage of an early lead in a very rapidly growing market. But one of these reasons is that ​Google has developed excellent software engineering practices​, which have helped it to succeed. These practices have evolved over time based on the accumulated and distilled wisdom of many of the most talented software engineers on the planet. We would like to share knowledge of our practices with the world, and to share some of the lessons that we have learned from our mistakes along the way. How the Artificial-Intelligence Program AlphaZero Mastered Its Games While their system is general-purpose enough to work for many two-person games, the researchers had adapted it specifically for Go, chess, and shogi (“Japanese chess”); it was given no knowledge beyond the rules of each game. At first it made random moves. Then it started learning through self-play. Over the course of nine hours, the chess version of the program played forty-four million games against itself on a massive cluster of specialized Google hardware. After two hours, it began performing better than human players; after four, it was beating the best chess engine in the world. Overlooked No More: Karen Sparck Jones, Who Established the Basis for Search Engines When most scientists were trying to make people use code to talk to computers, Karen Sparck Jones taught computers to understand human language instead. In so doing, her technology established the basis of search engines like Google. Towards a Human Artificial Intelligence for Human Development This paper discusses the possibility of applying the key principles and tools of current artificial intelligence (AI) to design future human systems in ways that could make them more efficient, fair, responsive, and inclusive.
  2. Childhood's End: The digital revolution isn’t over but has turned

    into something else The next revolution will be the ascent of analog systems over which the dominion of digital programming comes to an end. Nature’s answer to those who sought to control nature through programmable machines is to allow us to build machines whose nature is beyond programmable control.
  3. Friday Feed January 11, 2019 Cardiologist-level arrhythmia detection and classification

    in ambulatory electrocardiograms using a deep neural network Here, we develop a deep neural network (DNN) to classify 12 rhythm classes using 91,232 single-lead ECGs from 53,549 patients who used a single-lead ambulatory ECG monitoring device. When validated against an independent test dataset annotated by a consensus committee of board-certified practicing cardiologists, the DNN achieved an average area under the receiver operating characteristic curve (ROC) of 0.97. The average F1 score, which is the harmonic mean of the positive predictive value and sensitivity, for the DNN (0.837) exceeded that of average cardiologists (0.780). The State Of Software Security In 2019 The Good, bad, ugly...and the future 2019 UI and UX Design Trends Deep flat, better browsers, 3D... Google Knows You Better Than Your Doctor Ever Could It's not intended as a substitute for medical professionals, but it has become one — free, convenient and crowdsourced. Computer History Museum: Yesterday’s Computer of Tomorrow: The Xerox Alto How did personal computing start? Many credit Apple and IBM for this radical shift, but in 1973, years before the Apple II and IBM PC, Xerox built the Alto, a computer its makers thought could become the “computer of tomorrow.”
  4. Friday Feed January 18, 2019 CES 2019: A Show Report

    Steven Sinofsky's massive annual CES review: This year was interesting and thought-provoking. Many technologies are maturing and what is needed now is a great deal of work to make them far more usable, useful, and desirable for the mass market. More Data is Not Better and Machine Learning is a Grind…. Just Ask Amazon Amazon had made all of these mistakes in the past (and more), and that an important rule of thumb for machine learning programmers was to have a robust set of “guardrails” that includes standardizing on a single framework, creating environments for experimentation that mirrors production, defining standard interfaces that models must conform to, and encapsulating systems that abstract both experimentation and production. Does AI make strong tech companies stronger? But today, if you started a retailer and said “…and we’re going to use databases”, that would not make you different or interesting - SQL became part of everything and then disappeared. The same will happen with machine learning. Giving algorithms a sense of uncertainty could make them more ethical “Our behavior as moral beings is full of uncertainty. But when we try to take that ethical behavior and apply it in AI, it tends to get concretized and made more precise.” Instead, Eckersley proposes, why not explicitly design our algorithms to be uncertain about the right thing to do? The Route of a Text Message This installment is about a single text message: how it was typed, stored, sent, received, and displayed. I sprinkle in some history and context to break up the alphabet soup of protocols, but though the piece gets technical, it should all be easily understood.
  5. You are thinking about serverless costs all wrong Most companies

    won’t have the luxury of worrying about the cost of using AWS Lambda to train their ML models. They can’t even hope to compete for the engineering talents they need to design these workloads in the first place. Which brings me to the biggest cost saving from serverless that has been barely talked about. The saving in personnel cost
  6. Friday Feed January 25, 2019 Amazon Architecture What is it

    that we really mean by scalability? A service is said to be scalable if when we increase the resources in a system, it results in increased performance in a manner proportional to resources added. Increasing performance in general means serving more units of work, but it can also be to handle larger units of work, such as when datasets grow. Notes from a 1984 trip to Xerox PARC The idea of personal computers is an abreaction to time-sharing, with the unfortunate consequence that personal computers (to be specific, D machines) look like personal time-sharing computers, albeit without time-sharing software Influencing mainstream software—Applying programming language research ideas to transform spreadsheets One of the joys of working at Microsoft Research is the ability to directly influence mainstream software technologies – in this case, Microsoft Excel. The engineer’s engineer: Computer industry luminaries salute Dave Cutler’s five-decade-long quest for quality One of Dave’s unspoken contributions is that he has created generations of incredibly strong engineers, simply as a side-effect of being who he is and doing what he does,”... There was only one way to do things ‘right’ and ‘optimally,’ and it was intensely rigorous. He taught them. They taught me. I taught others. Generations have learned rigorous methodology, and have become far better engineers, even without having directly worked with him. Looking Back at Google’s Research Efforts in 2018 2018 was an exciting year for Google's research teams, with our work advancing technology in many ways, including fundamental computer science research results and publications, the application of our research to emerging areas new to Google (such as healthcare and robotics), open source software contributions and strong collaborations with Google product teams, all aimed at providing useful tools and services.
  7. Friday Feed February 1, 2019 The state of AI in

    2019 While the phrase “artificial intelligence” is unquestionably, undoubtedly misused, the technology is doing more than ever — for both good and bad. It’s being deployed in health care and warfare; it’s helping people make music and books; it’s scrutinizing your resume, judging your creditworthiness, and tweaking the photos you take on your phone. In short, it’s making decisions that affect your life whether you like it or not. Applied Machine Learning is a Meritocracy Machine learning is a multidisciplinary field, meaning that you have people coming to it with backgrounds all across the fields of science and engineering. It also means that there is no archetype for the “machine learning practitioner”, Want To Design Great Digital Experiences? Start Working With Architects “There are two majority roles in our life right now. Half the world is digital. Half the world is physical,” Neubert explains. “So I said, ‘How can I bring these two things together, where can I find a place to do that?” The Experience Vision: A Self-Fulfilling UX Strategy The easiest way to create an effective experience vision story is to start with the current experience. What makes today’s experience with our product or service frustrating for our users? Explainer: What is a quantum computer? Thanks to this counterintuitive phenomenon, a quantum computer with several qubits in superposition can crunch through a vast number of potential outcomes simultaneously. The final result of a calculation emerges only once the qubits are measured, which immediately causes their quantum state to “collapse” to either 1 or 0
  8. Amazon Web Services brought in more money than McDonald’s in

    2018 Amazon Web Services, in other words, is the size of a large and fully independent company.
  9. Friday Feed February 8, 2019 A High-Tech Pill to End

    Drug Injections Here was the challenge for bioengineers: Find a way to for patients to take drugs — like insulin or monoclonal antibodies used to treat cancers and other diseases — without injections. Making New Drugs With a Dose of Artificial Intelligence DeepMind specializes in “deep learning,” a type of artificial intelligence that is rapidly changing drug discovery science. A growing number of companies are applying similar methods to other parts of the long, enormously complex process that produces new medicines. These A.I. techniques can speed up many aspects of drug discovery and, in some cases, perform tasks typically handled by scientists. Guidelines for Human-AI Interaction We propose 18 generally applicable design guidelines for human-AI interaction. These guidelines are validated through multiple rounds of evaluation including a user study with 49 design practitioners who tested the guidelines against 20 popular AI-infused products We analyzed data breaches and cyber-attacks of 2018, here are the key insights 2018 also saw a rising trend in data leaks and exposures. Specifically data with a lack of proper protection was often found to be just floating around online without even the most basic security such as a password needed to access it. Fully exposed databases and PII data was easily found, yet could have been entirely preventable. User Experience Designers — The true fandom of Marie Kondo The Kondo method is about choices. Much like Ux Designers making choices for customers, Kondo forces her clients to make decisions about those items that they need to keep and or give/ throw away. The things that remain, according to Kondo must spark joy, the primary measure of process.
  10. Friday Feed February 15, 2019 ‘Virtual Pharmacology’ Advance Tackles Universe

    of Unknown Drugs Scientists at UC San Francisco, in collaboration with colleagues at the University of North Carolina (UNC), have developed the world’s largest virtual pharmacology platform and shown it is capable of identifying extremely powerful new drugs. The platform, soon to contain over a billion virtual molecules never before synthesized and not found in nature, is poised to dramatically change early drug discovery and send waves through the pharmaceutical industry The Hard Truth About Innovative Cultures Innovative cultures are misunderstood. The easy-to-like behaviors that get so much attention are only one side of the coin. They must be counterbalanced by some tougher and frankly less fun behaviors. “Reverse Innovation” Could Save Lives. Why Aren’t We Embracing It? Cheap and simple medical devices could improve performance and lower health-care costs, but first they have to overcome deeply rooted biases. Cloud Programming Simplified: A Berkeley View on Serverless Computing This paper gives a quick history of cloud computing, including an accounting of the predictions of the 2009 Berkeley View of Cloud Computing paper, explains the motivation for serverless computing, describes applications that stretch the current limits of serverless, and then lists obstacles and research opportunities required for serverless computing to fulfill its full potential. Evaluation and accurate diagnoses of pediatric diseases using artificial intelligence Our model demonstrates high diagnostic accuracy across multiple organ systems and is comparable to experienced pediatricians in diagnosing common childhood diseases. Our study provides a proof of concept for implementing an AI-based system as a means to aid physicians in tackling large amounts of data, augmenting diagnostic evaluations, and to provide clinical decision support in cases of diagnostic uncertainty or complexity.
  11. Nightscout #wearenotwaiting Nightscout (Continuous Glucose Monitoring in the Cloud) is

    an open source, DIY project that allows real time access to a CGM data via personal website, smartwatch viewers, or apps and widgets available for smartphones. This is why AI has yet to reshape most businesses Even if a company gets data flowing from many sources, it takes lots of experimentation and oversight to be sure that the information is accurate and meaningful. When Genpact, an IT services company, helps businesses launch what they consider AI projects, “10% of the work is AI,” says Sanjay Srivastava, the chief digital officer. “Ninety percent of the work is actually data extraction, cleansing, normalizing, wrangling.”
  12. Friday Feed February 22, 2019 Once hailed as unhackable, blockchains

    are now getting hacked More and more security holes are appearing in cryptocurrency and smart contract platforms, and some are fundamental to the way they were built. Amazon Alexa and the Search for the One Perfect Answer One-shot answers were unfashionable back when Tunstall-­Pedoe started programming at Cambridge. But that was no longer the case by the time the Echo came out. In the era of voice computing, offering a single answer is not merely a nice-to-have feature; it’s a need-to-have one. “You can’t provide 10 blue links by voice,” Tunstall-Pedoe says, echoing prevailing industry sentiment. “That’s a terrible user experience. From Novelty to Table Stakes: How to Get Started with Voice in Health Care At Moonshot, we have found that voice serves four key demographics particularly well: children, parents, the elderly, and the physically impaired. After all, voice is best utilized when solving real problems, and voice offers unique people-centric solutions to challenges faced by these groups. Supposedly ‘Fair’ Algorithms Can Perpetuate Discrimination The system should help us understand, rather than obscure, the impact of algorithms on society. We must provide a mechanism for civil society to be informed and engaged in the way in which algorithms are used, optimizations set, and data collected and interpreted. Principles of Technology Leadership The kind of talk that only Bryan Cantrill of Joyent can give, one which (enthusiastically) explores the importance of leadership principles ranging from the Gettysburg Address to Uber and exhorts the audience to think about how they might consider and apply these lessons to their own organizations.
  13. Friday Feed March 1, 2019 10 Breakthrough Technologies 2019 Robot

    dexterity, New-wave nuclear power, Predicting preemies, Gut probe in a pill, Custom cancer vaccines. The cow-free burger, Carbon dioxide catcher, An ECG on your wrist, Sanitation without sewers, Smooth-talking AI assistants Meet Amanda Cox, Who Brings Life to Data on Our Pages The Times’s newly named data editor spends her time thinking about how best to leverage data for journalism and present it with reader-friendly tools like charts, graphs and interactives. Visualization in Deep Learning How interactive interfaces and visualizations help people use and understand neural networks Microsoft Built a Bot to Match Patients to Clinical Trials The Clinical Trials Bot lets patients and doctors search for studies related to a disease and then answer a succession of text questions. The bot then suggests links to trials that best match the patients’ needs. Drugmakers can also use it to find test subjects.
  14. Friday Feed March 8, 2019 Put Humans at the Center

    of AI At Stanford and Google, Fei-Fei Li is leading the development of artificial intelligence—and working to diversify the field. Best of Machine Learning The best resources in Machine Learning and AI; 40 of the best resources for Machine Learning Introducing GPipe, an Open Source Library for Efficiently Training Large-scale Neural Network Models The ongoing development and success of many practical machine learning applications, such as autonomous driving and medical imaging, depend on achieving the highest accuracy possible. As this often requires building larger and even more complex models, we are happy to provide GPipe to the broader research community, and hope it is a useful infrastructure for efficient training of large-scale DNNs. Wristwatch heart monitors might save your life—and change medicine, too Making complex heart tests available at the push of a button has far-reaching consequences. The Big Data Revolution Will Be Sampled: How 'Big Data' Has Come To Mean 'Small Sampled Data' It is just as true that many of our beliefs regarding the size of the datasets we use are absolutely wrong. In particular, many of the vanguards of the big data era that we hold up as benchmarks of just what it means to work with “big data” like Facebook and Twitter, are actually vastly smaller than we have been led to believe.
  15. Friday Feed March 15, 2019 Lessons learned building natural language

    processing systems in health care NLP systems in health care are hard—they require broad general and medical knowledge, must handle a large variety of inputs, and need to understand context. Welcome to Health News Health News is built and maintained by the team behind Nukleosome. The ultimate goal of this online community is to create an intelligent discourse on human health. Iodide: an experimental tool for scientific communication and exploration on the web But to date, very few tools have focused on helping scientists gain unfiltered access to the full communication potential of modern web browsers. So today we’re excited to introduce Iodide, an experimental tool meant to help scientists write beautiful interactive documents using web technologies, all within an iterative workflow that will be familiar to many scientists. A Peek into the Future of Wearables Mind reading glasses, goggles that erase chronic pain, a wristband that can hear what the wearer can’t, and more futuristic wearables are on the horizon
  16. Friday Feed March 22, 2019 Apple Watch detects irregular heart

    beat in large U.S. study The Apple Watch was able to detect irregular heart pulse rates that could signal the need for further monitoring for a serious heart rhythm problem, according to data from a large study funded by Apple Inc (AAPL.O), demonstrating a potential future role for wearable consumer technology in healthcare. AI Algorithms are Now Shockingly Good at Doing Science Of course, the use of computers to aid in scientific research goes back about 75 years, and the method of manually poring over data in search of meaningful patterns originated millennia earlier. But some scientists are arguing that the latest techniques in machine learning and AI represent a fundamentally new way of doing science. One such approach, known as generative modeling, can help identify the most plausible theory among competing explanations for observational data, based solely on the data, and, importantly, without any preprogrammed knowledge of what physical processes might be at work in the system under study. Proponents of generative modeling see it as novel enough to be considered a potential “third way” of learning about the universe. Nvidia announces $99 AI computer for developers, makers, and researchers In recent years, advances in AI have produced algorithms for everything from image recognition to instantaneous translation. But when it comes to applying these advances in the real world, we’re only just getting started. A new product from Nvidia announced today at GTC — a $99 AI computer called the Jetson Nano — should help speed that process. Microsoft, Facebook, trust and privacy In other words, where Microsoft put better locks and a motion sensor on the windows, the world is moving to a model where the windows are 200 feet off the ground and don’t open...Much like moving from Windows to cloud and ChromeOS, you could see this as an attempt to remove the problem rather than patch it. Russians can't go viral in your newsfeed if there is no newsfeed. ‘Researchers’ can’t scrape your data if Facebook doesn't have your data. You solve the problem by making it irrelevant.
  17. Cybersecurity is not important Adaptations to cyberspace of techniques that

    worked to protect the traditional physical world have been the main means of mitigating the problems that occurred. This ”chewing gum and baling wire” approach is likely to continue to be the basic method of handling problems that arise, and to provide adequate levels of security
  18. Friday Feed March 29, 2019 A new study shows what

    it might take to make AI useful in health care Researchers used machine vision to help nurses monitor ICU patients. The way they approached their work shows the value of asking what people actually need artificial intelligence for. Fathers of the Deep Learning Revolution Receive ACM A.M. Turing Award Working independently and together, Hinton, LeCun and Bengio developed conceptual foundations for the field, identified surprising phenomena through experiments, and contributed engineering advances that demonstrated the practical advantages of deep neural networks. Computer Latency at a Human Scale How fast are computer systems really?
  19. Friday Feed April 5, 2019 Engineering Proteins in the Cloud

    with Python and Transcriptic, or, How to Make Any Protein You Want for $360 In this article I'll develop Python code that will take me from an idea for a protein all the way to expression of the protein in a bacterial cell, all without touching a pipette or talking to a human. The total cost will only be a few hundred dollars! Using Vijay Pande from A16Z's terminology, this is Bio 2.0. A Clever New Strategy for Treating Cancer, Thanks to Darwin In advanced-stage cancer, it’s generally a matter of when, not if, the pugnacious surviving cells will become an unstoppable force. Gatenby thought this deadly outcome might be prevented. His idea was to expose a tumor to medication intermittently, rather than in a constant assault, thereby reducing the pressure on its cells to evolve resistance. Death by a Thousand Clicks: Where Electronic Health Records Went Wrong Until this point, Foster, like most Americans, knew next to nothing about electronic medical records, but he was quickly amassing clues that eCW’s software had major problems—some of which put patients, like Annette Monachelli, at risk. 7 Ways Machine Learning Projects Fail hrough our years of experience in this field, we’ve identified several common ways that machine learning projects fail. Understanding these problems – and why they occur – will help you better assess the viability of your next machine learning project, and, most importantly, align the expectations of your team with actual outcomes. How Health Apps Let Women Down Even health apps for the general public have gendered design. These design blunders happen when designers and technologists don’t take time to understand people’s needs.
  20. Friday Feed April 12, 2019 The FDA wants to regulate

    machine learning in health care The agency released a white paper proposing a regulatory framework to decide how medical products that use AI should seek approval before they can go on the market. It is the biggest step the FDA has taken to date toward formalizing oversight of products that use machine learning (ML). Cloud Healthcare API Cloud Healthcare API bridges the gap between care systems and applications built on Google Cloud. By supporting standards-based data formats and protocols of existing healthcare technologies, Cloud Healthcare API connects your data to advanced Google Cloud capabilities, including streaming data processing with Cloud Dataflow, scalable analytics with BigQuery, and machine learning with Cloud Machine Learning Engine. How IBM Watson Overpromised and Underdelivered on AI Health Care IBM has learned these painful lessons in the marketplace, as the world watched. While the company isn’t giving up on its moon shot, its launch failures have shown technologists and physicians alike just how difficult it is to build an AI doctor. Google AI Platform AI Platform makes it easy for machine learning developers, data scientists, and data engineers to take their ML projects from ideation to production and deployment, quickly and cost-effectively. Ethically Aligned Design A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems Amazon’s Alexa now handles patient health information Amazon has invited six health care companies to build tools using Alexa
  21. Friday Feed April 19, 2019 AI Helps Classify Lung Cancer

    at the Pathologist Level “Considering the quick turnaround time of our model, it could be integrated into existing laboratory information management systems to automatically pre-populate diagnoses for histologic patterns on slides or provide a second opinion on more challenging patterns. In addition, a visualization of the entire slide, examined by our model at the piecewise level, could highlight elusive areas of high-grade patterns as well as primary regions of tumor cells,” 3D Printing of Personalized Thick and Perfusable Cardiac Patches and Hearts a simple approach to 3D‐print thick, vascularized, and perfusable cardiac patches that completely match the immunological, cellular, biochemical, and anatomical properties of the patient is reported. This is how AI bias really happens—and why it’s so hard to fix Bias can creep in at many stages of the deep-learning process, and the standard practices in computer science aren’t designed to detect it. How Algorithms Know What You’ll Type Next This article is about how text predictors work, and how crucial the input language dataset is for the resulting predictions. To see how this in action, we will predict tweets by four Twitter accounts: Barack Obama, Justin Timberlake, Kim Kardashian, and Lady Gaga.
  22. Friday Feed April 26, 2019 In African Villages, These Phones

    Become Ultrasound Scanners A hand-held device brings medical imaging to remote communities, often for the first time. AI Now Report 2018 AI Now works with a broad coalition of stakeholders, including academic researchers, industry, civil society, policy makers, and affected communities, to identify and address issues raised by the rapid introduction of AI across core social domains. AI Now produces interdisciplinary research to help ensure that AI systems are accountable to the communities and contexts they are meant to serve, and that they are applied in ways that promote justice and equity. I think right around this minute is just about exactly 5 years since the Heartbleed vulnerability in OpenSSL became public. The HeartBleed vulnerability happened five years ago. Here is the story of how AWS reacted Zoom, Zoom, Zoom! The Exclusive Inside Story Of The New Billionaire Behind Tech’s Hottest IPO “Customers have always said, ‘Eric, we’ll become your very important customer, you’ve got to visit us,’” says Yuan. “I say, ‘Fine, I’m going to visit you, but let’s have a Zoom call first.’” That’s usually enough. Notes on AI Bias Machine learning finds patterns in data. ‘AI Bias’ means that it might find the wrong patterns - a system for spotting skin cancer might be paying more attention to whether the photo was taken in a doctor’s office. ML doesn’t ‘understand’ anything - it just looks for patterns in numbers, and if the sample data isn’t representative, the output won’t be either. Meanwhile, the mechanics of ML might make this hard to spot.
  23. Friday Feed May 3, 2019 People Are Clamoring to Buy

    Old Insulin Pumps How an obsolete medical device with a security flaw became a must-have for some patients with type 1 diabetes How Microsoft Learned from the Past to Redesign its Future Every Thursday, Microsoft’s Surface, Windows, and app teams get together to discuss what they’re working on. During one of these many meetings in a sunny conference room at Microsoft’s Redmond HQ, designers sat around debating how playful Microsoft should be with its designs. What’s the tone of voice? What’s the visual representation of the personality of the product? Ultimately, how should Microsoft’s voice be expressed in the form of illustrations and design? ThoughtWorks Technology Radar Vol 20 An opinionated guide to technology frontiers -- an interesting rating system for a bunch of different tools, techniques, platforms, and languages/frameworks. The Most Valuable Company (for Now) Is Having a Nadellaissance According to a former executive, Nadella, frustrated with hand-wringing about the new cloud-vs.-Windows hierarchy, scolded a group of top executives early in his tenure. At his Microsoft, there would be only “fixers,” no “complainers.” If people didn’t buy into his vision, he’d tell them, “Don’t stay. Time to move on.” Microsoft, Slack, Zoom, and the SaaS Opportunity The most important factor that made all of this possible, though, is that for all of the disruption that the enterprise market has faced thanks to the rise of Software-as-a-Service (Saas), Microsoft was remarkably well-placed to take advantage of this new paradigm, if only they could get out of their own way.
  24. Friday Feed May 10, 2019 MIT AI model is 'significantly'

    better at predicting breast cancer The model can find breast cancer earlier and eliminates racial disparities in screening. Tukey, Design Thinking, and Better Questions Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise. Google backs a bid to use CRISPR to prevent heart disease The startup, Verve Therapeutics, said it had raised $58.5 million from investors including Alphabet's venture fund, GV. What makes Verve different? Most gene-therapy companies have gone after rare diseases like hemophilia. But Verve thinks editing people’s DNA could instead help solve the most common cause of death. Highlights of Microsoft's Build Conference Build 2019 Summary Highlights of Google I/O 2019 Summary of Google's annual developer event
  25. Friday Feed May 17, 2019 The inside story of why

    Amazon bought PillPack in its effort to crack the $500 billion prescription market PillPack is just a piece of Amazon’s expansive plan to uproot the $3 trillion U.S. health-care industry. The company is also working with J.P. Morgan Chase and Berkshire Hathaway on a joint venture called Haven aiming to improve care and bring down the costs. How AI could save lives without spilling medical secrets Oasis stores the private patient data on a secure chip, designed in collaboration with other researchers at Berkeley. The data remains within the Oasis cloud; outsiders are able to run algorithms on the data, and receive the results, without its ever leaving the system. A smart contract—software that runs on top of a blockchain—is triggered when a request to access the data is received. This software logs how the data was used and also checks to make sure the machine-learning computation was carried out correctly. What is a Neural Network? YouTube Series on the basics of neural networks and deep learning Stanford University: Illustrated Machine Learning Cheatsheets A set of illustrated Machine Learning cheatsheets covering the content of the CS 229 class, which I TA-ed in Fall 2018 at Stanford. They can (hopefully!) be useful to all future students of this course as well as to anyone else interested in Machine Learning. A new way to build tiny neural networks could create powerful AI on your phone We’ve been wasting our processing power to train neural networks that are ten times too big. People + AI Guidebook This Guidebook will help you build human-centered AI products. It’ll enable you to avoid common mistakes, design excellent experiences, and focus on people as you build AI-driven applications.
  26. Friday Feed May 24, 2019 End-to-end lung cancer screening with

    three-dimensional deep learning on low-dose chest computed tomography We propose a deep learning algorithm that uses a patient’s current and prior computed tomography volumes to predict the risk of lung cancer. Our model achieves a state-of-the-art performance (94.4% area under the curve) on 6,716 National Lung Cancer Screening Trial cases, and performs similarly on an independent clinical validation set of 1,139 cases. The Conversational AI Playbook This playbook represents a first step toward defining the governing principles and best practices which will enable developers to build great conversational applications. It is the result of several years of practical experience building and deploying dozens of the most advanced conversational applications achievable. Amazon Is Working on a Device That Can Read Human Emotions The wrist-worn gadget is described as a health and wellness product in internal documents reviewed by Bloomberg. It’s a collaboration between Lab126, the hardware development group behind Amazon’s Fire phone and Echo smart speaker, and the Alexa voice software team. I'd blush if I could: closing gender divides in digital skills through education Most AI voice assistants are gendered as young women, and they’re mostly used to answer questions or carry out tasks like checking the weather, playing music, or setting reminders. This sends a signal that women are docile, eager-to-please helpers, without any agency, and always on hand to help their masters,
  27. Friday Feed May 31, 2019 Open Insulin A team of

    Bay Area biohackers working on newer, simpler, less expensive ways to make insulin. Introducing Mercury OS A speculative vision of the operating system, driven by humane design principles. Insecure and Unintuitive: How We Need to Fix the UX of Security The most expensive dialog box in the world costs an Australian bank $750,000,000/year for password resets. The end of mobile How many people would have these things? Now, we know the answer: everyone. Everyone would have one.
  28. Friday Feed June 7, 2019 The 9 biggest highlights from

    Apple WWDC 2019 At WWDC 2019, Apple announced a slew of software updates headed toward the Apple family of devices, including the iPhone, iPad, Mac desktops and laptops, Apple TV, and Apple Watch. There’s also the announcement of the new Mac Pro, a powerful computing device that hasn’t been refreshed since 2013. Software Engineering for Machine Learning: A Case Study Recent advances in machine learning have stimulated widespread interest within the Information Technology sector on integrating AI capabilities into software and services. This goal has forced organizations to evolve their development processes. We report on a study that we conducted on observing software teams at Microsoft as they develop AI-based applications. Flexible systems are the next frontier of machine learning Two important contributors to the field of artificial intelligence and machine learning – Jeff Dean, head of Google AI and the co-founder of Google Brain, and Chris Ré, associate professor of computer science at Stanford – discussed the future of flexible machine learning at a recent session of the AI Salon, hosted by the Stanford AI Lab and the Stanford Institute for Human-Centered Artificial Intelligence. Google Tried to Prove Managers Don't Matter. Instead, They Discovered 10 Traits of the Very Best Ones So what follows are the 10 characteristics Google believes make for the best managers (and that they expect from managers), (1) Good coaching, (2) empower teams (3) Create an inclusive team environment (4) Good communications (5) Have a clear strategy (6) Collaborate (7) Support career development and discuss performance (8) Advise the team with expertise (9) Collaborate (10) Strong decision making
  29. Friday Feed June 14, 2019 Internet Trends 2019 The annual

    Mary Meeker report: Besides the usual trends: Some 51 percent of the world — 3.8 billion people — were internet users last year, up from 49 percent (3.6 billion) in 2017, and Americans are spending more time with digital media than ever: 6.3 hours a day in 2018, up 7 percent from the year before, note the healthcare section -- "Digital Impact expanding across the ecosystem" CRISPR scientists are teaming up with a pharma giant to look for new drug clues GlaxoSmithKline will pour $67 million into a new laboratory at the University of California to industrialize the search for drug clues using the gene-editing tool CRISPR. Benchling: Evolving to Enterprise-Grade Permissions Benchling is a data platform to help scientists do research. Hundreds of thousands of scientists across academic labs and enterprise companies use Benchling to store and analyze scientific data, record notes, and share findings with each other. How 5 Data Dynamos Do Their Jobs Our news reporters are increasingly choosing to level-up their data skills as well in order to find stories hidden in the numbers, organize their reporting and check government conclusions. The demand for this knowledge has been so great that our digital transition team now runs a training program to help reporters work on these skills.
  30. Friday Feed June 21, 2019 AI could be the key

    to catching Type 1 diabetes much earlier At the American Diabetes Association's 79th Scientific Sessions in early June, IBM and JDRF (formerly known as the Juvenile Diabetes Research Foundation), a nonprofit that spearheads Type 1 diabetes research, unveiled a predictive AI tool that has mapped the presence of Type 1 diabetes antibodies in blood to figure out exactly when and how the condition could develop. Google Assistant bests Alexa and Siri in most-used medication knowledge Research released today and published in Nature Digital Medicine that examines AI assistant intelligence about the 50 most-used medications in the United States gave Google Assistant a comprehension score of 91.8% accuracy on brand-name medication and 84.3% for generic names, followed by 58.5% and 51.2% for Siri, and 54.6% and 45.5 for Alexa. End-User Probabilistic Programming From Microsoft Research: The purpose of this paper is to bring together two rather distinct approaches to decision making under uncertainty: spreadsheets and probabilistic programming. We start by introducing these two approaches. How to design AI tools for the workplace, according to Google To ensure we’re building the right type of AI tools for employees, we tested AI products among our employee base of nearly 100,000 people to see how different machine-learning models played out in real circumstances. We identified the following core principles as the ones people most want to see in workplace AI. Nines are Not Enough: Meaningful Metrics for Clouds Providers also must define internal metrics to allow them to operate their systems without violating customer promises or expectations. We explore why these guarantees are hard to define. We show that this problem shares some similarities with the challenges of applying statistics to make decisions based on sampled data. We also suggest that defining guarantees in terms of defense against threats, rather than guarantees for application-visible outcomes, can reduce the complexity of these problems. Overall, we offer a partial framework for thinking about Service Level Objectives (SLOs), and discuss some unsolved challenges.
  31. Friday Feed June 28, 2019 The Two Technologies Changing the

    Future of Cancer Treatment Now scientists are busy exploring better ways forward. “As an oncologist, I just couldn’t accept chemotherapy toxicity,” said Miriam Merad, another panelist and the director of the Precision Immunology Institute at Mount Sinai School of Medicine. “We have an immune system that’s been shaped for millennia to fight against damage, and I thought that we should be able to use it to fight against cancer cells.” The STRIDE Threat Model You can group threats into categories to help you formulate these kinds of pointed questions. One model you may find useful is STRIDE, derived from an acronym for the following six threat categories: Spoofing, Tampering, Repudiation, Information disclosure, Denial of Service, Elevation of Priviledge The future of AI research is in Africa In the last few years, the machine-learning community has blossomed, applying the technology to challenges like food security and health care. Using AWK and R to parse 25tb of DNA data Recently I was put in charge of setting up a workflow for dealing with a large amount of raw DNA sequencing (well technically a SNP chip) data for my lab. The goal was to be able to quickly get data for a given genetic location (called a SNP) for use for modeling etc. Using vanilla R and AWK I was able to cleanup and organize the data in a natural way, massively speeding up the querying. It certainly wasn’t easy and it took lots of iterations. This post is meant to help others avoid some of the same mistakes and show what did eventually work.
  32. Friday Feed July 5, 2019 This changes everything for the

    DIY Diabetes Community We the DIY #diabetes community declared #WeAreNotWaiting and, dammit, we'd do this ourselves. And now TidePool expressing the intent to put an Artificial Pancreas in the damn App Store - along with Angry Birds - WITH SUPPORT FOR WARRANTIED NEW BLE PUMPS. I could cry. Tracking and Visualizing Faces Detect faces in a camera feed, overlay matching virtual content, and animate facial expressions in real-time. Formal Foundations of Serverless Computing First, to ease reasoning about code, we present a simplified naive semantics of serverless execution and precisely characterize when the naive semantics and λΛ coincide. Second, we augment λΛ with a key-value store to allow reasoning about stateful serverless functions. Third, since a handful of serverless platforms support serverless function composition, we show how to extend λΛ with a composition language. We have implemented this composition language and show that it outperforms prior work. Accelerating Science: A Computing Research Agenda This calls for concerted research agenda aimed at: Development, analysis, integration, sharing, and simulation of algorithmic or information processing abstractions of natural processes, coupled with formal methods and tools for their analyses and simulation;Innovations in cognitive tools that augment and extend human intellect and partner with humans in all aspects of science. Abigail Echo-Hawk on the art and science of 'decolonizing data' The chief research officer of the Seattle Indian Health Board is creating programs and databases that are not based on Western concepts to better serve indigenous communities.
  33. Amazon is 25 years old. What will the company’s next

    chapter look like? Amazon turns 25 this week. Who would’ve thought back in 1994 that a tiny online bookseller might turn into a company that touches pretty much every aspect of our lives — what we buy, how we compute, how we watch movies and even get groceries.
  34. Friday Feed July 12, 2019 DNA Data Storage Is Closer

    Than You Think Life’s information-storage system is being adapted to handle massive amounts of information A new way to use the AI behind deepfakes could improve cancer diagnosis Generative adversarial networks, the algorithms responsible for deepfakes, have developed a bit of a bad rap of late. But their ability to synthesize highly realistic images could also have important benefits for medical diagnosis. Your Pa$$word doesn't matter Every week I have at least one conversation with a security decision maker explaining why a lot of the hyperbole about passwords – “never use a password that has ever been seen in a breach,” “use really long passwords”, “passphrases-will-save-us”, and so on – is inconsistent with our research and with the reality our team sees as we defend against 100s of millions of password-based attacks every day. Focusing on password rules, rather than things that can really help – like multi-factor authentication (MFA), or great threat detection – is just a distraction. Because here’s the thing: When it comes to composition and length, your password (mostly) doesn’t matter. The Computer Maverick Who Modeled the Evolution of Life Barricelli programmed some of the earliest computer algorithms that resemble real-life processes: a subdivision of what we now call “artificial life,” which seeks to simulate living systems—evolution, adaptation, ecology—in computers. Barricelli presented a bold challenge to the standard Darwinian model of evolution by competition by demonstrating that organisms evolved by symbiosis and cooperation. The False Musk-Jobs Parallel Comparing Elon Musk to Steve Jobs makes for easy headlines, but it’s not a helpful way to look at Tesla’s perennial mix of progress and trouble.
  35. Friday Feed July 19, 2019 Neil And Buzz Go For

    A Walk This is the story about what happened on that 1969 Moonwalk, told with highlights from the original transmission log. Scroll to see the adventure. MIT Science Reporter—"Computer for Apollo" (1965) In commemoration of the 50th anniversary of the first manned moon landing, an overview of the computing equipment that got us there. Margaret Hamilton: ‘They worried that the men might rebel. They didn’t’ Computer pioneer Margaret Hamilton was critical to landing astronauts on the moon for the first time on 20 July 1969 and returning them safely a few days later. The young Massachusetts Institute of Technology (MIT) computer programmer and working mother led the team that created the onboard flight software for the Apollo missions, including Apollo 11. Africa Is Tech’s Next Great Frontier, Github A.I. Expert Says The bleeding edge of technology innovation is increasingly shifting from Silicon Valley to places that are also at the frontiers of economic development, including sub-Saharan Africa, says Omoju Miller, who works on machine learning at software development platform Github. The future of work in America: People and places, today and tomorrow The day-to-day nature of work could change for nearly everyone as intelligent machines become fixtures in the American workplace.
  36. Friday Feed July 26, 2019 The Hidden Costs of Automated

    Thinking This approach to discovery—answers first, explanations later—accrues what I call intellectual debt. It’s possible to discover what works without knowing why it works, and then to put that insight to use immediately, assuming that the underlying mechanism will be figured out later. In some cases, we pay off this intellectual debt quickly. But, in others, we let it compound, relying, for decades, on knowledge that’s not fully known. Vaccination rates - data exploration Data, analysis and source code for a story on vaccination rates in New Zealand Microsoft invests in and partners with OpenAI to support us building beneficial AGI We’re partnering to develop a hardware and software platform within Microsoft Azure which will scale to AGI. We’ll jointly develop new Azure AI supercomputing technologies, and Microsoft will become our exclusive cloud provider—so we’ll be working hard together to further extend Microsoft Azure’s capabilities in large-scale AI systems. Apple’s Heir Apparent Is Much More Like Tim Cook Than Steve Jobs When Apple announced the pending departure of Chief Design Officer Jony Ive last month, it threw the spotlight on an executive few outsiders know: Chief Operating Officer Jeff Williams, who has now also taken over the company’s legendary design studio. This added fiefdom makes Williams unambiguously the second-most important person at Apple and Tim Cook’s heir apparent as CEO. Saving the World from Spreadsheets Spreadsheets are one of the most widely used programming environments, with roughly 1 billion users of Microsoft Excel alone. Unfortunately, spreadsheets make it all too easy to make errors that go unnoticed. These errors can have catastrophic consequences because spreadsheets are widely deployed in domains like finance and government. For instance, the infamous “London Whale” incident in 2012 cost JP Morgan approximately $2 billion; this was due in part to a spreadsheet programming error.
  37. Friday Feed August 2, 2019 Fast Software, the Best Software

    I love fast software. That is, software speedy both in function and interface. Software with minimal to no lag between wanting to activate or manipulate something and the thing happening. Lightness. How Phones Made the World Your Office, Like It or Not The telephone began to pervade our lives at the end of the 19th century, and then — as you can see in these photos from The New York Times’s archives — it became our lives. Cellphones were a significant inflection point. They made it possible for us to be available at virtually any moment, which was so extraordinary that most of us tacitly accepted that we should be available at virtually any moment. Machine Learning That’s Light Enough for an Arduino I want to find out what happens when we bring machine learning to cheap, robust devices that can have all kinds of sensors and work in all kinds of environments. And I want you to help. The kind of AI we can squeeze into a US $30 or $40 system won’t beat anyone at Go, but it opens the door to applications we might never even imagine otherwise. London Lab Advances Use of A.I. in Health Care, but Raises Privacy Concerns In a paper published on Wednesday in the science journal Nature, researchers from DeepMind, a London artificial intelligence lab owned by Google’s parent company, detail a system that can analyze a patient’s health records, including blood tests, vital signs and past medical history, and predict A.K.I. up to 48 hours before onset. Would You Want a Computer to Judge Your Risk of H.I.V. Infection? A new software algorithm decides which patients are most likely to become infected with the virus. Does the assessment stigmatize patients?
  38. Friday Feed August 9, 2019 I'm a cyborg now! (On

    Building My Own Artificial Pancreas) Given the choice between a) changing my lifestyle to be boring and b) hacking my metabolism, I chose the easier option. I built an artificial pancreas using OpenAPS! And it's changing my life. Apple and Eli Lilly are studying whether data from iPhones and Apple Watches can detect signs of dementia According to research published this week, the two companies teamed up with health-tech start-up Evidation to find ways to more quickly and precisely detect cognitive impairments like Alzheimer’s disease with the help of popular consumer gadgets. Cinematic scientific visualization: the art of communicating science Visualization is a wide field, and in this course we are focusing specifically on “cinematic scientific visualization”. Cinematic scientific visualization focuses on using Hollywood storytelling techniques to communicate spatial computational data with the public. This type of visualization is production-quality, data-driven imagery created with movie-making tools with good composition, camera direction, and artistic aesthetics suitable for distribution in immersive giant screen theaters. Work Ruined Email The inbox became a to-do list, and everything started to flow through it. The ting of a new email elicits panic because it signals the arrival of a new toil: a new assignment from a boss, a request from a colleague, a policy notice from human resources, an announcement from management, a networking request from a stranger. You didn’t ask for any of these, but now you have to deal with them—even if just to press delete. The Lonely Work of Moderating Hacker News The question facing Hacker News is whether the site’s original tech-intellectual culture can be responsibly scaled up to make space for a more inclusive, wider-ranging vision of technology.
  39. A Framework for Moderation The question of when and why

    to moderate or ban has been an increasingly frequent one for tech companies, although the circumstances and content to be banned have often varied greatly. Some examples from the last several years:
  40. Friday Feed August 16, 2019 First They Came for the

    Black Feminists IN 2013, a series of disinformation campaigns were spread on Twitter, primarily targeting black women. The manipulation tactics developed during these campaigns served as a blueprint for future movements like Gamergate and continue to be shared across anonymous message boards and far-right blogs, shaping the online world. Training bias in AI "hate speech detector" means that tweets by Black people are far more likely to be censored The authors do a pretty good job of pinpointing the cause: the people who hand-labeled the training data for the algorithm were themselves biased, and incorrectly, systematically misidentified AAE writing as offensive. And since machine learning models are no better than their training data (though they are often worse!), the bias in the data propagated through the model. Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads We explore data from a field test of how an algorithm delivered ads promoting job opportunities in the science, technology, engineering and math fields. This ad was explicitly intended to be gender neutral in its delivery. Empirically, however, fewer women saw the ad than men. This happened because younger women are a prized demographic and are more expensive to show ads to. An algorithm that simply optimizes cost-effectiveness in ad delivery will deliver ads that were intended to be gender neutral in an apparently discriminatory way, because of crowding out. We show that this empirical regularity extends to other major digital platforms. The world’s top deepfake artist is wrestling with the monster he created ​Hao Li has spent his career perfecting digital trickery. Now he’s working to confront the problem of increasingly seamless off-the-shelf deception. Serena Versus the Drones Here’s a portion of the chapter “How to Catch a Drone”, in which Serena helped test whether tennis serves could be an effective countermeasure against flying robots … by taking a drone out onto a court and hitting tennis balls at it until it crashed.
  41. Friday Feed August 23, 2019 Recreating W.E.B Du Bois’s Data

    Portraits The idea of the project is to re-create the visualizations with as much precision as possible, preserving the original look, colors, and layout. The principle is to respect the 120-year-old design choices, using modern digital techniques. You can think of these as covers, not re-mixes (more on that later). I Visited 47 Sites. Hundreds of Trackers Followed Me. What did we find? The big story is as you’d expect: that everything you do online is logged in obscene detail, that you have no privacy. And yet, even expecting this, I was bowled over by the scale and detail of the tracking; even for short stints on the web, when I logged into Invasive Firefox just to check facts and catch up on the news, the amount of information collected about my endeavors was staggering. (note the splendid graphics from Nadieh Bremer) A.I. Is Learning From Humans. Many Humans. A.I., most people in the tech industry would tell you, is the future of their industry, and it is improving fast thanks to something called machine learning. But tech executives rarely discuss the labor-intensive process that goes into its creation. A.I. is learning from humans. Lots and lots of humans. Why I Printed My Facebook It was thousands of pages’ worth of data—and it was illuminating. Data: Past, Present, and Future Data and data-empowered algorithms now shape our professional, personal, and political realities. This course introduces students both to critical thinking and practice in understanding how we got here, and the future we now are building together as scholars, scientists, and citizens.
  42. Friday Feed August 30, 2019 Forget single genes: CRISPR now

    cuts and splices whole chromosomes But a long-sought goal remained out of reach: manipulating much larger chunks of chromosomes in Escherichia coli, the workhorse bacterium. Now, researchers report they've adapted CRISPR and combined it with other tools to cut and splice large genome fragments with ease. “Bicycle for the Mind” “When we invented the personal computer, we created a new kind of bicycle…a new man-machine partnership…a new generation of entrepreneurs.” Steve Jobs said this and a lot more in 1980 as explored in this annotated twitter thread. From Laptop to Lambda: Outsourcing Everyday Jobs to Thousands of Transient Functional Containers We present gg, a framework and a set of command-line tools that helps people execute everyday applications—e.g., software compilation, unit tests, video encoding, or object recognition—using thousands of parallel threads on a cloud functions service to achieve near-interactive completion times. In the future, instead of running these tasks on a laptop, or keeping a warm cluster running in the cloud, users might push a button that spawns 10,000 parallel cloud functions to execute a large job in a few seconds from start. gg is designed to make this practical and easy. How the Anthony Levandowski Indictment Helps Big Tech Stifle Innovation in Silicon Valley Much of the history of innovation is, in fact, also a history of theft: Microsoft stole the basic idea for the graphical user interface (think Windows’ on-screen icons) from Apple; Apple had stolen it from Xerox; the researchers at Xerox, most likely, stole it from someone else. Innovation, in many ways, is not about creation but about iteration, about building on ideas that have come before. Innovation is also about betrayal: much of Silicon Valley’s genesis can be traced to Fairchild Semiconductor, which was founded by a group of young engineers who came to be known as the Traitorous Eight after they left their previous employer, en masse, to set up a rival company.
  43. Privacy Fundamentalism Indeed, that is why my critique of Manjoo’s

    article specifically and the ongoing privacy hysteria broadly is not simply about definitions or philosophy. It’s about fundamental assumptions. The default state of the Internet is the endless propagation and collection of data: you have to do work to not collect data on one hand, or leave a data trail on the other. This is the exact opposite of how things work in the physical world: there data collection is an explicit positive action, and anonymity the default.
  44. Friday Feed September 6, 2019 Those Hurricane Maps Don’t Mean

    What You Think They Mean We use hurricane forecasts to warn people. Why do we misinterpret them so often? The Internet is for End Users As the Internet increasingly mediates essential functions in societies, it has unavoidably become profoundly political; it has helped people overthrow governments and revolutionize social orders, control populations, collect data about individuals, and reveal secrets. It has created wealth for some individuals and companies while destroying others’. All of this raises the question: Who do we go through the pain of gathering rough consensus and writing running code for? Why We Find Joy and Value in Creating Data Visualizations But are there certain sorts of problems that data visualization can help with or needs it’s especially good at meeting? In a recent conversation, we asked members of the Data Visualization Society (DVS) to give their take. Welcome to a rough guide on knowing where data visualization can help you. An AI system identified a potential new drug in just 46 days A team from AI pharma startup Insilico Medicine, working with researchers at the University of Toronto, took 21 days to create 30,000 designs for molecules that target a protein linked with fibrosis (tissue scarring). They synthesized six of these molecules in the lab and then tested two in cells; the most promising one was tested in mice. The researchers concluded it was potent against the protein and showed “drug-like” qualities. All in all, the process took just 46 days. A very deep dive into iOS Exploit chains found in the wild TAG was able to collect five separate, complete and unique iPhone exploit chains, covering almost every version from iOS 10 through to the latest version of iOS 12. This indicated a group making a sustained effort to hack the users of iPhones in certain communities over a period of at least two years.
  45. What Is a Tech Company? Of course, it is fair

    to ask, “What isn’t a tech company?” Surely that is the endpoint of software eating the world; I think, though, to classify a company as a tech company because it utilizes software is just as unhelpful today as it would have been decades ago.
  46. Friday Feed September 13, 2019 What Makes A Data Visualisation

    Elegant? How do people define and describe elegance in visualisation as well as other creative or communicated work? Why Aren’t Cancer Drugs Better? The Targets Might Be Wrong Drugs can stop cancer cells if they attack the right proteins. But many of these targets were chosen with dated, imprecise technology, a new study suggests. The iPhone and Apple's Services Strategy It does feel like there is one more shoe yet to drop when it comes to Apple’s strategic shift. The fact that Apple is bundling a for-pay service (Apple TV+) with a product purchase is interesting, but what if Apple started included products with paid subscriptions?
  47. Friday Feed September 20, 2019 At Dynamicland, The Building Is

    The Computer This computational research lab is reinventing computer programming A doc as powerful as an app. Coda is a new doc that grows with your ideas. People have made Coda docs that do everything from launch products, to scale small businesses, to help them study for tests. What will you Coda? The Five Types of People Who Use Visualization The Doer, the Analyzer, the Decider, the Casual Learner, the Museum Goer My takeaway from working with SwiftUI Overall the experience with SwiftUI was great. However, you have to try it yourself to know what it has to offer. For me I found it almost too fast to work with, easy to iterate and very approachable. The animations are a breeze to add. More importantly, I can dropdown to UIKit whenever you feel like it. iphone 11 pro camera review: china While many camera/smartphone manufactures continue to approach digital photography as a technical problem to solve, I see the team at Apple has graduated beyond technical problems and is asking deeper questions like “How can we give artists (everyone) the creative tools to express their vision?”
  48. Friday Feed September 27, 2019 A.I. Researchers See Danger of

    Haves and Have-Nots Computer scientists say A.I. research is becoming increasingly expensive, requiring complex calculations done by giant data centers, leaving fewer people with easy access to the computing firepower necessary to develop the technology behind futuristic products like self-driving cars or digital assistants that can see, talk and reason. A fairer way forward for AI in health care Without careful implementation, artificial intelligence could widen health-care inequality. Deep Learning with Electronic Health Record (EHR) Systems Walk into any machine learning conference and ask people about the applications of ML in healthcare and most will respond with the canonical example of using computer vision to diagnose diseases from medical scans (followed by a prolonged debate about “should radiologists be worried about their jobs”). But there exists another source of data, beyond imaging studies, that can change the way we approach health: the electronic health record (EHR). Amazon launches Amazon Care, a virtual medical clinic for employees Health care represents a $3.5 trillion sector for Amazon, which is looking at ways to bring in technology ranging from cloud computing to medical record technology. The biggest announcements from Amazon’s fall 2019 hardware event Amazon’s huge 2019 hardware event has wrapped up. The company announced 15 new products, including the Echo Buds truly wireless headphones, the Dolby Atmos-equipped Echo Studio speaker, and the Echo Frames, which have built-in microphones so you can chat with Alexa.
  49. The Cloud and Open Source But on the operations side,

    the picture is really unique. First of all, there are very few places in the world where you can get operational experience at this scale. Second, AWS doesn’t run on SRE culture; the same engineers who write the code live by the dashboards and alarms and metrics that try to reflect and protect the customers’ experience (not perfectly, but we make progress).
  50. Friday Feed October 4, 2019 Africa Is Building an A.I.

    Industry That Doesn’t Look Like Silicon Valley Over the last three years, academics and industry researchers from around the African continent have begun sketching the future of their own A.I. industry at a conference called Deep Learning Indaba. The conference brings together hundreds of researchers from more than 40 African countries to present their work, and discuss everything from natural language processing to A.I. ethics. The Massive, Overlooked Potential of African DNA Around the world, tissue and blood banks have sprung up to catalog human DNA’s many mysteries. But not in Africa. About 80 percent of the human DNA used in genetic studies comes from people of European descent. When researchers survey vast numbers of genomes to unearth a disease’s genetic causes, they use almost no African data. Pharmaceutical companies, too, develop new drugs based overwhelmingly on the genomes of white people. When we watch TV, our TVs watch us back and track our habits This practice has exploded recently since it hasn’t faced much public scrutiny. But in the last few days, not one but *three* papers have dropped that uncover the extent of tracking on TVs. Let me tell you about them. Intro to Creative Coding This repository includes resources and course notes for students attending my Intro to Creative Coding workshops, demonstrating p5.js and Tone.js. Just Enough Research Good research is about asking more and better questions, and thinking critically about the answers. Done well, it will save your team time and money by reducing unknowns and creating a solid foundation to build the right thing, in the most effective way.
  51. Microsoft Surface Neo First Look: The Future Of Windows 10x

    Is Dual-Screen Microsoft is creating a new version of Windows 10, designed exclusively for dual-screen and foldable devices. Windows 10X, also known by codename “Santorini,” is an ambitious effort from Microsoft to redesign Windows 10 for devices that don’t even exist yet — like flexible tablets that morph into being capable of laptop-like tasks — devices like the Surface Neo.
  52. Friday Feed October 11, 2019 A Patient Hopes Gene-Editing Can

    Help With Pain Of Sickle Cell Disease Scientists used CRISPR to modify a gene in the cells to make them produce fetal hemoglobin, a protein that babies usually stop making shortly after birth. The hope is that the protein produced through the gene-editing treatment will give sickle cell patients like Gray healthy red blood cells. ‘Treatise’: A Visual Symphony Of Information Design Why Cornelius Cardew’s legendary 193-page graphic score might not actually be about music Any Sufficiently Advanced Neglect is Indistinguishable from Malice, Assumptions and Bias in Algorithmic Systems I want to spend time on it because I think what doesn’t get through in many of our discussions is that it’s not just about how Artificial Intelligence, Machine Learning, or Algorithmic instances get trained, but the processes for how and the cultural environments in which HUMANS are increasingly taught/shown/environmentally encouraged/socialized to think is the “right way” to build and train said systems. 150 successful machine learning models: 6 lessons learned at Booking.com We found that driving true business impact is amazingly hard, plus it is difficult to isolate and understand the connection between efforts on modeling and the observed impact… Our main conclusion is that an iterative, hypothesis driven process, integrated with other disciplines was fundamental to build 150 successful products enabled by machine learning. Eight Habits of Expert Software Designers: An Illustrated Guide volve the user, design elegant abstractions, focus on the essence, simulate continually, look around, reshape the problem space, see error as opportunity, think about what they are not designing
  53. Friday Feed October 18, 2019 Canada's Decision To Make Public

    More Clinical Trial Data Puts Pressure On FDA Transparency advocates say clinical study reports need to be made public in order to understand how regulators make decisions and to independently assess the safety and efficacy of a drug or device. They also say the reports provide medical societies with more thorough data to establish guidelines for a treatment's use, and to determine whether articles about clinical trials published in medical journals — a key source of information for clinicians and medical societies — are accurate. Most Popular Programming Languages 1965 - 2019 A Barchart race for the ages Most Popular Websites 1996 - 2019 Watch this barchart race to understand the winners and losers over the last two decades Google and Ambient Computing Now you heard me talk about this idea with Baratunde, that helpful computing can be all around you — ambient computing. Your devices work together with services and AI, so help is anywhere you want it, and it’s fluid. The technology just fades into the background when you don’t need it. So the devices aren’t the center of the system, you are. That’s our vision for ambient computing. The Lines of Code That Changed Everything Apollo 11, the JPEG, the first pop-up ad, and 33 other bits of software that have transformed our world. Can you make AI fairer than a judge? Play our courtroom algorithm game The US criminal legal system uses predictive algorithms to try to make the judicial process less biased. But there’s a deeper problem.
  54. Friday Feed October 25, 2019 Millions of black people affected

    by racial bias in health-care algorithms An algorithm widely used in US hospitals to allocate health care to patients has been systematically discriminating against black people, a sweeping analysis has found. Faces of Open Source This project is my attempt to highlight a revolution whose importance is not broadly understood by a world that relies heavily upon the fruits of its labor. createCanvas: Interview with Dan Shiffman My official, fancy-sounding title is Associate Arts Professor, and most of the courses that I teach are — the way I think of them is they’re computer science-like classes. I look at a lot of topics that you would find in a computer science course, but through the lens of art and creativity and open source. How We Built the World Wide Web in Five Days Our story begins with the Big Bang. Quantum supremacy using a programmable superconducting processor Our Sycamore processor takes about 200 seconds to sample one instance of a quantum circuit a million times—our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years. On “Quantum Supremacy” Recent advances in quantum computing have resulted in two 53-qubit processors: one from our group in IBM and a device described by Google in a paper published in the journal Nature. In the paper, it is argued that their device reached “quantum supremacy” and that “a state-of-the-art supercomputer would require approximately 10,000 years to perform the equivalent task.” We argue that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity.
  55. Friday Feed November 1, 2019 50 years ago today, the

    internet was born in Room 3420 Here’s the story of the creation of ARPANET, the groundbreaking precursor to the internet—as told by the people who were there. 'Graphic detail', @TheEconomist's print section dedicated to data journalism, just turned one! I wrote about what we learned in our first year here: https://medium.economist.com/a-year-in-graphic-detail-d1825b28e06f Rethinking Patient Data at the Mayo Transform Conference Earlier this year, we collaborated with Mayo Clinic’s Center for Innovation to develop prototypes of data-driven healthcare applications for the future. Our design and development process is unusual in the healthcare field — as it is in many fields — and Mayo, impressed with the results, asked us to run a breakout session at Transform outlining how we use our process to approach problems surrounding patient data. قلب: a non-ASCII programming language written entirely in Arabic قلب is a programming language exploring the role of human culture in coding. Code is written entirely in Arabic, highlighting cultural biases of computer science and challenging the assumptions we make about programming. The Pentagon’s AI Ethics Draft Is Actually Pretty Good The much-anticipated draft document released on Thursday by a Pentagon advisory group goes beyond similar lists of principles promulgated by big tech companies. If the military manages to adopt, implement, and follow the guidelines, it would leap into an increasingly rare position as a leader in establishing standards for the wider tech world.
  56. Friday Feed November 8, 2019 What W. E. B. Du

    Bois Conveyed in His Captivating Infographics The most arresting visuals in the exhibition were handmade charts and graphics that illustrated the evolution of black life since emancipation. They featured facts and figures about population growth and political participation, educational attainment and financial clout. A line depicting the urban and rural populations of Georgia in 1890 breaks into a sudden spiral, resembling an elegant snake. Crispr Takes Its First Steps in Editing Genes to Fight Cancer Doctors have for the first time in the United States tested a powerful gene-editing technique in people with cancer. Understanding the World with Program Synthesis The most prominent existing application of program synthesis in the natural sciences is in executable biology. Here, one models cellular behavior using stateful, concurrent programs that represent biological entities like proteins and genes. Each “component”, or process, of such a program interacts with neighboring processes and changes state when certain events take place (e.g., when a molecular signal is received). Because of the inherent complexity of concurrency, even a small number of components can collectively describe highly nontrivial system behaviors. An Indie Approach to DataViz—Creating Market Cafe Magazine Co-Founders Tiziana Alocci and Piero Zagami talk about their love for print and tackling activism through data visualization How terrible software design decisions led to Uber’s deadly 2018 crash Two things are noteworthy about this sequence of events. First, at no point did the system classify her as a pedestrian. According to the NTSB, that's because "the system design did not include consideration for jaywalking pedestrians."
  57. Friday Feed November 15, 2019 Behind the Scenes of a

    Radical New Cancer Cure What if we could go one step further? What if we could genetically engineer a patient’s own immune cells to spot and fight cancer, as a sort of “best hits” of genetic therapy and immunotherapy? Apple just released an app that tracks your heart, hearing, and menstrual cycles the app is meant to collect user data and send it to research partnerships that Apple has set up—three of them, each of which will focus on a separate issue. The first, on women’s health, links the company with Harvard’s public health school and the National Institutes of Health. The second study focuses on noise pollution and headphone usage, sharing data with the University of Michigan and the World Health Organization. The last is on heart and movement, with researchers from Brigham and Women’s Hospital and the American Heart Association. Does AI Have a Place in Medicine? Currently, our team is using AI to find the hidden pearls of wisdom buried inside massive reams of data. At the same time, we are striving to create a new, hybrid role—what we call “physician data scientists”—who understand machine learning, AI and how these technologies can be applied to medical research and clinical practice. Our goal is to improve patient outcomes and drive down costs. The Greatest Library You’ve Never Heard Of: ‘The Visual Telling Of Stories’ Dr. Chris Mullen’s drive to collect and share information has created a fascinating archive with a focus on Fortunes Magazine and the history of visual communication How the Dumb Design of a WWII Plane Led to the Macintosh But Chapanis and Fitts were proposing something deeper than a solution for airplane crashes. Faced with the prospect of soldiers losing their lives to poorly designed machinery, they invented a new paradigm for viewing human behavior. That paradigm lies behind the user-friendly world that we live in every day. They realized that it was absurd to train people to operate a machine and assume they would act perfectly under perfect conditions.
  58. Friday Feed November 22, 2019 Visualizing Artificial Intelligence From robots

    to brains and lightning bolts, representing concepts related to Artificial Intelligence (AI) can be complicated. Noun Project recently teamed up with Essence, a data science and measurement-driven agency, for an Iconathon to create a new collection of icons that better represent key concepts in AI. How to recognize AI snake oil Much of what’s being sold as “AI” today is snake oil — it does not and cannot work. Why is this happening? How can we recognize flawed AI claims and push back? Thoughtworks Technology Radar, Vol 21 Cloud: Is More Less?, Protecting the Software Supply Chain, Interpreting the Black Box of ML, Software Development as a Team Sport Let’s Practice Storytelling With Data A conversation with Cole Nussbaumer Knaflic, who literally wrote the book on data storytelling The Architect of Modern Algorithms Barbara Liskov pioneered the modern approach to writing code. She warns that the challenges facing computer science today can’t be overcome with good design alone.
  59. Friday Feed November 29, 2019 Big Calculator: How Texas Instruments

    Monopolized Math Class These $100 calculators have been required in classrooms for more than 20 years, as students and teachers still struggle to afford them Digital Tools I Wish Existed ...the core issue is an extraordinarily high level of friction in the process of finding, organizing, and sharing digital content. The Deep Learning Revolution and Its Implications for Computer Architecture and Chip Design This paper is a companion paper to a keynote talk at the 2020 International Solid-State Circuits Conference (ISSCC) discussing some of the advances in machine learning, and their implications on the kinds of computational devices we need to build, especially in the post-Moore's Law-era. It also discusses some of the ways that machine learning may also be able to help with some aspects of the circuit design process. Endless River: An Overview of DataViz for Categorical Data Let us explore some flow and network chart types that are ready-made for visual storytelling using categorical data Big Tech’s Big Defector Roger McNamee made a fortune as one of Silicon Valley’s earliest champions. Now he’s one of its most fervent critics.
  60. Friday Feed December 6, 2019 A Distributed Meeting Primer My

    advice, which is both lived and collected via Twitter, falls into three categories: Pre-Meeting, During, and Post-Meeting. Less Reporting, More Visualization Visualization, on the other hand, is an active participant in wringing the truth out of data. No analytical work is complete without it. But it’s not just an add-on at the end or a pleasant veneer to a perfectly fine set of analytics. It’s vital to how analytics are understood and perceived. Unlike reporting, visualization has a unique ability to clarify murky ideas and make connections that otherwise would have been missed. The Deep Learning Revolution and Its Implications for Computer Architecture and Chip Design This paper is a companion paper to a keynote talk at the 2020 International Solid-State Circuits Conference (ISSCC) discussing some of the advances in machine learning, and their implications on the kinds of computational devices we need to build, especially in the post-Moore’s Law-era. The One-Traffic-Light Town with Some of the Fastest Internet in the U.S. There’s a sit-down restaurant, Opal’s, that serves the weekday breakfast-and-lunch crowd, one traffic light, a library, a few health clinics, eight churches, a Dairy Queen, a pair of dollar stores, and some of the fastest Internet in the United States. Subscribers to Peoples Rural Telephone Cooperative (P.R.T.C.), which covers all of Jackson County and the adjacent Owsley County, can get speeds of up to one gigabit per second, and the coöperative is planning to upgrade the system to ten gigabits. How I Taught My Computer to Write Its Own Music I was thrilled and astonished. It was exactly what I was hoping for: The computer had created alluring music—music I wanted to listen to!—from a completely unexpected manipulation of the sonic information I had given it. The music was at once futuristic and nostalgic, slightly melancholy, and quite subtle: Even the digital noise samples it used—basically sonic detritus—seemed sensitively integrated.
  61. Friday Feed December 13, 2019 “My Car does not start

    when I buy Vanilla Ice Cream”, said a Man to General Motors. Did you ever imagine that an “Ice Cream” could shake the entire General Motors? In 2010, the Pontiac Division of General Motors received a very funny complaint from one of its customers. It was so weird and bizarre that it took the entire General Motors by storm. However, on reading the entire Case, this definitely caught our interest and we realized that this is by far an epic case of ‘Customer Care’. It teaches us that however weird the complaint is, never under estimate your Client! The Verge’s Gadgets Of The Decade The 100 gadgets that made a difference and defined the 2010s Understanding searches better than ever before With the latest advancements from our research team in the science of language understanding--made possible by machine learning--we’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search. Failure Modes in Machine Learning Intentional failures wherein the failure is caused by an active adversary attempting to subvert the system to attain her goals – either to misclassify the result, infer private training data, or to steal the underlying algorithm. Unintentional failures wherein the failure is because an ML system produces a formally correct but completely unsafe outcome. Rules of Machine Learning: Best Practices for ML Engineering This document is intended to help those with a basic knowledge of machine learning get the benefit of best practices in machine learning from around Google. It presents a style for machine learning, similar to the Google C++ Style Guide and other popular guides to practical programming. If you have taken a class in machine learning, or built or worked on a machine­learned model, then you have the necessary background to read this document.
  62. Friday Feed December 20, 2019 Twelve Million Phones, One Dataset,

    Zero Privacy What we learned from the spy in your pocket. What Happens After Prisoners Learn to Code? Slack, one of Silicon Valley’s more diverse companies, has hired three formerly incarcerated coders. Google AI chief Jeff Dean interview: Machine learning trends in 2020 At the Neural Information Processing Systems (NeurIPS) conference this week in Vancouver, Canada, machine learning took center stage as 13,000 researchers explored things like neuroscience, how to interpret neural network outputs, and how AI can help solve big real-world problems. What Happens After Prisoners Learn to Code? Slack, one of Silicon Valley’s more diverse companies, has hired three formerly incarcerated coders. Coolest Things I Learned In 2019 The most popular part of the newsletter is a section called “Coolest Things I Learned This Week.” It’s fun and eclectic, interesting and intriguing. This is a collection of the most popular ideas I shared in 2019. How AI will eat UI The inevitable day when machines learn to design our apps
  63. Friday Feed December 27, 2019 Trends 2020 from Fjord, design

    and innovation from Accenture Interactive Many faces of growth, Money changers, Walking barcodes, Liquide people, Designing intelligence, Digital doubles, Life-centered design On the Measure of Intelligence To make deliberate progress towards more intelligent and more human-like artificial systems, we need to be following an appropriate feedback signal: we need to be able to define and evaluate intelligence in a way that enables comparisons between two systems, as well as comparisons with humans. Getting Started With ‘Small Multiples’ — an Underused but Powerful Form of Data Viz Small multiple charts are widely underused though they are one of the best tools to derive insights from data. It is estimated that only 1 in 10 data analysts use small multiple. How to ask good questions Asking good questions is a super important skill when writing software. I’ve gotten way better at it over the years (to the extent that it’s something my coworkers comment on a lot). Here are a few guidelines that have worked well for me! Netflix was the best-performing stock of the decade, delivering a more than 4,000% return A $1 million bet on Netflix’s stock placed on Jan. 1, 2010, would be worth close to $43 million today. The 4,181% return, as of Friday’s close, beats all current members of the S&P 500, which Netflix joined in December 2010, replacing The New York Times. The index as a whole is up 189% over the past 10 years.