Here are a selection of tweets we have collected from February 2020.
RIP Katherine Johnson (1918-2020). What a life:
🙏 One of the first African-American women to work at NASA.
🙏 Worked as a “human computer" for Mercury, Apollo, Shuttle programs
🙏 John Glenn requested she personally re-check computer calculations before his Friendship 7 mission pic.twitter.com/qxhdZ4edeJ
— Popular Science (@PopSci) February 24, 2020
Every year, we pick 10 recent technological breakthroughs that we predict will have a big impact in the years to come. We're proud to release this year’s list of the technologies that we believe will make a real difference in solving important problems. https://t.co/ct3KYjwebB
— MIT Technology Review (@techreview) February 26, 2020
— AIhub (@aihuborg) February 8, 2020
A thread of some #FAT2020 paper presentations that translated between academic disciplines 👇 As an interdisciplinary organisation, we’re excited about work that intersects and communicates across experience, which is crucial but can be under-recognised in academic contexts.
— Ada Lovelace Institute (@AdaLovelaceInst) January 30, 2020
⚡️We also have a fantastic #Viewpoint in which we gathered six scientists to discuss challenges, hopes and hypes of #AI in cancer research, diagnosis and care.https://t.co/6IgFW6mfsT pic.twitter.com/Q5CjMNXiXS
— Nature Cancer (@NatureCancer) February 24, 2020
—Nobel Laureate Danny Kahneman
The Next Decade in AI: Four Steps Towards Robust Artificial Intelligence
It's what I wish I had had time to say at the #AIDebate 🙂
Finally ready, free, on arXiv. Happy Reading! https://t.co/rbeWGMvqMO
— Gary Marcus (@GaryMarcus) February 17, 2020
The fastai paper (with @GuggerSylvain) is now available on arXiv and on our site! It's been peer reviewed and will appear in the journal Information soon.
The paper covers v2, which is a from-scratch rewrite that focuses on usability and hackability.
— Jeremy Howard (@jeremyphoward) February 13, 2020
Introducing the TensorFlow Constrained Optimization library, a new tool to configure and train #MachineLearning models based on combinations of metrics, making it easy to formulate and solve problems of interest to the ML fairness community. Learn more at https://t.co/vVdqfHgWnL pic.twitter.com/9yuDLb4YxF
— Google AI (@GoogleAI) February 21, 2020
Microsoft researchers and engineers release Zero Redundancy Optimizer (ZeRO) and DeepSpeed library, a system able to train 100-billion-parameter deep learning models. Learn about this breakthrough and how it led to Turing Natural Language Generation: https://t.co/NY75qYd07a
— Microsoft Research (@MSFTResearch) February 10, 2020
The new language model our teams built is the largest and most powerful one ever created – a milestone with the promise to transform how technology understands and assists us. https://t.co/YvLM0HAr8u
— Satya Nadella (@satyanadella) February 12, 2020
— Facebook AI (@facebookai) February 6, 2020
Introducing the RoboTurk Real Robot Dataset – one of the largest, richest, and most diverse robot manipulation datasets ever collected using human creativity and dexterity!
54 non-expert demonstrators
— Stanford Vision and Learning Lab (@StanfordSVL) February 26, 2020