Entries from February 2020 ↓

pinboard February 29, 2020

Digest powered by RSS Digest

pinboard February 28, 2020

Digest powered by RSS Digest

pinboard February 27, 2020

Digest powered by RSS Digest

pinboard February 26, 2020

Digest powered by RSS Digest

pinboard February 25, 2020

  • What’s Really Holding Women Back?

    Carefully done study of gender differences in responding to unrealistic workloads combined with overwhelming confirmation bias on the part of management.
    https://hbr.org/2020/03/whats-really-holding-women-back

  • 1811.00982v2.pdf

    ✨🖼️@GoogleAI presents #OpenImagesV4, a dataset of 9.2M images with unified annotations for image classification, object detection and visual relationship detection.

    30.1M image-level labels for 19.8k concepts, 15.4M bounding boxes for 600 object classes

    https://arxiv.org/pdf/1811.00982v2.pdf https://twitter.com/DynamicWebPaige/status/1232174690925301761/photo/1

  • Google AI Blog: Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer
    Over the past few years, transfer learning has led to a new wave of state-of-the-art results in natural language processing (NLP). Transfer learning’s effectiveness comes from pre-training a model on abundantly-available unlabeled text data with a self-supervised task, such as language modeling or filling in missing words. After that, the model can be fine-tuned on smaller labeled datasets, often resulting in (far) better performance than training on the labeled data alone. The recent success of transfer learning was ignited in 2018 by GPT, ULMFiT, ELMo, and BERT, and 2019 saw the development of a huge diversity of new methods like XLNet, RoBERTa, ALBERT, Reformer, and MT-DNN. The rate of progress in the field has made it difficult to evaluate which improvements are most meaningful and how effective they are when combined.

    In “Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer”, we present a large-scale empirical survey to determine which transfer learning techniques work best and apply these insights at scale to create a new model that we call the Text-To-Text Transfer Transformer (T5). We also introduce a new open-source pre-training dataset, called the Colossal Clean Crawled Corpus (C4). The T5 model, pre-trained on C4, achieves state-of-the-art results on many NLP benchmarks while being flexible enough to be fine-tuned to a variety of important downstream tasks. In order for our results to be extended and reproduced, we provide the code and pre-trained models, along with an easy-to-use Colab Notebook to help get started.

  • Speakers
    Women in tech speakers
  • [untitled]
  • Twitter
    RT @mikiebarb: What a lead!
  • Fate and Fury in James McBride’s “Deacon King Kong”  | The New Yorker
  • 100 Brilliant Women in AI Ethics for 2020 – Lighthouse3
  • Twitter
    RT @mattdpearce: The coronavirus story will also become a health-insurance story in America. This guy went to China, caught the flu,…
  • Twitter
    RT @nycsouthpaw: The damage the private insurance industry, particularly short term junk plans, could do by dissuading vulnerable or…
  • Twitter
    RT @JessicaValenti: That cheering you hear is the sound of female journalists finally being able to drop the "alleged" before "rapist H…
  • Twitter
    RT @EricMGarcia: Credit to the women who were willing to speak to @jodikantor @mega2e and @ronanfarrow. I hope their loved ones are…

Digest powered by RSS Digest

pinboard February 24, 2020

Digest powered by RSS Digest

pinboard February 23, 2020

Digest powered by RSS Digest

pinboard February 22, 2020

Digest powered by RSS Digest

pinboard February 21, 2020

Digest powered by RSS Digest

pinboard February 20, 2020

Digest powered by RSS Digest