pinboard October 6, 2018

  • Health tech pioneer Deborah Estrin named MacArthur fellow | Cornell Chronicle
  • Twitter
    RT @shondarhimes: I. HAVE. HAD. IT. WITH. THE. MISOGYNY.
  • A New Study Shows How Mushrooms Could Save Bees. Yes, Mushrooms. – Mother Jones
  • Twitter
    RT @dizwire: Holy shit, Mad Magazine 👏👏👏
    Pulling no punches.
  • Twitter
    RT @a_stylin: Who Run The World
  • Cloud poetry: training and hyperparameter tuning custom text models on Cloud ML Engine | Google Cloud Blog
    Machine learning models for interpreting and processing natural language have made tremendous advances in recent years thanks to deep learning methods. Recurrent models continue to be a common choice for textual data, but newer models based on fully convolutional architectures like ByteNet, and even more recently models based on attention like the Transformer have yielded impressive results. All this complexity—added to the fast pace of research—have made it hard to keep current and apply the latest methods to your own problems.

    This is why the open-sourcing of Tensor2Tensor (T2T), a library of best-in-class machine learning models by the Google Brain team, was so exciting—you now have at your disposal a standard interface that ties together all the pieces needed in a deep learning system: datasets, model architectures, optimizers, and hyperparameters in a coherent and standardized way that enables you to try many models on the same dataset, or apply the same model to many datasets.

    Now that we’ve established that the software tools exist, how should you go about setting up  a training environment to launch many experiments? In this blog post, we provide a tutorial on how to use T2T and Google Cloud ML Engine, Google Cloud Platform’s fully managed ML service, to train a text-to-text model on your own data. With T2T and ML Engine, you won’t have to manage any infrastructure for training or hyperparameter tuning. You will be able to train a sophisticated, custom natural language model from just a Jupyter notebook.

    Throughout this blog post, we will examine code blocks from this Jupyter notebook—we strongly encourage you to fire up Cloud Datalab (you don’t need an instance with a GPU because we’ll submit jobs to Cloud ML Engine) and try out the notebook on Google Cloud Platform.

  • Google AI Blog: Accelerating Deep Learning Research with the Tensor2Tensor Library
    Deep Learning (DL) has enabled the rapid advancement of many useful technologies, such as machine translation, speech recognition and object detection. In the research community, one can find code open-sourced by the authors to help in replicating their results and further advancing deep learning. However, most of these DL systems use unique setups that require significant engineering effort and may only work for a specific problem or architecture, making it hard to run new experiments and compare the results.

    Today, we are happy to release Tensor2Tensor (T2T), an open-source system for training deep learning models in TensorFlow. T2T facilitates the creation of state-of-the art models for a wide variety of ML applications, such as translation, parsing, image captioning and more, enabling the exploration of various ideas much faster than previously possible. This release also includes a library of datasets and models, including the best models from a few recent papers (Attention Is All You Need, Depthwise Separable Convolutions for Neural Machine Translation and One Model to Learn Them All) to help kick-start your own DL research.

  • We were Brett Kavanaugh’s drinking buddies. We don’t think he should be confirmed. – The Washington Post
    RT @donnabrazile: Wow……..

    We were Brett Kavanaugh’s drinking buddies. We don’t think he should be confirmed.

Digest powered by RSS Digest