- “America is becoming more like Russia” says Russian-born novelist Gary Shteyngart | Salon.com
- Seattle-area women of color share how they navigate the workplace | The Seattle Times
- Literary awards season heats up with $50,000 Kirkus Prize finalists – Los Angeles Times
- BBC – Future – How to use seawater to grow food – in the desert
- Federal judge restores grizzly protections, canceling bear hunt
- justice-brett-kavanaugh-would-represent-an-immediate-threat-to-lgbt-rights
- New PPP Poll: Jess King Makes PA-11 Single-Digit Race – Jess King for Congress
- the-most-promising-migraine-drug-in-years-is-being-held-1829112554
- Stormwater management: How Dutch solutions could have mitigated Hurricane Florence damage – 60 Minutes – CBS News
- Kavanaugh’s Yearbook Page Is ‘Horrible, Hurtful’ to a Woman It Named – The New York Times
- [1705.05355] Probabilistic Matrix Factorization for Automated Machine Learning
In order to achieve state-of-the-art performance, modern machine learning techniques require careful data pre-processing and hyperparameter tuning. Moreover, given the ever increasing number of machine learning models being developed, model selection is becoming increasingly important. Automating the selection and tuning of machine learning pipelines consisting of data pre-processing methods and machine learning models, has long been one of the goals of the machine learning community. In this paper, we tackle this meta-learning task by combining ideas from collaborative filtering and Bayesian optimization. Using probabilistic matrix factorization techniques and acquisition functions from Bayesian optimization, we exploit experiments performed in hundreds of different datasets to guide the exploration of the space of possible pipelines. In our experiments, we show that our approach quickly identifies high-performing pipelines across a wide range of datasets, significantly outperforming the current state-of-the-art.
- Microsoft unveils AI capability that automates AI development – The AI Blog
- kubeflow/kubebench: Repository for benchmarking
The goal of Kubebench is to make it easy to run benchmark jobs on Kubeflow with various system and model settings. Kubebench enables benchmarks by leveraging Kubeflow’s capability of managing TFJobs, as well as Argo based workflows.
- chainer/chainer: A flexible framework of neural networks for deep learning
Chainer is a Python-based deep learning framework aiming at flexibility. It provides automatic differentiation APIs based on the define-by-run approach (a.k.a. dynamic computational graphs) as well as object-oriented high-level APIs to build and train neural networks. It also supports CUDA/cuDNN using CuPy for high performance training and inference.
- Microsoft Learn | Microsoft Docs
The skills required to advance your career and earn your spot at the top do not come easily. Now there’s a more rewarding approach to hands-on learning that helps you achieve your goals faster.
- Country-level social cost of carbon | Nature Climate Change
Digest powered by RSS Digest