Researchers affiliated with Uber AI and OpenAI proposed a new approach to neural architecture search (NAS), a technique that involves evaluating hundreds or thousands of AI models to identify the top performers. In a preprint paper, they claim their technique, called Synthetic Petri Dish, accelerates the most computationally intensive NAS steps while predicting model performance […]
Home »
OpenAI debuts gigantic GPT-3 language model with 175 billion parameters
A team of more than 30 OpenAI researchers have released a paper about GPT-3, a language model capable of achieving state-of-the-art results on a range of benchmark and unique natural language processing tasks that range from language translation to generating news articles to answering SAT questions. GPT-3 is a whopping 175 billion parameter model. By […]