Startups

4 ways startups will drive GPT-3 adoption in 2021

Comment

Robot paper holding pen, space for text
Image Credits: Zastrozhnov (opens in a new window) / Getty Images

Oren Etzioni

Contributor

Professor Emeritus at the University of Washington, Oren Etzioni is an entrepreneur and CEO of the non-profit Allen Institute for AI.

More posts from Oren Etzioni

The introduction of GPT-3 in 2020 was a tipping point for artificial intelligence. In 2021, this technology will power the launch of a thousand new startups and applications. GPT-3 and similar models have brought the power of AI into the hands of those looking to experiment — and the results have been extraordinary.

Trained on trillions of words, GPT-3 is a 175-billion parameter transformer model — the third of such models released by OpenAI. GPT-3 is remarkable in its ability to generate human-like text and responses — in some respects, it’s eerie. When prompted by a user with text, GPT-3 can return coherent and topical emails, tweets, trivia and much more.

Suddenly, authoring emails, customer interactions, social media exchanges and even news stories can be automated — at least in part. While large companies are pondering the pitfalls and risks of generating text (remember Microsoft’s disastrous Tay bot?), startups have already begun sweeping in with novel applications — and they will continue to lead the charge in transformer-based innovation.

OpenAI researchers first released the paper introducing GPT-3 in May 2020, and what started out as some nifty use cases on Twitter has quickly become a hotbed of startup activity. Companies have been formed on top of GPT-3, using the model to generate emails and marketing copy, to create an interactive nutrition tracker or chatbot, and more. Let OthersideAI take a first pass at writing your emails, or try out Broca or Snazzy for your ad copy and campaign content, for instance.

Other young companies are harnessing the API to accelerate their existing efforts, augmenting their technical teams’ capabilities with the power of 175 billion parameters and quickly bringing otherwise difficult products to market with much greater speed and data than previously possible. With some clever prompt engineering (a combination of an instruction to the model with a sample output to help guide the model), these companies leverage the underlying GPT-3 system to improve or extend an existing application’s capabilities.

Sure, a text expander can be a useful tool for shorthand notation — but powered by GPT-3, that shorthand can be transformed into a product that generates contextually aware emails in your own style of writing.

As early-stage technology investors, we are inspired to see AI broadly, and natural language processing specifically, become more accessible via the next generation of large-scale transformer models like GPT-3. We expect they will unlock new use cases and capabilities we have yet to even contemplate.

It’s worth noting that while impressive, GPT-3 is far from perfect. Access is expensive (especially for the most robust version). Reliability is a well-known issue. And, the model is often criticized for generating ridiculous, nonsensical and repetitive statements. Users need to become adept at prompt engineering (a method of training the model by “prompting” with an instruction and sample of ideal output) to further train and refine model outputs.

Create a handbook and integrate AI to onboard remote employees

More broadly, threats around fake news, documents and bias are real — we as an industry, and OpenAI as an organization, have big questions ahead to address.

So, what will become of GPT-3 in 2021?

What OpenAI — and crucially, the beta testers with access to GPT-3 and other models — are able to accomplish continues to surprise and, in many cases, unexpectedly delight us.

Here are our key predictions for GPT-3 in the coming year:

Transformer models will become more accessible: While it’s true that 175 billion is an exceptional number of parameters, we expect a slew of competing models to emerge that will help drive down the cost to access GPT-3 and other models. Researchers at Google Brain announced a 1.6-trillion-parameter language model, and a grassroots collection of researchers is working together on GPT-Neo. These alternatives, including GPT-3, will provide users with access to improved output, increased reliability and speed, and likely more affordable access to large-scale transformer models. As this happens, startups will leverage these models to rapidly and iteratively develop new applications.

Text will become the command line: As new applications on top of transformer models continue to emerge, they will primarily center around text as the core input that translates to a variety of outputs. Historically, we have spent time understanding new languages (spoken and coded) and we think next-generation transformer models will begin to act as a “universal translator” of sorts, putting text at the front and center and bringing the power to build new applications to the “codeless” among us. We think this will empower a whole new generation of creators, with trillions of parameters at their fingertips, in an entirely low-code/no-code way.

Specialized “models as a service” will emerge for more specific application areas: Large models like GPT-3 that are pre-trained on vast datasets reduce the need for task-specific fine-tuning for general use cases. But, this presents an opportunity for both “prompt engineering” on top of the general models and more specialized models that are ready to deploy. We are already seeing examples of where prompt engineering, which requires very little user-specific training, on top of GPT-3 is generating valuable and contextually relevant output. We expect to see more of these types of applications emerge, particularly those built on specialized models such as Hugging Face.

Data will start becoming a differentiator: As models like GPT-3 become more widely available, a critical component of differentiation on final product output will be the datasets that “public” models are trained on. Companies that invest in building high-quality and proprietary datasets early will establish a strong competitive moat — and those using GPT-3 will be able to focus on finding and curating that dataset.
As evidenced by GPT-3 and emerging competitive models, modern and deeper NLP is a breakthrough technology. The new generation of transformer language models is unlocking use cases by the day and redefining the standards by which we evaluate their capabilities in mere months.

Powered by forms of deep learning and open-source model and dataset sharing, natural language processing capabilities continue to accelerate, promising an exciting year ahead for AI startups broadly and emerging NLP companies specifically. Companies and organizations with substantial resources will keep investing and innovating at the “transformer” infrastructure level. And, venture investors are paying attention primarily around the application level.

Authoring text has always exclusively rested under the domain of humans. While we are not suggesting in the slightest that New York Times journalists and best-selling authors be replaced, we foresee the authoring of mass communications to be increasingly automated with technology like GPT-3.

Deep Science: AI adventures in arts and letters


TC Early Stage: The premier how-to event for startup entrepreneurs and investors

From April 1-2, some of the most successful founders and VCs will explain how they build their businesses, raise money and manage their portfolios.

At TC Early Stage, we’ll cover topics like recruiting, sales, legal, PR, marketing and brand building. Each session includes ample time for audience questions and discussion.

Use discount code ECNEWSLETTER to take 20% off the cost of your TC Early Stage ticket!

More TechCrunch

Mike Krieger, one of the co-founders of Instagram and, more recently, the co-founder of personalized news app Artifact (which TechCrunch corporate parent Yahoo recently acquired), is joining Anthropic as the…

Anthropic hires Instagram co-founder as head of product

Seven orgs so far have signed on to standardize the way data is collected and shared.

Venture orgs form alliance to standardize data collection

As cloud adoption continues to surge towards the $1 trillion mark in annual spend, we’re seeing a wave of enterprise startups gaining traction with customers and investors for tools to…

Alkira connects with $100M for a solution that connects your clouds

Charging has long been the Achilles’ heel of electric vehicles. One startup thinks it has a better way for apartment dwelling EV drivers to charge overnight.

Orange Charger thinks a $750 outlet will solve EV charging for apartment dwellers

So did investors laugh them out of the room when they explained how they wanted to replace Quickbooks? Kind of.

Embedded accounting startup Layer secures $2.3M toward goal of replacing Quickbooks

While an increasing number of companies are investing in AI, many are struggling to get AI-powered projects into production — much less delivering meaningful ROI. The challenges are many. But…

Weka raises $140M as the AI boom bolsters data platforms

PayHOA, a previously bootstrapped Kentucky-based startup that offers software for self-managed homeowner associations (HOAs), is an example of how real-world problems can translate into opportunity. It just raised a $27.5…

Meet PayHOA, a profitable and once-bootstrapped SaaS startup that just landed a $27.5M Series A

Restaurant365, which offers a restaurant management suite, has raised a hot $175M from ICONIQ Growth, KKR and L Catterton.

Restaurant365 orders in $175M at $1B+ valuation to supersize its food service software stack 

Venture firm Shilling has launched a €50M fund to support growth-stage startups in its own portfolio and to invest in startups everywhere else. 

Portuguese VC firm Shilling launches €50M opportunity fund to back growth-stage startups

Chang She, previously the VP of engineering at Tubi and a Cloudera veteran, has years of experience building data tooling and infrastructure. But when She began working in the AI…

LanceDB, which counts Midjourney as a customer, is building databases for multimodal AI

Trawa simplifies energy purchasing and management for SMEs by leveraging an AI-powered platform and downstream data from customers. 

Berlin-based trawa raises €10M to use AI to make buying renewable energy easier for SMEs

Lydia is splitting itself into two apps — Lydia for P2P payments and Sumeria for those looking for a mobile-first bank account.

Lydia, the French payments app with 8 million users, launches mobile banking app Sumeria

Cargo ships docking at a commercial port incur costs called “disbursements” and “port call expenses.” This might be port dues, towage, and pilotage fees. It’s a complex patchwork and all…

Shipping logistics startup Harbor Lab raises $16M Series A led by Atomico

AWS has confirmed its European “sovereign cloud” will go live by the end of 2025, enabling greater data residency for the region.

AWS confirms will launch European ‘sovereign cloud’ in Germany by 2025, plans €7.8B investment over 15 years

Go Digit, an Indian insurance startup, has raised $141 million from investors including Goldman Sachs, ADIA, and Morgan Stanley as part of its IPO.

Indian insurance startup Go Digit raises $141M from anchor investors ahead of IPO

Peakbridge intends to invest in between 16 and 20 companies, investing around $10 million in each company. It has made eight investments so far.

Food VC Peakbridge has new $187M fund to transform future of food, like lab-made cocoa

For over six decades, the nonprofit has been active in the financial services sector.

Accion’s new $152.5M fund will back financial institutions serving small businesses globally

Meta’s newest social network, Threads, is starting its own fact-checking program after piggybacking on Instagram and Facebook’s network for a few months.

Threads finally starts its own fact-checking program

Looking Glass makes trippy-looking mixed-reality screens that make things look 3D without the need of special glasses. Today, it launches a pair of new displays, including a 16-inch mode that…

Looking Glass launches new 3D displays

Replacing Sutskever is Jakub Pachocki, OpenAI’s director of research.

Ilya Sutskever, OpenAI co-founder and longtime chief scientist, departs

Intuitive Machines made history when it became the first private company to land a spacecraft on the moon, so it makes sense to adapt that tech for Mars.

Intuitive Machines wants to help NASA return samples from Mars

As Google revamps itself for the AI era, offering AI overviews within its search results, the company is introducing a new way to filter for just text-based links. With the…

Google adds ‘Web’ search filter for showing old-school text links as AI rolls out

Blue Origin’s New Shepard rocket will take a crew to suborbital space for the first time in nearly two years later this month, the company announced on Tuesday.  The NS-25…

Blue Origin to resume crewed New Shepard launches on May 19

This will enable developers to use the on-device model to power their own AI features.

Google is building its Gemini Nano AI model into Chrome on the desktop

It ran 110 minutes, but Google managed to reference AI a whopping 121 times during Google I/O 2024 (by its own count). CEO Sundar Pichai referenced the figure to wrap…

Google mentioned ‘AI’ 120+ times during its I/O keynote

Firebase Genkit is an open source framework that enables developers to quickly build AI into new and existing applications.

Google launches Firebase Genkit, a new open source framework for building AI-powered apps

In the coming months, Google says it will open up the Gemini Nano model to more developers.

Patreon and Grammarly are already experimenting with Gemini Nano, says Google

As part of the update, Reddit also launched a dedicated AMA tab within the web post composer.

Reddit introduces new tools for ‘Ask Me Anything,’ its Q&A feature

Here are quick hits of the biggest news from the keynote as they are announced.

Google I/O 2024: Here’s everything Google just announced

LearnLM is already powering features across Google products, including in YouTube, Google’s Gemini apps, Google Search and Google Classroom.

LearnLM is Google’s new family of AI models for education