These are key drivers transitioning LLMs from single-player, generic use to broad commercial adoption in the workplace — but what are they, and who supports each?

Let’s start by defining these concepts and aligning terminology:

LLMs’ transition to broad workplace adoption

Every week now, I meet people and teams — sometimes those not at the forefront of technical adoption — who are setting up their own Custom GPTs in Chat GPT. They’re feeding in their best prepared documents and then using these workspaces to draft emails, create on-brand copy and aid in writing new documents.

Custom GPTs have emerged as the most prominent example of AI Workspaces, and they are perhaps the most pivotal in shifting LLM use from single-player, generic use-cases to broad, commercially driven usage in the workplace.

Equally crucial to this transition are function calls, which enable LLMs to pull from structured data sources or interact with third-party systems directly on behalf of users. While function calls have already gained traction within custom tools built by developers using foundational model providers’ APIs, the use of function calls in AI Workspaces is only beginning to emerge — a combination we expect to see more of in the coming months.

One example of this powerful combination is how we are bringing spreadsheets to Custom GPTs at GRID, bridging the gap to make spreadsheets and spreadsheet calculations available to LLMs.

Who Does What and by What Name?

All information is as of writing on October 28th, 2024

ChatGPT

OpenAI is good at many things. Naming is not one of them.

Officially, OpenAI refers to ChatGPT’s AI Workspaces as simply GPTs (see original announcement), though most users know them as Custom GPTs. GPTs support file uploads, with optimized handling for text files, images, and tabular data formats. Users can also specify website URLs in the GPTs’ instructions, allowing the model to incorporate content from those sites into responses.

Function calls within Custom GPTs are branded as Actions, while on OpenAI’s API platforms (Chat Completions API, Assistants API, and Batch API), it goes by the straightforward name Function calling.

Claude

Claude’s AI Workspaces are known as Projects. Projects are limited to uploaded text documents and directly inputted text. They lack the ability to incorporate website URLs.

Claude’s Projects, do not support function calling. But function calling is available on Claude’s broader platform, where it’s referred to as Tool use.

Google

Over the past two years, Google has explored numerous approaches in the LLM space. A notable project, which can be considered an AI Workspace, is NotebookLM. NotebookLM accepts a variety of sources: uploaded files, files in Google Drive and text from website links. It handles text documents and images effectively, as well as, impressively, video and audio files.

NotebookLM does not support Function calls, but Google’s Gemini Platform does.

Cohere

Cohere focuses on enterprise solutions and as such their default mode of operation is as an AI Workspace. They offer a range of deployment options, but you can test their offering through the Chat or Playground on their Dashboard. The platform accepts text files and can be configured to access web content based on specified domains.

Cohere’s function calling capability, termed Tool Use, allows for interaction with external tools. Within the Playground, only a limited selection of pre-built tools is available, but in a full deployment, customers can define their own tools using Cohere’s Chat API.

Glean

Like Cohere, Glean is enterprise-focused, with a default setup functioning as an AI Workspace. Additionally, Glean enables users to create custom Apps with dedicated sources and instructions, effectively allowing for further tailored AI Workspaces within a platform setup.

The platform supports function calling, referred to as Actions, which are accessible both in custom Apps and across the broader platform.

Perplexity

Perplexity positions itself as an AI Search Engine, offering a distinct approach compared to general-purpose LLMs.

Recently, Perplexity introduced an AI Workspace feature called Spaces, which supports uploaded text files and tabular data alongside its core web search functionality.

Perplexity does not currently support function calling, and given its search-centric focus, it may not have plans to introduce this capability.

Meta AI / Llama

Meta AIArchived takes a different approach, focused on integrating with existing social media accounts and content, and as such does not have an AI Workspace as per the definition above.

Meta’s open source LLM model, Llama, does support function calling.

Mistral

Mistral primarily operates as a model provider, so it’s unsurprising that neither their online chat environment, Le Chat, nor their platform, Le Plateforme, offers a configurable AI Workspace. However, developers can fine-tune Mistral’s models to work with any selection of data.

Mistral’s models support function calling.