Streaming Made Easy

Streaming Made Easy

Share this post

Streaming Made Easy
Streaming Made Easy
The AI Distribution Playbook

The AI Distribution Playbook

Lessons Learned From SaaS & Streaming Giants

Marion Ranchet's avatar
Marion Ranchet
Feb 18, 2025
∙ Paid

Share this post

Streaming Made Easy
Streaming Made Easy
The AI Distribution Playbook
1
Share

Show me any business and I will look at it from a distribution strategy angle, force of habit I guess.

It was about time I took a closer look at how LLM models like OpenAI’s ChatGPT, Google’s Gemini, Mistral AI’s Le Chat, Claude and DeepSeek, design their partnership strategy.

Much has been said about the partnership deals they strike to feed their models (e.g. News Corps, Axel Springer and most recently The Guardian) so let me focus on the distribution strategy of the models themselves. As much as we like to think they are unstoppable, they need to answer the same crucial question streamers do: how do I reach users at scale? This is the question I’ll try to answer today.

Today at a glance:

  • Key LLM Models Overview

  • Lessons Learned From SaaS & Streaming Giants

  • A New Kind Of Bundle

Mentioned in this edition: OpenAI, Google, Mistral AI, DeepSeek, Claude, Free, Microsoft, Amazon, Netflix, Apple, Samsung

Streaming Made Easy is a reader-supported publication. To receive new posts and support my work, consider becoming a paid subscriber.


Key LLM Models Overview

LLMs adopt a SaaS (Subscription As A Service) model which follows the traditional cloud software distribution approach: users subscribe directly to access AI services via web, mobile platforms or APIs. Users only need to focus on utilising the software, as maintenance and infrastructure management are handled by the service provider.

→ Key characteristics of SaaS distribution:

• Subscription tiers (free, premium, enterprise).

• Direct web access & mobile apps (self-service).

• API integrations (for businesses and developers).

• Feature-based and Pay-per-use models (e.g., OpenAI API pricing based on token usage).

→ Breakdown of each key LLM Model’s user base, pricing, distribution:

How many users do they have? Let’s go with a visual representation to give you a quick snapshot of where they’re at:

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Marion Ranchet
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share