The 'Bring Your Own Coach' Pattern: Building AI-Powered Apps Without Integrating AI

The 'Bring Your Own Coach' Pattern: Building AI-Powered Apps Without Integrating AI

January 9, 2026·urjit

I had a lifetime subscription to a strength training app. Then the company decided lifetime subscriptions weren’t profitable enough and forced everyone into a monthly subscription model.

So I built my own app. Focus Strength.

I have 10 years of workout data - every set, every rep, every weight - in a CSV file. I’d been using Claude and ChatGPT to analyze my training history and suggest workout programs. “Look at my squat progression. I’ve stalled at 315 for three months. What should I change?”

The LLMs are good at this. They understand periodization, volume, recovery. But I had a problem: how do I get these AI-generated workout programs into my app?

The obvious answer? Integrate OpenAI. Or Anthropic. Or Google’s Gemini. Add API keys, streaming responses, prompt engineering, token management, error handling, cost controls, privacy policies…

I said no.

Instead, I built something simpler: a way to bring your own AI coach.

The Problem With “Just Add AI”

Here’s what adding AI to my app would have meant:

  • Complexity I didn’t need - streaming responses, error handling, retry logic
  • Costs that would make a personal app expensive (LLM API pricing adds up fast)
  • Vendor lock-in to whatever LLM I chose today
  • Privacy concerns - I’d be sending my workout data through third-party APIs
  • Maintenance burden - prompt engineering that breaks when models update

And here’s the thing: I already had Claude Pro. I was already using it to analyze my workout data. The LLMs were already helping me - I just needed to get their output into my app.

So why build all that infrastructure when I already had access to better AI than I could integrate?

How It Actually Works

The solution is dead simple:

  1. I open Claude (or ChatGPT, or whatever LLM I want to use that day)
  2. I click a button in the app to generate a prompt with my current workout data
  3. I paste the prompt into the LLM: “Here’s my training history and current exercises. Create a 4-day program focusing on breaking through my squat plateau.”
  4. The LLM generates a JSON configuration with exercises and workout templates
  5. I copy the JSON back into the app
  6. Done. The workout program is now in my app.

No API integration. No keys. No streaming. No AI code at all.

What I Actually Built

Rather than integrating AI, I built something more interesting:

A JSON import/export system with a structured format for exercises and workout templates. This is the foundation - define your domain model clearly and make it easy to serialize.

An LLM prompt generator that creates comprehensive prompts including my current app state, exercise statistics, and the JSON schema with examples. The prompt does the heavy lifting - it teaches the LLM exactly what format I need.

Configuration versioning where I can save up to 10 versions, rollback anytime, and factory reset if I want to start over. I experiment with different AI-generated programs, so I needed an undo button.

Smart merge mode that preserves my workout history and personal records while adding new exercises and templates from imported JSON. This was critical - I don’t want to lose 10 years of data just to try a new program.

Duplicate detection because LLMs aren’t perfectly consistent. “Pull Ups” vs “Pullups” vs “Pull-Ups” would create three separate exercises without this. I normalize names on import.

The Technical Details

The JSON format looks like this:

{
  "metadata": {
    "version": "1.0",
    "createdBy": "Claude",
    "createdAt": "2026-01-09"
  },
  "exercises": [
    {
      "name": "Squat",
      "barWeight": 45,
      "category": "lower",
      "recommendedStartWeight": 135
    }
  ],
  "workoutTemplates": [
    {
      "name": "Upper Body A",
      "exercises": [...]
    }
  ]
}

The prompt I generate includes:

  • My current exercises and templates
  • Workout statistics (session count, max weight, last performed date)
  • The complete JSON schema with examples
  • Space for my specific goals (“I want to break through my squat plateau”)
  • Step-by-step import instructions

LLMs are excellent at generating structured JSON. They understand fitness concepts. They can create personalized programs. I just needed to give them the format.

Here’s the thing: modern LLMs are really good at this. Claude, GPT-4, Gemini - they all handle structured output reliably. The schema acts as a contract, and they respect it.

Why This Pattern Actually Wins

Privacy by default. My 10 years of workout data never leaves my device. I control exactly what I share with which LLM, when I want to. No third-party servers. No API keys to manage.

Use the best AI available. I can use Claude today, GPT-4 tomorrow, whatever new model launches next week. I’m not locked into whatever LLM I chose in 2026. The app just parses JSON - it doesn’t care where it came from.

Simpler codebase. JSON parsing vs. streaming responses, token counting, rate limiting, retry logic, error handling for 15 different API failure modes. I built this in a few days. A proper LLM integration would have taken weeks.

Zero infrastructure costs. No API bills. No surprise invoices. No rate limits. I already pay for Claude Pro - why pay again for AI features in every app?

Works offline. Once I’ve imported a workout program, everything is local. No internet required. I can work out in my basement with terrible cell reception.

Full control. I can review and edit the JSON before importing. If the LLM suggests something stupid, I can fix it. If I want to tweak the program, I can edit the JSON directly.

Version control. I keep up to 10 versions of my config. I can rollback anytime. Trying a new program doesn’t mean losing the old one. Experimentation is cheap.

Future-proof. When GPT-6 or Claude Opus 5 launches with better fitness knowledge, my app just works. No code changes needed.

Shareable configurations. Here’s something I didn’t initially consider: JSON files are just files. A coach can use an LLM to create a personalized program for their client, then send them the JSON file. A friend who knows about training can help you design a program and just share the file. No need for the app to have user accounts, social features, or program sharing infrastructure. It’s just files.

What I Learned

LLMs are great at structured output. JSON generation is reliable and getting better. I rarely see malformed JSON from Claude or GPT-4. They understand the schema and respect it.

Copy-paste is underrated. It “just works” across all platforms. No SDK integrations needed. No OAuth flows. No API key management. Everyone knows how to copy and paste.

Version control matters. I experiment a lot with different programs. Having the ability to rollback was critical. I can try a new AI-generated program and revert if it doesn’t work for me.

Name normalization is critical. LLMs aren’t perfectly consistent with naming. “Bench Press” vs “Bench press” vs “bench press” need to be treated as the same exercise. I normalize to title case and strip extra whitespace on import.

Simplicity scales. One JSON import feature vs. entire AI infrastructure. The simpler system won, both for building and maintaining.

The Trade-offs

I gave up the “magic” feeling of instant AI responses in-app. I gave up real-time conversational AI. I gave up being able to type “suggest a workout” and have it just work.

What I gained: zero AI costs, perfect privacy, freedom to use any LLM, a simpler codebase, faster shipping, and a future-proof architecture.

I’ll take that trade every time.

Beyond Fitness Apps

This pattern works anywhere users want AI-generated configuration rather than real-time conversation.

Recipe apps: “Generate a weekly meal plan for my dietary restrictions and what I have in my fridge.” Import the JSON meal plan.

Study apps: “Create a study schedule for my exams with these topics and time constraints.” Import the schedule.

Budget apps: “Design a savings plan for my goals based on my income and expenses.” Import the financial plan.

Travel apps: “Build a 7-day itinerary for Tokyo with my interests and budget.” Import the itinerary.

Task managers: “Organize these projects into a GTD system with contexts and priorities.” Import the task structure.

Any app where configuration is the end goal can use this pattern. You don’t need AI in your app - you need a good JSON schema and a way to import it.

And because it’s just JSON files, configurations become shareable. Nutritionists can create meal plans for clients. Study coaches can design schedules for students. Financial advisors can generate budget templates. All without building sharing infrastructure into the app.

The Bigger Picture

We’re seeing a shift in how AI gets integrated into apps. Agentic AI patterns are everywhere. Companies are building LLM-powered fitness coaches and LangChain implementations of the same concept.

But they all integrate AI directly. Complex infrastructure. Scaling costs. Privacy concerns.

This pattern is different: configuration-driven AI where LLMs generate structured data that apps consume, but the AI never lives in the app itself.

It’s simpler. It’s cheaper. It’s more private. And it works with any LLM that exists now or will exist in the future.

Should You Use This Pattern?

Ask yourself: “Do I need real-time AI conversation, or do I just need AI-generated configuration?”

If users need back-and-forth conversation with the AI, this pattern won’t work. Build a proper integration.

But if the AI’s job is to generate a config once (or occasionally), and then the user works with that config in your app, this pattern is perfect.

Your users might already have better AI access than you can provide. Your codebase will be simpler. Your costs will be zero. And your users will have complete control.

Implementation Notes

If you want to try this pattern, here’s what mattered:

  1. Define a clear JSON schema for your domain. Make it comprehensive. Include examples. The LLMs need to know exactly what format you want.

  2. Build import/export with validation. Parse the JSON strictly. Give clear error messages when something’s malformed.

  3. Generate rich LLM prompts with current state + schema + examples. The better your prompt, the better the AI’s output. Don’t skimp on context.

  4. Add versioning/rollback. You’ll experiment. You need an undo button. I keep 10 versions because that’s enough to try different programs without worrying.

  5. Handle duplicates. LLMs aren’t perfectly consistent with naming. Normalize on import or you’ll have “Pull Ups” and “Pullups” as separate exercises.

  6. Use merge mode. Preserve existing data when importing new configurations. Don’t make people choose between trying a new program and keeping their history.

I’m working on getting Focus Strength ready for release. This pattern made it possible to build a fitness app that’s both powerful and simple.


The “Bring Your Own Coach” pattern isn’t just about avoiding AI integration costs. It’s about building tools that respect user agency, preserve privacy, and stay simple.

Sometimes the best way to add AI to your app is to not add AI to your app at all.

Last updated on