jrm4
So, I'll be honest, I don't understand this market. I get that one can be profitable selling shovels during the gold rush, sure. But I have trouble understanding who is knowledgeable/dedicated enough to try to get their AI app going, but would pay to abstract/outsource this part of the chain.

(I suppose, relately, I have trouble understanding why anyone would just sort of presume OpenAI would be forever the best backend here as well?)

ukuina
This is cool, congrats on launching!

How is it different from Puter AI, which offers auth + free inference?

https://docs.puter.com/AI/chat/

stroupwaffle
Please hire a real artist those graphics on the home page are disturbing.
CuriouslyC
If you don't want to pay for this service, keytrustee.org does this for free.
madamelic
To potentially save you some headache, take a look at serverless.com and weigh the likelihood they come after you about that name if you are planning on making this a business.

(And yes, I hate their name too. I don't honestly know how defendable an entire technology term actually is. It also results in terrible Googling.)

jdmoreira
This is a great idea. You should market to app devs as well.

I would also build this on top of firebase marketplace: https://extensions.dev

friendly_chap
Hah! Nice idea! I built something with a similar mindset but instead of calling cloud AI providers my aim is to provide a self-hostable complete AI platform: https://github.com/singulatron/singulatron

I know that might sound like putting the server back to serverless. But I would say it's being your own serverless provider - once you have the platform installed on your servers you can build frontend-only AI apps on top.

Hope you don't mind the self-plug. Your approach definitely a ton of advantages when starting out (no infra to manage etc).

sf-wy
Great idea! I like the ergonomics of this for the developer-side, it's easy to add and puts the onus on the developer to have a robust auth system that avoids users creating 1000s of accounts so they can get unlimited LLM access.

One challenge on frontend-only apps is if the prompt is proprietary then this will be exposed unless you will then offer prompt templating or prompt mapping on your side i.e. the frontend says prompt: Template_123 and then this maps to the actual prompt somehow. Prompting is important still and maybe for a while so having the internals externally available could be sensitive.