Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The GPT Store is poised to be the biggest platform since the iOS App Store.

Reason: The GPT-4 API is too expensive for most use cases. This encourages app developers to build custom GPTs, letting their users pay $20/month for a ChatGPT Plus subscription. To provide the same service in a standalone webapp using the GPT-4 API you'd have to charge ~$50/mo for your app only, and that's if you settle for ~50-75% margins (SaaS usually has much higher margins).

By the time GPT-4 level (proprietary or open source) models proliferate and come down in price, OpenAI will have GPT-4.5/GPT-5 for $20/mo in ChatGPT Plus, and it will be tough to ask users to pay to use a subpar model.



sort by: page size:

Nice to see they are working on reducing the pricing. GPT-4 is just too expensive right now imo. A long conversation would quickly end up costing tens of dollars if not more, so less expensive model costs + stateful API is urgently needed. I think even OpenAI will actually gain a lot by reducing the pricing, right now I wouldn't be surprised if many uses of GPT-4 weren't viable just because of the costs.

The world needs a way to pay for thin AI wrappers without having to turn them into subscription services. There are lots of potential applications where an indie developer can add value on top of LLM's, but can't subsidize GPT-4 usage for the world. OpenAI still doesn't approve of stored bring-your-own-key solutions, nor allow monitoring-and-approval of API costs at the individual consumer level (the new project keys are a step in that direction). The GPT store covers some use cases, and hopefully will get both monetization and a way to expose non-chat interfaces. But until then, everyone with a solution that needs to pass through API costs has to make pricing decisions inconvenient to everyone (in this case, asking too high a price; in other cases, risking overuse or having to heavily restrict usage).

GPT-4 is way too expensive atm for anyone to offer it in their products, unless they're charging _a lot_, or are delegating to the user's API key. It's also very slow and unreliable for any production app.

I'd just like to see GPT4 more available, even on the free chatGPT, although I wonder if that will ever fully come with ChatGPT getting so much use and GPT 3.5 being cheaper to run.

Plus seems expensive to me and it is still rate limited quite a lot.

I guess it's going to take further optimisation to make it worthwhile for OpenAI.


I think that this is a really big deal for at least two reasons. A lot of people in large companies can't or won't use ChatGPT or Copilot because management is worried that their data and code will be used to train new versions of OpenAI's models. Also, although the pricing for ChatGPT Business isn't yet available, I'm guessing they will make orders of magnitude more money from enterprise customers than with ChatGPT Plus, which might make it possible for them to make GPT-4 tokens less expensive. The current cost of GPT-4 is what is limiting its massive adoption as I see it.

Unless you're an extremely heavy user, it's cheaper to just use the API. I've been tempted to do that, but OpenAI doesn't have a free trial for me to see the quality of GPT-4 first.

OpenAI has just released the GPT Store, and creating GPTs is one of the biggest opportunities of 2024. Read more.

yeah, hopefully OpenAI will launch GPT store or something in the future. I was thinking if I should create a webapp or extension, so that GPT subs dependency will not be needed any longer.

agreed, and I can't wait for gpt4 to have great competition in terms of ease, price and performance. I was responding to this

> something that should just be completely on-device or self-hosted if you don't trust cloud-based AI models like ChatGPT Enterprise and want it all private and low cost


Last I checked:

- GPT-4 (ChatGPT Plus): has max 4K tokens ?

- GPT-4 API: has max 8K tokens (for most users atm)

- GPT-3.5 API: has max 16K tokens

I'd consider the 32K GPT-4 context the most valuable feature. In my opinion OpenAI shouldn't discriminate in favor of large enterprises. It should be equaly available to normal (paying) customers.


Sure, but GPT-4 through the UI costs $20 per month, which is a lot of api calls.

Most of the products announced (and the price cuts) appear to be more about increasing lock-in to the OpenAI API platform, which is not surprising given increased competition in the space. The GPTs/GPT Agents and Assistants demos in particular showed that they are a black box within a black box within a black box that you can't port anywhere else.

I'm mixed on the presentation and will need to read the fine print on the API docs on all of these things, which have been updated just now: https://platform.openai.com/docs/api-reference

The pricing page has now updated as well: https://openai.com/pricing

Notably, the DALL-E 3 API is $0.04 per image which is an order of magnitude above everyone else in the space.

EDIT: One interesting observation with the new OpenAI pricing structure not mentioned during the keynote: finetuned ChatGPT 3.5 is now 3x of the cost of the base ChatGPT 3.5, down from 8x the cost. That makes finetuning a more compelling option.


You can still run the original gpt-4-0314 model (March 14th) on the API playground:

https://platform.openai.com/playground?mode=chat&model=gpt-4...

Costs $0.12 per thousand tokens (~words), and I find even fairly heavy use rarely exceeds a dollar a day.


GPT API is a successful product. All those start ups that are just a thin layer over GPT that are funded by YCombinator are paying for API use and that's profitable for OpenAI.

Their main advantage for now is their super clean API. Open source alternative are already on par with GPT-3.5 and 4 capabilities, they just don't have as good a package but that could change rather quickly too.

Because running GPT4 for hundreds of millions iOS users is not an easy task - especially if there is no subscription model behind it.

Probably still not worth it for me; I use gpt4 fairly heavily via API and still my API charges only come to about $5-7 a month.

I heard of people using the GPT-4 API for personal use because it's a lot cheaper than paying for the subscription since it's pay-per-use. Maybe they don't want people to do that.

We need a simpler way to make GPT get mass adoption and I don't think GPT Store from OpenAI will do that.

We need a unique interface with a standardized process to help people make use-case-specific GPTs.

Let's see what the future holds.

next

Legal | privacy