Here’s an anecdote. I was subscribed to ChatGPT Plus for a while, to get access to GPT4.
I stopped subscribing after after I got GPT4 API access because I developed a little personal app which used the OpenAI API to just read and write directly from plain text files and that suits my workflow better than the ChatGPT website.
But it sucks because I’m constantly thinking about how much I’m using, and how many tokens I’m putting into my query, because each API call costs me money. It was way nicer just paying a flat fee and using it “as much as I want”, even though this actually costs me way less because I use don’t use $20USD worth of API calls in a month, even with GPT4.
It would be a nightmare to use Reddit if it cost money to scroll down or post a comment. On the other hand, that might actually be a good disincentive to help me spend less time on it.
You keep saying “chatgpt-4”; I wonder if you’re aware that you don’t need to pay $20 a month (which is what that costs) to use gpt4. If you use it through the API instead, even a fairly heavy user comes out to a dollar or two a month.
Since I do not use it every day, I only pay for API access directly and it costs me a fraction of that. You can trivially make your own ChatGPT frontend (and from what people write you could make GPT write most of the code, although it's never been my experience).
Ever since I got API access to GPT-4, I don't even bother with using ChatGPT proper (I need to cancel the subscription one of these days). I tend to use it in bursts, so smoothing usage by rate limits doesn't work for me.
Is it more expensive this way? At the rate I'm using it, probably 2-3x. Is it worth it? Absolutely. I've never in my life felt so happy to pay for something per request.
If you're paying per token for ChatGPT, I am surprised. You pay nothing to get access to ChatGPT. Plus subscribers get access to GPT4, but they pay per month (with ratelimits are per N reqeusts / X hours), not per token.
If you're paying for API, you have text-davinci, it is not behaving the way like free ChatGPT behaves.
I think I pay less than a dollar a month for my use of GPT-3 at the moment. I only used ChatGPT once but it seemed inferior and more restrictive than just using the API directly. Am I missing something?
No wonder, is this just the chat interface or the API too? I guess gpt4 was never sustainable at $20 a month. Annoying to be charged the same subscription and the product made inferior.
If you're even a moderate user of chatGPT, the equivalent cost of using the smallest GPT4 model would run you a hell of a lot more. Let's actually run the numbers.
The cost of the GPT-4 API is ballpark around $0.05 / 1000 tokens. If you want to include a rolling context window, you will easily hit 1000+ tokens if not a huge amount more.
ChatGPT pro gives you 50 GPT-4 queries every three hours. If you're using it all day you might average about 100 daily queries. Using a dedicated GPT4 API would run you approximately five dollars a day for the same thing - that's $150 a month as opposed to flat cost of $20.
How hard is it to hit 100 queries in a day? Pretty damn easy when you realize that most queries aren't usually standalone - instead they involve a back and forth approach which necessitates a rolling context window and you explore the problem space.
When a query (plus back and forth) might cost as much as $0.25, paying for GPT-4 via the API would only net you a whopping 80-100 queries per month before you exceeded the cost of a pro subscription.
The API certainly has its advantages over the web based ChatGPT, but price is definitely not one of them.
The GPT-4 API costs significantly more than the $20/mo ChatGPT Plus subscription, but it does have benefits. Namely, 128k tokens of context with the latest GPT-4-turbo model, so you can put a whole (small) software project or a short book into your prompt.
Tangentially, I lowered my personal cost by more than 3x, and was able to share GPT-4 with my friends and family.
I installed LibreChat on a Linode droplet, put $10 on my OpenAI account for API usage, and cancelled my ChatGPT subscription. Since neither I, nor my friends and family use ChatGPT a ton, I let them sign up on my server and the API costs are under $1 so far.
Can’t you get by with ChatGPT-4 for these personal assistant type questions? That’s what I do and my 20 a month goes a long way. I’d be interested to see if I am missing out on anything using GPT to is way in contrast to the API.
I'm surprised nobody is mentioning one of the largest factors. If you're a heavy user of chatGPT, the equivalent cost of using the smallest GPT4 model would run you a hell of a lot more.
The cost of the GPT-4 API is ballpark around $0.05 / 1000 tokens. If you want to include a rolling context window, you will easily hit 1000+ tokens if not a huge amount more. ChatGPT pro gives you 50 GPT-4 queries every three hours. If you're using it all day you might average about 100 daily queries. Using a dedicated GPT4 API would run you approximately five dollars a day for the same thing - that's $150 a month as opposed to flat cost of $20.
I stopped my ChatGPT Plus subscription and replaced it with pure GPT-4 API calls because the "product" built around the API just got dumber and dumber over time.
- ChatGPT pros:
Has some bells and whistles like code interpreter (which I can easily get via Open Interpreter).
Has plugins (although I found web browsing to be inferior compared to Poe/Perplexity).
- Pure GPT-4 API pros:
Is not dumbed down or forced to "forget" things or be lazy in coding.
I use the API either programmatically or through Poe.
I'm astonished how often this comes up and also how wrong it is.
The cost of the GPT-4 API is ballpark around $0.05 / 1000 tokens. If you want to include a rolling context window which you basically HAVE TO DO if you want to maintain a persistent conversation, you will easily meet or exceed 1000+ tokens.
ChatGPT Pro gives you 50 GPT-4 queries every three hours. If you're using it all day you might average about 100 daily queries. Using a dedicated GPT4 API would run you approximately five dollars a day for the same thing - that's $150 a month as opposed to flat cost of $20.
No, because my workplace bans it, so I don't find a subscription for personal use worth it. Before my workplace banned it, I used the free version last year, and I found it slightly useful, but not essential - more of a smart Autocomplete than anything truly intelligent. ChatGPT and the GPT API have been much more useful for my personal coding (mostly as an enhanced Stack Overflow).
I actually used to use ChatGPT but switched to the API once I had GPT-4 access. Mainly it’s because I simply didn’t use the $20 worth of the GPT-4 at the time. It was extremely slow and the question per hour limitation was annoying and stressful. I would always worry I would need it for something unexpected so I never used more than 15 questions at a time (but this has probably changed these couple months). In addition, the privacy implications are better for the API since the terms are better for how they handle your data. I also like how I can tie in GPT anywhere. I use the matrix bridge so you can give access to people like my parents who are not as tech literate to sign up and get used to chatgpt interface. I allow them to talk to it as a bot through WhatsApp bridge.
I stopped subscribing after after I got GPT4 API access because I developed a little personal app which used the OpenAI API to just read and write directly from plain text files and that suits my workflow better than the ChatGPT website.
But it sucks because I’m constantly thinking about how much I’m using, and how many tokens I’m putting into my query, because each API call costs me money. It was way nicer just paying a flat fee and using it “as much as I want”, even though this actually costs me way less because I use don’t use $20USD worth of API calls in a month, even with GPT4.
It would be a nightmare to use Reddit if it cost money to scroll down or post a comment. On the other hand, that might actually be a good disincentive to help me spend less time on it.
reply