\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

AI wrapper "companies" going to ZERO when token prices triple

lol no way is open ai and anthropic going to keep the free r...
Clear meetinghouse
  02/19/26
🚨 this is a WLMAS account 🚨 🚨 this is a WLMAS ac...
offensive field roast beef
  02/20/26
They’re already going to zero man the wrappers are so ...
Opaque cuckold
  02/19/26
yeah good point. but also other kinds of AI startups are goi...
Clear meetinghouse
  02/19/26
When do you think they’re gonna do the rug pull and 10...
Opaque cuckold
  02/19/26
gross margin on tokens is already pretty good (around 50%), ...
Dark beady-eyed home famous landscape painting
  02/19/26
Models are only going to get cheaper to run relative to thei...
Vigorous senate toaster
  02/19/26
How tf do you figure that models are only getting cheaper to...
Opaque cuckold
  02/19/26
maybe they are assuming that if raw comopute per query inccr...
Clear meetinghouse
  02/19/26
I think ultimately we’re going to end up at a spot wit...
Opaque cuckold
  02/19/26
https://x.com/gdb/status/2024662197692223857
Opaque cuckold
  02/20/26
Extreme compute scarcity is the only scenario in which our G...
Vigorous senate toaster
  02/20/26
"all of the gains in capabilities in the past 18 months...
Vigorous senate toaster
  02/20/26
i mean it's an exaggeration, yes. not literally all of the g...
Opaque cuckold
  02/20/26
the trend is toward cheaper tokens yes. but people would sti...
Clear meetinghouse
  02/19/26
This is why I'm furiously building a local llm with amd 9 an...
supple becky stag film
  02/20/26
🚨 this is a WLMAS account 🚨 🚨 this is a WLMAS ac...
offensive field roast beef
  02/20/26


Poast new message in this thread



Reply Favorite

Date: February 19th, 2026 10:40 PM
Author: Clear meetinghouse

lol no way is open ai and anthropic going to keep the free ride going forever

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681739)



Reply Favorite

Date: February 20th, 2026 12:52 PM
Author: offensive field roast beef

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49682918)



Reply Favorite

Date: February 19th, 2026 10:43 PM
Author: Opaque cuckold

They’re already going to zero man the wrappers are so fuckin done here

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681746)



Reply Favorite

Date: February 19th, 2026 10:45 PM
Author: Clear meetinghouse

yeah good point. but also other kinds of AI startups are going to ZERO when token prices go up

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681749)



Reply Favorite

Date: February 19th, 2026 10:47 PM
Author: Opaque cuckold

When do you think they’re gonna do the rug pull and 10x token prices

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681754)



Reply Favorite

Date: February 19th, 2026 11:11 PM
Author: Dark beady-eyed home famous landscape painting

gross margin on tokens is already pretty good (around 50%), and it's a competitive market requiring billions in R&D and training to keep your tokens worth anything

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681777)



Reply Favorite

Date: February 19th, 2026 11:15 PM
Author: Vigorous senate toaster

Models are only going to get cheaper to run relative to their strength, there’s far too much competition. The shit coming out of China on a relative shoestring budget (and price to run) is very impressive. Anthropic is rushing to build products on top and establish a dominant position in B2B early on for that exact reason

AI wrappers go to zero not because of token prices but because they only get easier to build the better the models get, and the frontier labs are obviously always a step ahead in access to the best models and talent

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681782)



Reply Favorite

Date: February 19th, 2026 11:19 PM
Author: Opaque cuckold

How tf do you figure that models are only getting cheaper to run relative to strength when all of the gains in capabilities in the past 18 months have come from throwing as much inference compute as possible at whatever you want the model to do

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681785)



Reply Favorite

Date: February 19th, 2026 11:25 PM
Author: Clear meetinghouse

maybe they are assuming that if raw comopute per query inccreases, efficency per outcome is still increasing faster overall?

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681795)



Reply Favorite

Date: February 19th, 2026 11:35 PM
Author: Opaque cuckold

I think ultimately we’re going to end up at a spot with LLMs where you can keep getting marginal capability gains by spending tons of inference compute but they’re increasingly inefficient gains relative to cost

I worry that like capital itself, access to the most powerful AI capabilities will end up gated behind prohibitively high costs and access to “borrowed” Monopoly money. The rich get richer, the poor get poorer, etc

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681803)



Reply Favorite

Date: February 20th, 2026 1:00 AM
Author: Opaque cuckold

https://x.com/gdb/status/2024662197692223857

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681868)



Reply Favorite

Date: February 20th, 2026 5:14 AM
Author: Vigorous senate toaster

Extreme compute scarcity is the only scenario in which our GPU capex frenzy stands a chance in the long run relative to China and its insurmountable lead in meatspace supply chain, of course they are going to say this.

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681995)



Reply Favorite

Date: February 20th, 2026 5:09 AM
Author: Vigorous senate toaster

"all of the gains in capabilities in the past 18 months have come from throwing as much inference compute as possible at whatever you want the model to do "

Because that's just obviously not true? Minimax 2.5 is far more capable than Sonnet 3.5 or whatever

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681994)



Reply Favorite

Date: February 20th, 2026 12:52 PM
Author: Opaque cuckold

i mean it's an exaggeration, yes. not literally all of the gains have come from throwing more inference compute at the prompt, but most of them have, for sure

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49682914)



Reply Favorite

Date: February 19th, 2026 11:21 PM
Author: Clear meetinghouse

the trend is toward cheaper tokens yes. but people would still pay for the frontier if they tripled prices so I wouldn't rule it out yet. plus its not profitable right now open ai and anthropic are basiclaly giving people a free ride. I think the main thing stopping them is that open source models are improving and distillation makes "good enough" very cheap. but still i think if they keep the gap wide enough at the frontier the prices could rise especially in industries that are high stakes

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681790)



Reply Favorite

Date: February 20th, 2026 1:53 AM
Author: supple becky stag film

This is why I'm furiously building a local llm with amd 9 and 4090 this weekend; then preordering M4 ultra with maximum RAM config on day 1

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49681900)



Reply Favorite

Date: February 20th, 2026 12:53 PM
Author: offensive field roast beef

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=3986969#49682919)