\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

AI wrapper "companies" going to ZERO when token prices triple

lol no way is open ai and anthropic going to keep the free r...
Excitant angry nowag state
  02/19/26
🚨 this is a WLMAS account 🚨 🚨 this is a WLMAS ac...
Frum Gaping Giraffe
  02/20/26
They’re already going to zero man the wrappers are so ...
Azure national
  02/19/26
yeah good point. but also other kinds of AI startups are goi...
Excitant angry nowag state
  02/19/26
When do you think they’re gonna do the rug pull and 10...
Azure national
  02/19/26
gross margin on tokens is already pretty good (around 50%), ...
heady underhanded temple pisswyrm
  02/19/26
Models are only going to get cheaper to run relative to thei...
Unhinged nibblets locus
  02/19/26
How tf do you figure that models are only getting cheaper to...
Azure national
  02/19/26
maybe they are assuming that if raw comopute per query inccr...
Excitant angry nowag state
  02/19/26
I think ultimately we’re going to end up at a spot wit...
Azure national
  02/19/26
https://x.com/gdb/status/2024662197692223857
Azure national
  02/20/26
Extreme compute scarcity is the only scenario in which our G...
Unhinged nibblets locus
  02/20/26
"all of the gains in capabilities in the past 18 months...
Unhinged nibblets locus
  02/20/26
i mean it's an exaggeration, yes. not literally all of the g...
Azure national
  02/20/26
the trend is toward cheaper tokens yes. but people would sti...
Excitant angry nowag state
  02/19/26
This is why I'm furiously building a local llm with amd 9 an...
marvelous whorehouse
  02/20/26
🚨 this is a WLMAS account 🚨 🚨 this is a WLMAS ac...
Frum Gaping Giraffe
  02/20/26


Poast new message in this thread



Reply Favorite

Date: February 19th, 2026 10:40 PM
Author: Excitant angry nowag state

lol no way is open ai and anthropic going to keep the free ride going forever

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681739)



Reply Favorite

Date: February 20th, 2026 12:52 PM
Author: Frum Gaping Giraffe

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49682918)



Reply Favorite

Date: February 19th, 2026 10:43 PM
Author: Azure national

They’re already going to zero man the wrappers are so fuckin done here

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681746)



Reply Favorite

Date: February 19th, 2026 10:45 PM
Author: Excitant angry nowag state

yeah good point. but also other kinds of AI startups are going to ZERO when token prices go up

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681749)



Reply Favorite

Date: February 19th, 2026 10:47 PM
Author: Azure national

When do you think they’re gonna do the rug pull and 10x token prices

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681754)



Reply Favorite

Date: February 19th, 2026 11:11 PM
Author: heady underhanded temple pisswyrm

gross margin on tokens is already pretty good (around 50%), and it's a competitive market requiring billions in R&D and training to keep your tokens worth anything

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681777)



Reply Favorite

Date: February 19th, 2026 11:15 PM
Author: Unhinged nibblets locus

Models are only going to get cheaper to run relative to their strength, there’s far too much competition. The shit coming out of China on a relative shoestring budget (and price to run) is very impressive. Anthropic is rushing to build products on top and establish a dominant position in B2B early on for that exact reason

AI wrappers go to zero not because of token prices but because they only get easier to build the better the models get, and the frontier labs are obviously always a step ahead in access to the best models and talent

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681782)



Reply Favorite

Date: February 19th, 2026 11:19 PM
Author: Azure national

How tf do you figure that models are only getting cheaper to run relative to strength when all of the gains in capabilities in the past 18 months have come from throwing as much inference compute as possible at whatever you want the model to do

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681785)



Reply Favorite

Date: February 19th, 2026 11:25 PM
Author: Excitant angry nowag state

maybe they are assuming that if raw comopute per query inccreases, efficency per outcome is still increasing faster overall?

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681795)



Reply Favorite

Date: February 19th, 2026 11:35 PM
Author: Azure national

I think ultimately we’re going to end up at a spot with LLMs where you can keep getting marginal capability gains by spending tons of inference compute but they’re increasingly inefficient gains relative to cost

I worry that like capital itself, access to the most powerful AI capabilities will end up gated behind prohibitively high costs and access to “borrowed” Monopoly money. The rich get richer, the poor get poorer, etc

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681803)



Reply Favorite

Date: February 20th, 2026 1:00 AM
Author: Azure national

https://x.com/gdb/status/2024662197692223857

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681868)



Reply Favorite

Date: February 20th, 2026 5:14 AM
Author: Unhinged nibblets locus

Extreme compute scarcity is the only scenario in which our GPU capex frenzy stands a chance in the long run relative to China and its insurmountable lead in meatspace supply chain, of course they are going to say this.

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681995)



Reply Favorite

Date: February 20th, 2026 5:09 AM
Author: Unhinged nibblets locus

"all of the gains in capabilities in the past 18 months have come from throwing as much inference compute as possible at whatever you want the model to do "

Because that's just obviously not true? Minimax 2.5 is far more capable than Sonnet 3.5 or whatever

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681994)



Reply Favorite

Date: February 20th, 2026 12:52 PM
Author: Azure national

i mean it's an exaggeration, yes. not literally all of the gains have come from throwing more inference compute at the prompt, but most of them have, for sure

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49682914)



Reply Favorite

Date: February 19th, 2026 11:21 PM
Author: Excitant angry nowag state

the trend is toward cheaper tokens yes. but people would still pay for the frontier if they tripled prices so I wouldn't rule it out yet. plus its not profitable right now open ai and anthropic are basiclaly giving people a free ride. I think the main thing stopping them is that open source models are improving and distillation makes "good enough" very cheap. but still i think if they keep the gap wide enough at the frontier the prices could rise especially in industries that are high stakes

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681790)



Reply Favorite

Date: February 20th, 2026 1:53 AM
Author: marvelous whorehouse

This is why I'm furiously building a local llm with amd 9 and 4090 this weekend; then preordering M4 ultra with maximum RAM config on day 1

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49681900)



Reply Favorite

Date: February 20th, 2026 12:53 PM
Author: Frum Gaping Giraffe

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

🚨 this is a WLMAS account 🚨

(http://www.autoadmit.com/thread.php?thread_id=5836559&forum_id=2\u0026mark_id=5310906#49682919)