\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

The 14-inch Macbook Pros are DEFECTIVE and need to go back

Only the oafish 16" models have proper cooling designs....
https://imgur.com/a/o2g8xYK
  07/06/25
I have the 14” M3 pro model and I agree.
Oh, you travel?
  07/06/25
Flip it on Swappa and buy a 16" (also on Swappa)
https://imgur.com/a/o2g8xYK
  07/06/25
why are you running LLMs locally
cucumbers
  07/06/25
Deepseek is the only cloud AI I trust with my data. I would ...
https://imgur.com/a/o2g8xYK
  07/06/25
the problem isn't the MacBook; the problem is that you're a ...
cucumbers
  07/06/25
The problem is you can't use more than 8gb of RAM without yo...
https://imgur.com/a/o2g8xYK
  07/06/25
my work MacBook Pro (14") typically uses around 14+ GB ...
cucumbers
  07/06/25
why on earth would you be paranoid about storing shit on the...
Police Boner
  07/06/25


Poast new message in this thread



Reply Favorite

Date: July 6th, 2025 11:42 AM
Author: https://imgur.com/a/o2g8xYK


Only the oafish 16" models have proper cooling designs. The 14-inch models have fans but they're not enough to keep the CPU with tolerable limits. Run LLMs or something and it's going to spike to 95C and throttle down to 1.8ghz. People are paying $2000-$5000 for laptops full of CPUs that throttle 24/7, and the only reason they don't notice is because they never use the computer to do shit except email.

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076031)



Reply Favorite

Date: July 6th, 2025 11:50 AM
Author: Oh, you travel? ( )

I have the 14” M3 pro model and I agree.

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076042)



Reply Favorite

Date: July 6th, 2025 12:02 PM
Author: https://imgur.com/a/o2g8xYK


Flip it on Swappa and buy a 16" (also on Swappa)

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076067)



Reply Favorite

Date: July 6th, 2025 12:28 PM
Author: cucumbers

why are you running LLMs locally

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076109)



Reply Favorite

Date: July 6th, 2025 12:29 PM
Author: https://imgur.com/a/o2g8xYK


Deepseek is the only cloud AI I trust with my data. I would never let an American company read my prompts, especially when I'm doing image generation. Gemma 12b is 180 for legal writing so I don't need the cloud for that, and ljl@ any lawyer sharing sensitive info with a cloud computer for any reason.

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076111)



Reply Favorite

Date: July 6th, 2025 12:32 PM
Author: cucumbers

the problem isn't the MacBook; the problem is that you're a lawyer

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076115)



Reply Favorite

Date: July 6th, 2025 12:33 PM
Author: https://imgur.com/a/o2g8xYK


The problem is you can't use more than 8gb of RAM without your Macbook overheating, and Apple lets people buy Macbooks with up to 128gb. You'll be in a nursing home before it gets done chewing through half that RAM because it has to throttle to 1/3 its rated speed.

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076117)



Reply Favorite

Date: July 6th, 2025 12:54 PM
Author: cucumbers

my work MacBook Pro (14") typically uses around 14+ GB of RAM without issue. there's more to running LLMs than RAM

plus it's a fucking laptop. it's not a watercooled vapor chamber cryogenic desktop that can be overclocked without issue. you bought the wrong computer for running anything beyond a modest LLM

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076161)



Reply Favorite

Date: July 6th, 2025 1:11 PM
Author: Police Boner (🧐)

why on earth would you be paranoid about storing shit on the cloud then go with deepseek of all options

(http://www.autoadmit.com/thread.php?thread_id=5746690&forum_id=2...id.#49076207)