• 0 Posts
  • 6 Comments
Joined 10 days ago
cake
Cake day: January 24th, 2025

  • @llama@lemmy.dbzer0.com Depends on the inference engine. Some of them will try to load the model until it blows up and runs out of memory. Which can cause its own problems. But it won’t overheat the phone, no. But if you DO use a model that the phone can run, like any intense computation, it can cause the phone to heat up. Best not run a long inference prompt while the phone is in your pocket, I think.