![](https://upvote.au/pictrs/image/84065ed2-43b1-4ab2-8678-dfcd7e760a7b.jpeg)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
I’m pretty sure Google uses their TPU chips
The Coral ones? They don’t have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.
Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.
instead of a regular GPU
I wouldn’t call them regular GPUs… AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don’t have any video output ports.
The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.
It runs Tensorflow Lite, so you can also build your own models.
Pretty good for a $25 device!