Brandon Seppa Navigation
  • Home
  • About
  • Search
  • Home
  • About
  • Search
Home Posts AI Inference

Tag Archive

Below you'll find a list of all posts that have been tagged as “AI Inference”

Google Built a TPU for the Age of Inference. Meet Ironwood.

TPU Ironwood is Google’s 7th-generation custom AI chip, and unlike its predecessors, it was built for inference first. Here’s what that means and why it matters.

AI InferenceAI InfrastructureCustom SiliconGoogle CloudIronwoodISVTPU

GPU Inference Without the Cluster. Cloud Run Finally Makes That Real.

Cloud Run now supports GPUs with scale-to-zero billing. For AI inference workloads that are bursty, sporadic, or just getting started, that changes the math entirely.

AI InferenceCloud RunGoogle CloudGPUISVLLM InferenceServerless
LinkedIn
BRANDONSEPPA.COM © 2026