Brandon Seppa Navigation
  • Home
  • About
  • Search
  • Home
  • About
  • Search

Your AI Model Needs a Bouncer. Google Built One.

Prompt injection is the SQL injection of the AI era. Model Armor is the first cloud-native solution that protects any LLM, on any cloud, without locking you into a single vendor.

AI SecurityGoogle CloudLLM SecurityModel ArmorPrompt Injection

Your Load Balancer Has No Idea What an LLM Is (Google Fixed That)

Standard load balancers treat LLM inference like any other HTTP traffic. That is expensive and slow. GKE Inference Gateway knows the difference.

AI InfrastructureGKEGoogle CloudKubernetesLLM Inference

PostgreSQL Got a Supercharger. Your Database Bill Didn’t Get the Memo.

AlloyDB collapses three separate database systems into one managed PostgreSQL instance. The benchmarks are embarrassing for Aurora.

AlloyDBCloud DatabaseEnterprise AIGoogle CloudPostgreSQL

The AI Pilot Trap, and How to Get Out of It

Most enterprise AI projects don’t fail because the technology doesn’t work. They fail because no one built the infrastructure to let it work at scale.

AI StrategyDigital TransformationEnterprise AIGoogle CloudVertex AI

The ETL Pipeline You’re Running Probably Shouldn’t Exist

BigQuery can now run AI models directly inside SQL. The implications for how you’ve been architecting your data stack are a little uncomfortable.

BigQueryData EngineeringEnterprise AIETLGoogle Cloud

Most Enterprise AI Is Blind to 80% of Your Data

Your AI reads text just fine. It’s the contracts, recordings, and images it can’t touch that are going to cost you.

Enterprise AIGoogle CloudMultimodal AIUnstructured DataVertex AI

Your AI Inference Bill Is About to Become a Strategy Problem

Google Research just published a way to cut AI serving costs by 50% with zero accuracy loss. The interesting part is what happens to the ISVs who figure this out first.

AI InfrastructureCost OptimizationGoogle CloudLLM InferenceTPU
LinkedIn
BRANDONSEPPA.COM © 2026