AI Glossary

GPU Poor

The state of having insufficient GPU compute resources for training or serving AI models.

Overview

GPU poor is a colloquial term in the AI community describing the reality faced by most researchers, startups, and organizations who lack access to the massive GPU clusters needed for training frontier AI models. As model sizes grow, the compute divide between GPU-rich organizations (like major tech companies) and everyone else continues to widen.

Key Details

Being 'GPU poor' has driven innovation in efficiency: quantization, LoRA fine-tuning, knowledge distillation, and efficient architectures all help maximize the value of limited compute. Cloud GPU rentals, open-source models optimized for consumer hardware, and community efforts like distributed training projects help democratize access. The term highlights a key challenge in AI: ensuring that advanced AI capabilities aren't concentrated only in well-resourced organizations.

Related Concepts

gpuquantizationlora

← Back to AI Glossary

Last updated: March 5, 2026