Parameters
The adjustable settings inside an AI model that determine how it processes information.
In Plain English
Parameters are the internal values that an AI model learns during training. Think of them as the model's "knowledge" encoded as numbers. More parameters generally means a more capable model, but also requires more computing power. Leading models now range from a few billion parameters (for fast, lightweight tasks) to over a trillion parameters for the most powerful systems. Newer architectures like mixture-of-experts activate only a fraction of parameters per query, improving efficiency.
💡Real-World Example
When people say a model has "70B parameters," they mean it has 70 billion adjustable values that were tuned during training.
Related Terms
What did you think of our explanation?
