Model Training vs Model Inference

With the recent advancement in LLMs, there are many customers who are interested in using LLM, some time tuning LLMs and even building their own LLMs.

In this small article I will try to outline the difference between what is model training vs. model inference:

Model Training:

  • Compare it to teaching a new employee. You provide them with extensive historical data (training data), and they learn patterns, knowledge, and skills from this information

  • training is a one-time, resource-intensive process that requires significant computational power and time

Model Hosting (or Inference):

  • the new employee (now the LLM) doing their job day-to-day, applying what they learned during training to new, unseen work (user prompts)

  • inference is much faster and less resource-intensive than training. It's the practical use of the model to generate predictions, answers, or content based on new inputs

Previous
Previous

The Emergence of the Chief AI Officer

Next
Next

My thoughts on recent announcement from Anthropic.