14 must know generative AI terms

Improve your generative AI vocabulary - 14 must know terms.

  1. Generative AI: machine learning (ML) techniques that allow computer models to create new, realistic content such as images, text, audio, and video

  2. Large Models: the engine behind the power of

    Generative AI - composed of neural networks with billions of parameters

  3. Foundation Model: reusable, modularly-trained ML models like that serve as generalized starting points for developing generative AI applications

  4. Large Language Model: natural language processing (NLP) system of deep neural networks with billions of parameters that are trained on massive text datasets.

  5. Multimodal: combining multiple modes or types of data such as text, images, speech, and video to build integrated AI models and applications.

  6. Fine-Tuning: the process of taking a pre-trained model and adapting it to a downstream task by updating the weights through additional training on a smaller, task-specific dataset

  7. Parameter efficient fine-tuning (PEFT): adapts a pre-trained model to new tasks using limited task data and without substantially changing the original model's parameters

  8. Prompt Engineering: process of constructing effective natural language prompts to provide as inputs to LLMs to guide them towards the desired task or response

  9. RLHF (Reinforcement Learning with Human Feedback): trains AI by allowing humans to positively or negatively reinforce the behaviors in an interactive loop

  10. Embedding: vector (an array of numbers) representation of an entity like a word, image, or video that encodes key information

  11. Vector Database: repositories of vector representations of entities like words, images, documents, and videos that enable similarity searches and other vector-based operations

  12. Agent: Ability to reason, observe, plan, act - accelerate the delivery of generative AI application

  13. Token: discrete atomic unit of data like a word, image patch, or audio snippet used as input to or output from a foundation model — generally 75 english words is around 1,000 tokens

  14. Context length: the maximum number of tokens provided as context to a model when making predictions or generations — typically 4K context is worth a paragraph while 100K context is worth a novel.

Previous
Previous

The Cost of AI: Should You Build or Buy Your Foundation Model?

Next
Next

26 weeks of AI marathon