POSTS / ChatGLM Prompt Engineering Course Notes (1) - Background
Notes from ChatGLM Prompt Training Course
What is LLM?
A large language model (LLM) is a type of artificial intelligence (AI) algorithm that uses deep learning techniques and massively large data sets to understand, summarize, generate and predict new content. The term generative AI also is closely connected with LLMs, which are, in fact, a type of generative AI that has been specifically architected to help generate text-based content.
The scale of parameters is the biggest feature of today’s LLMs. For example:
- ChatGLM - 130B.
- ChatGPT - 175B.
Why LLM is so incredible?
Capabilities which don’t exist in smaller models emerges when model parameter sizes exceed hundreds of billions.
The Power of Few-Shot
Few-Shot is that we give some examples to the model then it can provide a better answer to match our needs.
Pros:
- Fewer Examples to Learn: LLM has the ability of learning and understanding with several given examples. Compared to traditional machine learning, few-shot on LLM needs less data which brings lower cost for collecting and processing data.
- Generalization Capability: LLM is able to extract common features and patterns from a smaller number of examples then apply them to new similar tasks. Also, it can reason and classify new problems with its experience and language understanding trained on large-scale datasets.
Chain of Thought Prompt
Using examples with chain of thought, LLM can provide a more reasonable answer with some explanation of that. The explanation of reasoning always leads to more accurate results.
What can LLM do?
Text Generation
Text Extraction
Text Translation
What is Prompt?
Prompt is an instruction given to LLM which directs the model to generate an appropriate answer.
What is Prompt Engineering?
Prompt Engineering is like machine learning and both of them are iterative process.
Rules of Writing a Prompt
Clear and Unambiguous Instructions
For example, Please tell me what is AI? is much better than Talk about technology.
You can always use some examples in your prompt to get models a better understanding.
Let Model Think
You should minimize conflicts in your prompt and divide tasks in stages, which will give the model more space to think. Adding Chain of Thought also brings improvement on answering your questions.
Components of a Prompt
Component | Contents | |
---|---|---|
Context | Optional | Character / Task / Knowledge |
Instruction | Required | Steps / Chain-of-Thought / Examples |
Input Data | Required | Sentences / Articles / Questions |
Output Indicator | Optional | Anything |
For example:
You are a text classification model that classifies movie reviews into two categories: positive and negative reviews.
Please categorize the text based on the following context and inputs and give the corresponding output categories.
Example:
Input Text: This movie was fantastic! The actors did a great job, the plot was gripping, and it is highly recommended!
Output category: Positive
Input Text: This movie is wonderful! The actors are great, the plot is exciting, and it's highly recommended!
Output category: