POSTS / ChatGLM Prompt Engineering Course Notes (3) - Advanced
Published: 2023-12-13
Notes from ChatGLM 大模型应用构建 & Prompt 工程. Last Note is Here #4 #6.
Basic Knowledge
To-C vs. To-B
| To C | To B |
|---|---|
| General Assistant | Specialized Tool |
| Prompt Exploring | Prompt Engineering |
| Content First | Format First |
Pain Points of To-B
- How to make LLM embedded in the process?
- How to break down the workflow into actions?
Model & System
- Manage user’s input so that model can understand better.
- Manage model’s output so that it can interact with system well.
- Make model focus on one specific kind of issues.
- Make system control model for better questions.
- Multiple model to co-work with each other.
- Split long context to small pieces then summary (kind of like Map-Reduce).
Some Methods
- Use flowchart to get a global view of whole process.
- Evaluate the result then enhance.
- Better test data makes better evaluation.
Prompt Engineering
Background
- Lack of systematization and more relies on personal experience.
- Hard to modify a prompt shared by others.
- Depends on input data and needs evaluation tools.
- Not reusable across models.
Techniques
Zero Shot
Few Shot
Chain of Thought
Zero Shot + CoT
Parameters
Temperature
Control the randomness of the output. Higher temperature gives more diverse answers.
Top-P
Kind of like Temperature.
Tricks
Splitter
- Avoid other contents disturb the instruction for model.
- Separate different contexts and modules.
- Avoid irrelevant instructions.
Sub-Entry
- Help model think step by step following the instructions.
- Entry-by-entry writing & testing and iterative maintenance.
Knowledge Injection
Improve model’s accuracy and timeliness.
Formatted Output
- May avoid overly diffuse and fabricated contents.
- Easy to interact with other tools.
Avoid Peculiar Logic
Peculiar logic may confuse model.
Avoid Regularized Conversions
Leave it for regular program.
Nothing means None
Don’t force model to output.