Prompt Engr Module 6
Prompt Engr Module 6
1
Introduction to Prompt Engineering
Improving Prompts
Prompts are the instructions that you give to an AI generative app. They tell the app what you want it to
do. The better your prompts are, the better the results that you will get from the app.
LLMs are trained on massive datasets of text and code. This data is used to teach the LLM how to
generate text, translate languages, write different kinds of creative content, and answer your questions in
an informative way.
When you give an LLM a prompt, it first processes the prompt by breaking it down into individual words
and phrases. It then looks for patterns in these words and phrases. These patterns are used to generate a
response that is consistent with the patterns that the LLM has learned.
For example, if you give the LLM the prompt "Write me a poem about a cat", it will first break the prompt
down into the words "write", "me", "poem", and "cat". It will then look for patterns in these words. For
example, it will notice that the words "write" and "poem" are often used together. This pattern tells the
LLM that you are asking it to generate a poem.
The LLM will then use this pattern to generate a response that is consistent with it. For example, it might
generate the following poem:
Course Module
Improving Prompts
2
Introduction to Prompt Engineering
This poem is consistent with the pattern that the LLM has learned. The words "cat", "furry", "eyes", "tail",
"play", "mice", "sleep", and "sun" are all words that are often associated with cats.
The LLM is able to process prompts literally because it is trained on a massive dataset of text and code.
This dataset contains many different patterns that the LLM can learn. When you give the LLM a prompt, it
can use the patterns that it has learned to generate a response that is consistent with the prompt.
However, it is important to note that LLMs are not perfect. They can sometimes make mistakes, especially
if the prompt is not clear or concise. Additionally, LLMs are still under development, and their capabilities
are constantly evolving. As LLMs continue to develop, they will become better at processing prompts
literally.
Course Module
Improving Prompts
3
Introduction to Prompt Engineering
By following these tips, you can improve your prompts and get better results from AI generative apps.
Course Module
Improving Prompts
4
Introduction to Prompt Engineering
Course Module