Note 5— Prompt Engineering for Developer

Topic: Inferring

Mars Wang
3 min readApr 23, 2024
Photo by Markus Winkler on Unsplash

Analyzing Input Content

For the content, this could be extraction of labels or name, understanding of the sentiment of the text.

The Method we used to understand the content

We need to train each model separately to approach the function under Machine Learning Methodology:

Collecting label dataset → Traning model → Deploying model on cloud

=> Cons: It takes abundant of time to train models and to collect dataset.

However, one of the advantage of LLM is to utilize a prompt and go with it right away!

Let get into an example to show what we’re talking about:

Example 1 — sentiment of the text
There is a review of a lamp. We are the sellers of the product, and we would like to know and simplify client’s review.

raw review of the lamp
  • Sentiment:
utilizing prompt to infer the sentiment in the review
  • Emotions:
listing the emotions conveying from the review
  • Whether the user shows the negative emotion in the review?

Example 2 — Information Extraction:
We use the same review in Example 1, and need it to be extracted important information, so that merchant can know their products. Lastly, the output should be formatted for applying analysis.

Only a prompt can extract all we need, including sentiment or information

Example 3 — Inferring a Story

Generalize the story into 1 or 2 words.

In Example 3, it may list all the categories we need, and categorize the story. Finally, we use Python to collect dataset:

using topic_list to categorize the topic we need and using Python to classify the story

In the above example, it calls “Zero-Shot Learning.” In other words, Zero-shot learning won’t have any sample for the model. We will discuss this topic in future post. Stay in tune.

--

--

Mars Wang
Mars Wang

Written by Mars Wang

99' | Software Development | Cloud | Adventurer

No responses yet