Valentineโ€™s gift idea using RAG: Roided out LLMs

Ritesh Shergill
4 min readFeb 13, 2024

Its Valentineโ€™s day again. (The bane of a loverโ€™s existence)

For on this fated day, the perfect gift must be chosen for your lover else you might incur a fury like Hell hath none. ๐Ÿ‘ฟ๐Ÿ‘ฟ๐Ÿ‘ฟ

You browse some sites for quick gifting ideas but nothing seems to be good enough!

You zero in on a couple of options. Will she like it?

Will she or wonโ€™t she? ๐Ÿ˜“๐Ÿ˜“๐Ÿ˜“

Why kill yourself? Make the easy choice. Teach an LLM to decide the idea for you.

I browsed the internet for some sites that have good gifting ideas. Then I shamelessly pulled the HTML content, scrubbed it for my intents and purposes and shoved them into PDF files.

Hint: BeautifulSoup

Now that I have the PDF files, what next?

Well hereโ€™s a recipe to train your favorite model to generate gifting ideas

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿญ : ๐—”๐—ฑ๐—ฑ ๐—ฎ ๐—ฏ๐˜‚๐—ป๐—ฐ๐—ต ๐—ผ๐—ณ ๐—Ÿ๐—ฎ๐—ป๐—ด๐—ฐ๐—ต๐—ฎ๐—ถ๐—ป ๐—ฑ๐—ฒ๐—ฝ๐—ฒ๐—ป๐—ฑ๐—ฒ๐—ป๐—ฐ๐—ถ๐—ฒ๐˜€

import os
from langchain.document_loaders import PyPDFDirectoryLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.retrievers.self_query.base import SelfQueryRetriever
from langchain.vectorstores import Chroma
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.llms import OpenAI
from langchain.chains.query_constructor.base import AttributeInfo
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿฎ : ๐—š๐—ฒ๐˜ ๐—ฎ๐—ป ๐—ข๐—ฝ๐—ฒ๐—ป ๐—”๐—œ ๐—ธ๐—ฒ๐˜† ๐—ฎ๐—ป๐—ฑ ๐—น๐—ผ๐—ฎ๐—ฑ ๐—ถ๐˜ ๐—ถ๐—ป๐˜๐—ผ ๐˜๐—ต๐—ฒ ๐—ฒ๐—ป๐˜ƒ

with open("./openai-key.txt") as oakf:
os.environ["OPENAI_API_KEY"] = oakf.read()

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿฏ : ๐—Ÿ๐—ผ๐—ฎ๐—ฑ ๐˜๐—ต๐—ฒ ๐—ฝ๐—ฑ๐—ณ๐˜€

loader = PyPDFDirectoryLoader("data")
data = loader.load()

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿฐ : ๐—ฆ๐—ฝ๐—น๐—ถ๐˜ ๐˜๐—ต๐—ฒ ๐—ฐ๐—ผ๐—ป๐˜๐—ฒ๐—ป๐˜ ๐˜‚๐˜€๐—ถ๐—ป๐—ด ๐—ฅ๐—ฒ๐—ฐ๐˜‚๐—ฟ๐˜€๐—ถ๐˜ƒ๐—ฒ๐—ง๐—ฒ๐˜…๐˜๐—ฆ๐—ฝ๐—น๐—ถ๐˜๐˜๐—ฒ๐—ฟ

r_splitter = RecursiveCharacterTextSplitter(
chunk_size=450,
chunk_overlap=0,
separators=["\n\n", "\n", " "]
)

splits = r_splitter.split_documents(data)
print(splits[0])

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿฑ : ๐—š๐—ฒ๐—ป๐—ฒ๐—ฟ๐—ฎ๐˜๐—ฒ ๐—ฒ๐—บ๐—ฏ๐—ฒ๐—ฑ๐—ฑ๐—ถ๐—ป๐—ด๐˜€ ๐˜๐—ผ ๐—ฑ๐˜‚๐—บ๐—ฝ ๐—ถ๐—ป๐˜๐—ผ ๐—ฎ ๐—ฉ๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟ ๐—ฑ๐—ฎ๐˜๐—ฎ๐—ฏ๐—ฎ๐˜€๐—ฒ (๐—–๐—ต๐—ฟ๐—ผ๐—บ๐—ฎ ๐——๐—•)

embedding = OpenAIEmbeddings()

# save chroma db embeddings in this directory
persist_directory = 'docs/chroma/'

# Create the vector store
vectordb = Chroma.from_documents(
documents=splits,
embedding=embedding,
persist_directory=persist_directory
)

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿฒ : ๐—™๐—ถ๐—ป๐—ฑ ๐˜€๐—ถ๐—บ๐—ถ๐—น๐—ฎ๐—ฟ ๐—ฟ๐—ฒ๐—น๐—ฎ๐˜๐—ฒ๐—ฑ ๐—ฐ๐—ผ๐—ป๐˜๐—ฒ๐—ป๐˜ ๐—ณ๐—ฟ๐—ผ๐—บ ๐˜๐—ต๐—ฒ ๐—ฑ๐—ฏ

question = "What would be a memorable valentine gift for a woman aged 39 years?"

docs = vectordb.similarity_search(question, k=2)
vectordb.persist()
content = docs[0].page_content

print(content)

I get :โ€”

Valentineโ€™s Day flowers

and a romantic homemade dinner.

So it seems from my knowledge base that flowers and a romantic homemade dinner might be a good idea.

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿณ : ๐—•๐˜‚๐—ถ๐—น๐—ฑ ๐—ฎ ๐—ฟ๐—ฒ๐˜๐—ฟ๐—ถ๐—ฒ๐˜ƒ๐—ฒ๐—ฟ ๐˜๐—ผ ๐—พ๐˜‚๐—ฒ๐—ฟ๐˜† ๐˜๐—ต๐—ฒ ๐—Ÿ๐—Ÿ๐—  ๐—ฎ๐—ป๐—ฑ ๐˜€๐˜‚๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜ ๐˜๐—ต๐—ฒ ๐—บ๐—ผ๐˜€๐˜ ๐—ฟ๐—ฒ๐—น๐—ฒ๐˜ƒ๐—ฎ๐—ป๐˜ ๐—ฟ๐—ฒ๐˜€๐˜‚๐—น๐˜ ๐—ณ๐—ฟ๐—ผ๐—บ ๐˜๐—ต๐—ฒ ๐—ฉ๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟ ๐——๐—•

# build some metadata to support the query to the LLM
# the preferable source file of the gift
metadata_field_info = [
AttributeInfo(
name="source",
description="The source of the gift. One of ['file3.pdf']",
type="string",
),
]

document_content_description = "memorable gift for a woman"
llm = OpenAI(temperature=0)
retriever = SelfQueryRetriever.from_llm(
llm,
vectordb,
document_content_description,
metadata_field_info,
verbose=True
)

#get the most relevant result for the query based on our own corpus
question = "What would be a memorable valentine gift for a woman aged 39 years?"
docs = retriever.get_relevant_documents(question)

print(docs)

I get :โ€”

page_content='Valentineโ€™s Day flowers\n \nand a romantic homemade dinner.'

So it seems (according to the knowledge base) that a combination of flowers and a home made dinner would be an excellent choice.

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿด : ๐—Ÿ๐—ฒ๐˜๐˜€ ๐—ฏ๐˜‚๐—ถ๐—น๐—ฑ ๐˜๐—ต๐—ฒ ๐—ฝ๐—ฟ๐—ผ๐—บ๐—ฝ๐˜

template = """Given a person with the following profile:
{user_profile}
Use the following pieces of context to answer the question at the end.
{context}
Question: {question}
Helpful Answer:"""

user_profile = {
"age": 39,
"gender": "female",
"interests": ["reading", "gardening", "music", "clothes", "outdoors"],
"profession": ["teacher"]
}

# add context to the query template
prompt = PromptTemplate(template=template, input_variables=["user_profile", "context", "question"])

๐—ฆ๐˜๐—ฒ๐—ฝ ๐Ÿต : ๐—Ÿ๐—ฒ๐˜๐˜€ ๐—ณ๐—ถ๐—ป๐—ฎ๐—น๐—น๐˜† ๐—ฎ๐˜€๐—ธ ๐—–๐—ต๐—ฎ๐˜๐—š๐—ฃ๐—ง

llm = ChatOpenAI(model_name="gpt-3.5-turbo-0125", temperature=0.0002)
llm_chain = LLMChain(prompt=prompt, llm=llm)

question = "What would be a memorable valentine's day gift for my wife aged 39?"

generated = llm_chain.run(user_profile=user_profile, context = docs, question = question)

# generate results
print("Gift ideas: " + generated)

And I get :โ€”

Gift ideas: Based on the profile of your wife, a memorable Valentine's Day gift could be a combination of things she enjoys such as a book from her favorite author, a new plant or gardening tool for her garden, a vinyl record of her favorite music artist, a stylish piece of clothing, or a gift card for outdoor activities or experiences. You could also consider planning a romantic homemade dinner or surprising her with Valentine's Day flowers. Ultimately, the most memorable gift will be something thoughtful and personalized to her interests and preferences.

I actually followed the advice.

What do you think?

Which gift did I get her?

And Did she like the gift?

Follow me Ritesh Shergill

for more articles on

๐Ÿ‘จโ€๐Ÿ’ป Tech

๐Ÿ‘ฉโ€๐ŸŽ“ Career advice

๐Ÿ“ฒ User Experience

๐Ÿ† Leadership

I also do

โœ… Career Guidance counselling โ€” https://topmate.io/ritesh_shergill/149890

โœ… Mentor Startups as a Fractional CTO โ€” https://topmate.io/ritesh_shergill/193786

--

--

Ritesh Shergill

Cybersec and Software Architecture Consultations | Career Guidance | Ex Vice President at JP Morgan Chase | Startup Mentor | Angel Investor | Author