AI Tavily
Tavily is not an AI itself, but a search engine powered by AI. It’s designed to be a more intuitive and powerful alternative to traditional keyword-based search engines like Google.
Here’s how Tavily uses AI:
- Natural Language Processing (NLP): Tavily understands the meaning and intent behind your search queries, allowing you to search using natural language like you would ask a question to a person.
- Contextual Understanding: It remembers your previous searches and uses them to refine future results, providing more relevant information.
- Source Analysis: Tavily analyzes and ranks sources based on credibility and reliability, helping you find trustworthy information.
Here are some key features of Tavily:
- Ask in your own words: You can ask questions naturally, just like you would with a human.
- Get direct answers: Tavily aims to provide concise answers to your questions, often pulling information directly from the most relevant source.
- Explore different perspectives: It gathers information from various sources to offer a well-rounded view on the topic.
- Access credible sources: Tavily prioritizes trustworthy sources and provides information about the source’s credibility.
In essence, Tavily leverages AI to make searching for information online easier, faster, and more reliable.
HANDS-ON
The search API has two search depth options: basic and advanced. The basic search is optimized for performance leading to faster response time. The advanced may take longer (around 5-10 seconds response time) but optimizes for quality. 1
Look out for the response content
field. Using the ‘advanced’ search depth
will highly improve the retrieved content to be only the most related content
from each site based on a relevance score.
code sample
"""
python3 -m venv env-tavily
source env-tavily/bin/activate
# `-I` Ignore the installed packages, overwriting them.
# `-U` Upgrade all specified packages to the newest available version.
pip install -I tavily-python==0.3.3
pip install --upgrade --force-reinstall taviy-python
pip show tavily-python
pip index versions tavily-python
https://docs.tavily.com/docs/tavily-api/python-sdk
"""
from pprint import pprint
# debugging
import pudb;
# pu.db
# set up required api keys
from tavily import TavilyClient
tavily = TavilyClient(api_key="tvly-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx")
# for basic search:
response = tavily.search(query="Should I invest in Apple in 2024?")
# for advanced search:
response = tavily.search(query="Should I invest in Apple in 2024?", search_depth="advanced")
# get the search results as context to pass an llm:
# pu.db
context = [{"url": obj["url"], "content": obj["content"]} for obj in response['results']]
# pprint(context)
# You can easily get search result context based on any max tokens straight into your RAG.
# The response is a string of the context within the max_token limit.
tavily.get_search_context(query="What happened in the burning man floods?", search_depth="advanced", max_tokens=1500)
# You can also get a simple answer to a question including relevant sources all with a simple function call:
print(tavily.qna_search(query="""
Where does Messi play right now?
"""))
Resources
- A blog post on how to use Tavily and Groq for AI-powered search:
- Langchain Tooling using Groq
-
os.environ["TAVILY_API_KEY"] = userdata.get("TAVILY_API_KEY") os.environ["GROQ_API_KEY"] = userdata.get("GROQ_API_KEY")
-
- Tool Calling with LangChain
- Tool calling enables a model to generate responses to a prompt according to a specific user-defined format or schema. For instance, if you need to extract structured information from unstructured text, you could provide the model with an “extraction” tool that requires parameters fitting the desired schema. The output generated by the model based on this schema can then be used as the final result. 2
- Langchain Tooling using Groq