crewai + openai + serper.dev Web Search Crawler
May 9, 2025 Snippet Contains Code

Requirements:

bashCopyEditpip install crewai openai requests

🧠 Example Code

pythonCopyEditfrom crewai import Agent, Task, Crew
import requests
import os

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "sk-..."

# Optional: Using Serper.dev or similar web search API
def search_web(query):
    resp = requests.get(
        "https://google.serper.dev/search",
        headers={"X-API-KEY": "your-serper-api-key"},
        json={"q": query}
    )
    return "\n".join([r["link"] for r in resp.json().get("organic", [])])

# Define a WebCrawler agent
web_crawler = Agent(
    role="Web Crawler",
    goal="Search the internet and extract the most relevant results for a given topic.",
    backstory="You are a helpful web researcher agent that uses search tools and summarizes pages.",
    verbose=True,
    allow_delegation=False
)

# Task for the crawler
crawl_task = Task(
    description="Search the web and summarize 3 latest trends in AI startup funding.",
    agent=web_crawler,
    tools=[search_web]  # You can also define a Tool class if needed
)

# Run the crew
crew = Crew(
    agents=[web_crawler],
    tasks=[crawl_task],
    verbose=True
)

result = crew.run()
print(result)

🧩 Optional: Use Tool object for better crew integration

pythonCopyEditfrom crewai_tools import Tool

web_search_tool = Tool(
    name="Web Search",
    func=search_web,
    description="Searches Google and returns a list of top links."
)