Firecrawl MCP Upgrades: Launch Week III - Day 6

Welcome to Launch Week III, Day 6 — Firecrawl MCP Upgrades.
Today, we’re rolling out a major set of updates to our Firecrawl MCP server — our implementation of the Model Context Protocol (MCP) that powers scraping workflows with LLMs and web data.
FIRE-1 Web Action Agent Support
The Firecrawl MCP server now supports our new FIRE-1 model. This means you can now:
- Use FIRE-1 via the MCP scrape and extract endpoints.
- Seamlessly collect data behind interaction barriers — logins, buttons, modals, and more.
- Incorporate intelligent, agent-driven scraping into any MCP-compatible tool.
Server-Sent Events (SSE)
We’ve added HTTP Server-Side Events (SSE) support to the MCP, making real-time communication and data flow smoother.
- SSE is now available for local use.
- This means you can plug into a running Firecrawl MCP server with minimal overhead.
Learn More
Check out our updated docs and MCP repo:
About the Author

Eric Ciarla is the Chief Operating Officer (COO) of Firecrawl and leads marketing. He also worked on Mendable.ai and sold it to companies like Snapchat, Coinbase, and MongoDB. Previously worked at Ford and Fracta as a Data Scientist. Eric also co-founded SideGuide, a tool for learning code within VS Code with 50,000 users.
More articles by Eric Ciarla
How to Create an llms.txt File for Any Website
Learn how to generate an llms.txt file for any website using the llms.txt Generator and Firecrawl.
Cloudflare Error 1015: How to solve it?
Cloudflare Error 1015 is a rate limiting error that occurs when Cloudflare detects that you are exceeding the request limit set by the website owner.
Build an agent that checks for website contradictions
Using Firecrawl and Claude to scrape your website's data and look for contradictions.
Why Companies Need a Data Strategy for Generative AI
Learn why a well-defined data strategy is essential for building robust, production-ready generative AI systems, and discover practical steps for curation, maintenance, and integration.
Getting Started with OpenAI's Predicted Outputs for Faster LLM Responses
A guide to leveraging Predicted Outputs to speed up LLM tasks with GPT-4o models.
How to easily install requests with pip and python
A tutorial on installing the requests library in Python using various methods, with usage examples and troubleshooting tips
How to quickly install BeautifulSoup with Python
A guide on installing the BeautifulSoup library in Python using various methods, with usage examples and troubleshooting tips
How to Use OpenAI's o1 Reasoning Models in Your Applications
Learn how to harness OpenAI's latest o1 series models for complex reasoning tasks in your apps.