Akhilesh Yadav Being a Software Engineer, I love documenting my journey and share my learnings with other developers. I have passion for exploring Javascript, AWS Cloud and lot many things.
  1. Home
  2. /
  3. aws
  4. /
  5. Building LLM MCP Server...

Building LLM MCP Server to update static website pages hosted with AWS S3 and Cloudfront

4 min read

aws-llm-update-s3-site-using-mcp

Managing cloud-based resources efficiently is critical in modern application development. In this article, we explore how the Model Context Protocol (MCP) can be leveraged with AWS S3 and CloudFront to manage static files dynamically. We will demonstrate how MCP can facilitate seamless file updates, retrievals, and cache invalidation using an asynchronous approach.

What is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is a framework that enables tools to interact with data sources asynchronously. MCP is particularly useful for automation and serverless environments, making it a great fit for cloud-based workflows involving AWS S3 and CloudFront.

Why Use MCP with AWS S3 and CloudFront?

AWS S3 is a highly scalable object storage service used for hosting static files, whereas AWS CloudFront is a content delivery network (CDN) that accelerates the distribution of content globally. MCP can automate:

  • Uploading or updating files in an S3 bucket
  • Fetching file contents from S3
  • Purging CloudFront cache to reflect the latest updates

 

Setting Up MCP with AWS S3 and CloudFront

Prerequisites

Before proceeding, ensure you have:

  • AWS credentials (Access Key ID and Secret Access Key)
  • An S3 bucket configured for static file hosting
  • A CloudFront distribution linked to your S3 bucket
  • Python with Boto3 and asyncio installed

 

AWS Resource Creation Guide

Create a new S3 bucket and upload any sample index.html file

 

Go to AWS Cloudfront and create new distribution without WAF.

Disable WAF if asked for.

 

Note down Cloudfront Resource ID and distribution domain name.

Now click on Origins tab and create a new origin . Select S3 bucket , and copy policy and paste it within S3 bucket page.

 

Paste the copied policy from above to S3  Bucket Policy under S3 Permission Tab and click on Save.

 

Implementing MCP Tools

The following code demonstrates how MCP can be integrated with AWS S3 and CloudFront.

 

1. Install uv

uv is a fast Python package manager. If you haven’t installed it yet, do so with:

pip install uv


2. Initialize uv

Navigate to your project directory and initialize uv:

uv init

This will set up a pyproject.toml file if it doesn’t already exist.

3. Activate the Virtual Environment

  • Linux/macOS:

    <span class="hljs-built_in">source</span> .venv/bin/activate

     

  • Windows (PowerShell):
    .venv\Scripts\Activate


4. Add Dependencies

Run the following command to add httpx, mcp[cli], boto3 and requests:

uv add mcp[cli] boto3

This will automatically update pyproject.toml and install the packages.

5. Create Server.py file

 

Below code covers:

  • Configure AWS Credentials and Services
  • Define MCP Tools for File Handling
    Uploading and Updating Files in S3 -> async def update_s3_file(content: str) -> bool:
    Fetching File Contents from S3 -> async def fetch_s3_file_content() -> str:
    Purging CloudFront Cache -> purge_cloudflare_cache() -> bool:
  • Expose MCP Methods for External Usage
    @mcp.tool()
    async def update_create_my_web_file(content: str) -> str:
    
    @mcp.tool()
    async def update_create_my_web_file(content: str) -> str:
    
    @mcp.tool()
    async def fetch_my_web_file() -> str:

Full Code of server.py : 

from typing import Any
import asyncio
from mcp.server.fastmcp import FastMCP
import boto3
import time


# Initialize FastMCP server
mcp = FastMCP()

# AWS S3 Configuration
BUCKET_NAME = "aws-llm-s3-site"
FILE_KEY = "index.html"  # File to update on S3
# AWS Cloudfront
DISTRIBUTION_ID = ""


# Create a session with explicit credentials
# AWS Creds
aws_access_key = ""
aws_secret_key = ""
aws_region = "us-east-1"
session = boto3.Session(
    aws_access_key_id=aws_access_key,
    aws_secret_access_key=aws_secret_key,
    region_name=aws_region,
)

# Initialize S3 client
s3 = session.client("s3")

# Create a CloudFront client
client = session.client("cloudfront")

async def update_s3_file(content: str) -> bool:
    """Upload updated content to S3."""
    try:
        s3.put_object(Bucket=BUCKET_NAME, Key=FILE_KEY, Body=content, ContentType="text/html")
        print(f"File {FILE_KEY} updated successfully in S3.")
        return True
    except Exception as e:
        print(f"Error updating S3 file: {e}")
        return False

async def fetch_s3_file_content() -> str:
    """Fetch content of the file from S3."""
    try:
        response = s3.get_object(Bucket=BUCKET_NAME, Key=FILE_KEY)
        content = response["Body"].read().decode("utf-8")
        return content
    except Exception as e:
        print(f"Error fetching S3 file: {e}")
        return ""

async def purge_cloudflare_cache() -> bool:
    """Purge Cloudflare cache for the updated file."""
    # Create an invalidation request
    response = client.create_invalidation(
        DistributionId=DISTRIBUTION_ID,
        InvalidationBatch={
            "Paths": {
                "Quantity": 1,
                "Items": ["/*"]  # Invalidate everything, or specify specific files like ["/index.html"]
            },
            "CallerReference": str(time.time()),  # Unique value to avoid duplicate requests
        },
    )

    return response


@mcp.tool()
async def purge_cloudfront_cache() -> bool:
    """Purge CloudFront cache for the updated file."""
    try:
        response = client.create_invalidation(
            DistributionId=DISTRIBUTION_ID,
            InvalidationBatch={
                "Paths": {
                    "Quantity": 1,
                    "Items": ["/*"]  # Invalidate everything, or specify specific files like ["/index.html"]
                },
                "CallerReference": str(time.time()),  # Unique value to avoid duplicate requests
            },
        )
        return response
    except Exception as e:
        print(f"Error purging CloudFront cache: {e}")
        return False

@mcp.tool()
async def update_create_my_web_file(content: str) -> str:
    """
    Pass file contents and update/create the file on the web.
    
    Args:
        content (str): The new content of the file.
    
    Returns:
        str: Success or failure message.
    """
    success_s3 = await update_s3_file(content)
    if success_s3:
        return "File updated in S3 and CloudFront cache purged."
    return "Failed to update file in S3."

@mcp.tool()
async def fetch_my_web_file() -> str:
    """
    Fetch the contents of the file from S3.
    
    Returns:
        str: The content of the file or an error message.
    """
    content = await fetch_s3_file_content()
    return content if content else "Failed to fetch file content from S3."


## To Test Each Method Directly from CLI first

#asyncio.run(fetch_s3_file_content())
#asyncio.run(update_s3_file("Content"))
#asyncio.run(purge_cloudflare_cache())

if __name__ == "__main__":
    print("MCP Server started!")
    mcp.run(transport="stdio")

4. Run the Application

Once the environment is activated, run:

uv run server.py


5. Setup MCP client tools in Claude Desktop.

 

 

Open claude_desktop_config.json file and put this content within it. Replace with your directory path where “server.py” is present.

{
  "allowDevTools": true,
  "mcpServers": {
    "aws-site-updater": {
      "command": "uv",
      "args": [
          "--directory",
          "/Users/akyadav/Downloads/Code",
          "run",
          "server.py"
      ]
    }
  }
}

6. Restart Claude desktop and verify if it was installed in Claude by clicking on hammer icon.

 

7. Give a Prompt on Claude Desktop


8. Check Cloudfront Domain Site

9. Deactivate the Virtual Environment

When you’re done, deactivate the environment:

deactivate

 

Integrating MCP with AWS S3 and CloudFront enables seamless file updates and real-time content distribution while automating cache invalidation. This approach enhances performance, reduces manual intervention, and ensures end-users always get the latest content.

By leveraging asynchronous programming with Python’s asyncio, this setup maximizes efficiency while keeping operations lightweight. Start using MCP today to optimize your cloud workflows!

Next Steps

  • Expand the MCP tools to handle multiple file types
  • Implement authentication and access controls
  • Automate deployment with AWS Lambda and event triggers

Would you like additional features integrated into this setup? Let me know!

Akhilesh Yadav Being a Software Engineer, I love documenting my journey and share my learnings with other developers. I have passion for exploring Javascript, AWS Cloud and lot many things.

Leave a Reply

Your email address will not be published. Required fields are marked *