A fast, plug-and-play MCP server for image processing and cloud uploads, designed for AI assistants and automation workflows.
A lightweight server implementing Model Context Protocol (MCP) for automated image manipulation and uploads. It makes image resizing, converting, optimizing, and uploading seamless for devs, AI tools, or automated pipelines.
- All-in-One Image Processing: Resize, convert, optimize, and transform images with the powerful sharp library.
- Effortless Cloud Uploads: Integrates with AWS S3, Cloudflare R2, Google Cloud Storage.
- AI & Workflow Ready: Built for MCP, integrates with any AI assistant or workflow runner.
- Flexible Input: Works with file paths, URLs, or base64 images.
- Automatable: Scriptable for batch tasks or as a backend service.
Use npm (or yarn/pnpm):
npm install -g @boomlinkai/image-worker-mcp
# or
yarn global add @boomlinkai/image-worker-mcp
# or
pnpm add -g @boomlinkai/image-worker-mcp
Or use it instantly (no install):
npx @boomlinkai/image-worker-mcp
npx @boomlinkai/image-worker-mcp
Resize an image:
{
"tool_code": "use_mcp_tool",
"tool_name": "resize_image",
"server_name": "image-worker",
"arguments": {
"imageUrl": "https://example.com/original.jpg",
"width": 800,
"format": "webp",
"outputPath": "./resized_image.webp"
}
}
Upload an image:
{
"tool_code": "use_mcp_tool",
"tool_name": "upload_image",
"server_name": "image-worker",
"arguments": {
"imagePath": "./resized_image.webp",
"service": "s3",
"filename": "my-optimized-image",
"folder": "website-assets"
}
}
The MCP server works via stdio, making it easy to plug into AI tools and code editors.
Click to expand platform setup guides (Cursor, Windsurf, VSCode, Zed, Claude, BoltAI, Roo Code)
Add to ~/.cursor/mcp.json
:
{
"mcpServers": {
"image-worker": {
"command": "npx",
"args": ["-y", "@boomlinkai/image-worker-mcp"]
}
}
}
Resize and transform images via:
imagePath
,imageUrl
, orbase64Image
(input)width
,height
,fit
,format
,quality
,rotate
, etc.- Returns path or base64 of processed image
Upload any image (by path/url/base64) to:
service
:s3
|cloudflare
|gcloud
filename
,folder
,public
, etc.- Set credentials as env vars
Set these for your chosen cloud provider:
AWS S3
export AWS_ACCESS_KEY_ID=xxx
export AWS_SECRET_ACCESS_KEY=xxx
export S3_BUCKET=your-bucket
export S3_REGION=us-east-1
# Optional: S3_ENDPOINT=https://...
Cloudflare R2
export CLOUDFLARE_R2_ACCESS_KEY_ID=xxx
export CLOUDFLARE_R2_SECRET_ACCESS_KEY=xxx
export CLOUDFLARE_R2_BUCKET=your-bucket
export CLOUDFLARE_R2_ENDPOINT=https://...
Google Cloud Storage
export GCLOUD_PROJECT_ID=xxx
export GCLOUD_BUCKET=your-bucket
# Optionally: GCLOUD_CREDENTIALS_PATH=/path/to/key.json
Default upload service:
export UPLOAD_SERVICE=s3
⚠️ Never commit credentials to source control. Use environment variables or secret managers.
- Node.js 18.x or higher
- No system dependencies;
sharp
is auto-installed
- Install fails on ARM/Apple Silicon? Run
brew install vips
(sharp dependency) or use Node 18+. - Credentials not working? Check env var spelling/casing.
- Image output is blank or corrupt? Confirm input image type and size.
PRs and issues welcome! Please open an issue or submit a pull request.
Vuong Ngo – BoomLink.ai
- Join our Discord for support, feedback, and community discussions
- Follow us on X.com for updates and news
- Connect on LinkedIn for company news and insights
MIT