deco-cx
MCP Serverdeco-cxpublic

chat

An open-source SDK for agentic workflows based on MCPs. Integrated LLM cost management and one-click deploy to Cloudflare.

Repository Info

141
Stars
7
Forks
141
Watchers
165
Issues
TypeScript
Language
GNU Affero General Public License v3.0
License

About This Server

An open-source SDK for agentic workflows based on MCPs. Integrated LLM cost management and one-click deploy to Cloudflare.

Model Context Protocol (MCP) - This server can be integrated with AI applications to provide additional context and capabilities, enabling enhanced AI interactions and functionality.

Documentation

image

deco.chat

deco.chat is an open-source foundation for building AI-native software.
We equip developers, engineers, and AI enthusiasts with robust tools to rapidly prototype, develop, and deploy AI-powered applications.

Official docs: https://docs.deco.page

TIP

If you have questions or want to learn more, please join our discord community: https://deco.chat/discord

Who is it for?

  • Vibecoders prototyping ideas
  • Agentic engineers deploying scalable, secure, and sustainable production systems

Why deco.chat?

Our goal is simple: empower teams with Generative AI by giving builders the tools to create AI applications that scale beyond the initial demo and into the thousands of users, securely and cost-effectively.

image

Core capabilities

  • Open-source Runtime – Easily compose tools, workflows, and views within a single codebase
  • MCP Mesh (Model Context Protocol) – Securely integrate models, data sources, and APIs, with observability and cost control
  • Unified TypeScript Stack – Combine backend logic and custom React/Tailwind frontends seamlessly using typed RPC
  • Global, Modular Infrastructure – Built on Cloudflare for low-latency, infinitely scalable deployments. Self-host with your Cloudflare API Key
  • Visual Workspace – Build agents, connect tools, manage permissions, and orchestrate everything built in code
image

Creating a new Deco project

A Deco project extends a standard Cloudflare Worker with our building blocks and defaults for MCP servers.
It runs a type-safe API out of the box and can also serve views — front-end apps deployed alongside the server.

Currently, views can be any Vite app that outputs a static build. Soon, they’ll support components declared as tools, callable by app logic or LLMs.
Views can call server-side tools via typed RPC.

Requirements

  • Your preferred JavaScript runtime:
    • Recommended: Bun
    • Supported: Node.js, Deno

Quick Start

  1. Install the CLI
npm i -g deco-cli

or

bun i -g deco-cli
  1. Log in to deco.chat. Don’t have an account? Sign up first.
deco login
  1. Create a new project
deco create              # create new project, select workspace and choose template
cd my-project
npm install              # or bun, deno, pnpm
  1. Start the dev server
npm run dev               # → http://localhost:8787 (hot reload)

Need pre‑built MCP integrations? Explore deco-cx/apps.

Project Layout

my-project/
├── server/         # MCP tools & workflows (Cloudflare Workers)
│   ├── main.ts
│   ├── deco.gen.ts  # Typed bindings (auto-generated)
│   └── wrangler.toml
├── view/           # React + Tailwind UI (optional)
│   └── src/
├── package.json    # Root workspace scripts
└── README.md

Skip view/ if you don’t need a frontend.

CLI Essentials

CommandPurpose
deco devRun server & UI with hot reload
deco deployDeploy to Cloudflare Workers
deco genGenerate types for external integrations
deco gen:selfGenerate types for your own tools

For full command list: deco --help or see the CLI README

Building Blocks

A Deco project is built using tools and workflows — the core primitives for connecting integrations, APIs, models, and business logic.

Tools

Atomic functions that call external APIs, databases, or AI models. All templates include the necessary imports from the Deco Workers runtime.

import { createTool, Env, z } from "deco/mod.ts";

const createMyTool = (env: Env) =>
  createTool({
    id: "MY_TOOL",
    description: "Describe what it does",
    inputSchema: z.object({ query: z.string() }),
    outputSchema: z.object({ answer: z.string() }),
    execute: async ({ context }) => {
      const res = await env.OPENAI.CHAT_COMPLETIONS({
        model: "gpt-4o",
        messages: [{ role: "user", content: context.query }],
      });
      return { answer: res.choices[0].message.content };
    },
  });

Tools can be used independently or within workflows. Golden rule: one tool call per step — keep logic in the workflow.


Workflows

Orchestrate tools using Mastra operators like .then, .parallel, .branch, and .dountil.

Tip: Add Mastra docs to your AI code assistant for autocomplete and examples.

import { createStepFromTool, createWorkflow } from "deco/mod.ts";

return createWorkflow({
  id: "HELLO_WORLD",
  inputSchema: z.object({ name: z.string() }),
  outputSchema: z.object({ greeting: z.string() }),
})
  .then(createStepFromTool(createMyTool(env)))
  .map(({ inputData }) => ({ greeting: `Hello, ${inputData.answer}!` }))
  .commit();

Views

Build React + Tailwind frontends served by the same Cloudflare Worker.

  • Routing with TanStack Router
  • Typed RPC via @deco/workers-runtime/client
  • Preconfigured with shadcn/ui and lucide-react

Development Flow

  1. Add an integration via the deco.chat dashboard (improved UX coming soon)

  2. Run npm run gen → updates deco.gen.ts with typed clients

  3. Write tools in server/main.ts

  4. Compose workflows using .map, .branch, .parallel, etc.

  5. (Optional) Run npm run gen:self → typed RPC clients for your tools

  6. Build views in /view and call workflows via the typed client

  7. Run locally

    npm run dev   # → http://localhost:8787
    
  8. Deploy to Cloudflare

    npm run deploy
    

How to Contribute

We welcome contributions! Check out CONTRIBUTING.md for guidelines and tips.


Made with ❤️ by the Deco community — helping teams build AI-native systems that scale.

Quick Start

1

Clone the repository

git clone https://github.com/deco-cx/chat
2

Install dependencies

cd chat
npm install
3

Follow the documentation

Check the repository's README.md file for specific installation and usage instructions.

Repository Details

Ownerdeco-cx
Repochat
LanguageTypeScript
LicenseGNU Affero General Public License v3.0
Last fetched8/10/2025

Recommended MCP Servers

💬

Discord MCP

Enable AI assistants to seamlessly interact with Discord servers, channels, and messages.

integrationsdiscordchat
🔗

Knit MCP

Connect AI agents to 200+ SaaS applications and automate workflows.

integrationsautomationsaas
🕷️

Apify MCP Server

Deploy and interact with Apify actors for web scraping and data extraction.

apifycrawlerdata
🌐

BrowserStack MCP

BrowserStack MCP Server for automated testing across multiple browsers.

testingqabrowsers

Zapier MCP

A Zapier server that provides automation capabilities for various apps.

zapierautomation