Vercel Runs Your Backend Too

Most engineers I talk to think of Vercel as a place to deploy Next.js frontends. That's underselling it. Vercel runs full backend services: Express APIs, scheduled cron jobs, async queues, and multi-step workflows. The primitives are all there. You just have to know they exist.

Here's what running a real backend on Vercel actually looks like.

Running Express on Vercel

Vercel supports Express and other Node.js HTTP frameworks via Vercel Functions. Your Express app becomes a serverless function. Vercel handles routing, scaling, and infrastructure.

Start with a minimal Express app:

npm install express
// api/index.js
import express from 'express'

const app = express()
app.use(express.json())

app.get('/api/health', (req, res) => {
  res.json({ status: 'ok' })
})

app.post('/api/orders', async (req, res) => {
  const { item, quantity } = req.body
  // process the order...
  res.status(201).json({ orderId: crypto.randomUUID() })
})

export default app

vercel.json is the config file that controls how Vercel's CDN routes requests for your project. Rewrites work like a reverse proxy: the URL the client sees stays the same, but Vercel forwards the request internally to a different path.

This step is optional. If your entry file is at a conventional location (app.js, index.js, server.js, or the src/ equivalents), Vercel detects it automatically with zero config. The rewrite is only needed here because the file lives at api/index.js, which is off that path.

// vercel.json
{
  "rewrites": [
    { "source": "/api/(.*)", "destination": "/api/index" }
  ]
}

Deploy with vercel deploy. Your Express routes are live. Vercel runs each invocation in an isolated function instance, scales to zero when idle, and scales up automatically under load.

If you are already on Next.js, you get the same result with Route Handlers without needing Express at all. But if you have an existing Express API or need to deploy your API seperately, Vercel supports it.

Cron Jobs

On your own infrastructure you'd reach for a crontab or a separate scheduler process. On Vercel, Cron Jobs are configured in vercel.json and invoke any function on a schedule.

// vercel.json
{
  "crons": [
    {
      "path": "/api/jobs/sync-inventory",
      "schedule": "0 */6 * * *"
    },
    {
      "path": "/api/jobs/send-digest",
      "schedule": "0 8 * * 1"
    }
  ]
}

The function at that path receives a standard HTTP request. No special SDK, no new interface to learn.

// api/jobs/sync-inventory.js
export default async function handler(req, res) {
  // Vercel sends a GET request from its scheduler
  // Verify the request is from Vercel if needed
  const authHeader = req.headers.authorization
  if (authHeader !== `Bearer ${process.env.CRON_SECRET}`) {
    return res.status(401).json({ error: 'Unauthorized' })
  }

  await syncInventoryFromUpstream()
  res.json({ synced: true })
}

Set CRON_SECRET in your project environment variables and Vercel will include it on every cron invocation. You can verify it to prevent the endpoint from being called by anything else.

Cron jobs show up in the Vercel dashboard with last run time, status, and logs. No separate monitoring tool required.

Background Queues

Cron jobs cover "run this at a time." Queues cover "run this in response to an event, but not in the request path."

Vercel Queues gives you a managed message queue with a producer/consumer model. The producer enqueues a message during an HTTP request. A separate consumer function picks it up asynchronously and does the heavy work.

npm install @vercel/queue

Producer (inside your API handler):

// api/orders.js
import { Queue } from '@vercel/queue'

const emailQueue = new Queue('order-confirmation-emails')

export default async function handler(req, res) {
  const order = await createOrder(req.body)

  // Respond immediately — do not make the user wait for the email
  res.status(201).json({ orderId: order.id })

  // Enqueue the confirmation email for background processing
  await emailQueue.sendMessage({ orderId: order.id, email: req.body.email })
}

Consumer (a separate function that processes the queue):

// api/consumers/order-email.js
import { consumeQueue } from '@vercel/queue'

export default consumeQueue(async (message) => {
  const { orderId, email } = message.body
  await sendOrderConfirmationEmail({ orderId, email })
})

Wire up the consumer in vercel.json:

{
  "queues": [
    {
      "name": "order-confirmation-emails",
      "consumer": "/api/consumers/order-email"
    }
  ]
}

The queue handles retries, backoff, and dead-letter routing. Your consumer just processes the message. If it throws, Vercel retries it automatically according to the queue configuration.

This pattern keeps your API responses fast. Anything that does not need to happen in the request/response cycle: send it to a queue.

HTTP POST /api/orders
         ↓
  [Create order in DB]
         ↓
  [Enqueue email job] ← returns immediately to caller

  ... later ...

  Queue → [Consumer Function] → [Send email]

Combining Them: Multi-Step Workflows

Cron jobs and queues compose well. A common pattern is a cron job that fans out work to a queue, then a consumer that processes each item independently.

Say you need to send weekly reports to 10,000 users. Running that in a single function will hit timeout limits. Instead:

  1. A cron job fires once a week and enqueues one message per user.
  2. Each message is processed by a consumer function independently.
  3. Each consumer sends one report and exits.
// api/jobs/queue-weekly-reports.js (runs via cron)
import { Queue } from '@vercel/queue'

const reportQueue = new Queue('weekly-reports')

export default async function handler(req, res) {
  const users = await getAllActiveUsers()

  for (const user of users) {
    await reportQueue.sendMessage({ userId: user.id })
  }

  res.json({ queued: users.length })
}
// api/consumers/weekly-report.js (processes each message)
import { consumeQueue } from '@vercel/queue'

export default consumeQueue(async (message) => {
  const { userId } = message.body
  const report = await generateReport(userId)
  await sendReport(userId, report)
})

The cron job runs in seconds. The consumers run in parallel, each within their own function timeout, across as many instances as needed. No configuration changes required to scale it up.

For workflows that need explicit orchestration, things like branching logic, waiting on external events, or conditional retries, check the Vercel Workflow docs, which handles durable execution across multiple steps.

What This Means in Practice

You do not need a separate backend service running on EC2 or a managed container platform for any of this. The same Vercel project that serves your frontend can run your API, your scheduled jobs, and your background queue consumers.

I've seen backend teams lose entire sprints to infrastructure work that had nothing to do with their actual product. Keeping everything on one platform, with one set of logs and one place to manage environment variables, cuts that overhead significantly. Not to zero, but enough to notice.

If you are already deploying to Vercel, check the Backends on Vercel docs. The primitives are there. Use them.

Hi! I'm Charlie, a Solutions Architect at Vercel. Opinions are my own.