相关文章推荐
深情的扁豆  ·  Power Query ...·  1 月前    · 
细心的手术刀  ·  Build Errors on ...·  2 月前    · 
强健的碗  ·  Ubuntu桌面系统 | Ubuntu·  7 月前    · 
阳光的充值卡  ·  UL Certification ...·  1 年前    · 

Bug report

Describe the bug

When using Next's API routes, chunks that are written with res.write aren't sent until after res.end() is called.

To Reproduce

Steps to reproduce the behavior, please provide code snippets or a repository:

  • Create the following API route in a Next app:
  • export default async (req, res) => {
      let intervalID = null
      res.setHeader('Content-Type', 'text/event-stream')
      res.write('data: CONNECTION ESTABLISHED\n')
      const end = () => {
        if (intervalID) {
          clearTimeout(intervalID)
      req.on('aborted', end)
      req.on('close', end)
      const sendData = () => {
        const timestamp = (new Date).toISOString()
        res.write(`data: ${timestamp}\n`)
      intervalID = setInterval(sendData, 1000)
    
  • Connect to the route with a tool that supports Server-Sent Events (i.e. Postwoman).
  • Expected behavior

    The route sends a new event to the connection every second.

    Actual behavior

    The route doesn't send any data to the connection unless a call to res.end() is added to the route.

    System information

  • OS: macOS
  • Version of Next.js: 9.1.5
  • Additional context

    When using other HTTP frameworks (Express, Koa, http, etc) this method works as expected. It's explicitly supported by Node's http.incomingMessage and http.ServerResponse classes which, from what I understand, Next uses as a base for the req and res that are passed into Next API routes.

    I'd hazard a guess that #5855 was caused by the same issue, but considered unrelated because the issue was obscured by the express-sse library.

    There are also two Spectrum topics about this (here and here) that haven't garnered much attention yet.

    Supporting Websockets and SSE in Next API routes may be related, but fixing support for SSE should be a lower barrier than adding support Websockets. All of the inner workings are there, we just need to get the plumbing repaired.

    For those stumbling onto this through Google, this is working as of Next.js 13 + Route Handlers:

    // app/api/route.ts
    import { Configuration, OpenAIApi } from 'openai';
    export const runtime = 'nodejs';
    // This is required to enable streaming
    export const dynamic = 'force-dynamic';
    export async function GET() {
      const configuration = new Configuration({
        apiKey: process.env.OPENAI_API_KEY,
      });
      const openai = new OpenAIApi(configuration);
      let responseStream = new TransformStream();
      const writer = responseStream.writable.getWriter();
      const encoder = new TextEncoder();
      writer.write(encoder.encode('Vercel is a platform for....'));
      try {
        const openaiRes = await openai.c…

    You can use a custom server.js to workaround this for now:

    require('dotenv').config();
    const app = require('express')();
    const server = require('http').Server(app);
    const next = require('next');
    const DSN = process.env.DSN || 'postgres://postgres:postgres@localhost/db';
    const dev = process.env.NODE_ENV !== 'production';
    const nextApp = next({ dev });
    const nextHandler = nextApp.getRequestHandler();
    nextApp.prepare().then(() => {
      app.get('*', (req, res) => {
        if (req.url === '/stream') {
          res.writeHead(200, {
            Connection: 'keep-alive',
            'Cache-Control': 'no-cache',
            'Content-Type': 'text/event-stream',
          });
          res.write('data: Processing...\n\n');
          setTimeout(() => {
            res.write('data: Processing2...\n\n');
          }, 10000);
        } else {
          return nextHandler(req, res);
      });
      require('../websocket/initWebSocketServer')(server, DSN);
      const port = 8080;
      server.listen(port, err => {
        if (err) throw err;
        console.log('> Ready on http://localhost:' + port);
      });
    });
    componentDidMount() {
        this.source = new EventSource('/stream')
        this.source.onmessage = function(e) {
          console.log(e)
            

    I would still recommend to keep any server sent event and websocket handlers in separate processes in production. It's very likely that the frequency of updates to those parts of the business logic are quite different. Your front-end most likely changes more often than the types of events you handle / need to push to the clients from the servers. If you only make changes to one, you probably don't want to restart the processes responsible for the other(s). Better to keep the connections alive rather than cause a flood of reconnections / server restarts for changes which have no effect.

    @msand The main reason I'm trying to avoid using a custom server is that I'm deploying to Now. Using a custom server would break all of the wonderful serverless functionality I get there.

    Your second point is fair. What I'm trying to do is create an SSE stream for data that would otherwise be handled with basic polling. The server is already dealing with constant reconnections in that case, so an SSE stream actually results in fewer reconnections.

    I suppose I could set up a small webserver in the same repo that just uses a separate Now builder. That would allow the processes to remain separate, though it'd still cause all of the SSE connections to abort and reconnect when there are any changes to the project.

    Even with those points, I can see plenty of scenarios in which it makes sense to be able to run an SSE endpoint from one of Next's API routes. Additionally, in the docs it's specifically stated that...

  • req: An instance of http.IncomingMessage, plus some pre-built middlewares you can see here
  • res: An instance of http.ServerResponse, plus some helper functions you can see here
  • Since it's specifically stated that res is an instance of http.ServerResponse, I'd expect it to behave exactly the way http.ServerResponse behaves in any other circumstance. Either the documentation should change to reflect the quirks of the implementation or, preferably, res.write should be fixed to behave the way it does in any other circumstance.

    @trezy It seems the issue is that the middleware adds a gzip encoding which the browser has negotiated using the header:

    Accept-Encoding: gzip, deflate, br
    

    If you add

    Content-Encoding: none
    

    then it seems to work:

      res.writeHead(200, {
        Connection: 'keep-alive',
        'Content-Encoding': 'none',
        'Cache-Control': 'no-cache',
        'Content-Type': 'text/event-stream',
      });

    Actually, this seems to be documented here: https://github.com/expressjs/compression#server-sent-events

    Have to call res.flush() when you think there's enough data for the compression to work efficiently

    export default (req, res) => {
      res.writeHead(200, {
        'Cache-Control': 'no-cache',
        'Content-Type': 'text/event-stream',
      });
      res.write('data: Processing...');
      /* https://github.com/expressjs/compression#server-sent-events
        Because of the nature of compression this module does not work out of the box with
        server-sent events. To compress content, a window of the output needs to be
        buffered up in order to get good compression. Typically when using server-sent
        events, there are certain block of data that need to reach the client.
        You can achieve this by calling res.flush() when you need the data written to
        actually make it to the client.
      res.flush();
      setTimeout(() => {
        res.write('data: Processing2...');
        res.flush();
      }, 1000);