Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

OpenAI Stream response, not working as expected

On my NodeJS app, i have set-up a POST method that looks like this:

exports.complete = async (req, res, next) => {\
    const {prompt} = req.body
    res.writeHead(200, {
      'Content-Type': 'text/plain',
      'Transfer-Encoding': 'chunked'
    })
    const result = await openai.createCompletion(
    {
      model: 'text-davinci-003',
      prompt: prompt,
      temperature: 0.6,
      max_tokens: 700,
      stream: true
    },
    { responseType: 'stream' }
  )

  result.data.on('data', data => {
    const lines = data
      .toString()
      .split('\n')
      .filter(line => line.trim() !== '')
    for (const line of lines) {
      const message = line.replace(/^data: /, '')
      if (message === '[DONE]') {
          res.end()
      }
      try {
        const parsed = JSON.parse(message)
        res.write(parsed.choices[0].text)
      } catch (error) {
        // console.error('Could not JSON parse stream message', message, error)
      }
    }
  })
}

As the OpenAI Node SDK doesn’t natively support streaming responses, i scrapped this code from a few sources.

To some extent, it is working i guess. But when i make a call to this endpoint from Postman and also from command line (using curl), instead of actually getting the response in chunks, i am getting a final response when the entire completion call finishes.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

I am not sure what i am doing wrong here. Note: This code above is a part of a Firebase Function call where i have set-up express. I don’t think it affects anything, but still mentioning.

Edit 1: The curl request

Here’s my curl request:

curl --location 'http://localhost:5001/testapp-7aca9/us-central1/api/completion/complete' \
--header 'Content-Type: application/json' \
--data '{
    "message": "Say two random lines"
}'

>Solution :

You can refer to the solution posted here: What is the correct way to send a long string by an HTTP stream in ExpressJS/NestJS?

It might be caused by 2 reasons:

  1. In curl, you will need to add -N flag to buffer the response to receive the response directly
  2. In browser, the data are not returned until at least 1024 bytes (for Chrome) is returned, unless you add a X-Content-Type-Options of nosniff
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading