[BUG] [SSE] Prompt Flow Endpoint doesn't work in streaming mode #3878
Replies: 1 comment
-
We figured it out, the thing was that output of our "last" node was consumed by other node |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi we've used multiple prompt-flow endpoints in streaming mode and after some tinkering it worked fine after we followed that instructions, and then later wrapped the endpoint with API Management service to configure cors for it to let it work with the app.
Unfortunately the issue occured again with a flow that is configured as in previously mentioned guidelines, it has LLM at the end with
{"type":"text"}
as response formatWe're using Azure Open AI connection I know in past it did worked only with Open AI connection like described in this issue, but now it works with Azure Open AI on our other endpoints as well.
We test the prompt-flow endpoint with curl like this
and we also test the managed api endpoint both with curla nd by connecting it with the app none works. It only happens for 1 of our flows so far. I'm not sure if the issue is with prompt-flow itself or with the underlying Azure architecture that is breaking the deployment.
Beta Was this translation helpful? Give feedback.
All reactions