Skip to content

Commit

Permalink
fix: csharp graceful shutdown (#2770)
Browse files Browse the repository at this point in the history
# Description

In csharp executor service, traces are sent in parallel with exec
response, so it is possible that traces are not fully sent when we try
to destroy the executor proxy. To avoid this, we need to enable graceful
shutdown for csharp.

To enable graceful shutdown (supported in windows only for now), csharp
required a `CTRL_C_EVENT` signal instead of `SIGTERM` signal, which will
forcefully shutdown the csharp service.

On the other hand, directly sending `CTRL_C_EVENT` signal will touch all
processes in target process group so the main process will also be
terminated.

We fix this issue in this PR

# All Promptflow Contribution checklist:
- [x] **The pull request does not introduce [breaking changes].**
- [x] **CHANGELOG is updated for new features, bug fixes or other
significant changes.**
- [x] **I have read the [contribution guidelines](../CONTRIBUTING.md).**
- [x] **Create an issue and link to the pull request to get dedicated
review from promptflow team. Learn more: [suggested
workflow](../CONTRIBUTING.md#suggested-workflow).**

## General Guidelines and Best Practices
- [x] Title of the pull request is clear and informative.
- [x] There are a small number of commits, each of which have an
informative message. This means that previously merged commits do not
appear in the history of the PR. For more information on cleaning up the
commits in your PR, [see this
page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md).

### Testing Guidelines
- [ ] Pull request includes test coverage for the included changes.
  • Loading branch information
elliotzh authored Apr 12, 2024
1 parent a16623c commit 5180776
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,8 @@ async def create(
log_path=log_path,
error_file_path=init_error_file,
yaml_path=flow_file.as_posix(),
)
),
creationflags=subprocess.CREATE_NEW_PROCESS_GROUP if platform.system() == OSType.WINDOWS else 0,
)
else:
# if port is provided, assume the execution service is already started
Expand Down Expand Up @@ -177,10 +178,13 @@ async def destroy(self):
if self._process and self._is_executor_active():
if platform.system() == OSType.WINDOWS:
# send CTRL_C_EVENT to the process to gracefully terminate the service
self._process.send_signal(signal.CTRL_C_EVENT)
self._process.send_signal(signal.CTRL_BREAK_EVENT)
else:
# for Linux and MacOS, Popen.terminate() will send SIGTERM to the process
self._process.terminate()

# TODO: there is a potential issue that, graceful shutdown won't work for streaming chat flow for now
# because response will not be fully consumed before we destroy the executor proxy
try:
self._process.wait(timeout=5)
except subprocess.TimeoutExpired:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from promptflow._constants import FlowEntryRegex
from promptflow._core._errors import UnexpectedError
from promptflow._sdk._constants import ALL_CONNECTION_TYPES, FLOW_TOOLS_JSON, PROMPT_FLOW_DIR_NAME
from promptflow._utils.flow_utils import read_json_content
from promptflow._utils.flow_utils import is_flex_flow, read_json_content
from promptflow._utils.yaml_utils import load_yaml

from ._base_inspector_proxy import AbstractInspectorProxy
Expand All @@ -26,6 +26,9 @@ def __init__(self):
def get_used_connection_names(
self, flow_file: Path, working_dir: Path, environment_variables_overrides: Dict[str, str] = None
) -> List[str]:
if is_flex_flow(flow_path=flow_file):
# in flex mode, csharp will always directly get connections from local pfs
return []
# TODO: support environment_variables_overrides
flow_tools_json_path = working_dir / PROMPT_FLOW_DIR_NAME / FLOW_TOOLS_JSON
tools_meta = read_json_content(flow_tools_json_path, "meta of tools")
Expand Down

0 comments on commit 5180776

Please sign in to comment.