Skip to content

Commit

Permalink
update llmlingua tool doc (#3319)
Browse files Browse the repository at this point in the history
# Description

Please add an informative description that covers that changes made by
the pull request and link all relevant issues.

# All Promptflow Contribution checklist:
- [x] **The pull request does not introduce [breaking changes].**
- [x] **CHANGELOG is updated for new features, bug fixes or other
significant changes.**
- [x] **I have read the [contribution guidelines](../CONTRIBUTING.md).**
- [ ] **Create an issue and link to the pull request to get dedicated
review from promptflow team. Learn more: [suggested
workflow](../CONTRIBUTING.md#suggested-workflow).**

## General Guidelines and Best Practices
- [x] Title of the pull request is clear and informative.
- [x] There are a small number of commits, each of which have an
informative message. This means that previously merged commits do not
appear in the history of the PR. For more information on cleaning up the
commits in your PR, [see this
page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md).

### Testing Guidelines
- [x] Pull request includes test coverage for the included changes.
  • Loading branch information
SiyunZhao authored May 21, 2024
1 parent f80a4d9 commit 2350f41
Show file tree
Hide file tree
Showing 2 changed files with 44 additions and 0 deletions.
2 changes: 2 additions & 0 deletions docs/integrations/tools/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,12 @@ The table below provides an index of custom tool packages. The columns contain:
| Package Name | Description | Owner | Support Contact |
|-|-|-|-|
| promptflow-azure-ai-language | Collection of Azure AI Language Prompt flow tools. | Sean Murray | taincidents@microsoft.com |
|llmlingua-promptflow|Speed up large language model's inference and enhance large language model's perceive of key information, compress the prompt with minimal performance loss.| LLMLingua Team|llmlingua@microsoft.com|

```{toctree}
:maxdepth: 1
:hidden:
azure-ai-language-tool
llmlingua-prompt-compression-tool
```
42 changes: 42 additions & 0 deletions docs/integrations/tools/llmlingua-prompt-compression-tool.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# LLMLingua Prompt Compression

## Introduction
LLMLingua Prompt Compression tool enables you to speed up large language model's inference and enhance large language model's perceive of key information, compress the prompt with minimal performance loss.

## Requirements
PyPI package: [`llmlingua-promptflow`](https://pypi.org/project/llmlingua-promptflow/).
- For Azure users:
follow [the wiki for AzureML](https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-custom-tool-package-creation-and-usage?view=azureml-api-2#prepare-runtime) or [the wiki for AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/prompt-flow-tools/prompt-flow-tools-overview#custom-tools), starting from `Prepare runtime`.
- For local users:
```
pip install llmlingua-promptflow
```
You may also want to install the [Prompt flow for VS Code extension](https://marketplace.visualstudio.com/items?itemName=prompt-flow.prompt-flow).
## Prerequisite
Create a MaaS deployment for large language model in Azure model catalog. Take the Llama model as an example, you can learn how to deploy and consume Meta Llama models with model as a service by [the guidance for Azure AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama?tabs=llama-three#deploy-meta-llama-models-with-pay-as-you-go)
or
[the guidance for Azure Machine Learning Studio
](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-models-llama?view=azureml-api-2&tabs=llama-three#deploy-meta-llama-models-with-pay-as-you-go).
## Inputs
The tool accepts the following inputs:
| Name | Type | Description | Required |
| ---- | ---- | ----------- | -------- |
| prompt | string | The prompt that needs to be compressed. | Yes |
| myconn | CustomConnection | The created connection to a MaaS resource for calculating log probability. | Yes |
| rate | float | The maximum compression rate target to be achieved. Default value is 0.5. | No |
## Outputs
| Return Type | Description |
|-------------|----------------------------------------------------------------------|
| string | The resulting compressed prompt. |
## Sample Flows
Find example flows using the `llmlingua-promptflow` package [here](https://github.com/microsoft/promptflow/tree/main/examples/flows/integrations/llmlingua-prompt-compression).
## Contact
Please reach out to LLMLingua Team (<llmlingua@microsoft.com>) with any issues.

0 comments on commit 2350f41

Please sign in to comment.