Skip to content

Commit

Permalink
Merge pull request #98 from nitya/docs/oct-2024-refresh
Browse files Browse the repository at this point in the history
initial commit / restructure docs
  • Loading branch information
sethjuarez authored Oct 10, 2024
2 parents 9cd965f + fa26918 commit 27c4a4f
Show file tree
Hide file tree
Showing 34 changed files with 1,244 additions and 465 deletions.
1 change: 1 addition & 0 deletions .env
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
AZURE_OPENAI_ENDPOINT=${env:AZURE_OPENAI_ENDPOINT}
4 changes: 2 additions & 2 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"files.exclude": {
"**/web": true,
"**/runtime": true
"**/web": false,
"**/runtime": false
}
}
4 changes: 0 additions & 4 deletions web/docs/contributing/code/page.mdx

This file was deleted.

4 changes: 0 additions & 4 deletions web/docs/contributing/documentation/page.mdx

This file was deleted.

4 changes: 0 additions & 4 deletions web/docs/contributing/documentation/simple/page.mdx

This file was deleted.

33 changes: 31 additions & 2 deletions web/docs/contributing/page.mdx
Original file line number Diff line number Diff line change
@@ -1,4 +1,33 @@
---
title: Contributing
index: 4
---
index: 5
authors:
- bethanyjep
- nitya
date: 2024-10-09
tags:
- documentation
- contributing
---

<br/>
## About Prompty
[Prompty](https://github.com/microsoft/prompty) is an open-source project from Microsoft that makes it easy for developers to _create, manage, debug, and evaluate_ LLM prompts for generative AI applications. We welcome contributions from the community that can help make the technology more useful, and usable, by developers from all backgrounds. Before you get started, review this page for contributor guidelines.

<br/>
## Code Of Conduct
Read the project's [Code of Conduct](https://github.com/microsoft/prompty/blob/main/CODE_OF_CONDUCT.md) and adhere to it. The project is alse governed by the Microsoft Open Source Code of Conduct - [Read their FAQ](https://opensource.microsoft.com/codeofconduct/faq/) to learn why the CoC matters and how you can raise concerns or provide feedback.

<br/>
## Providing feedback

Feedback can come in several forms:
- Tell us about a missing feature or enhancement request
- Let us know if you found errors or ambiguity in the documentation
- Report bugs or inconsistent behavior seen with Prompty tools and usage

The easiest way to give us feedback is by [filing an issue](https://github.com/microsoft/prompty/issues/new?template=Blank+issue). **Please check previously logged issues (open and closed) to make sure the topic or bug has not already been raised.** If it does exist, weigh in on that discussion thread to add any additional context of value.

<br/>
## Contributor guidelines
The repository contains both the code and the documentation for the project. Each requires a different set of tools and processes to build and preview outcomes. We hope to document these soon - so check back for **contributor guidelines** that will cover the requirements in more detail.
File renamed without changes
File renamed without changes
File renamed without changes
140 changes: 140 additions & 0 deletions web/docs/getting-started/concepts/page.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
---
title: Concepts
authors:
- sethjuarez
- wayliums
- cassiebreviu
- bethanyjep
- nitya
date: 2024-10-09
tags:
- getting-started
- documentation
- overview
index: 2
---

_In this section, we cover the core building blocks of Prompty (specification, tooling, and runtime) and walk you through the developer flow and mindset for going from "prompt" to "prototype"_.

<br/>


## 1. Prompty Components

The Prompty implementation consists of three core components - the _specification_ (file format), the _tooling_ (developer experience) and the _runtime_ (executable code). Let's review these briefly.
<br/>

![What is Prompty?](01-what-is-prompty.png)
<br/>


### 1.1 The Prompty Specification

The [Prompty specification](https://github.com/microsoft/prompty/blob/main/Prompty.yaml) defines the core `.prompty` asset file format. We'll look at this in more detail in the [Prompty File Spec](/docs/prompty-file-spec) section of the documentation. For now, click to expand the section below to see a _basic.prompty_ sample and get an intuitive sense for what an asset file looks like.

<details>
<summary> **Learn More**: The `basic.prompty` asset file </summary>

```markdown
---
name: Basic Prompt
description: A basic prompt that uses the GPT-3 chat API to answer questions
authors:
- sethjuarez
- jietong
model:
api: chat
configuration:
api_version: 2023-12-01-preview
azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}
azure_deployment: ${env:AZURE_OPENAI_DEPLOYMENT:gpt-35-turbo}
sample:
firstName: Jane
lastName: Doe
question: What is the meaning of life?
---
system:
You are an AI assistant who helps people find information.
As the assistant, you answer questions briefly, succinctly,
and in a personable manner using markdown and even add some personal flair with appropriate emojis.

# Customer
You are helping {{firstName}} {{lastName}} to find answers to their questions.
Use their name to address them in your responses.

user:
{{question}}
```

</details>
<br/>

### 1.2 The Prompty Tooling

The [Prompty Visual Studio Code Extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty) helps you create, manage, and execute, your `.prompty` assets - effectively giving you a _playground_ right in your editor, to streamline your prompt engineering workflow and speed up your prototype iterations. We'll get hands-on experience with this in the [Tutorials](/docs/tutorials) section. For now, click to expand the section and get an intutive sense for how this enhances your developer experience.

<details>
<summary> **Learn More**: The Prompty Visual Studio Code Extension </summary>

- [Install the extension](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty) in your Visual Studio Code environment to get the following features out-of-the-box:
- Create a default `basic.prompty` starter asset - then configure models and customize content.
- Create the "pre-configured" starter assets for [GitHub Marketplace Models](https://github.com/marketplace/models) in _serverless_ mode.
- Create "starter code" from the asset for popular frameworks (e.g., LangChain)
- Use the _prompty_ commandline tool to execute a `.prompty` asset and "chat" with your model.
- Use _settings_ to create _named_ model configurations for reuse
- Use toolbar icon to view and switch quickly between named configurations
- View the "runs" history, and drill down into a run with a built-in trace viewer.

</details>
<br/>

### 1.3 The Prompty Runtime

The Prompty Runtime helps you make the transition from _static asset_ (`.prompty` file) to _executable code_ (using a preferred language and framework) that you can test interactively from the commandline, and integrate seamlessly into end-to-end development workflows for automation. We'll have a dedicated documentation on this **soon**. In the meantime, click to expand the section below to learn about **supported runtimes** today, and check back for updates on new runtime releases.

<details>
<summary> **Learn More**: Available Prompty Runtimes </summary>

Core runtimes provide the base package needed to run the Prompty asset with code. Prompty currently has two core runtimes, with more support coming.
* [Prompty Core (python)](https://pypi.org/project/prompty/) → Available _in preview_.
* Prompty Core (csharp) → In _active development_.
<br/>

Enhanced runtimes add support for orchestration frameworks, enabling complex workflows with Prompty assets:
* [Prompt flow](https://microsoft.github.io/promptflow/) → Python core
* [LangChain (python)](https://pypi.org/project/langchain-prompty/) → Python core (_experimental_)
* [Semantic Kernel](https://learn.microsoft.com/semantic-kernel/) → C# core

</details>
<br/>


## 2. Developer Workflow

Prompty is ideal for rapid prototyping and iteration of a new generative AI application, using rich developer tooling and a local development runtime. It fits best into the _ideation_ and _evaluation_ phases of the GenAIOps application lifecycle as shown:

1. **Start** by creating & testing a simple prompt in VS Code
2. **Develop** by iterating config & content, use tracing to debug
3. **Evaluate** prompts with AI assistance, saved locally or to cloud

<br/>
![How do we use prompty?](02-build-with-prompty.png)
<br/>


## 3. Developer Mindset

As an AI application developer, you are likely already using a number of tools and frameworks to enhance your developer experience. So, where does Prompty fit into you developer toolchain?

Think of it as a **micro-orchestrator focused on a single LLM invocation** putting it at a step above the basic _API call_ and positioned to support more complex orchestration frameworks above it. With Prompty, you can:
- _configure_ the right model for that specific invocation
- _engineer_ the prompt (system, user, context, instructions) for that request
- _shape_ the data used to "render" the template on execution by the runtime

<br/>
![Where does this fit?](03-micro-orchestrator-mindset.png)

<br/>
---
<br/>
[Want to Contribute To the Project?](/docs/contributing/) - _Updated Guidance Coming Soon_.
Loading

0 comments on commit 27c4a4f

Please sign in to comment.