python.useinstructor.com
Open in
urlscan Pro
2606:4700:3031::6815:1b64
Public Scan
Submitted URL: http://python.useinstructor.com/
Effective URL: https://python.useinstructor.com/
Submission: On May 13 via api from US — Scanned from DE
Effective URL: https://python.useinstructor.com/
Submission: On May 13 via api from US — Scanned from DE
Form analysis
3 forms found in the DOM<form class="md-header__option" data-md-component="palette"> <input class="md-option" data-md-color-media="" data-md-color-scheme="default" data-md-color-primary="black" data-md-color-accent="indigo" aria-label="Switch to dark mode" type="radio"
name="__palette" id="__palette_0"> <label class="md-header__button md-icon" title="Switch to dark mode" for="__palette_1"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path d="M12 8a4 4 0 0 0-4 4 4 4 0 0 0 4 4 4 4 0 0 0 4-4 4 4 0 0 0-4-4m0 10a6 6 0 0 1-6-6 6 6 0 0 1 6-6 6 6 0 0 1 6 6 6 6 0 0 1-6 6m8-9.31V4h-4.69L12 .69 8.69 4H4v4.69L.69 12 4 15.31V20h4.69L12 23.31 15.31 20H20v-4.69L23.31 12 20 8.69Z"></path>
</svg> </label> <input class="md-option" data-md-color-media="" data-md-color-scheme="slate" data-md-color-primary="black" data-md-color-accent="indigo" aria-label="Switch to light mode" type="radio" name="__palette" id="__palette_1"> <label
class="md-header__button md-icon" title="Switch to light mode" for="__palette_0" hidden=""> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path
d="M12 18c-.89 0-1.74-.2-2.5-.55C11.56 16.5 13 14.42 13 12c0-2.42-1.44-4.5-3.5-5.45C10.26 6.2 11.11 6 12 6a6 6 0 0 1 6 6 6 6 0 0 1-6 6m8-9.31V4h-4.69L12 .69 8.69 4H4v4.69L.69 12 4 15.31V20h4.69L12 23.31 15.31 20H20v-4.69L23.31 12 20 8.69Z">
</path>
</svg> </label> </form>
Name: search —
<form class="md-search__form" name="search"> <input type="text" class="md-search__input" name="query" aria-label="Search" placeholder="Search" autocapitalize="off" autocorrect="off" autocomplete="off" spellcheck="false"
data-md-component="search-query" required=""> <label class="md-search__icon md-icon" for="__search"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path d="M9.5 3A6.5 6.5 0 0 1 16 9.5c0 1.61-.59 3.09-1.56 4.23l.27.27h.79l5 5-1.5 1.5-5-5v-.79l-.27-.27A6.516 6.516 0 0 1 9.5 16 6.5 6.5 0 0 1 3 9.5 6.5 6.5 0 0 1 9.5 3m0 2C7 5 5 7 5 9.5S7 14 9.5 14 14 12 14 9.5 12 5 9.5 5Z"></path>
</svg> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path d="M20 11v2H8l5.5 5.5-1.42 1.42L4.16 12l7.92-7.92L13.5 5.5 8 11h12Z"></path>
</svg> </label>
<nav class="md-search__options" aria-label="Search">
<a href="https://python.useinstructor.com/?q=" class="md-search__icon md-icon" title="Share" aria-label="Share" data-clipboard="" data-clipboard-text="javascript:void(0)" data-md-component="search-share" tabindex="-1"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path d="M18 16.08c-.76 0-1.44.3-1.96.77L8.91 12.7c.05-.23.09-.46.09-.7 0-.24-.04-.47-.09-.7l7.05-4.11c.54.5 1.25.81 2.04.81a3 3 0 0 0 3-3 3 3 0 0 0-3-3 3 3 0 0 0-3 3c0 .24.04.47.09.7L8.04 9.81C7.5 9.31 6.79 9 6 9a3 3 0 0 0-3 3 3 3 0 0 0 3 3c.79 0 1.5-.31 2.04-.81l7.12 4.15c-.05.21-.08.43-.08.66 0 1.61 1.31 2.91 2.92 2.91 1.61 0 2.92-1.3 2.92-2.91A2.92 2.92 0 0 0 18 16.08Z"></path></svg> </a>
<button type="reset" class="md-search__icon md-icon" title="Clear" aria-label="Clear" tabindex="-1"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path d="M19 6.41 17.59 5 12 10.59 6.41 5 5 6.41 10.59 12 5 17.59 6.41 19 12 13.41 17.59 19 19 17.59 13.41 12 19 6.41Z"></path>
</svg> </button> </nav>
<div class="md-search__suggest" data-md-component="search-suggest"></div>
</form>
Name: feedback —
<form class="md-feedback" name="feedback">
<fieldset>
<legend class="md-feedback__title"> Was this page helpful? </legend>
<div class="md-feedback__inner">
<div class="md-feedback__list"> <button class="md-feedback__icon md-icon" type="submit" title="This page was helpful" data-md-value="1"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path
d="M20 12a8 8 0 0 0-8-8 8 8 0 0 0-8 8 8 8 0 0 0 8 8 8 8 0 0 0 8-8m2 0a10 10 0 0 1-10 10A10 10 0 0 1 2 12 10 10 0 0 1 12 2a10 10 0 0 1 10 10M10 9.5c0 .8-.7 1.5-1.5 1.5S7 10.3 7 9.5 7.7 8 8.5 8s1.5.7 1.5 1.5m7 0c0 .8-.7 1.5-1.5 1.5S14 10.3 14 9.5 14.7 8 15.5 8s1.5.7 1.5 1.5m-5 7.73c-1.75 0-3.29-.73-4.19-1.81L9.23 14c.45.72 1.52 1.23 2.77 1.23s2.32-.51 2.77-1.23l1.42 1.42c-.9 1.08-2.44 1.81-4.19 1.81Z">
</path>
</svg> </button> <button class="md-feedback__icon md-icon" type="submit" title="This page could be improved" data-md-value="0"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path
d="M20 12a8 8 0 0 0-8-8 8 8 0 0 0-8 8 8 8 0 0 0 8 8 8 8 0 0 0 8-8m2 0a10 10 0 0 1-10 10A10 10 0 0 1 2 12 10 10 0 0 1 12 2a10 10 0 0 1 10 10m-6.5-4c.8 0 1.5.7 1.5 1.5s-.7 1.5-1.5 1.5-1.5-.7-1.5-1.5.7-1.5 1.5-1.5M10 9.5c0 .8-.7 1.5-1.5 1.5S7 10.3 7 9.5 7.7 8 8.5 8s1.5.7 1.5 1.5m2 4.5c1.75 0 3.29.72 4.19 1.81l-1.42 1.42C14.32 16.5 13.25 16 12 16s-2.32.5-2.77 1.23l-1.42-1.42C8.71 14.72 10.25 14 12 14Z">
</path>
</svg> </button> </div>
<div class="md-feedback__note">
<div data-md-value="1" hidden=""> Thanks for your feedback! </div>
<div data-md-value="0" hidden=""> Thanks for your feedback! Help us improve this page by using our <a href="https://python.useinstructor.com/..." target="_blank" rel="noopener">feedback form</a>. </div>
</div>
</div>
</fieldset>
</form>
Text Content
Skip to content For updates follow @jxnlco on Twitter and us on GitHub . If you don't like python, check out the TS/JS and Elixir ports. Instructor Welcome To Instructor Type to start searching instructor * 1.2.6 * 5.4k * 425 * Introduction * Cookbook * Blog * Concepts * Hub * CLI Reference * Tutorials Instructor instructor * 1.2.6 * 5.4k * 425 * Introduction Introduction * Why use Instructor? * Help with Instructor * Installation * Contributing * Philosophy * Cookbook * Blog * Concepts * Hub * CLI Reference * Tutorials Table of contents * Why use Instructor? * Getting Started * Using OpenAI * Using Anthropic * Using Litellm * Correct Typing * Calling create * Handling async: await create * Returning the original completion: create_with_completion * Streaming Partial Objects: create_partial * Streaming Iterables: create_iterable * Validation * More Examples * Contributing * License INSTRUCTOR, GENERATING STRUCTURE FROM LLMS¶ Structured outputs powered by llms. Designed for simplicity, transparency, and control. -------------------------------------------------------------------------------- Instructor makes it easy to reliably get structured data like JSON from Large Language Models (LLMs) like GPT-3.5, GPT-4, GPT-4-Vision, including open source models like Mistral/Mixtral from Together, Anyscale, Ollama, and llama-cpp-python. By leveraging various modes like Function Calling, Tool Calling and even constrained sampling modes like JSON mode, JSON Schema; Instructor stands out for its simplicity, transparency, and user-centric design. We leverage Pydantic to do the heavy lifting, and we've built a simple, easy-to-use API on top of it by helping you manage validation context, retries with Tenacity, and streaming Lists and Partial responses. We also provide a library in Typescript, Elixir and PHP. WHY USE INSTRUCTOR?¶ The question of using Instructor is fundamentally a question of why to use Pydantic. 1. Powered by type hints — Instructor is powered by Pydantic, which is powered by type hints. Schema validation, prompting is controlled by type annotations; less to learn, less code to write, and integrates with your IDE. 2. Customizable — Pydantic is highly customizable. You can define your own validators, custom error messages, and more. 3. Ecosystem Pydantic is the most widely used data validation library for Python with over 100M downloads a month. It's used by FastAPI, Typer, and many other popular libraries. GETTING STARTED¶ pip install -U instructor If you ever get stuck, you can always run instructor docs to open the documentation in your browser. It even supports searching for specific topics. instructor docs [QUERY] You can also check out our cookbooks and concepts to learn more about how to use Instructor. Now, let's see Instructor in action with a simple example: USING OPENAI¶ import instructor from pydantic import BaseModel from openai import OpenAI # Define your desired output structure class UserInfo(BaseModel): name: str age: int # Patch the OpenAI client client = instructor.from_openai(OpenAI()) # Extract structured data from natural language user_info = client.chat.completions.create( model="gpt-3.5-turbo", response_model=UserInfo, messages=[{"role": "user", "content": "John Doe is 30 years old."}], ) print(user_info.name) #> John Doe print(user_info.age) #> 30 USING ANTHROPIC¶ import instructor from anthropic import Anthropic from pydantic import BaseModel class User(BaseModel): name: str age: int client = instructor.from_anthropic(Anthropic()) # note that client.chat.completions.create will also work resp = client.messages.create( model="claude-3-opus-20240229", max_tokens=1024, messages=[ { "role": "user", "content": "Extract Jason is 25 years old.", } ], response_model=User, ) assert isinstance(resp, User) assert resp.name == "Jason" assert resp.age == 25 USING LITELLM¶ import instructor from litellm import completion from pydantic import BaseModel class User(BaseModel): name: str age: int client = instructor.from_litellm(completion) resp = client.chat.completions.create( model="claude-3-opus-20240229", max_tokens=1024, messages=[ { "role": "user", "content": "Extract Jason is 25 years old.", } ], response_model=User, ) assert isinstance(resp, User) assert resp.name == "Jason" assert resp.age == 25 CORRECT TYPING¶ This was the dream of instructor but due to the patching of openai, it wasnt possible for me to get typing to work well. Now, with the new client, we can get typing to work well! We've also added a few create_* methods to make it easier to create iterables and partials, and to access the original completion. CALLING CREATE¶ import openai import instructor from pydantic import BaseModel class User(BaseModel): name: str age: int client = instructor.from_openai(openai.OpenAI()) user = client.chat.completions.create( model="gpt-4-turbo-preview", messages=[ {"role": "user", "content": "Create a user"}, ], response_model=User, ) Now if you use a IDE, you can see the type is correctly infered. HANDLING ASYNC: AWAIT CREATE¶ This will also work correctly with asynchronous clients. import openai import instructor from pydantic import BaseModel client = instructor.from_openai(openai.AsyncOpenAI()) class User(BaseModel): name: str age: int async def extract(): return await client.chat.completions.create( model="gpt-4-turbo-preview", messages=[ {"role": "user", "content": "Create a user"}, ], response_model=User, ) Notice that simply because we return the create method, the extract() function will return the correct user type. RETURNING THE ORIGINAL COMPLETION: CREATE_WITH_COMPLETION¶ You can also return the original completion object import openai import instructor from pydantic import BaseModel client = instructor.from_openai(openai.OpenAI()) class User(BaseModel): name: str age: int user, completion = client.chat.completions.create_with_completion( model="gpt-4-turbo-preview", messages=[ {"role": "user", "content": "Create a user"}, ], response_model=User, ) STREAMING PARTIAL OBJECTS: CREATE_PARTIAL¶ In order to handle streams, we still support Iterable[T] and Partial[T] but to simply the type inference, we've added create_iterable and create_partial methods as well! import openai import instructor from pydantic import BaseModel client = instructor.from_openai(openai.OpenAI()) class User(BaseModel): name: str age: int user_stream = client.chat.completions.create_partial( model="gpt-4-turbo-preview", messages=[ {"role": "user", "content": "Create a user"}, ], response_model=User, ) for user in user_stream: print(user) #> name=None age=None #> name=None age=None #> name=None age=None #> name=None age=None #> name=None age=30 #> name=None age=30 #> name=None age=30 #> name=None age=30 #> name=None age=30 #> name='John' age=30 # name=None age=None # name='' age=None # name='John' age=None # name='John Doe' age=None # name='John Doe' age=30 Notice now that the type infered is Generator[User, None] STREAMING ITERABLES: CREATE_ITERABLE¶ We get an iterable of objects when we want to extract multiple objects. import openai import instructor from pydantic import BaseModel client = instructor.from_openai(openai.OpenAI()) class User(BaseModel): name: str age: int users = client.chat.completions.create_iterable( model="gpt-4-turbo-preview", messages=[ {"role": "user", "content": "Create 2 users"}, ], response_model=User, ) for user in users: print(user) #> name='Alice' age=30 #> name='Bob' age=25 # User(name='John Doe', age=30) # User(name='Jane Smith', age=25) VALIDATION¶ You can also use Pydantic to validate your outputs and get the llm to retry on failure. Check out our docs on retrying and validation context. MORE EXAMPLES¶ If you'd like to see more check out our cookbook. Installing Instructor is a breeze. Just run pip install instructor. CONTRIBUTING¶ If you want to help out, checkout some of the issues marked as good-first-issue or help-wanted. Found here. They could be anything from code improvements, a guest blog post, or a new cook book. LICENSE¶ This project is licensed under the terms of the MIT License. Was this page helpful? Thanks for your feedback! Thanks for your feedback! Help us improve this page by using our feedback form. Back to top Next Why use Instructor? Copyright © 2024 Jason Liu Made with Material for MkDocs