You can read the doc or directly have a look at tests files for multiples exemples.
OpenHosta is a powerful Python extension designed to seamlessly integrate semantic capabilities seen in Large Language Models (LLMs) into tradictional development environments, enabling AI-powered function emulation that maintains native Python syntax and paradigms. Its strength lies in its simplicity and flexibility, allowing developers to easily create AI-enhanced applications while maintaining clean, Pythonic code structure.
OpenHosta can run fully offline with a local model, or use a remote model via API key.
- The future of development is human -
For this project, we have adopted a Code of Conduct to ensure a respectful and inclusive environment for all contributors. Please take a moment to read it.
The simplest usage of OpenHosta is to allow semantic tests in your code, like this:
from OpenHosta import test
sentence = "You are an nice person."
# You shall try with this too:
# sentence = "You are a stupid #@!!~uk."
if test(f"this contains an insult: {sentence}"):
print("The sentence is considered an insult.")
else:
print("The sentence is not considered an insult.")
# The sentence is not considered an insult.But the most powerful feature of OpenHosta is the emulate function, which allows you to define a function with a docstring and let OpenHosta implement it for you using AI. The emulate function supports basic python types, dataclasses, pydantic, enums and Images. You can use all these types as input and output of the function (except for Images which can only be input).
from OpenHosta import emulate
from enum import Enum
class DocumentType(Enum):
OLD_BOOK = "old_book"
ARTICLE = "article"
REPORT = "report"
THESIS = "thesis"
from PIL.Image import Image, open
def classify_document(page:Image)->DocumentType:
"""
This function classifies the document based on the content of the page givent in parameter.
Arguments:
page: An image of the document page to classify.
Returns:
DocumentType: The type of the document
"""
return emulate()
import requests
url=r"https://www.inria.fr/sites/default/files/2024-01/A_outil_innovant_caracte%CC%81riser_plantes_1827x1026_bonnier-2.png"
img = open(requests.get(url, stream=True).raw)
result = classify_document(img)
result
# <DocumentType.OLD_BOOK: 'old_book'>For workloads requiring high concurrency (such as web servers or batch processing), you can use emulate_async to perform non-blocking LLM calls:
import asyncio
from OpenHosta import emulate_async
async def translate_batch(texts: list[str], target_language: str) -> list[str]:
"""Translates a list of texts into the specified language."""
# Process multiple LLM calls in parallel!
return await emulate_async()You can install OpenHosta either via pip or via GitHub.
We encourage you to use a virtual environment. You can create one with:
python -m venv .venv
source .venv/bin/activate # On Windows use `.venv\Scripts\activate`Then you can install OpenHosta with one of the following commands:
pip install OpenHostaor
pip install "git+https://github.com/hand-e-fr/OpenHosta.git"or for a specific branch
pip install "git+https://github.com/hand-e-fr/OpenHosta.git@unstable" # for the latest unstable versionSee the full installation guide
Running OpenHosta is incredibly simple. You can execute models entirely Locally (for free and privately) or connect them to a Remote API (like OpenAI).
Perfect when privacy is paramount. Ensure you have Ollama installed and run ollama run qwen3.5:4b in your terminal.
from OpenHosta import emulate, OpenAICompatibleModel, config
# 1. Point OpenHosta to your local Ollama instance
local_model = OpenAICompatibleModel(
model_name="qwen3.5:4b",
base_url="http://localhost:11434/v1",
api_key="none" # Ollama does not require a key
)
config.DefaultModel = local_model
# 2. Emulate your Python function
def translate(text: str, language: str) -> str:
"""Translates the text into the specified language."""
return emulate()
print(translate("Hello World!", "French"))
# 'Bonjour le monde !'When you want the highest capability models with zero setup. Set your API credentials via .env:
OPENHOSTA_DEFAULT_MODEL_NAME="gpt-4.1"
OPENHOSTA_DEFAULT_MODEL_API_KEY="put-your-api-key-here"
# OPENHOSTA_DEFAULT_MODEL_BASE_URL="https://api.openai.com/v1"from OpenHosta import emulate
def translate(text: str, language: str) -> str:
"""Translates the text into the specified language."""
return emulate()
print(translate("Hello World!", "French"))
# 'Bonjour le monde !'Curious? Read our comprehensive Documentation Hub to discover Pydantic integrations, observability tracking, and advanced examples like using glm-ocr locally.
We warmly welcome contributions from the community. Whether you are an experienced developer or a beginner, your contributions are welcome.
If you wish to contribute to this project, please refer to our Contribution Guide and our Code of Conduct.
Browse the existing issues to see if someone is already working on what you have in mind or to find contribution ideas.
This project is licensed under the MIT License. This means you are free to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to the following conditions:
- The text of the license below must be included in all copies or substantial portions of the software.
See the LICENSE file for more details.
For further questions or assistance, please refer to partner hand-e or contact us directly via github.
Authors:
- Emmanuel Batt: Manager and Coordinator, Founder of Hand-e
- William Jolivet: DevOps, SysAdmin
- Léandre Ramos: IA developer
- Merlin Devillard: UX designer, Product Owner
GitHub: https://github.com/hand-e-fr/OpenHosta
Thank you for your interest in our project and your potential contributions!
The OpenHosta Team