Conversation
Documents all required and optional env vars for inference endpoint configuration, LLM settings, CORS, and SSL verification.
Implements code translation across Java, C, C++, Python, Rust, and Go via CodeLlama inference endpoints. Includes PDF code extraction, token- based auth for GenAI/APISIX gateways, input validation, and health check.
Side-by-side code editor with language pill selectors (6 languages), PDF drag-and-drop upload, real-time character counter, dark mode with localStorage persistence, and copy-to-clipboard. Built with Vite, Tailwind CSS, and served via Nginx.
Wires transpiler-api (port 5001) and transpiler-ui (port 3000→8080) on a shared network. Nginx proxies /api/ to the backend. Supports both remote inference and local Ollama via host.docker.internal.
README covers architecture, prerequisites, quick-start deployment, environment configuration, and UI usage. TROUBLESHOOTING covers common Docker, inference endpoint, and CORS issues.
…port - Rename project title to CodeTrans and update clone URL to cld2labs/CodeTrans - Document dual inference paths: remote OpenAI-compatible APIs and local Ollama - Explain Ollama host-native requirement for Metal GPU acceleration on macOS - Update architecture diagram and service names (transpiler-api, transpiler-ui) - Replace duplicate .env blocks with a single cp .env.example .env quick-start - Add key settings reference table and Ollama model pull commands - Update log command service names and validated models table - Remove stale opea-project references and broken image embed
- Add .github/workflows/code-scans.yaml (Trivy + Bandit SDLE scans) - Add CONTRIBUTING, DISCLAIMER, LICENSE, TERMS_AND_CONDITIONS docs - Add docs/assets company header image - Extend .gitignore with testing artifacts, Audify local dir, and pytest/coverage entries
6511db7 to
a88fc01
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
.gitignorecovering Python, Node, IDE, env secrets, mypy cache, bandit outputs, and local working files.env.exampledocumenting all configuration options for both remote and Ollama inferenceapi/) with dual inference paths, PDF extraction, and health checkui/) with dark mode, pill language selectors, and PDF uploadtranspiler-apiandtranspiler-uivia NginxTest plan
cp .env.example .env, fill in inference credentials, rundocker compose up --buildollama pull codellama:7b, setINFERENCE_PROVIDER=ollama, retranslate