Multi-language code converter contribution
This commit is contained in:
@@ -0,0 +1,12 @@
|
||||
# CodeXchange AI Documentation
|
||||
|
||||
This directory contains comprehensive documentation for the CodeXchange AI project.
|
||||
|
||||
## Contents
|
||||
|
||||
- [Supported Languages](./languages.md) - Details on all supported programming languages
|
||||
- [Configuration Guide](./configuration.md) - How to configure the application
|
||||
- [Development Guide](./development.md) - Guide for developers extending the application
|
||||
- [Contributing Guidelines](./contributing.md) - How to contribute to the project
|
||||
- [Project Structure](./project_structure.md) - Overview of the codebase architecture
|
||||
- [Architecture Diagram](./architecture_diagram.md) - Visual representation of the application architecture and component relationships
|
||||
@@ -0,0 +1,297 @@
|
||||
# Architecture Diagram
|
||||
|
||||
This diagram illustrates the architecture and component relationships of the CodeXchange AI application.
|
||||
|
||||
> **Note:** For detailed information about the CI/CD pipeline, see [CI/CD Pipeline Architecture](ci_cd_pipeline.md).
|
||||
|
||||
## Application Flow Diagram
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
%% Main Entry Points
|
||||
A[run.py] --> B[main.py]
|
||||
B --> C[CodeConverterApp]
|
||||
|
||||
%% Core Components
|
||||
C --> D[Gradio UI]
|
||||
C --> E[AIModelStreamer]
|
||||
C --> F[CodeExecutor]
|
||||
C --> G[LanguageDetector]
|
||||
C --> H[FileHandler]
|
||||
|
||||
%% AI Models
|
||||
E --> I[OpenAI GPT]
|
||||
E --> J[Anthropic Claude]
|
||||
E --> K[Google Gemini]
|
||||
E --> L[DeepSeek]
|
||||
E --> M[GROQ]
|
||||
|
||||
%% Language Processing
|
||||
G --> N[Language Validation]
|
||||
F --> O[Code Execution]
|
||||
|
||||
%% File Operations
|
||||
H --> P[File Upload/Download]
|
||||
H --> Q[ZIP Creation]
|
||||
|
||||
%% Configuration
|
||||
R[config.py] --> C
|
||||
R --> E
|
||||
R --> F
|
||||
|
||||
%% Template
|
||||
S[template.j2] --> C
|
||||
|
||||
%% User Interactions
|
||||
D --> T[Code Input]
|
||||
D --> U[Language Selection]
|
||||
D --> V[Model Selection]
|
||||
D --> W[Code Conversion]
|
||||
D --> X[Code Execution]
|
||||
D --> Y[File Download]
|
||||
|
||||
%% Logging
|
||||
Z[logger.py] --> C
|
||||
Z --> E
|
||||
Z --> F
|
||||
Z --> G
|
||||
Z --> H
|
||||
|
||||
%% Styling
|
||||
style A fill:#f9d77e,stroke:#333,stroke-width:2px
|
||||
style B fill:#f9d77e,stroke:#333,stroke-width:2px
|
||||
style C fill:#f9d77e,stroke:#333,stroke-width:2px
|
||||
style D fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style E fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style F fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style G fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style H fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style I fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style J fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style K fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style L fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style M fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style R fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style S fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style Z fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
```
|
||||
|
||||
## Component Interaction Sequence
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant User
|
||||
participant UI as Gradio UI
|
||||
participant App as CodeConverterApp
|
||||
participant AI as AIModelStreamer
|
||||
participant Executor as CodeExecutor
|
||||
participant Detector as LanguageDetector
|
||||
participant Files as FileHandler
|
||||
|
||||
User->>UI: Enter Source Code
|
||||
User->>UI: Select Source Language
|
||||
User->>UI: Select Target Language
|
||||
User->>UI: Select AI Model
|
||||
User->>UI: Click Convert
|
||||
|
||||
UI->>App: Request Code Conversion
|
||||
App->>Detector: Validate Source Language
|
||||
Detector-->>App: Validation Result
|
||||
|
||||
App->>App: Create Prompt from Template
|
||||
App->>AI: Send Prompt to Selected Model
|
||||
AI-->>App: Stream Converted Code
|
||||
App-->>UI: Display Converted Code
|
||||
|
||||
User->>UI: Click Run Original
|
||||
UI->>App: Request Code Execution
|
||||
App->>Executor: Execute Original Code
|
||||
Executor-->>App: Execution Result
|
||||
App-->>UI: Display Execution Result
|
||||
|
||||
User->>UI: Click Run Converted
|
||||
UI->>App: Request Code Execution
|
||||
App->>Executor: Execute Converted Code
|
||||
Executor-->>App: Execution Result
|
||||
App-->>UI: Display Execution Result
|
||||
|
||||
User->>UI: Click Download
|
||||
UI->>App: Request Download
|
||||
App->>Files: Create ZIP with Files
|
||||
Files-->>App: ZIP File
|
||||
App-->>UI: Provide Download Link
|
||||
UI-->>User: Download Files
|
||||
```
|
||||
|
||||
## Class Diagram
|
||||
|
||||
```mermaid
|
||||
classDiagram
|
||||
class CodeConverterApp {
|
||||
-AIModelStreamer ai_streamer
|
||||
-CodeExecutor executor
|
||||
-LanguageDetector detector
|
||||
-FileHandler file_handler
|
||||
-GradioInterface demo
|
||||
+__init__()
|
||||
+_setup_environment()
|
||||
+_initialize_components()
|
||||
+_create_gradio_interface()
|
||||
+convert_code()
|
||||
+execute_code()
|
||||
+download_files()
|
||||
+run()
|
||||
}
|
||||
|
||||
class AIModelStreamer {
|
||||
-OpenAI openai
|
||||
-Anthropic claude
|
||||
-OpenAI deepseek
|
||||
-OpenAI groq
|
||||
-GenerativeModel gemini
|
||||
+__init__()
|
||||
+stream_gpt()
|
||||
+stream_claude()
|
||||
+stream_gemini()
|
||||
+stream_deepseek()
|
||||
+stream_groq()
|
||||
+stream_completion()
|
||||
}
|
||||
|
||||
class CodeExecutor {
|
||||
-dict executors
|
||||
+__init__()
|
||||
+execute()
|
||||
+execute_python()
|
||||
+execute_javascript()
|
||||
+execute_java()
|
||||
+execute_cpp()
|
||||
+execute_julia()
|
||||
+execute_go()
|
||||
+execute_ruby()
|
||||
+execute_swift()
|
||||
+execute_rust()
|
||||
+execute_csharp()
|
||||
+execute_typescript()
|
||||
+execute_r()
|
||||
+execute_perl()
|
||||
+execute_lua()
|
||||
+execute_php()
|
||||
+execute_kotlin()
|
||||
+execute_sql()
|
||||
}
|
||||
|
||||
class LanguageDetector {
|
||||
+detect_python()
|
||||
+detect_javascript()
|
||||
+detect_java()
|
||||
+detect_cpp()
|
||||
+detect_julia()
|
||||
+detect_go()
|
||||
+detect_ruby()
|
||||
+detect_swift()
|
||||
+detect_rust()
|
||||
+detect_csharp()
|
||||
+detect_typescript()
|
||||
+detect_r()
|
||||
+detect_perl()
|
||||
+detect_lua()
|
||||
+detect_php()
|
||||
+detect_kotlin()
|
||||
+detect_sql()
|
||||
+validate_language()
|
||||
}
|
||||
|
||||
class FileHandler {
|
||||
+create_readme()
|
||||
+create_zip()
|
||||
+handle_upload()
|
||||
+handle_download()
|
||||
}
|
||||
|
||||
CodeConverterApp --> AIModelStreamer
|
||||
CodeConverterApp --> CodeExecutor
|
||||
CodeConverterApp --> LanguageDetector
|
||||
CodeConverterApp --> FileHandler
|
||||
```
|
||||
|
||||
## Supported Languages and Models
|
||||
|
||||
```mermaid
|
||||
graph LR
|
||||
subgraph "AI Models"
|
||||
M1[GPT]
|
||||
M2[Claude]
|
||||
M3[Gemini]
|
||||
M4[DeepSeek]
|
||||
M5[GROQ]
|
||||
end
|
||||
|
||||
subgraph "Supported Languages"
|
||||
L1[Python]
|
||||
L2[JavaScript]
|
||||
L3[Java]
|
||||
L4[C++]
|
||||
L5[Julia]
|
||||
L6[Go]
|
||||
L7[Ruby]
|
||||
L8[Swift]
|
||||
L9[Rust]
|
||||
L10[C#]
|
||||
L11[TypeScript]
|
||||
L12[R]
|
||||
L13[Perl]
|
||||
L14[Lua]
|
||||
L15[PHP]
|
||||
L16[Kotlin]
|
||||
L17[SQL]
|
||||
end
|
||||
|
||||
subgraph "Fully Implemented"
|
||||
L1
|
||||
L2
|
||||
L3
|
||||
L4
|
||||
L5
|
||||
L6
|
||||
end
|
||||
|
||||
subgraph "Template Ready"
|
||||
L7
|
||||
L8
|
||||
L9
|
||||
L10
|
||||
L11
|
||||
L12
|
||||
L13
|
||||
L14
|
||||
L15
|
||||
L16
|
||||
L17
|
||||
end
|
||||
|
||||
style M1 fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style M2 fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style M3 fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style M4 fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
style M5 fill:#ffb6c1,stroke:#333,stroke-width:2px
|
||||
|
||||
style L1 fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style L2 fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style L3 fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style L4 fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style L5 fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
style L6 fill:#a8d5ba,stroke:#333,stroke-width:2px
|
||||
|
||||
style L7 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L8 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L9 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L10 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L11 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L12 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L13 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L14 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L15 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L16 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
style L17 fill:#d0e0e3,stroke:#333,stroke-width:2px
|
||||
```
|
||||
@@ -0,0 +1,9 @@
|
||||
|
||||
**Task Overview:**
|
||||
Thoroughly analyze the entire codebase and gain a deep understanding of the application flow diagram and the interaction sequence between all components.
|
||||
|
||||
**Objective:**
|
||||
Develop a comprehensive understanding of how different modules interact, their dependencies, and how data flows across the system. This foundational knowledge will be essential for implementing future updates accurately and efficiently without breaking the existing codebase.
|
||||
|
||||
**Important Note:**
|
||||
⚠️ **No code should be written or modified at this stage.** The primary focus is on understanding the architecture, relationships, and dependencies to ensure a seamless update process in subsequent phases.
|
||||
@@ -0,0 +1,86 @@
|
||||
# CI/CD Pipeline Architecture
|
||||
|
||||
This document outlines the continuous integration and continuous deployment (CI/CD) pipeline implemented for CodeXchange AI.
|
||||
|
||||
## Pipeline Overview
|
||||
|
||||
The CI/CD pipeline automates testing, building, and deployment processes using GitHub Actions. It follows a trunk-based development model with protection for the main branch.
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
Code[Developer Code] --> PR[Pull Request]
|
||||
PR --> Tests[Automated Tests]
|
||||
Tests --> DockerBuild[Docker Build & Validation]
|
||||
DockerBuild --> |On merge to develop| StagingDeploy[Staging Deployment]
|
||||
StagingDeploy --> |On release creation| ProdDeploy[Production Deployment]
|
||||
|
||||
%% Pipeline Components
|
||||
subgraph "CI Pipeline"
|
||||
Tests
|
||||
DockerBuild
|
||||
end
|
||||
|
||||
subgraph "CD Pipeline"
|
||||
StagingDeploy
|
||||
ProdDeploy
|
||||
end
|
||||
```
|
||||
|
||||
## Workflow Components
|
||||
|
||||
### 1. Python Testing Workflow
|
||||
|
||||
The testing workflow (`python-test.yml`) performs:
|
||||
- Syntax validation and linting with flake8
|
||||
- Unit and integration tests with pytest
|
||||
- Code coverage analysis
|
||||
- Matrix testing across Python 3.9, 3.10, and 3.11
|
||||
|
||||
### 2. Docker Build Workflow
|
||||
|
||||
The Docker workflow (`docker-build.yml`) performs:
|
||||
- Dockerfile validation
|
||||
- Container image building
|
||||
- Security scanning with Trivy
|
||||
- Vulnerability assessment
|
||||
|
||||
### 3. Staging Deployment
|
||||
|
||||
The staging workflow (`deploy-staging.yml`):
|
||||
- Triggered on pushes to the develop branch
|
||||
- Builds and tags Docker images with branch name and commit hash
|
||||
- Pushes images to Docker Hub
|
||||
- Deploys to the staging environment
|
||||
|
||||
### 4. Production Deployment
|
||||
|
||||
The production workflow (`deploy-production.yml`):
|
||||
- Triggered on release publication
|
||||
- Builds and tags Docker images with semantic version
|
||||
- Pushes images to Docker Hub with version tags
|
||||
- Deploys to the production environment with approval gate
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- Sensitive credentials stored as GitHub Secrets
|
||||
- Docker Hub authentication using access tokens, not passwords
|
||||
- Security scanning for both code and container images
|
||||
- Protected branches requiring CI checks to pass before merging
|
||||
|
||||
## Development Workflow
|
||||
|
||||
1. Create feature branch from develop
|
||||
2. Implement changes with tests
|
||||
3. Open pull request to develop
|
||||
4. Pass all CI checks
|
||||
5. Merge to develop (triggers staging deployment)
|
||||
6. Create release for production deployment
|
||||
|
||||
## Required Repository Secrets
|
||||
|
||||
The following secrets must be configured in the GitHub repository:
|
||||
|
||||
| Secret Name | Purpose |
|
||||
|-------------|---------|
|
||||
| `DOCKERHUB_USERNAME` | Docker Hub username for image publishing |
|
||||
| `DOCKERHUB_TOKEN` | Docker Hub authentication token |
|
||||
@@ -0,0 +1,82 @@
|
||||
# Configuration Guide
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Create a `.env` file in the root directory:
|
||||
|
||||
```bash
|
||||
OPENAI_API_KEY=your_openai_key_here
|
||||
ANTHROPIC_API_KEY=your_anthropic_key_here
|
||||
GOOGLE_API_KEY=your_google_key_here
|
||||
DEEPSEEK_API_KEY=your_deepseek_key_here
|
||||
GROQ_API_KEY=your_groq_key_here
|
||||
PORT=7860 # Optional, default port for the web interface
|
||||
```
|
||||
|
||||
## Model Configuration
|
||||
|
||||
Model names are configured in `src/ai_code_converter/config.py`:
|
||||
|
||||
```python
|
||||
# Model configurations
|
||||
OPENAI_MODEL = "gpt-4o-mini" # OpenAI model name
|
||||
CLAUDE_MODEL = "claude-3-sonnet-20240307" # Anthropic Claude model
|
||||
DEEPSEEK_MODEL = "deepseek-chat" # DeepSeek model
|
||||
GEMINI_MODEL = "gemini-1.5-flash" # Google Gemini model
|
||||
GROQ_MODEL = "llama3-70b-8192" # GROQ model
|
||||
```
|
||||
|
||||
You can modify these values to use different model versions based on your requirements and API access.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
The following dependencies are required for full functionality:
|
||||
|
||||
- Python 3.10+
|
||||
- Node.js and npm (with TypeScript)
|
||||
- Java JDK 17+
|
||||
- Julia 1.9+
|
||||
- Go 1.21+
|
||||
- GCC/G++
|
||||
- Perl
|
||||
- Lua 5.3+
|
||||
- PHP
|
||||
- R
|
||||
- Ruby
|
||||
- Rust (rustc and cargo)
|
||||
- Mono (for C#)
|
||||
- Swift 5.9+
|
||||
- Kotlin
|
||||
- SQLite3
|
||||
|
||||
Using Docker is recommended as it includes all necessary dependencies.
|
||||
|
||||
## Docker Configuration
|
||||
|
||||
The application includes Docker support for easy deployment. The `docker-compose.yml` file defines the service configuration:
|
||||
|
||||
```yaml
|
||||
version: '3'
|
||||
services:
|
||||
ai_code_converter:
|
||||
build: .
|
||||
ports:
|
||||
- "${PORT:-7860}:7860"
|
||||
volumes:
|
||||
- ./logs:/app/logs
|
||||
env_file:
|
||||
- .env
|
||||
restart: unless-stopped
|
||||
```
|
||||
|
||||
You can customize the port mapping and volume mounts as needed.
|
||||
|
||||
## Application Settings
|
||||
|
||||
Additional application settings can be configured in `src/ai_code_converter/config.py`:
|
||||
|
||||
- UI theme and styling
|
||||
- Default language selections
|
||||
- Model temperature settings
|
||||
- Execution timeouts
|
||||
- Logging configuration
|
||||
@@ -0,0 +1,89 @@
|
||||
# Contributing Guidelines
|
||||
|
||||
Thank you for your interest in contributing to the CodeXchange AI project! This document provides guidelines and instructions for contributing to the project.
|
||||
|
||||
## How to Contribute
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
|
||||
3. Commit your changes (`git commit -m 'Add amazing feature'`)
|
||||
4. Push to the branch (`git push origin feature/amazing-feature`)
|
||||
5. Open a Pull Request
|
||||
|
||||
## Development Workflow
|
||||
|
||||
1. Check the issues page for open tasks or create a new issue for the feature/bug you want to work on
|
||||
2. Assign yourself to the issue
|
||||
3. Implement your changes following the best practices outlined below
|
||||
4. Write tests for your changes
|
||||
5. Update documentation as needed
|
||||
6. Submit a pull request referencing the issue
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Code Style
|
||||
|
||||
- Follow existing patterns for consistency
|
||||
- Follow PEP 8 style guidelines for Python code
|
||||
- Use descriptive variable and function names
|
||||
- Add type hints for all new functions
|
||||
- Keep functions small and focused on a single responsibility
|
||||
- Use docstrings for all public functions and classes
|
||||
|
||||
### Error Handling
|
||||
|
||||
- Add comprehensive error handling
|
||||
- Use specific exception types
|
||||
- Provide helpful error messages
|
||||
- Log errors with appropriate context
|
||||
|
||||
### Logging
|
||||
|
||||
- Include detailed logging
|
||||
- Use the existing logging framework
|
||||
- Log at appropriate levels (DEBUG, INFO, WARNING, ERROR)
|
||||
- Include relevant context in log messages
|
||||
|
||||
### Documentation
|
||||
|
||||
- Update documentation for any changes
|
||||
- Document new features, configuration options, and APIs
|
||||
- Keep the README and docs directory in sync
|
||||
- Use clear, concise language
|
||||
|
||||
### Testing
|
||||
|
||||
- Write unit tests for new functionality
|
||||
- Ensure all tests pass before submitting a PR
|
||||
- Test edge cases and error conditions
|
||||
- Aim for good test coverage
|
||||
|
||||
## Pull Request Process
|
||||
|
||||
1. Ensure your code follows the style guidelines
|
||||
2. Update documentation as needed
|
||||
3. Include tests for new functionality
|
||||
4. Link the PR to any related issues
|
||||
5. Wait for code review and address any feedback
|
||||
|
||||
## Code Review
|
||||
|
||||
All submissions require review. We use GitHub pull requests for this purpose:
|
||||
|
||||
1. Reviewers will check code quality, test coverage, and documentation
|
||||
2. Address any comments or requested changes
|
||||
3. Once approved, maintainers will merge your PR
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
We would like to thank the following organizations and projects that make CodeXchange AI possible:
|
||||
|
||||
- OpenAI for GPT models
|
||||
- Anthropic for Claude
|
||||
- Google for Gemini
|
||||
- DeepSeek and GROQ for their AI models
|
||||
- The Gradio team for the web interface framework
|
||||
|
||||
## License
|
||||
|
||||
By contributing to this project, you agree that your contributions will be licensed under the project's MIT License.
|
||||
@@ -0,0 +1,180 @@
|
||||
# Development Guide
|
||||
|
||||
This guide provides instructions for extending the CodeXchange AI application with new languages and AI models.
|
||||
|
||||
Before diving into development, it's recommended to review the [Architecture Diagram](./architecture_diagram.md) to understand the component relationships and application flow.
|
||||
|
||||
## Adding New Programming Languages
|
||||
|
||||
1. Update Language Configuration (`config.py`):
|
||||
```python
|
||||
SUPPORTED_LANGUAGES = [..., "NewLanguage"]
|
||||
LANGUAGE_MAPPING = {..., "NewLanguage": "language_highlight_name"}
|
||||
```
|
||||
|
||||
2. Add Language Detection (`core/language_detection.py`):
|
||||
```python
|
||||
class LanguageDetector:
|
||||
@staticmethod
|
||||
def detect_new_language(code: str) -> bool:
|
||||
patterns = [r'pattern1', r'pattern2', r'pattern3']
|
||||
return any(re.search(pattern, code) for pattern in patterns)
|
||||
```
|
||||
|
||||
3. Add Execution Support (`core/code_execution.py`):
|
||||
```python
|
||||
def execute_new_language(self, code: str) -> tuple[str, Optional[bytes]]:
|
||||
with tempfile.NamedTemporaryFile(suffix='.ext', mode='w', delete=False) as f:
|
||||
f.write(code)
|
||||
file_path = f.name
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["compiler/interpreter", file_path],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True
|
||||
)
|
||||
return result.stdout, None
|
||||
except Exception as e:
|
||||
return f"Error: {str(e)}", None
|
||||
finally:
|
||||
os.unlink(file_path)
|
||||
```
|
||||
|
||||
4. Update the Dockerfile:
|
||||
```dockerfile
|
||||
# Add necessary dependencies for the new language
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
new-language-package \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Verify the installation
|
||||
RUN echo "New Language: $(new-language --version 2>/dev/null || echo 'NOT VERIFIED')"
|
||||
```
|
||||
|
||||
5. Update the UI Components in `app.py`:
|
||||
```python
|
||||
def _initialize_components(self):
|
||||
# Add the new language to dropdown options
|
||||
self.source_language = gr.Dropdown(
|
||||
choices=["Python", "JavaScript", ..., "NewLanguage"],
|
||||
...
|
||||
)
|
||||
self.target_language = gr.Dropdown(
|
||||
choices=["Python", "JavaScript", ..., "NewLanguage"],
|
||||
...
|
||||
)
|
||||
```
|
||||
|
||||
6. Add Language-Specific Instructions in `template.j2`:
|
||||
```jinja
|
||||
{% if target_language == "NewLanguage" %}
|
||||
# NewLanguage-specific conversion instructions
|
||||
- Follow NewLanguage best practices
|
||||
- Use idiomatic NewLanguage patterns
|
||||
- Handle NewLanguage-specific edge cases
|
||||
{% endif %}
|
||||
```
|
||||
|
||||
## Adding New AI Models
|
||||
|
||||
1. Update Model Configuration (`config.py`):
|
||||
```python
|
||||
NEW_MODEL = "model-name-version"
|
||||
MODELS = [..., "NewModel"]
|
||||
```
|
||||
|
||||
2. Add Model Integration (`models/ai_streaming.py`):
|
||||
```python
|
||||
def stream_new_model(self, prompt: str) -> Generator[str, None, None]:
|
||||
try:
|
||||
response = self.new_model_client.generate(
|
||||
prompt=prompt,
|
||||
stream=True
|
||||
)
|
||||
reply = ""
|
||||
for chunk in response:
|
||||
fragment = chunk.text
|
||||
reply += fragment
|
||||
yield reply
|
||||
except Exception as e:
|
||||
logger.error(f"New Model API error: {str(e)}", exc_info=True)
|
||||
yield f"Error with New Model API: {str(e)}"
|
||||
```
|
||||
|
||||
3. Add API Client Initialization:
|
||||
```python
|
||||
def __init__(self, api_keys: dict):
|
||||
# Initialize existing clients
|
||||
...
|
||||
|
||||
# Initialize new model client
|
||||
if "NEW_MODEL_API_KEY" in api_keys:
|
||||
self.new_model_client = NewModelClient(
|
||||
api_key=api_keys["NEW_MODEL_API_KEY"]
|
||||
)
|
||||
```
|
||||
|
||||
4. Update the Model Selection Logic:
|
||||
```python
|
||||
def stream_completion(self, model: str, prompt: str) -> Generator[str, None, None]:
|
||||
if model == "NewModel":
|
||||
yield from self.stream_new_model(prompt)
|
||||
else:
|
||||
# Existing model handling
|
||||
...
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
1. Add test cases for new components:
|
||||
- Unit tests for language detection
|
||||
- Integration tests for code execution
|
||||
- End-to-end tests for UI components
|
||||
|
||||
2. Test language detection with sample code:
|
||||
- Positive examples (valid code in the target language)
|
||||
- Negative examples (code from other languages)
|
||||
- Edge cases (minimal valid code snippets)
|
||||
|
||||
3. Test code execution with various examples:
|
||||
- Simple "Hello World" programs
|
||||
- Programs with external dependencies
|
||||
- Programs with different runtime characteristics
|
||||
- Error handling cases
|
||||
|
||||
4. Test model streaming with different prompts:
|
||||
- Short prompts
|
||||
- Long prompts
|
||||
- Edge cases (empty prompts, very complex code)
|
||||
- Error handling
|
||||
|
||||
5. Verify error handling and edge cases:
|
||||
- API rate limiting
|
||||
- Network failures
|
||||
- Invalid inputs
|
||||
- Resource constraints
|
||||
|
||||
## Logging
|
||||
|
||||
The application uses a structured logging system:
|
||||
|
||||
- JSON formatted logs with timestamps
|
||||
- Stored in `logs` directory
|
||||
- Separate console and file logging
|
||||
- Detailed execution metrics
|
||||
|
||||
To add logging for new components:
|
||||
|
||||
```python
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def new_function():
|
||||
try:
|
||||
logger.info("Starting operation", extra={"component": "new_component"})
|
||||
# Function logic
|
||||
logger.info("Operation completed", extra={"component": "new_component", "metrics": {...}})
|
||||
except Exception as e:
|
||||
logger.error(f"Error: {str(e)}", exc_info=True, extra={"component": "new_component"})
|
||||
```
|
||||
@@ -0,0 +1,68 @@
|
||||
# Supported Languages
|
||||
|
||||
CodeXchange AI currently supports the following programming languages:
|
||||
|
||||
## Language Support Table
|
||||
|
||||
| Language | Execution Method | File Extension |
|
||||
|------------|----------------------------------------|----------------|
|
||||
| Python | Direct execution in restricted env | .py |
|
||||
| JavaScript | Node.js | .js |
|
||||
| Java | javac + java | .java |
|
||||
| C++ | g++ + executable | .cpp |
|
||||
| Julia | julia | .jl |
|
||||
| Go | go run | .go |
|
||||
| Ruby | ruby | .rb |
|
||||
| Swift | swift | .swift |
|
||||
| Rust | rustc + executable | .rs |
|
||||
| C# | csc (Mono) | .cs |
|
||||
| TypeScript | tsc + node | .ts |
|
||||
| R | Rscript | .R |
|
||||
| Perl | perl | .pl |
|
||||
| Lua | lua5.3 | .lua |
|
||||
| PHP | php | .php |
|
||||
| Kotlin | kotlinc + kotlin | .kt |
|
||||
| SQL | sqlite3 | .sql |
|
||||
|
||||
## Currently Implemented Languages
|
||||
|
||||
While the application has templates and instructions for all the languages listed above, the following languages are currently fully implemented with language detection and execution support:
|
||||
|
||||
- Python
|
||||
- JavaScript
|
||||
- Java
|
||||
- C++
|
||||
- Julia
|
||||
- Go
|
||||
|
||||
## Language-Specific Notes
|
||||
|
||||
### Python
|
||||
- Executed directly in a restricted environment
|
||||
- Supports most standard libraries
|
||||
- Execution timeout: 30 seconds
|
||||
|
||||
### JavaScript
|
||||
- Executed using Node.js
|
||||
- Supports ES6+ features
|
||||
- No external npm packages are installed during execution
|
||||
|
||||
### Java
|
||||
- Requires a class with a main method
|
||||
- Class name must match filename
|
||||
- Compiled with javac before execution
|
||||
|
||||
### C++
|
||||
- Compiled with g++
|
||||
- Standard C++17 support
|
||||
- Execution timeout: 30 seconds
|
||||
|
||||
### Julia
|
||||
- Executed with the julia interpreter
|
||||
- Supports Julia 1.9+
|
||||
- Limited package support during execution
|
||||
|
||||
### Go
|
||||
- Executed with go run
|
||||
- Supports Go 1.21+
|
||||
- Standard library support only
|
||||
@@ -0,0 +1,108 @@
|
||||
# Project Structure
|
||||
|
||||
This document provides an overview of the CodeXchange AI project structure and architecture.
|
||||
|
||||
For a visual representation of the application architecture and component relationships, please refer to the [Architecture Diagram](./architecture_diagram.md).
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
ai_code_converter/
|
||||
├── README.md # Project documentation
|
||||
├── requirements.txt # Python dependencies
|
||||
├── run.py # Application entry point
|
||||
├── Dockerfile # Docker configuration
|
||||
├── docker-compose.yml # Docker Compose configuration
|
||||
├── .env.example # Environment variables template
|
||||
├── docs/ # Detailed documentation
|
||||
└── src/ # Source code directory
|
||||
└── ai_code_converter/
|
||||
├── main.py # Main application logic
|
||||
├── app.py # Gradio interface setup
|
||||
├── config.py # Model and app configuration
|
||||
├── template.j2 # Prompt template
|
||||
├── core/ # Core functionality
|
||||
│ ├── __init__.py
|
||||
│ ├── language_detection.py # Language validation
|
||||
│ └── code_execution.py # Code execution
|
||||
├── models/ # AI model integration
|
||||
│ ├── __init__.py
|
||||
│ └── ai_streaming.py # API clients and streaming
|
||||
└── utils/ # Utility functions
|
||||
├── __init__.py
|
||||
└── logging.py # Logging configuration
|
||||
```
|
||||
|
||||
## Component Descriptions
|
||||
|
||||
### Entry Points
|
||||
|
||||
- **run.py**: The main entry point for the application. It imports and initializes the necessary modules and starts the application.
|
||||
|
||||
- **main.py**: Contains the main application logic, initializes the application components, and starts the Gradio interface.
|
||||
|
||||
### Core Components
|
||||
|
||||
- **app.py**: Sets up the Gradio interface and defines UI components. Contains the `CodeConverterApp` class that handles the UI and code conversion logic.
|
||||
|
||||
- **config.py**: Contains configuration for models, languages, and application settings. Defines supported languages, model names, and UI styling.
|
||||
|
||||
- **template.j2**: A Jinja2 template used to create prompts for the LLMs with language-specific instructions for code conversion.
|
||||
|
||||
### Core Directory
|
||||
|
||||
The `core` directory contains modules for core functionality:
|
||||
|
||||
- **language_detection.py**: Contains the `LanguageDetector` class with static methods to validate if code matches the expected language patterns.
|
||||
|
||||
- **code_execution.py**: Handles the execution of code in different programming languages. Contains language-specific execution methods.
|
||||
|
||||
### Models Directory
|
||||
|
||||
The `models` directory contains modules for AI model integration:
|
||||
|
||||
- **ai_streaming.py**: Handles API calls to various LLMs (GPT, Claude, Gemini, DeepSeek, GROQ). Contains methods for streaming responses from different AI providers.
|
||||
|
||||
### Utils Directory
|
||||
|
||||
The `utils` directory contains utility modules:
|
||||
|
||||
- **logging.py**: Configures the logging system for the application. Sets up console and file handlers with appropriate formatting.
|
||||
|
||||
## Application Flow
|
||||
|
||||
1. **Initialization**:
|
||||
- `run.py` imports the main module
|
||||
- `main.py` initializes the application components
|
||||
- `app.py` sets up the Gradio interface
|
||||
|
||||
2. **User Interaction**:
|
||||
- User selects source and target languages
|
||||
- User enters or uploads source code
|
||||
- User selects AI model and temperature
|
||||
- User clicks "Convert"
|
||||
|
||||
3. **Code Conversion**:
|
||||
- Source language is validated using `language_detection.py`
|
||||
- Prompt is created using `template.j2`
|
||||
- AI model is called using `ai_streaming.py`
|
||||
- Response is streamed back to the UI
|
||||
|
||||
4. **Code Execution** (optional):
|
||||
- User clicks "Run" on original or converted code
|
||||
- Code is executed using appropriate method in `code_execution.py`
|
||||
- Output is displayed in the UI
|
||||
|
||||
## Design Patterns
|
||||
|
||||
- **Singleton Pattern**: Used for API clients to ensure only one instance exists
|
||||
- **Factory Pattern**: Used for creating language-specific execution methods
|
||||
- **Strategy Pattern**: Used for selecting the appropriate AI model
|
||||
- **Observer Pattern**: Used for streaming responses from AI models
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **Gradio**: Web interface framework
|
||||
- **Jinja2**: Template engine for creating prompts
|
||||
- **OpenAI, Anthropic, Google, DeepSeek, GROQ APIs**: AI model providers
|
||||
- **Various language interpreters and compilers**: For code execution
|
||||
Reference in New Issue
Block a user