AI-powered code review assistant that enhances pull request analysis with intelligent insights and automated suggestions.
- AI-Powered Analysis: Integrates with Claude and Azure AI Foundry for intelligent code review
- Multi-Platform Support: Works with GitHub and Azure DevOps
- Real-time Processing: Asynchronous job processing with background services
- Enterprise Ready: Clean Architecture, comprehensive testing, and production-grade features
- Rich Analytics: Detailed analysis reports with metrics and telemetry
- CLI Integration: Command-line tool for CI/CD pipeline integration
- RESTful API: Complete REST API with OpenAPI documentation
- Quick Start
- Installation
- Usage
- API Reference
- Configuration
- Architecture
- Development
- Git Workflow
- Release Process
- Contributing
- License
- .NET 10 SDK
- Visual Studio 2022 or VS Code
- Docker Desktop (for Aspire orchestration)
git clone https://github.com/your-org/lintellect.git
cd lintellectThe application is designed to run with .NET Aspire for optimal development experience:
# Start the Aspire AppHost
cd src/AppHost
dotnet runThis will:
- Start PostgreSQL in a container
- Launch the API service
- Provide Aspire dashboard at https://localhost:15000
- Handle service discovery and configuration
The Aspire AppHost will automatically configure the environment. For custom configuration, modify src/AppHost/appsettings.json:
{
"ClaudeAnalyzer": {
"ApiKey": "your-claude-api-key"
},
"GitCredentials": {
"GitHub": {
"Token": "your-github-token"
},
"AzureDevOps": {
"Pat": "your-pat-token",
"OrgUrl": "https://dev.azure.com/your-org"
}
}
}- Aspire Dashboard: https://localhost:15000
- API: https://localhost:7000
- API Documentation: https://localhost:7000/scalar-api-reference
- Health Check: https://localhost:7000/health
# Build and run
cd src/Lintellect.Api
dotnet run
# Or with Docker
docker build -t lintellect:latest -f src/Lintellect.Api/Dockerfile .
docker run -p 7000:7000 lintellect:latest# Install globally
dotnet tool install --global Lintellect.Cli
# Verify installation
Lintellect --help# Basic C# analysis with AI features (Semgrep disabled by default)
Lintellect analyze \
--language "csharp" \
--enable-summary-comment \
--enable-inline-suggestions \
--enable-description-summary
# C# analysis with Semgrep (MIT-licensed security analysis)
Lintellect analyze \
--language "csharp" \
--enable-semgrep \
--enable-summary-comment \
--enable-inline-suggestions
# C# analysis WITHOUT Semgrep (AI features only)
Lintellect analyze \
--language "csharp" \
--enable-semgrep false \
--enable-summary-comment \
--enable-inline-suggestions \
--enable-description-summary
# Multi-language analysis with exclusions
Lintellect analyze \
--language "csharp" \
--exclude "**/bin/**" \
--exclude "**/obj/**" \
--exclude "**/test/**" \
--exclude "**/Generated/**" \
--enable-summary-comment \
--enable-inline-suggestions \
--enable-azure-devops-code-owners
# Linked work items / issues are used as PR context by default.
# Azure DevOps: linked work items resolved server-side via the WIT REST API.
# GitHub: PR body parsed for "Closes/Fixes/Resolves #N" keywords.
# To opt out:
Lintellect analyze \
--language "csharp" \
--enable-summary-comment \
--enable-work-item-context false
# Python analysis with Semgrep
Lintellect analyze \
--language "python" \
--enable-semgrep \
--exclude "**/__pycache__/**" \
--exclude "**/venv/**" \
--exclude "**/node_modules/**" \
--enable-summary-comment
# JavaScript/TypeScript analysis
Lintellect analyze \
--language "javascript" \
--enable-semgrep \
--exclude "**/node_modules/**" \
--exclude "**/dist/**" \
--exclude "**/build/**" \
--enable-summary-comment \
--enable-inline-suggestionsname: PR Analysis
# Environment Variables Required:
# - LINTELLECT_API_URL: Your Lintellect API endpoint URL
# - LINTELLECT_API_KEY: Your Lintellect API key
#
# Optional Environment Variables:
# - LINTELLECT_API_URL and LINTELLECT_API_KEY can be provided via command line arguments instead
on:
pull_request:
types: [opened, synchronize, reopened]
jobs:
analyze:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0 # Fetch full history for better analysis
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: "10.0.x"
- name: Install DevOps PR Analyzer
run: |
dotnet tool install --global Lintellect
- name: Basic C# Analysis
run: |
Lintellect analyze \
--language "csharp" \
--enable-summary-comment \
--enable-inline-suggestions \
--enable-description-summary
env:
LINTELLECT_API_URL: ${{ secrets.LINTELLECT_API_URL }}
LINTELLECT_API_KEY: ${{ secrets.LINTELLECT_API_KEY }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}trigger: none # Trigger on every pull request by setting build validation inside the options
# Environment Variables Required:
# - LINTELLECT_API_URL: Your Lintellect API endpoint URL
# - LINTELLECT_API_KEY: Your Lintellect API key
#
# Optional Environment Variables:
# - LINTELLECT_API_URL and LINTELLECT_API_KEY can be provided via command line arguments instead
# - AZURE_DEVOPS_PAT: Azure DevOps Personal Access Token (for Azure DevOps integration)
pool:
vmImage: "ubuntu-latest"
variables:
buildConfiguration: "Release"
stages:
- stage: Analyze
displayName: "Analyze PR"
jobs:
- job: AnalyzePR
displayName: "Analyze Pull Request"
steps:
- task: UseDotNet@2
displayName: "Use .NET 10"
inputs:
packageType: "sdk"
version: "10.0.x"
- task: DotNetCoreCLI@2
displayName: "Install DevOps PR Analyzer"
inputs:
command: "custom"
custom: "tool"
arguments: "install --global Lintellect"
- task: DotNetCoreCLI@2
displayName: "Basic C# Analysis"
inputs:
command: "custom"
custom: "Lintellect"
arguments: 'analyze --language "csharp" --enable-summary-comment --enable-inline-suggestions --enable-description-summary'
env:
LINTELLECT_API_URL: $(LINTELLECT_API_URL)
LINTELLECT_API_KEY: $(LINTELLECT_API_KEY)
GITHUB_TOKEN: $(GITHUB_TOKEN)All API endpoints require authentication using an API key:
API-Key: your-api-key{
"id": "550e8400-e29b-41d4-a716-446655440000",
"status": "Completed",
"startedAt": "2024-01-15T10:30:00Z",
"completedAt": "2024-01-15T10:35:00Z",
"summary": "Analysis completed successfully",
"detailedAnalysis": "Detailed analysis results...",
"analyzerUsed": "Claude"
}{
"ConnectionStrings": {
"postgresdb": "Host=localhost;Database=lintellect;Username=postgres;Password=password"
},
"ApiKey": "your-secure-api-key"
}{
"ClaudeAnalyzer": {
"ApiKey": "sk-ant-api03-...",
"Model": "claude-3-5-sonnet-20241022",
"MaxTokens": 4000,
"Temperature": 0.1
}
}{
"AzureOpenAIAnalyzer": {
"ApiKey": "your-azure-ai-key",
"Endpoint": "https://your-resource.openai.azure.com/",
"DeploymentName": "gpt-4o"
}
}Git provider credentials are configured at the application level and used for all analysis requests.
{
"GitCredentials": {
"GitHub": {
"Token": "ghp_..."
}
}
}Or via environment variable:
export GITHUB_TOKEN="ghp_..."{
"GitCredentials": {
"AzureDevOps": {
"Pat": "your-pat-token",
"OrgUrl": "https://dev.azure.com/your-org"
}
}
}Or via environment variables:
export AZURE_DEVOPS_PAT="your-pat-token"
export AZURE_DEVOPS_ORG_URL="https://dev.azure.com/your-org"- Domain Layer: Core business logic, entities, and domain events
- Application Layer: CQRS with Mediator pattern, commands, queries, and handlers
- Infrastructure Layer: Database, external services, Git clients, and AI integrations
- API Layer: REST endpoints, authentication, and middleware
- .NET 10: Primary framework with C# 14.0
- PostgreSQL: Database with JSONB support
- Mediator: Source-generator based CQRS implementation
- FluentValidation: Input validation
- Polly: Resilience patterns for external API calls
- OpenTelemetry: Metrics and observability
- Testcontainers: Integration testing with real database
- NSubstitute: Mocking framework for unit tests
- .NET 10 SDK
- Visual Studio 2022 or VS Code
- PostgreSQL 15+
- Docker (optional)
-
Clone and Setup
git clone https://github.com/your-org/lintellect.git cd lintellect dotnet restore -
Start with Aspire (Recommended)
# Start the Aspire AppHost - this handles everything automatically cd src/AppHost dotnet run
The Aspire AppHost will:
- Start PostgreSQL in a container
- Launch the API service
- Provide the Aspire dashboard at https://localhost:15000
- Handle all service discovery and configuration
-
Alternative: Manual Setup
# Database setup (if not using Aspire) docker run --name postgres-dev -e POSTGRES_PASSWORD=password -p 5432:5432 -d postgres:15 createdb lintellect # Run migrations cd src/Lintellect.Api dotnet ef database update # Start API manually dotnet run
# Run all tests
dotnet test
# Run with coverage
dotnet test --collect:"XPlat Code Coverage"
# Run integration tests
dotnet test tests/Lintellect.Api.FunctionalTests/# Build all projects
dotnet build
# Build specific project
dotnet build src/Lintellect.Api/Lintellect.Api.csproj
# Publish for production
dotnet publish src/Lintellect.Api/Lintellect.Api.csproj -c Release -o ./publishLintellect uses GitHub Flow with release branches. See Git Workflow Documentation for complete details.
main- Production-ready code, always stablefeature/*- New featuresbugfix/*- Bug fixeshotfix/api/*orhotfix/cli/*- Critical fixesrelease/api/v*orrelease/cli/v*- Release preparation
Lintellect has two independent releases:
- API - Docker images tagged as
api/v1.2.3 - CLI - NuGet package tagged as
cli/v2.1.0
# API Release
./scripts/create-release-api.sh 1.2.0
# CLI Release
./scripts/create-release-cli.sh 2.1.0See Git Workflow Documentation for detailed release process.
For setting up the repository on GitHub, see GitHub Setup Guide.
We welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes using conventional commits
- Push to the branch and open a Pull Request
- Follow C# Coding Conventions
- Add XML documentation for public APIs
- Include unit tests for new functionality
- Ensure all tests pass before submitting PR
This project is licensed under the MIT License - see the LICENSE.txt file for details.
- .NET Aspire for application orchestration
- Mediator for CQRS implementation
- Polly for resilience patterns
- OpenTelemetry for observability