Changelog
Stay up to date with the latest updates, features, and improvements to Chronalog.
Filter by Major Version
Performance Optimization: Batch Fetching and Parallel Processing
Features
- •Implemented GraphQL batch fetching for changelog entries via GitHub API
- •Added parallel fetching fallback using Promise.all() for REST API calls
- •Optimized listChangelogEntriesViaGitHub to reduce API calls from N+1 to 1-2 queries
- •Added getMultipleFileContents function for efficient batch file retrieval
- •Updated init script to generate optimized API routes with caching
- •Updated init script to generate ISR-enabled changelog pages
Performance Optimization Release
This release addresses critical performance issues in production environments where changelog pages were experiencing 15+ second load times. The optimizations reduce load times to under 1-2 seconds through intelligent batching and parallel processing.
Technical Implementation
GraphQL Batch Fetching
Problem: The previous implementation fetched each MDX file sequentially via individual API calls, resulting in an N+1 query problem. For 50 changelog entries, this meant 51 API calls (1 for directory listing + 50 for each file).
Solution: Implemented getMultipleFileContents() in src/utils/github-api.ts that uses GitHub's GraphQL API to fetch multiple files in a single query. The function:
- Batches files into chunks of 50 to respect GitHub's query complexity limits
- Uses GraphQL aliases (
file0,file1, etc.) to fetch multiple files in parallel within a single query - Constructs expressions in the format
branch:pathfor each file - Returns a
Map<string, string | null>mapping file paths to their contents
Code Location: src/utils/github-api.ts:195-273
Performance Impact: Reduces 50+ individual API calls to 1-2 batch queries, resulting in a 10-15x reduction in API calls and ~90% reduction in load time when an access token is available.
Parallel Fetching Fallback
Problem: For public repositories without authentication, the system couldn't use GraphQL batch fetching and still fetched files sequentially.
Solution: Modified listChangelogEntriesViaGitHub() in src/utils/github-filesystem.ts to use Promise.all() for parallel fetching when batch fetching isn't available:
- Maps all MDX files to promises that fetch content in parallel
- Uses
Promise.all()to execute all fetches simultaneously - Falls back to individual GraphQL or REST API calls based on authentication status
Code Location: src/utils/github-filesystem.ts:320-410
Performance Impact: Reduces sequential wait time from ~15 seconds (50 files × 300ms each) to ~1-2 seconds (all files fetched in parallel).
Implementation Details
Batch Fetching Flow
- Detection: When
accessTokenis available, the system attempts batch fetching - Batching: Files are grouped into batches of 50 to avoid GraphQL complexity limits
- Query Construction: Each batch creates a GraphQL query with aliased repository queries:
query GetMultipleFiles($owner: String!, $name: String!) { file0: repository(owner: $owner, name: $name) { object(expression: "main:path/to/file1.mdx") { ... on Blob { text } } } file1: repository(owner: $owner, name: $name) { object(expression: "main:path/to/file2.mdx") { ... on Blob { text } } } # ... up to 50 files per batch } - Error Handling: If batch fetching fails, the system gracefully falls back to parallel fetching
- Result Mapping: File paths are mapped back to their contents using the alias indices
Parallel Fetching Flow
- File Mapping: All MDX files are mapped to async functions that fetch their content
- Parallel Execution:
Promise.all()executes all fetches simultaneously - Content Retrieval: Each file uses either:
- GraphQL
getFileContent()if access token is available - REST API
fetch()for public repositories
- GraphQL
- Error Isolation: Individual file fetch failures don't block other files
- Result Aggregation: All successful fetches are collected and parsed
Caching and ISR Updates
The init script (cli/init.ts) was updated to generate optimized route handlers and pages:
API Route Optimization
File: app/api/changelog/list/route.ts
- Added
export const revalidate = 300for Next.js route segment config caching - Added
Cache-Control: public, s-maxage=300, stale-while-revalidate=600headers for CDN caching - Enables 5-minute cache with 10-minute stale-while-revalidate window
Page Optimization
File: app/changelog/page.tsx
- Replaced
export const dynamic = 'force-dynamic'withexport const revalidate = 300 - Enables ISR (Incremental Static Regeneration) for pre-rendered pages
- Pages are generated at build time and regenerated every 5 minutes in the background
Backward Compatibility
- Breaking Changes: None - all changes are internal optimizations
- API Compatibility: The
listChangelogEntries()function signature remains unchanged - Migration: Existing projects need to update their route files and pages to enable caching (or re-run
chronalog init)
Performance Metrics
| Scenario | Before | After | Improvement |
|---|---|---|---|
| With Token (50 entries) | ~15s (51 sequential calls) | ~0.5-1s (1-2 batch queries) | 15-30x faster |
| Without Token (50 entries) | ~15s (51 sequential calls) | ~1-2s (51 parallel calls) | 7.5-15x faster |
| Cached/ISR | N/A | ~0.1s (served from cache) | 150x faster |
Files Modified
src/utils/github-api.ts: AddedgetMultipleFileContents()functionsrc/utils/github-filesystem.ts: UpdatedlistChangelogEntriesViaGitHub()with batch and parallel fetchingcli/init.ts: Updated generated API route and page templates with caching and ISR
Testing Recommendations
- Test with access token to verify batch fetching works correctly
- Test without access token to verify parallel fetching fallback
- Test with large changelogs (50+ entries) to verify batching logic
- Verify caching behavior in production environment
- Monitor API rate limits to ensure batch fetching doesn't exceed GitHub limits
Initial Release
Features
- •Git-backed changelog management system
- •Beautiful admin interface for managing changelog entries
- •MDX support for rich content formatting
- •Automatic Git commits for all changes
- •GitHub OAuth authentication
- •TypeScript support with full type safety
- •No database required - completely file-based
- •Customisable configuration options
- •Tag system for categorising entries
- •Version tracking support
- •Quick setup with init command
Initial Release
Welcome to Chronalog! This is the first official release of our Git-backed changelog management system for Next.js.
What is Chronalog?
Chronalog is a powerful, file-based changelog management system designed specifically for Next.js projects. It provides a beautiful admin interface for managing your changelog entries while keeping everything in your Git repository.
Key Features
This initial release includes all the core functionality you need to manage your changelog:
- Git Integration: All changes are automatically committed to your Git repository
- MDX Support: Write rich content with markdown and custom components
- No Database: Everything is stored as MDX files in your repository
- TypeScript: Fully typed for the best developer experience
- Customisable: Configure paths, routes, and commit message formats
- Quick Setup: Get started in minutes with a single command
Getting Started
To get started with Chronalog, simply install it in your Next.js project:
pnpm add chronalog
pnpm chronalog init
Then set up GitHub OAuth authentication and you're ready to start managing your changelog entries!
What's Next?
We're excited to continue improving Chronalog based on your feedback. Stay tuned for future updates and enhancements.