🎉 Exciting news! CursorLens Now supports Caching for AnthropicCheck out the guide

Changelog

Changelog for CursorLens

All notable changes to CursorLens will be documented in this file.

[0.1.2-alpha] - 2024-08-22

⚠️ ALPHA RELEASE

Added

  • Add Anthropic Cache support for context messages
  • Increase Token limit for Anthropic to 8192 tokens
  • Improved statistics page: Now you can select the data points you want to see
  • Log details are now collapsible
  • Full response is captured in the logs

View release on GitHub

[0.1.1-alpha] - 2024-08-18

⚠️ ALPHA RELEASE

Added

  • Added support for Mistral AI, Cohere, Groq, and Ollama

[0.1.0-alpha] - 2024-08-17

This is the initial alpha release of CursorLens. As an alpha version, it may contain bugs and is not yet feature-complete. Use with caution in non-production environments.

Added

  • Initial project setup with Next.js
  • Basic proxy functionality between Cursor and AI providers (OpenAI, Anthropic)
  • Simple dashboard for viewing AI interaction logs
  • Token usage tracking for OpenAI and Anthropic models
  • Basic cost estimation based on token usage
  • Support for PostgreSQL database with Prisma ORM
  • Environment variable configuration for API keys and database connection
  • Basic error handling and logging

Known Issues

  • Limited error handling for edge cases
  • Incomplete test coverage
  • Basic UI with limited customization options
  • Potential performance issues with large volumes of requests

Upcoming Features (Planned for future releases)

  • Enhanced analytics dashboard with more detailed insights
  • Support for additional AI providers
  • Improved error handling and logging
  • User authentication and multi-user support
  • Customizable alerts for token usage thresholds
  • Export functionality for logs and analytics data
  • Performance optimizations for handling larger datasets
  • More comprehensive test suite

CursorLens is currently in active development. We appreciate your patience and feedback as we work to improve the tool. Please report any bugs or feature requests on our GitHub repository.