Design an in-memory LRU (Least Recently Used) Cache that achieves O(1) time complexity for both get and put operations. When the cache reaches capacity, it automatically evicts the least recently used item. The solution combines a HashMap for O(1) lookups and a Doubly Linked List to efficiently track access order. You'll also implement thread safety for concurrent access and support optional TTL (Time-To-Live) for cache entries.
Join thousands of developers practicing Low Level Design. Build your design step-by-step, get AI feedback, and learn from complete solutions.
🚀 Start Practicing Now