JAVA-----------Easy Version To Understand!!!!


  • 65
    H

    1.The key to solve this problem is using a double linked list which enables us to quickly move nodes.
    2.The LRU cache is a hash table of keys and double linked nodes. The hash table makes the time of get() to be O(1). The list of double linked nodes make the nodes adding/removal operations O(1).

    class Node {
    int key;
    int value;
    Node pre;
    Node next;

    public Node(int key, int value) {
    	this.key = key;
    	this.value = value;
    }
    

    }
    public class LRUCache {

    HashMap<Integer, Node> map;
    int capicity, count;
    Node head, tail;

    public LRUCache(int capacity) {
    	this.capicity = capacity;
    	map = new HashMap<>();
    	head = new Node(0, 0);
    	tail = new Node(0, 0);
    	head.next = tail;
    	tail.pre = head;
    	head.pre = null;
    	tail.next = null;
    	count = 0;
    }
    
    public void deleteNode(Node node) {
    	node.pre.next = node.next;
    	node.next.pre = node.pre;
    }
    
    public void addToHead(Node node) {
    	node.next = head.next;
    	node.next.pre = node;
    	node.pre = head;
    	head.next = node;
    }
    
    public int get(int key) {
    	if (map.get(key) != null) {
    		Node node = map.get(key);
    		int result = node.value;
    		deleteNode(node);
    		addToHead(node);
    		return result;
    	}
    	return -1;
    }
    
    public void set(int key, int value) {
    	if (map.get(key) != null) {
    		Node node = map.get(key);
    		node.value = value;
    		deleteNode(node);
    		addToHead(node);
    	} else {
    		Node node = new Node(key, value);
    		map.put(key, node);
    		if (count < capicity) {
    			count++;
    			addToHead(node);
    		} else {
    			map.remove(tail.pre.key);
    			deleteNode(tail.pre);
    			addToHead(node);
    		}
    	}
    }
    

    }


  • 10
    G

    Great solution. One thing can be improved is to use map.size() directly in the capacity maintenance instead of having a counter of the nodes for that.


  • 0
    A

    Could you explain your solution please...What does new Node(0, 0) do? Is that some kind of dummy? Some more explanation on the various methods would be nice...thanks


  • 5
    D

    @aoben10 That's a dummy head and a dummy tail


  • 0
    I

    I think the question would also declare the key cannot be 0, otherwise it would fail since the keys of both dummy head and tail are 0. And I still confused that once the <key, node> pair has been removed from the Hashmap, how can that node still link to others? Wouldn't it be collected by garbage collector? Thanks.


  • 2
    R

    @icezhou0784 You don't have to worry about other key's being 0 since head and tail are never added to the map; they're only dummy nodes used for the linked list operations.
    When you remove a pair from a map, you are only removing the map's references to the key and value objects; the key and value objects are only collected at some later point if there are no other references to them. (It helps to think of these references as C pointers)


  • 0
    J
    This post is deleted!

  • 0
    N

    When I run this answer, it shows "runtime error". Could anyone explain it?


  • 0
    C

    @nsbdsxh Could be because method "set" was renamed to "put" recently.


  • 0

    Very concise solution. However, I found myself uncomfortable when I use the two (0,0) nodes as head and tail node because they are not existed. So I modify your code by simply using common head and tail node without being predefined. Hope it helps.

    class Node{
            private int key;
            private int value;
            Node pre, next;
            public Node(int key, int value){
                this.key = key;
                this.value = value;
                this.pre = null;
                this.next = null;
            }
        }
        HashMap<Integer, Node> map;
        Node head, tail;
        int capacity, count;
        public LRUCache(int capacity) {
            this.capacity = capacity;
            this.count = 0;
            this.map = new HashMap<>();
            this.head = null;
            this.tail = null;
        }
        public void deleteNode(Node node){
            if(node == head && node == tail){
                head = tail = null;
            }else if(node == head){
                head = head.next;
            }else if(node == tail){
                tail = node.pre;
                tail.next = null;
            }else{
                node.pre.next = node.next;
                node.next.pre = node.pre;
            }
        }
        public void addToHead(Node node){
            if(head == null && tail == null){
                head = tail = node;
                return ;
            }
            node.next = head;
            node.next.pre = node;
            node.pre = null;
            head = node;
        }
        public int get(int key) {
            if(map.get(key) == null) return -1;
            Node node = map.get(key);
            int res = node.value;
            deleteNode(node);
            addToHead(node);
            return res;
        }
        
        public void put(int key, int value) {
            if(map.get(key) != null){
                Node node = map.get(key);
                node.value = value;
                deleteNode(node);
                addToHead(node);
            }else{
                Node node = new Node(key, value);
                map.put(key, node);
                if(count < capacity){
                    count ++;
                    addToHead(node);
                }else{
                    map.remove(tail.key);
                    deleteNode(tail);
                    addToHead(node);
                }
            }
        }
    

  • 0
    D

    Solution doesn't seem to meet the requirements of a LRUCache. Order matters. Hash Sets do not order according to usage.


  • 0
    M

    why Node having key and value? why can't just one field say data?


  • 0
    S

    Simple version beats 97%

    class LRUCache {
        
        int size;
        int limit;
        HashMap<Integer, Node> map;
        Node head;
        Node tail;
        
        public LRUCache(int capacity) {
            limit = capacity;
            map = new HashMap<>();
            head = new Node(0, 0);
            tail = new Node(0, 0);
            head.next = tail;
            tail.prev = head;
        }
        
        public int get(int key) {
            if(map.containsKey(key)){
                Node n = map.get(key);
                remove(n);
                addFirst(head, n);
                return n.val;
            }
            return -1;
        }
        
        public void put(int key, int value) {
            if(map.containsKey(key)){
                Node n = map.get(key);
                n.val = value;
                remove(n);
                addFirst(head, n);
                return;
            }
            Node node = new Node(key, value);
            if(limit == 0) return;
            addFirst(head, node);
            if(size < limit)
                ++size;
            else
                removeLast(tail, map);
            map.put(key, node);
        }
        
        public void addFirst(Node head, Node node){
            node.next = head.next;
            head.next.prev = node;
            node.prev = head;
            head.next = node;
        }
    
        public void removeLast(Node tail, HashMap<Integer, Node> map){
            Node prev = tail.prev;
            Node prepre = prev.prev;
            prev.next = null;
            prev.prev = null;
            prepre.next = tail;
            tail.prev = prepre;
            map.remove(prev.key);
        }
    
        public void remove(Node node){
            Node prev = node.prev;
            Node next = node.next;
            node.prev = null;
            node.next = null;
            prev.next = next;
            next.prev = prev;
        }
        
        class Node{
            int key;
            int val;
            Node next;
            Node prev;
            public Node(int k, int v) { key = k; val = v; next = null; prev = null; }
        }
    }
    

  • 0
    W

    @Tōsaka-Rin Look at all the null checks in your code! Sometimes, dummy nodes make code clear and eliminate the need to handle special cases.


  • 0

    My code without defining my own node class but the put method might not be O(1) since there is a while loop. I did this because I didn't think of creating a new node class. Could anyone tell me if my put method is considered O(1) or O(n)? Thanks in advance
    '''
    class LRUCache {
    int capacity;
    HashMap<Integer,Integer> map;
    HashMap<Integer,Integer> duplicateMap;
    Queue<Integer> orderQ;
    public LRUCache(int capacity) {
    map = new HashMap<Integer,Integer>();
    duplicateMap = new HashMap<Integer,Integer>();
    orderQ = new LinkedList<Integer>();
    this.capacity = capacity;
    }

    public int get(int key) {
        if(map.containsKey(key)){
            orderQ.add(key);
            if(duplicateMap.containsKey(key))
                duplicateMap.put(key,duplicateMap.get(key)+1);
            else
                duplicateMap.put(key,1);
            return map.get(key);
        }
        else
            return -1;
    }
    
    public void put(int key, int value) {
        if(map.size() < capacity){
            map.put(key,value);
            orderQ.add(key);
            if(duplicateMap.containsKey(key))
                duplicateMap.put(key,duplicateMap.get(key)+1);
            else
                duplicateMap.put(key,1);
        }
        else{
            if(!map.containsKey(key)){
                //remove
                boolean isRemove = false;
                int rKey;
                while(!isRemove){
                    rKey = orderQ.poll();
                    if(duplicateMap.get(rKey) == 1){
                        duplicateMap.remove(rKey);
                        map.remove(rKey);
                        isRemove = true;
                    }
                    else
                        duplicateMap.put(rKey,duplicateMap.get(rKey)-1);                    
                }
            }
            //insert
            map.put(key,value);
            orderQ.add(key);
            if(duplicateMap.containsKey(key))
                duplicateMap.put(key,duplicateMap.get(key)+1);
            else
                duplicateMap.put(key,1);
        }
            
    }
    

    }
    '''


  • 0
    J

    Can someone explain why map.remove must come before deleteNode?

     map.remove(tail.prev.key);
     deleteNode(tail.prev);
    

Log in to reply
 

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.