Design a data structure that supports insert, delete, search and getRandom in constant time.

• Design a data structure that supports following operations in Θ(1) time.

insert(x): Inserts an item x to the data structure if not already present.

remove(x): Removes an item x from the data structure if present.

search(x): Searches an item x in the data structure.

getRandom(): Returns a random element from current set of elements

Example:

insert(3); ds becomes {3}
insert(4); ds becomes {3,4}
insert(5); ds becomes {3,4,5}
insert(6); ds becomes {3,4,5,6}

search(4); - returns true,
search(10); - returns false;

getRandom()- can return anyone of the items in our data structure ....it may return either 3 or 4 or 5 or 6...because all form a part of the data structure now. So, returning any one of them is correct.

remove(4) - removes element 4 from the data structure, now data structure becomes {3,5,6}

getRandom()- returns any one of the elements from the set {3,5,6}. Returning any one of these 3 existing elements is correct

And so on...insertion(x), remove(x), search(key), getRandom() keep on going repeatedly onto our data structure, and each must be a constant time operation

Note: uniformity wasnt stressed in getRandom() operation as it was asked during a discussion round. It was merely asked to get any of the existing elements randomly....Random class of java came to the rescue for the interviewee. I am not sure how uniformity can be checked in online judge... Although, this question has been asked repeatedly at amazon

• I would suggest adding at least one example to your problem description.

For example,

Assume we initially call:
`insert(3), insert(4), insert(6)`.

Calling `search(4)` should return true, while `search(5)` returns false

Calling `getRandom()` should return elements from `[3, 4, 6]` uniformly.

Now we call `remove(4)`.

Calling `search(4)` should return false.

Calling `getRandom()` should return elements from `[3, 6]` uniformly. (ie, each element `3` and `4` gets 50-50 chance)

• Note: uniformity wasnt stressed in getRandom() operation as it was asked during a discussion round. It was merely asked to get any of the existing elements randomly....

• Getting any existing elements randomly implies a uniform distribution. Each element should have equal chance of returning.

• One way to solve it is to use two basic data structures:

A resizable O(1) container `elements` (`std::vector` in C++, `ArrayList` in Java)
A hash table `elemToIndex` which maintains mapping from an element to index.

Inserting an element val to the data structure involves two operations:

1. elements.add(val)
2. elemToIndex.put(val, elements.size() - 1)

`GetRandom` is just randomly choosing an index from `0` to `elements.size() - 1`.

Searching an element is easy, just find the element from the hash table.

Now the tricky part, `remove`. You've probably heard that removing an element from an array is O(n). Is it?

Not true for the case of removing the last element, which is O(1)! Since we do not care about ordering, we can simply swap the element we want to delete with the last element, and pop off the last. Then update the map as well and we are done. :)

This is the simplest approach I can think of. I am interested if someone can come up with another approach.

• Here is my implementation. It's linear space; ~2N to be exact, but given the time complexity constraints, can we do better?

``````import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.Random;

/**
* All operations work in constant time.
*/
public class RandomizedSet<T> {

private Map<T, Integer> valueIndexes;
private List<T> values;
private Random rand;

public RandomizedSet() {
valueIndexes = new HashMap<>();
values = new ArrayList<>();
rand = new Random(System.currentTimeMillis());
}

public void add(T value) {
if (!contains(value)) {
int lastIndex = values.size();
valueIndexes.put(value, lastIndex);
values.add(value);
}
}

public boolean contains(T value) {
if (value == null) {
throw new NullPointerException();
}
return valueIndexes.containsKey(value);
}

public T getRandom() {
if (valueIndexes.isEmpty()) {
throw new NoSuchElementException();
}
int randomIndex = rand.nextInt(values.size());
return values.get(randomIndex);
}

public T deleteRandom() {
if (valueIndexes.isEmpty()) {
throw new NoSuchElementException();
}
int randomIndex = rand.nextInt(values.size());
return deleteValue(randomIndex);
}

public T delete(T value) {
if (!contains(value)) {
throw new NoSuchElementException();
}
int index = valueIndexes.get(value);
return deleteValue(index);
}

private T deleteValue(int currentIndex) {
// remove the current element in the array, swap with the last,
// and update the new index value in the map.
T currentValue = values.get(currentIndex);
int lastIndex = values.size() - 1;
T lastVal = values.get(lastIndex);
Collections.swap(values, currentIndex, lastIndex);
// removing the last element is constant
values.remove(lastIndex);
valueIndexes.put(lastVal, currentIndex);
valueIndexes.remove(currentValue);
return currentValue;
}

public int size() {
if (values.size() != valueIndexes.size()) {
// should never happen.
throw new IllegalStateException();
}
return values.size();
}
}
``````

Basic test:

``````  @Test
public void testBasic() {
RandomizedSet<Character> set = new RandomizedSet<>();
for (char val = 'A'; val <= 'Z'; val++) {
set.add(val);
}
for (char val = 'A'; val <= 'Z'; val++) {
System.out.print(set.deleteRandom());
}
System.out.println();
assertEquals(0, set.size());

for (char val = 'A'; val <= 'Z'; val++) {
set.add(val);
}
for (char val = 'A'; val <= 'Z'; val++) {
System.out.print(set.delete(val));
}
System.out.println();
assertEquals(0, set.size());
}
``````

• @Ronny.Pena I like how you use generics and handle exception, code is nicely written and is definitely production quality. Upvoted and great job!

• @1337c0d3r Hi, Admin.

As I know, hash table itself supports insert, search, delete manipulation in Θ(1) time.

The container elements, the indices of elements and the relationship of indices between elements hash table are both used to support GetRandom function.

My understanding to your solution is correct? thx

• @Tony_Zhang Yes that is correct. Hash table alone can't be used to support GetRandom efficiently.

• This is extremely easy in python:

``````class Solution():
def __init__(self):
self.data = []
self.hash = {}
def insert(self, a):
if self.search(a):
self.data.append(a)
self.hash[a] = len(self.data)-1
def search(self, a):
return self.hash.has_key(a)
def remove(self, a):
if self.search(a):
del self.data[self.hash[a]]
self.hash.pop(a,None)
def getRandom(self):
import random
if len(self.data) > 0:
return self.data[random.randint(0,len(self.data)-1)]
return
``````

• @raydai Welcome to the new Discuss!

Please format your code using 3 backquote sign:

`````````
your code block
```
``````

will be rendered as:

``````your code block
``````

• @1337c0d3r Thanks for the correction.

• @raydai What's the efficiency of "del self.data[self.hash[a]]" in python? is the data list reduced by one? My understanding it's O(n). This doesn't meet the requirement. (source: https://wiki.python.org/moin/TimeComplexity)

• Can I do something like this?

public Dictionary<int, int> DS { get; set; }

``````    void Insert(Dictionary<int, int> dataStrt, int number)
{
if (!Search(dataStrt, number))
{
dataStrt.Add(number, number);
}
}

void Remove(Dictionary<int, int> dataStrt, int number)
{
dataStrt.Remove(number);
}

bool Search(Dictionary<int, int> dataStrt, int number)
{
int dicval=0;
if (dataStrt.TryGetValue(number, out dicval))
return true;
else
return false;
}

int GetRandom(Dictionary<int, int> dataStrt)
{

Random rand = new Random();
int index = rand.Next(0, dataStrt.Count);
int randomKey = dataStrt.Keys.ElementAt(index);  // hat tip @yshuditelu
int randomValue = dataStrt[randomKey];
//  int a = dataStrt[rand.Next(dataStrt.Count())];
return randomValue;
}``````

• Removing an element from a list in Python is O(n), so maybe you can just mark that slot as empty with an invalidi value.

Hashtag maps like python dictionaries are O(1) with most of the operations but not in case of key collision, at worst they are O(n)!!

We are talking about integrers here and the only quick solution I see is a big bit array as big sa the biggest possibile number.

If this number is too big (then sparse) we can envelope more bit arrays ad leaves of a tree.

• @1337c0d3r nice solution

• Shouldn't std::unordered_set solve this problem gracefully which is implemented based on hash table?

``````std::unordered_set<int> myset;
myset.insert(2);
myset.insert(3);
myset.insert(4);
myset.erase(2);
myset.erase(3);
myset.find(4);

For getRandom, it needs to be implemented with a static iterator inside unorderset_set, or get total number of hash buckets and mod this number to get the random element?
``````

• It is an existing problem in 380. Insert Delete GetRandom O(1).

The best way to solve it is by using HashMap and ArrayList together. The value field of the HashMap should be the corresponding position in ArrayList. Every time an elements is deleted, the arraylist should update the corresponding position with its tail element and reduce the total length by 1. (Or simply reduce length by 1 if the position is at the tail). The time complexity of any operation will be O(1).

• A simple Java solution by using HashSet:

``````class RandomSet {
private HashSet<Integer> set = new HashSet<Integer>();

public boolean insert(int x) {
return set.add(x);
}

public boolean remove (int x) {
return set.remove(x);
}

public boolean search( int x) {
return set.contains(x);
}

public int getRandom() {
int key = (int)(Math.random()*set.size());
Integer[] nums = toArray();
return nums[key];
}

public Integer[] toArray(){
Integer[] nums = new Integer[set.size()];
set.toArray(nums);
return nums;
}

public int size(){
return set.size();
}
}
``````

• @laqxs I believe getRandom() will have a complexity of O(N) in the above solution.

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.