Many people have used binary search to guarantee O(nlgn) time--which I never thought about as I stopped when I got the following O(n) solution.

The general idea is to do one pass from head to tail, check the condition num[i-1] < num[i] > num[i+1], return the first i that satisfies. So at first glance, it's a O(n) time, as the only peak may be close to the tail.

```
public class Solution {
public int findPeakElement(int[] num) {
if(num == null) return -1;
if(num.length == 1) return 0;
if(num[0] > num[1]) return 0;
if(num[num.length - 1] > num[num.length - 2]) return num.length - 1;
for(int i = num.length - 2; i > 0; i --){
if(num[i] > num[i+1] && num[i] > num[i-1]) return i;
}
return -1;
}
}
```

However, here comes something interesting. After a second thought, I noticed when assuming random input, my solution is actually constant time in average. Here's why:

Assumption: P(num[i] < num[i+1]) = P(num[i] > num[i+1]) = 1/2 -- assuming random input

Notation: use p(x) to represent the chance of program finding a peak after checking x indices.

p(1) = P(num[0] > num[1]) = 1/2;

p(2) = (1 - p(1)) * P(num[1] > num[2]) = 1/2 * 1/2 = 1/4;

p(3) = 1/8

...

p(x) = 1/2^x

so on average, the # of indices to check before finding a peak is 1*p(1) + 2*p(2) + 3*p(3) + .... = 2

so actually, this O(n) bound of first glance is pretty "loose" in some sense, as the average time is O(2)!

However, this analysis kind of relies on the "randomness" of the input, but I think it's still something worth mentioning.

Having fun solving!