Given a matrix A, write a matrix M for which every element [i,j] is the sum of all elements of A left and above A[i,j]


  • 4
    G

    Considering the following matrix A:

    [
      [3, 7, 1],
      [2, 4, 0],
      [9, 4, 2]
    ]
    

    Compute M:

    [
      [3,  10, 11],
      [5,  16, 17],
      [14, 29, 32]
    ]

  • 0
    Y

    I met this problem in a phone interview, called "integral image" or "sum area table" problem.
    reference


  • 0
    G

    Here's a solution in JavaScript. The if statement is to try to use the most efficient way to calculate the left or right (use above computed value or left computed value). Then it sums either the row or column, which ever it didn't use the computed value. This means it does the same sum calculation multiple times, it would be more efficient to store the previously calculated column sum or row sum and then use that value. But I didn't do that optimization here. You could use a temporary dictionary that at each x,y point it stored the column sum and row sum to that point.

    function leftOfCellRowSum(row, columnIndex) {
      var sum = 0;
      for (var i = 0; i < columnIndex; i++) {
        sum += row[i];
      }
      return sum;
    }
    function calculatedAboveValue(matrix, rowIndex, columnIndex) {
      var rowAbove = matrix[rowIndex - 1];
      if (!rowAbove) { return 0; }
      return rowAbove[columnIndex];
    }
    function calculatedLeftValue(row, columnIndex) {
      var leftCell = columnIndex - 1;
      return row[leftCell] || 0;
    }
    function aboveCellColumnSum(matrix, rowIndex, columnIndex) {
      var sum = 0;
      for (var i = 0; i < rowIndex; i++) {
        sum += matrix[i][columnIndex];
      }
      return sum;
    }
    /**
      @param {number[][]} matrix
      @return {number[][]}
    */
    function integralImage(matrix) {
      var result = [];
      for (var rowIndex = 0; rowIndex < matrix.length; rowIndex++) {
        var row = matrix[rowIndex];
        var newRow = [];
        for (var columnIndex = 0; columnIndex < row.length; columnIndex++) {
          var cellValue = row[columnIndex];
          var surroundingValue;
          if (rowIndex > columnIndex) {
            surroundingValue = leftOfCellRowSum(row, columnIndex) + calculatedAboveValue(result, rowIndex, columnIndex);
          } else {
            surroundingValue = aboveCellColumnSum(matrix, rowIndex, columnIndex) + calculatedLeftValue(newRow, columnIndex);
          }
          newRow.push(cellValue + surroundingValue);
        }
        result.push(newRow);
      }
      return result;
    }
    

  • 7
    G

    Actually the key is to notice that in M, M[i,j] = M[i-1, j] + M[i, j-1] - M[i-1, j-1]. Pretty straightforward afterwards.


  • 0
    G

    Oh great point. Then we add A[i][j]. I didn't see that at all. That's much easier than my solution.


  • 2
    M

    Simple Python Solution with above mentioned idea by @gratien and @ysonglc

    class Solution(object):
        def integral_image(self, matrix):
    
            row = len(matrix)
            col = len(matrix[0])
    
    
            for i in xrange(row):
                for j in xrange(col):
                    if i == 0 and j == 0:
                        matrix[i][j] = matrix[i][j]
                    elif i-1 >= 0 and j-1 >= 0:
                        matrix[i][j] = matrix[i][j] + matrix[i-1][j] + matrix[i][j-1] - matrix[i-1][j-1]
                    elif i-1 < 0:
                        matrix[i][j] = matrix[i][j] + matrix[i][j-1]
                    elif j-1<0:
                        matrix[i][j] = matrix[i][j] + matrix[i-1][j]
            return matrix
    
    
    def main():
        sol = Solution()
        #matrix = [[1, 2, 3],[4, 5, 6],[7, 8, 9]]
        matrix = [[3, 7, 1],[2, 4, 0],[9, 4, 2]]
        print sol.integral_image(matrix)
    
    if __name__ == '__main__':
        main()
    

  • 1
    I

    i do it for fun,

    def integral_image(matrix):
        if len(matrix) == 0 or len(matrix[0]) == 0:
            return None
    
        rows = len(matrix)
        cols = len(matrix[0])
    
        for y in xrange(rows):
            for x in xrange(1, cols):
                matrix[y][x] += matrix[y][x - 1]
    
    
        for y in xrange(1, rows):
            for x in xrange(cols):
                matrix[y][x] += matrix[y - 1][x]
    
        return matrix
    

  • 0

    yup same solution as above. Just have to realize that we need to add the cumulative sum up to the previous column and the cumulative sum up to the previous row to each position [i,j]

    def sum_up(matrix):
    	for r in matrix:
    		for c in range(1, len(r)):
    			r[c] += r[c-1]
    	for r in range(1, len(matrix)):
    		for c in range(len(matrix[0])):
    			matrix[r][c] += matrix[r-1][c]
    

  • 0
    typedef vector<vector<int> > Matrix;
    
    Matrix solve(const Matrix & a) {
      int m = a.size();
      int n = a[0].size();
      Matrix b(m);
      for (int i = 0; i < m; i++) {
        b[i].resize(n);
      }
      for (int i = 0; i < m; i++) {
        int s = 0;
        for (int j = 0; j < n; j++) {
          s += a[i][j];
          if (i == 0)
            b[i][j] = s;
          else
            b[i][j] = s + b[i - 1][j];
        }
      }
      return b;
    }
    

Log in to reply
 

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.