Memset and fill_n


  • 1
    Q
    class Solution {
    public:
        int minCut(string s) {
        	int n = s.size();
        	int f[n+1];
        	for(int i = 0; i<=n;++i)
        		f[i] = n-i-1;
        	bool p[n][n];
        	memset(p,false,sizeof(p));
        	
        	//fill_n(&p[0][0], n * n, 0);
    
        	for(int i=n-1;i>=0;--i)
        		for(int j = i;j<=n-1;++j)
        		{
        			if(s[j]==s[i]&&(j-i<2||p[i+1][j-1])){
        				p[i][j] = true;
        				f[i] = min(f[i],1+f[j+1]);
        			}
        		}
        	return f[0]; 
            
        }
    };
    

    the above code passed.
    However, if I change

    bool p[n][n];
    memset(p,false,sizeof(p));
    

    to

     int p[n][n];
     memset(p,0,sizeof(p));
    

    or

    int p[n][n];
    fill_n(&p[0][0], n * n, 0);
    

    it will result in a run time error.... Why, I still couldn't understand it clearly. I think they are same....


  • 0

    The size of bool versus int is totally compiler implementation dependent, but if you do sizeof(bool) vs sizeof(int) in OJ environment the result is 1 and 4 respectively.

    As we know allocating int costs 4x more space than bool and you allocate the array on the stack, you risk exceeding the limited stack memory which ultimately causes Runtime Error. The way around this is to allocate the n x n array of ints on the heap.


Log in to reply
 

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.