ios - Detection of sharpness of a photo -
i'm looking framework helps detecting sharpness of photo. have read this post points methodology of doing so. i'd rather work library getting hands dirty.
in documentation of core image apple says:
core image can analyze quality of image , provide set of filters optimal settings adjusting such things hue, contrast, , tone color, , correcting flash artifacts such red eye. 1 method call on part.
how can 'analyze image quality' part? i'd love see example code.
we did gpuimage framework (calculate brightness , sharpness): (here snippets might you)
-(bool) calculatebrightness:(uiimage *) image { float result = 0; int = 0; (int y = 0; y < image.size.height; y++) { (int x = 0; x < image.size.width; x++) { uicolor *color = [self colorat:image atx:x andy:y]; const cgfloat * colors = cgcolorgetcomponents(color.cgcolor); float r = colors[0]; float g = colors[1]; float b = colors[2]; result += .299 * r + 0.587 * g + 0.114 * b; i++; } } float brightness = result / (float)i; nslog(@"image brightness : %f",brightness); if (brightness > 0.8 || brightness < 0.3) { return no; } return yes;
}
-(bool) calculatesharpness:(uiimage *) image { gpuimagecannyedgedetectionfilter *filter = [[gpuimagecannyedgedetectionfilter alloc] init]; binaryimagedistancetransform *binimagtrans = [[binaryimagedistancetransform alloc] init ]; nsarray *resultarray = [binimagtrans twodimdistancetransform:[self getbinaryimageasarray:[filter imagebyfilteringimage:image]]]; if (resultarray == nil) { return no; } int sum = 0; (int x = 0; x < resultarray.count; x++) { nsmutablearray *col = resultarray[x]; sum += (int)[col valueforkeypath:@"@max.intvalue"]; } // values under analysis nslog(@"image sharp : %i",sum); if (sum < 26250000) { // tested - bad sharpness under ca. 26250000 return no; } return yes;
}
but slow. takes ca. 40 seconds 1 image ipad camera.
Comments
Post a Comment