如何有效地确定最常用的颜色



我需要从当前显示的屏幕中确定哪个 NSColor 占主导地位(当前位图调色板中的最高计数(......我构建了一些有效的东西,但它非常慢......我需要每秒大约执行 1 次(目前处理需要 6 秒以上(,我希望它不会占用 CPU(目前是这种情况(。

杀死它的部分是分析每个像素的 2 个嵌套循环(宽 x 高(。有没有更有效的方法可以做到这一点?我确定有...有什么例子吗?

谢谢!

#include "ScreenCapture.h"
#import <AVFoundation/AVFoundation.h>
@implementation ScreenCapture
@synthesize captureSession;
@synthesize stillImageOutput;
@synthesize stillImage;
//-----------------------------------------------------------------------------------------------------------------
- (id) init
{
    if ((self = [super init]))
        [self setCaptureSession:[[AVCaptureSession alloc] init]];
    // main screen input
    CGDirectDisplayID displayId = kCGDirectMainDisplay;
    AVCaptureScreenInput *input = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];
    [input setMinFrameDuration:CMTimeMake(1, 1)];
    input.capturesCursor = 0;
    input.capturesMouseClicks = 0;
    if ([[self captureSession] canAddInput:input])
        [[self captureSession] addInput:input];
    // still image output
    [self setStillImageOutput:[[AVCaptureStillImageOutput alloc] init]];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
    [[self stillImageOutput] setOutputSettings:outputSettings];
    if ([[self captureSession] canAddOutput:[self stillImageOutput]])
         [[self captureSession] addOutput:[self stillImageOutput]];
    // start capturing
    [[self captureSession] startRunning];
    return self;
}
//-----------------------------------------------------------------------------------------------------------------
- (NSColor* ) currentlyDominantColor
{
    [self captureImage];
    if ([self stillImage] != nil)
    {
        NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithCIImage:[self stillImage]];
        NSInteger pixelsWide = [imageRep pixelsWide];
        NSInteger pixelsHigh = [imageRep pixelsHigh];
        NSCountedSet* imageColors = [[NSCountedSet alloc] initWithCapacity:pixelsWide * pixelsHigh];
        NSColor* dominantColor = nil;
        NSUInteger highCount = 0;
        for (NSUInteger x = 0; x < pixelsWide; x++)
        {
            for (NSUInteger y = 0; y < pixelsHigh; y++)
            {
                NSColor* color = [imageRep colorAtX:x y:y];
                [imageColors addObject:color];
                NSUInteger count = [imageColors countForObject:color];
                if (count > highCount)
                {
                    dominantColor = color;
                    highCount = count;
                }
            }
        }
        return dominantColor;
    }
    else
    {
        // dummy random color until an actual color gets computed
        double r1 = ((double) arc4random() / 0x100000000);
        double r2 = ((double) arc4random() / 0x100000000);
        double r3 = ((double) arc4random() / 0x100000000);
        return [NSColor colorWithCalibratedRed:r1 green:r2 blue:r3 alpha:1.0f];
    }
}
//-----------------------------------------------------------------------------------------------------------------
- (void) captureImage
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [[self stillImageOutput] connections])
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo])
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection)
            break;
    }
    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
                                                         completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
    {
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        CIImage *image = [[CIImage alloc] initWithData:imageData];
        [self setStillImage:image];
    }];
}
//-----------------------------------------------------------------------------------------------------------------
- (void) dealloc
{
    [[self captureSession] stopRunning];
    captureSession = nil;
    stillImageOutput = nil;
    stillImage = nil;
}
@end

这是一个算法的大纲,它的速度要快得多。代码中的大部分缓慢来自对colorAtX:y的所有调用 - 包括获取像素,创建NSColor等(分析您的应用程序以找出答案(,并且所有这些都使用消息调度。如果直接访问位图数据,则可以做得更好。

例如,假设您的位图是网格化的(使用 isPlanar 找出(并且具有 32 位像素(bitsPerPixel(,您可以针对其他位进行调整。

  1. 按上述方式检查您的条件
  2. 获取指向像素的指针(bitmapData( - 这实际上是 uint32 像素的 C 数组,其长度是像素数 ( totalBytes/4(
  3. 对像素进行排序(例如使用 qsort (,这将为您提供相同像素值的运行 - 是的,它会弄乱您的图像,但谁在乎您为此目的创建了它
  4. 遍历数组并找到运行时间最长的像素值 - 您只是在寻找相同 uint32 值的运行,这是一种简单的算法
  5. 循环后,使用 colorWithColorSpace:components:count 创建NSColor - 通过从像素中提取每个字节(shift & mask(并转换为 0 到 1 范围内的浮点数,从位图(colorSpace(和浮点值中获取颜色空间。

考虑使用 CIFilter 的 CIAreaAverage。它比普通凡人更了解高速数学运算!

此代码不完全是您要求的,但是如果您没有以这种方式获取像素值,则您获得的像素值将不准确。我不知道为什么。

无论如何,这是对一系列其他问题的答案:获取图像指标,特别是最小值,平均值和最大值。请注意我是如何获得像素值的。你需要这样做。您对代码所做的唯一更改是添加一个循环,该循环根据高度和宽度遍历每个像素(这里只需要一个基本的 for 循环(。

这是我的输出...

2015-07-17 14:58:03.751 色度照片编辑扩展[1945:155358] CIArea最小输出: 255, 27, 0, 0

2015-07-17 15:00:08.086 色度照片编辑扩展[2156:157963] CIArea平均输出: 255, 191, 166, 155

2015-07-17 15:01:24.047 色度照片编辑扩展[2253:159246] CIArea最大输出: 255, 255, 255, 238

。来自以下代码(适用于 iOS(:

- (CIImage *)outputImage
{
    [GlobalCIImage sharedSingleton].ciImage = self.inputImage;
    
    CGRect inputExtent = [[GlobalCIImage sharedSingleton].ciImage extent];
    CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x
                                           Y:inputExtent.origin.y
                                           Z:inputExtent.size.width
                                           W:inputExtent.size.height];
    CIImage *inputAverage = [CIFilter filterWithName:@"CIAreaMaximum" keysAndValues:kCIInputImageKey, [GlobalCIImage sharedSingleton].ciImage, kCIInputExtentKey, extent, nil].outputImage;
    size_t rowBytes = 4;
    uint8_t byteBuffer[rowBytes];
    
    [[GlobalContext sharedSingleton].ciContext render:inputAverage toBitmap:byteBuffer rowBytes:rowBytes bounds:[inputAverage extent] format:kCIFormatRGBA8 colorSpace:nil];
    
    int width = inputAverage.extent.size.width;
    int height = inputAverage.extent.size.height;
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, width * 4, colorSpace, kCGBitmapAlphaInfoMask & kCGImageAlphaPremultipliedFirst);
    
    CGColorSpaceRelease(colorSpace);
    
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), [[GlobalContext sharedSingleton].ciContext createCGImage:inputAverage fromRect:CGRectMake(0, 0, width, height)]);
    
    unsigned int *colorData = CGBitmapContextGetData(context);
    unsigned int color = *colorData;
    
    float inputRed = 0.0;
    float inputGreen = 0.0;
    float inputBlue = 0.0;
    short a = color & 0xFF;
    short r = (color >> 8) & 0xFF;
    short g = (color >> 16) & 0xFF;
    short b = (color >> 24) & 0xFF;
    NSLog(@"CIAreaMaximum output: %d, %d, %d, %d", a, r, g, b);
        
    *colorData = (unsigned int)(r << 8) + ((unsigned int)(g) << 16) + ((unsigned int)(b) << 24) + ((unsigned int)(a));
    //NSLog(@"Second read: %i", colorData);
        
    inputRed = r / 255.0;
    inputGreen = g / 255.0;
    inputBlue = b / 255.0;
    
    CGContextRelease(context);
    
    return [[self dissimilarityKernel] applyWithExtent:[GlobalCIImage sharedSingleton].ciImage.extent roiCallback:^CGRect(int index, CGRect rect) {
        return CGRectMake(0, 0, CGRectGetWidth([GlobalCIImage sharedSingleton].ciImage.extent), CGRectGetHeight([GlobalCIImage sharedSingleton].ciImage.extent));
    } arguments:@[[GlobalCIImage sharedSingleton].ciImage, [NSNumber numberWithFloat:inputRed], [NSNumber numberWithFloat:inputGreen], [NSNumber numberWithFloat:inputBlue]]];
}

最新更新