当前位置:   article > 正文

MACOS开发 -- 通过访问Camera,实时获取图片_mac avcapturedevice

mac avcapturedevice

之前做过一个项目,就是通过MAC 端软件来监控。路口或者家门口摄像头所拍摄的实时画面。项目匆匆结束交付之后,最近查看想在上面做点其他的需求。发现当时给的测试账号已经过期啦。无奈只能实用自己的笔记本摄像头来代替。
因为之前的的数据访问与数据展示都是我做点,所以说写就是。But…. 悲剧的事情发生啦。之前QTKit下的QTCaptureDevice的系统类都找不到啦。于是,网上各种翻阅资料也不得其果。后来查阅苹果文档,才发现苹果把QTKit 下的库迁移到啦AVFoundation下面。于是轻松在找到啦AVCaptureDevice。

要想访问摄像头需要做以下几件事:
1. 加载视频输入源(camera)。
2. 创建会话。
3. 创建视频输出源(显示器)。
4. 将视频输入源与视频输出源建立会话链接。


核心代码如下:

// 初始化视频输入输出
-(void)setupVideoInputOutput
{
    // 添加视频输入源
    AVCaptureDevice    *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if(videoDevice)
    {
        if(!self.session)
        {
            self.session = [[AVCaptureSession alloc]init];
        }
        NSError *error = Nil;
        AVCaptureDeviceInput  *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
        if(deviceInput)
        {
            if([self.session canAddInput:deviceInput]){
                [self.session addInput:deviceInput];
                self.videoDataOutput = [[AVCaptureVideoDataOutput alloc]init];
                dispatch_queue_t queue = dispatch_queue_create("myQueue",DISPATCH_QUEUE_CONCURRENT);
                // 实现其代理方法 并实时得到数据
                [self.videoDataOutput setSampleBufferDelegate:self queue:queue];
                if([self.session canAddOutput:self.videoDataOutput])
                {
                    [self.session addOutput:self.videoDataOutput];
                    AVCaptureVideoPreviewLayer  *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];
                    captureVideoPreviewLayer.frame = self.view.bounds;
                    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
                    // [self.view.layer addSublayer:captureVideoPreviewLayer];
                    [[NSApplication sharedApplication].keyWindow.contentView.layer addSublayer:captureVideoPreviewLayer];
                    [self.session startRunning];
                }else
                {
                    NSLog(@"ERROR: Session cannot add output");
                }
            }else
            {
                NSLog(@"ERROR: Session cannot add input");
            }

        }else
        {
            NSLog(@"ERROR: Create Device Input error: %@",error);
        }
    }else
    {
        NSLog(@"ERROR:Cannot find video Device");
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49

代理方法如下:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"get image Start");
    CGImageRef imageRef = [self DataFromCMSampleBufferRef:sampleBuffer];
    NSTimeInterval period = 1.0; //设置时间间隔
    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
    dispatch_source_t _timer = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, queue);
    dispatch_source_set_timer(_timer, dispatch_walltime(NULL, 0), period * NSEC_PER_SEC, 0); //每秒执行
    dispatch_source_set_event_handler(_timer, ^{
        //在这里执行事件
         [self imageFromeCGImageRef:imageRef];
        NSLog(@"~ imageFromeCGImageRef ~");
    });
    dispatch_resume(_timer);
    NSLog(@"get image End");
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16

在将视频流转化 为图片的方法如下:

// CMSampleBufferRef –> CVImageBufferRef –> CGContextRef –> CGImageRef –> UIImage
-(CGImageRef)DataFromCMSampleBufferRef:(CMSampleBufferRef)sampleBuffer
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);
    // Get the number of bytes per row for the plane pixel buffer
    //void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
    uint8_t * baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    // Get the number of bytes per row for the plane pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    //CVPixelBufferGetBytesPerRowOfPlane(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    // Create a device-dependent gray color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace,kCGImageAlphaNone);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    return quartzImage;
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33

需要注意的是:
要想在窗口中显示视频流,必须实用AVCaptureView去装在视频,否则无法显示。PS:当然有些人也可能实用NSImageView去将其转换为图片在去装载,这个另当别论。

    [self.view addSubview:self.captureView];
    self.captureView.controlsStyle = AVCaptureViewControlsStyleFloating;
    self.captureView.delegate = self;
    [[NSApplication sharedApplication].mainWindow setContentView:self.captureView];
  • 1
  • 2
  • 3
  • 4
  • 5

扩展:
也许有人会说直接截屏得到图片也未尝不可。但是需要注意的是在截取View的时候 ,注意锁定焦点:

+ (NSImage *)viewToImage:(NSView *)m_view
{
    //    焦点锁定
    [m_view lockFocus];
    //    生成所需图片
    NSImage *image = [[NSImage alloc]initWithData:[m_view dataWithPDFInsideRect:[m_view bounds]]];
    [m_view unlockFocus];
    return image;
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

可以通过OpenCV来实现
NSImage—->CGImageRef

- (CGImageRef)nsImageToCGImageRef:(NSImage*)image;
{
    NSData * imageData          = [image TIFFRepresentation];
    CGImageRef imageRef;
    if(imageData)
    {
        CGImageSourceRef imageSource = CGImageSourceCreateWithData((CFDataRef)imageData,  NULL);
        imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);
    }
    return imageRef;
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13

CGImageRef——->NSImage

+ (NSImage*) imageFromCGImageRef:(CGImageRef)image
{
    NSRect imageRect = NSMakeRect(0.0, 0.0, 0.0, 0.0);
    CGContextRef imageContext = nil;
    NSImage* newImage = nil;

    // Get the image dimensions.
    imageRect.size.height = CGImageGetHeight(image);
    imageRect.size.width = CGImageGetWidth(image);
    // Create a new image to receive the Quartz image data.
    newImage = [[NSImage alloc] initWithSize:imageRect.size];

    [newImage lockFocus];
    // Get the Quartz context and draw.
    imageContext = (CGContextRef)[[NSGraphicsContext currentContext] graphicsPort];
    CGContextDrawImage(imageContext, *(CGRect*)&imageRect, image);
    [newImage unlockFocus];

    return newImage;

}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/空白诗007/article/detail/836096
推荐阅读
相关标签
  

闽ICP备14008679号