使用OpenNI 2获取RGBD摄像头深度信息

  • NiViewer

  安装好摄像头驱动和OpenNI后,在Tools文件夹中可以找到一个程序NiViewer。NiViewer的一些基本控制方法如下:
  1. ESC关闭NiViewer程序

  2. 右键可以显示出控制选项

  3. 按0到9以及“-”与“=”键,这12个键可以控制显示的方法

  4. 按M键控制显示图像是否镜像

  5. F键全屏切换

  在使用NiViewer的时候,如果使用的设备没有RGB摄像头,此时NiViewer只显示深度图像,而RGB图像部分则处于关闭状态。如果使用的设备存在RGB摄像头,则NiViewer窗口左半边是3D摄像头的深度影像,右半边是RGB摄像头的影像。

  使用RGBD摄像头时,由于需要传输的影像和深度信息资料的量很大,所以要注意所使用的USB的频宽。当USB频宽不够的时候会出现卡顿或其他情况,尽量不要和其它大量数据传输的USB装置同时使用。

  • 深度感应器原理

  对于光学设备来说,不可避免的要涉及精度和使用范围问题。作为开发者,要熟悉这款设备的这些功能,对于基于这些设备所进行的开发有着决定性的作用。Xtion设备采用的是基于散斑的光源标定技术,所以散斑的形成与否对深度图像有着决定性的作用。下图是获取的深度图像,图中黑色的部分表示无法精确获得深度信息。首先在靠近镜头的地方会无法获取深度,这是因为在这么近的距离无法形成有效的散斑。其次图片中天花板上的照明灯管部分也无法获取深度,因为Xtion使用的是主动光学式探测,所以直接光照会对传感器的接收产生较强的干扰,最后一部分黑色是在图片中最远的地方,这是因为距离过远,当距离过远时也不能形成有效的散斑。

  由于深度信息是通过红外激光发射器发射激光,激光在物体表面通过漫反射形成散斑,再由CMOS传感器接收返回的红外激光来获取,因此物体的材质会影响探测效果。比如透明材质和镜面材质,对于这两种材质,摄像头均不能有效获取深度信息。除了材质问题,在遮蔽的情况下,摄像头也不能获取到物体的深度。当一个物体被其他非透明物体所遮挡,这样红外激光不能照射到物体,也就不能被摄像头的接收器所接收到,这时物体就没有正确的深度。

  •  获取深度数据

  具体创建工程的步骤可以参考OpenNI文档中的Getting Started章节。下图描述了数据获取的流程:

  1. 轮询方式:

#include <stdio.h>
#include <OpenNI.h> 

#include "OniSampleUtilities.h"

#define SAMPLE_READ_WAIT_TIMEOUT 2000 //2000ms

using namespace openni;  // The entire C++ API is available under the openni namespace.

int main(int argc, char* argv[])
{
    // Be sure to call openni::OpenNI::initialize(), to make sure all drivers are loaded
    // If no drivers are found, this function will fail. If it does, you can get some basic
    // log by calling openni::OpenNI::getExtendedError() (which returns a string)
    Status rc = OpenNI::initialize();
    if (rc != STATUS_OK)
    {
        // getExtendedError() method returns additional, human-readable information about the error.
        printf("Initialize failed\n%s\n", OpenNI::getExtendedError());
        return 1;
    }

    // Provides an interface to a single sensor device connected to the system. Requires OpenNI
    // to be initialized before it can be created. Devices provide access to Streams.
    Device device; 

     // Device::open():  connects to a physical hardware device
     // This function returns a status code indicating either success or what error occurred.
    if (argc < 2)
        rc = device.open(ANY_DEVICE);
    else
        rc = device.open(argv[1]);

    if (rc != STATUS_OK)
    {
        printf("Couldn‘t open device\n%s\n", OpenNI::getExtendedError());
        return 2;
    }

    VideoStream depth;  // VideoStream: Abstracts a single video stream. Obtained from a specific Device.

    // Get the SensorInfo for a specific sensor type on this device
    // The SensorInfo is useful for determining which video modes are supported by the sensor
    // Parameters: sensorType of sensor to get information
    // Returns: SensorInfo object corresponding to the sensor type specified,or NULL if such a sensor is not available from this device.
    if (device.getSensorInfo(SENSOR_DEPTH) != NULL)
    {
        // Before VideoStream object can be used, this object must be initialized with the VideoStream::create() function.
        // The create() function requires a valid initialized device. Once created,
        // you should call the VideoStream::start() function to start the flow of data
        rc = depth.create(device, SENSOR_DEPTH);
        if (rc != STATUS_OK)
        {
            printf("Couldn‘t create depth stream\n%s\n", OpenNI::getExtendedError());
            return 3;
        }
    }

    rc = depth.start();  // Start data generation from the video stream
    if (rc != STATUS_OK)
    {
        printf("Couldn‘t start the depth stream\n%s\n", OpenNI::getExtendedError());
        return 4;
    }

    VideoFrameRef frame;  // VideoFrameRef: Abstracts a single video from and related meta-data. Obtained from a specific Stream.

    // Polling based data reading
    while (!wasKeyboardHit())
    {
        int changedStreamDummy;
        VideoStream* pStream = &depth;

        // A system of polling for stream access can be implemented by using the OpenNI::waitForAnyStream() function.
        // When called, it blocks until any of the streams in the list have new data available,or the timeout has passed.
        // It then returns a status code and indicates which stream has data available.
        rc = OpenNI::waitForAnyStream(&pStream, 1, &changedStreamDummy, SAMPLE_READ_WAIT_TIMEOUT);
        if (rc != STATUS_OK)
        {
            printf("Wait failed! (timeout is %d ms)\n%s\n", SAMPLE_READ_WAIT_TIMEOUT, OpenNI::getExtendedError());
            continue;
        }

        // Once a VideoStream has been created, data can be read from it directly with the VideoStream::readFrame() function.
        rc = depth.readFrame(&frame);
        if (rc != STATUS_OK)
        {
            printf("Read failed!\n%s\n", OpenNI::getExtendedError());
            continue;
        }

        // getVideoMode() can be used to determine the video mode settings of the sensor
        // This information includes the pixel format and resolution of the image, as well as the frame rate
        // PIXEL_FORMAT_DEPTH_1_MM: The values are in depth pixel with 1mm accuracy
        if (frame.getVideoMode().getPixelFormat() != PIXEL_FORMAT_DEPTH_1_MM && frame.getVideoMode().getPixelFormat() != PIXEL_FORMAT_DEPTH_100_UM)
        {
            printf("Unexpected frame format\n");
            continue;
        }

        // VideoFrameRef::getData() function that returns a pointer directly to the underlying frame data.
        // This will be a void pointer, so it must be cast using the data type of the individual pixels in order to be properly indexed.
        DepthPixel* pDepth = (DepthPixel*)frame.getData();

        // getHeight() and getWidth() functions are provided to easily determine the resolution of the frame
        int middleIndex = (frame.getHeight() + 1) * frame.getWidth() / 2;

        // getTimestamp: Provides timestamp of frame, measured in microseconds from an arbitrary zero
        printf("[%08llu] %8d\n", (long long)frame.getTimestamp(), pDepth[middleIndex]); 

        // %md:m为指定的输出字段的宽度。如果数据的位数小于m,则左端补以空格,若大于m,则按实际位数输出
        // 0:有0表示指定空位填0,如省略表示指定空位不填。
        // u格式:以无符号十进制形式输出整数。对长整型可以用"%lu"格式输出
    }

    depth.stop();    // Stop the flow of data

    depth.destroy(); // Destroy the stream

    // The close() function properly shuts down the hardware device. As a best practice, any device
    // that is opened should be closed. This will leave the driver and hardware device in a known
    // state so that future applications will not have difficulty connecting to them.
    device.close();

    // When an application is ready to exit, the OpenNI::shutdown() function
    // should be called to shutdown all drivers and properly clean up.
    OpenNI::shutdown();

    return 0;
}

  2. 事件方式:

#include <stdio.h>
#include "OpenNI.h"

#include "OniSampleUtilities.h"

using namespace openni;

// VideoFrameRef class encapsulates all data related to a single frame read from a VideoStream
void analyzeFrame(const VideoFrameRef& frame)
{
    DepthPixel* pDepth;
    RGB888Pixel* pColor;  // This structure stores a single color pixel value(in 24-bit RGB format).

    int middleIndex = (frame.getHeight()+1)*frame.getWidth()/2;

    // getVideoMode() can be used to determine the video mode settings of the sensor
    // This information includes the pixel format and resolution of the image, as well as the frame rate
    switch (frame.getVideoMode().getPixelFormat())
    {
        case PIXEL_FORMAT_DEPTH_1_MM:
        case PIXEL_FORMAT_DEPTH_100_UM:
            // VideoFrameRef::getData() returns a pointer directly to the underlying frame data.
            pDepth = (DepthPixel*)frame.getData();
            printf("[%08llu] %8d\n", (long long)frame.getTimestamp(),
                pDepth[middleIndex]);
            break;
        case PIXEL_FORMAT_RGB888:
            pColor = (RGB888Pixel*)frame.getData();
            printf("[%08llu] 0x%02x%02x%02x\n", (long long)frame.getTimestamp(),
                pColor[middleIndex].r&0xff,
                pColor[middleIndex].g&0xff,
                pColor[middleIndex].b&0xff);
            break;
        default:
            printf("Unknown format\n");
    }
}

// The VideoStream::NewFrameListener class is provided to allow the implementation of event driven frame reading.
// To use it, create a class that inherits from it and implement override the onNewFrame() method
class PrintCallback : public VideoStream::NewFrameListener
{
public:
    void onNewFrame(VideoStream& stream)
    {

        // Once a VideoStream has been created, data can be read from it directly with the VideoStream::readFrame() function.
        stream.readFrame(&m_frame);

        // VideoFrameRef objects are obtained from calling VideoStream::readFrame()
        analyzeFrame(m_frame);
    }
private:
    VideoFrameRef m_frame;
};

class OpenNIDeviceListener : public OpenNI::DeviceConnectedListener,
                                    public OpenNI::DeviceDisconnectedListener,
                                    public OpenNI::DeviceStateChangedListener
{
public:
    // All three events provide a pointer to a openNI::DeviceInfo object. This object can be used to get
    // details and identification of the device referred to by the event
    virtual void onDeviceStateChanged(const DeviceInfo* pInfo, DeviceState state)
    {
        printf("Device \"%s\" error state changed to %d\n", pInfo->getUri(), state);
    }

    virtual void onDeviceConnected(const DeviceInfo* pInfo)
    {
        printf("Device \"%s\" connected\n", pInfo->getUri());
    }

    virtual void onDeviceDisconnected(const DeviceInfo* pInfo)
    {
        printf("Device \"%s\" disconnected\n", pInfo->getUri());
    }
};

int main()
{
    Status rc = OpenNI::initialize();
    if (rc != STATUS_OK)
    {
        printf("Initialize failed\n%s\n", OpenNI::getExtendedError());
        return 1;
    }

    OpenNIDeviceListener devicePrinter;

    // OpenNI defines three events: onDeviceConnected, onDeviceDisconnected, and onDeviceStateChanged
    // Listener classes can be added or removed from the list of event handlers
    OpenNI::addDeviceConnectedListener(&devicePrinter);
    OpenNI::addDeviceDisconnectedListener(&devicePrinter);
    OpenNI::addDeviceStateChangedListener(&devicePrinter);

    openni::Array<openni::DeviceInfo> deviceList;
    // OpenNI::enumerateDevices() returns a list of all available devices connected to the system.
    openni::OpenNI::enumerateDevices(&deviceList);
    for (int i = 0; i < deviceList.getSize(); ++i)
    {
        // DeviceInfo::getUri returns te divice URI. URI can be used by Device::open to open a specific device.
        printf("Device \"%s\" already connected\n", deviceList[i].getUri());
    }

    Device device;
    rc = device.open(ANY_DEVICE);
    if (rc != STATUS_OK)
    {
        printf("Couldn‘t open device\n%s\n", OpenNI::getExtendedError());
        return 2;
    }

    VideoStream depth;

    if (device.getSensorInfo(SENSOR_DEPTH) != NULL)
    {
        rc = depth.create(device, SENSOR_DEPTH);
        if (rc != STATUS_OK)
        {
            printf("Couldn‘t create depth stream\n%s\n", OpenNI::getExtendedError());
        }
    }
    rc = depth.start();
    if (rc != STATUS_OK)
    {
        printf("Couldn‘t start the depth stream\n%s\n", OpenNI::getExtendedError());
    }

    PrintCallback depthPrinter;

    // Register your created class with an active VideoStream using the VideoStream::addNewFrameListener() function.
    // Once this is done, the event handler function you implemented will be called whenever a new frame becomes available.
    depth.addNewFrameListener(&depthPrinter);

    // Wait while we‘re getting frames through the printer
    while (!wasKeyboardHit())
    {
        Sleep(100);
    }

    // Removes a Listener from this video stream list. The listener removed will no longer receive new frame events from this stream
    depth.removeNewFrameListener(&depthPrinter);

    depth.stop();
    depth.destroy();
    device.close();
    OpenNI::shutdown();

    return 0;
}

  深度图像中每一个像素点所存储的数据就是这个像素点距离摄像头的距离,也就是深度,它可以看做一张由深度像素组成的一维矩阵,其大小为nXRes*nYRes(即X轴像素*Y轴像素)。如果输出模式为320*240、30帧的格式,当深度数据流开始时,就是以每秒30帧的速率产生一张像素为320*240的图。

  代码中声明了一个DepthPixel类型的指针pDepth,实质是一个用于指向存放深度图的一维数组。则图像中心点的深度信息middleIndex可以由下式算出:

        // getHeight() and getWidth() functions are provided to easily determine the resolution of the frame
        int middleIndex = (frame.getHeight() + 1) * frame.getWidth() / 2;

  这里需要注意,深度图中像素值直接代表深度,单位是mm。在通过convertDepthToWorld这个函数转换到真实坐标系(World coordinates)时Z坐标是不变的。函数会根据Depth坐标系下的(X, Y, Z),去计算World坐标系下的位置(X’, Y’, Z),X、Y是depth map的索引位置(X:为column ,Y:为row)。也就是说,在OpenNI里Depth坐标系和World坐标系下Z坐标的意义是相同的。所以,如果只想知道某个点的深度,其实不用转换到真实坐标系中,直接在投影坐标系对Z值做计算就可以了。但是如果想知道两点的距离,那就还是要进行转换,把真实坐标系的X、Y坐标计算出来。

  OpenNI applications commonly use two different coordinate systems to represent depth. These two systems are referred to as Depth and World representation. Depth coordinates are the native data representation. In this system, the frame is a map (two dimensional array), and each pixel is assigned a depth value. This depth value represents the distance between the camera plane and whatever object is in the given pixel. The X and Y coordinates are simply the location in the map, where the origin is the top-left corner of the field of view.

  World coordinates superimpose a more familiar 3D Cartesian coordinate system on the world, with the camera lens at the origin. In this system, every point is specified by 3 points – x, y and z. The x axis of this system is along a line that passes through the infrared projector and CMOS imager of the camera. The y axis is parallel to the front face of the camera, and perpendicular to the x axis (it will also be perpendicular to the ground if the camera is upright and level). The z axis runs into the scene, perpendicular to both the x and y axis. From the perspective of the camera, an object moving from left to right is moving along the increasing x axis. An object moving up is moving along the increasing y axis, and an object moving away from the camera is moving along the increasing z axis.

  Mathematically, the Depth coordinate system is the projection of the scene on the CMOS. If the sensor‘s angular field of view and resolution are known, then an angular size can be calculated for each pixel. This is how the conversion algorithms work.

  Note that converting from Depth to World coordinates is relatively expensive computationally. It is generally not practical to convert the entire raw depth map to World coordinates. A better approach is to have your computer vision algorithm work in Depth coordinates for as long as possible, and only converting a few specific points to World coordinates right before output. Note that when converting from Depth to World or vice versa, the Z value remains the same.

参考:

《OpenNI体感应用开发实战》 机械工业出版社

Kinect + OpenNI 的深度值

OpenNI 的座標系統

OpenNI 2 的座標系統轉換

OpenNI 2 VideoStream 與 Device 的設定與使用

时间: 2024-10-05 04:42:49

使用OpenNI 2获取RGBD摄像头深度信息的相关文章

你是这样获取人工智能AI前沿信息的吗?

前言 对于Researchers或者Geeks而言,特别是并没有在顶级的科研圈里的人,如何高效的获取最新的科技前沿,对自己的研究方向,定位是非常重要的.对于比如人工智能的入门者而言,确定方向更是重中之重.本人自己的经历发现常常相对旁人总能第一时间获取最新的技术前沿(显然也会漏到很多).然后觉得获取信息或许也是一种能力,值得去探索. 那么下面我就分享一下个人是如何收集前沿信息的. Tip 1: Search in English 很多信息都是通过英文第一时间发布的,所以,显然,用English是必

微信网页授权认证获取用户的详细信息,实现自动登陆-微信公众号开发干货

原创声明:本文为本人原创作品,绝非他处转账,转载请联系博主 从接触公众号到现在,开发维护了2个公众号,开发过程中遇到很多问题,现在把部分模块功能在这备案一下,做个总结也希望能给其他人帮助 工欲善其事,必先利其器,先看看开发公众号需要准备或了解什么 web开发工具:官方提供的开发工具,使用自己的微信号来调试微信网页授权.调试.检验页面的 JS-SDK 相关功能与权限,模拟大部分 SDK 的输入和输出.下载地址:web开发工具下载 开发文档:https://mp.weixin.qq.com/wiki

AppCan移动开发技巧:3步走,获取移动APP签名信息

大家知道,在移动APP开发里,与应用包名一样,应用的签名信息需是唯一的,否则将会出现应用冒领.重复安装等问题.之前分享过安卓应用的签名如何获取(点击查看),这里将继续以AppCan平台为例,分享如何获取APK包的签名信息及相关注意事项. 获取APK包签名信息 为了保证每个应用程序开发商的合法ID安全,防止部分开发商通过使用相同的包名混淆和替换已经安装的程序,开发者需要对发布的APK文件进行唯一签名,保证每次发布的应用版本的一致性(如果自动更新,则不会因为版本不一致而无法安装). 操作步骤: 1.

百度地图的使用之获取数据库表中信息的坐标点显示在地图上

//通用封装好的js: var map = new BMap.Map("container"); //建树Map实例 var point = new BMap.Point(103.976032, 33.845509); // 建树点坐标 map.centerAndZoom(point, 6); // 初始化地图,设置中心点坐标和地图级别. //地图事件设置函数: map.enableDragging(); //启用地图拖拽事件,默认启用(可不写) map.enableScrollWhe

开源项目成熟度分析工具-利用github api获取代码库的信息

1.github api github api是http形式的api,功能还是比较丰富的,博主因为项目的原因主要用到的是提取project信息这项功能,返回的数据是JSON格式. api页:https://developer.github.com/v3/ Options: (H) means HTTP/HTTPS only, (F) means FTP only --anyauth Pick "any" authentication method (H) -a, --append Ap

.net 获取https页面的信息 在iis7.5服务器上不管用

原文:.net 获取https页面的信息 在iis7.5服务器上不管用 让我纠结了一天多的问题,给大家看下,有相同情况的可以不用浪费时间了,本人当时找了好半天都没找到什么有用的信息,项目在本地没有问题,但部署在服务器后,获取不到https页面的信息,加入下面的代码就可以了,因为iis7.5的安全协议比较高的原因. 我的获取页面需要cookie,不需要的可以去掉: GET的方法: 1 /// <summary> 2 /// 获取URL访问的HTML内容 获取https 页面的 3 /// <

微信公众号开发之网页授权认证获取用户的详细信息,实现自动登陆

原创声明:本文转来源本人另一博客[http://blog.csdn.net/liaohaojian/article/details/70175835]绝非他人处转载 从接触公众号到现在,开发维护了2个公众号,开发过程中遇到很多问题,现在把部分模块功能在这备案一下,做个总结也希望能给其他人帮助 工欲善其事,必先利其器,先看看开发公众号需要准备或了解什么 web开发工具:官方提供的开发工具,使用自己的微信号来调试微信网页授权.调试.检验页面的 JS-SDK 相关功能与权限,模拟大部分 SDK 的输入

Java并发学习之二——获取和设置线程信息

本文是学习网络上的文章时的总结,感谢大家无私的分享. Thread类的对象中保存了一些属性信息能够帮助我们辨别每一个线程,知道它的一些信息 ID:每个线程的独特标示: Name:线程的名称: Priority:线程对象的优先级.优先级别在1-10之间,1是最低级,10是最高级. Status:线程状态.在java中,线程只有6种状态:new,runnable,blocked,waiting,time waiting 或terminated. 现在写一个程序,将线程的信息保存到文件中方便查看 pa

Android_获取手机各种详细信息

TelephonyManager类主要提供了一系列用于访问与手机通讯相关的状态和信息的get方法.其中包括手机SIM的状态和信息.电信网络的状态及手机用户的信息.在应用程序中可以使用这些get方法获取相关数据. TelephonyManager类的对象可以通过Context.getSystemService(Context.TELEPHONY_SERVICE)方法来获得,需要注意的是有些通讯信息的获取对应用程序的权限有一定的限制,在开发的时候需要为其添加相应的权限. 以下列出TelephonyM