制作立体图像(二):用Ogre渲染立体图像

了解红蓝眼镜原理之后剩下的事情就简单了

如果不清楚红蓝眼镜原理,请先看上一篇:制作立体图像(一):红蓝眼镜原理

另外你应该已经准备好了一副红蓝眼镜

现在戴上眼镜,先看看我们要做到的最终效果,一个旋转的立体地球:

(当然这个是静止截图)

先说说实现原理:

  1. 在坐标原点创建一个圆球模型,并贴上地球纹理
  2. 在恰当位置创建两个相机,并将两个相机的结果渲染到左右纹理
  3. 绘制全屏四边形,并应用立体材质,材质中通过shader对步骤2的纹理做红绿蓝混合,这个全屏四边形就是我们最终想要的结果

以下是详细说明:

  1. 创建三维模型
    这一步最重要的是制作一副高清的地球纹理图,类似下面这样

    不过图片nasa早就为你准备好了,你可以到这里下载任何你想要的(鬼子真的很强大)
    创建地球mesh的代码也早有人帮你写好了,详见附带文件中函数:

    //根据mesh名称、半径、经纬线条数创建对应的mesh
    void MyApplication::createSphere(const std::string& meshName, const float r, const int nRings, const int nSegments)
  2. 相机设置
    渲染到纹理,左眼使用主相机mCamera,需另创建右眼相机

    //左眼纹理
        Ogre::TexturePtr textureLeft = Ogre::TextureManager::getSingleton().createManual("textureLeft", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, Ogre::TEX_TYPE_2D, mWindow->getWidth(), mWindow->getHeight(), 0, Ogre::PF_R8G8B8, Ogre::TU_RENDERTARGET);
        Ogre::RenderTexture *targetLeft = textureLeft->getBuffer()->getRenderTarget();
        targetLeft->addViewport(mCamera);
    
        //右眼纹理
        Ogre::Camera *cameraRight = mSceneMgr->createCamera("cameraRight");
        ...同上... ;

    设置相机

    //设置相机位置、焦距
        const int x = 10, y = 150, z = 400;
        mCamera->setPosition(-x, y, z);
        cameraRight->setPosition(x, y, z);
        mCamera->lookAt(0, 0, 0);
        cameraRight->lookAt(0, 0, 0);
        mCamera->setFrustumOffset(x + x);
        mCamera->setFocalLength(Ogre::Math::Sqrt(x*x + y*y + z*z));

    setFrustumOffset setFocalLength 为Ogre提供的用于立体渲染辅助方法,可调整视角偏移
    你可以通过设置很远的焦距和很小的fovy制作出看上去很远很大的地球

  3. 全屏四边形,最终的渲染效果
    这里使用Ogre::Rectangle2D:

        mScreen = new Ogre::Rectangle2D(true);
        mScreen->setCorners(-1, 1, 1, -1, true);
        mScreen->setMaterial("stereo/fp");

    材质stereo/fp定义:(这里使用cg脚本以支持direct3d+opengl,同时代码也简短)

    fragment_program fpCG cg
    {
        source stereo.cg
        entry_point RedCyan
        profiles ps_2_0 arbfp1
    }
    
    material stereo/fpCG
    { technique { pass {
                fragment_program_ref fpCG{}
    
                texture_unit
                {
                    texture textureLeft
                }
                texture_unit
                {
                    texture textureRight
                }
    } } }

    材质脚本指定了左右相机渲染的textureLeft、textureRight两幅纹理,并引用RedCyan着色器

    CG脚本,stereo.cg:

    void RedCyan(
        float2 uv : TEXCOORD0,
        out float4 color :COLOR,
        uniform sampler2D t1 : register(s0),
        uniform sampler2D t2 : register(s1))
    {
        color = float4(tex2D(t1, uv) * float4(1, 0, 0, 0) + tex2D(t2, uv) * float4(0, 1, 1, 1));
    }

    简单的取左右纹理对应红+绿蓝分量即可
    注:这里用的乘法后相加,如果直接先取左右纹理颜色,再提取rgb分量的形式,如:color = float4(c1.r, c2.g, c2.b, 1)会导致与direct3d不兼容,and i don‘t konw why:(

  4. 其它
    因为我们使用全屏四边形,在左右相机渲染纹理的时候需要隐藏,不然有可能将我们的四边形渲染到纹理中
    这里需要实现RenderTargerListener接口,在渲染前后做显隐控制:

        virtual void preRenderTargetUpdate(const Ogre::RenderTargetEvent& evt)
        {
            mScreen->setVisible(false);
        }
        virtual void postRenderTargetUpdate(const Ogre::RenderTargetEvent& evt)
        {
            mScreen->setVisible(true);
        }

    同时在createScene中注册对应的listener:

        targetLeft->addListener(this);
        targetRight->addListener(this);
  5. 最后是锦上添花的一步:让我们的地球转起来

    bool MyApplication::frameRenderingQueued(const Ogre::FrameEvent &evt)
    {
        mEarthNode->yaw(Ogre::Radian(evt.timeSinceLastFrame * 0.5));
        return true;
    }

    附程序代码:

    #pragma once
    
    #include <vector>
    #include <fstream>
    #include <string>
    
    #include <Ogre/Ogre.h>
    #include <OIS/OIS.h>
    
    class MyApplication: public Ogre::RenderTargetListener, public Ogre::FrameListener, public OIS::KeyListener
    {
    public:
        MyApplication(void){
            mSceneMgr = NULL;
            mRoot = NULL;
        }
    
        ~MyApplication(void){
            mInputManager->destroyInputObject(mKeyboard);
            mInputManager->destroyInputObject(mMouse);
            OIS::InputManager::destroyInputSystem(mInputManager);
            delete mRoot;
        }
    
        int startup();
    
    private:
        void createScene();
    
        virtual void preRenderTargetUpdate(const Ogre::RenderTargetEvent& evt)
        {
            mScreen->setVisible(false);
        }
    
        virtual void postRenderTargetUpdate(const Ogre::RenderTargetEvent& evt)
        {
            mScreen->setVisible(true);
        }
    
        Ogre::MovableObject* createSphere();
    
        void createSphere(const std::string& meshName, const float r, const int nRings = 16, const int nSegments = 16);
    
        bool frameStarted(const Ogre::FrameEvent& evt);
    
        bool frameEnded(const Ogre::FrameEvent& evt);
    
        bool frameRenderingQueued(const Ogre::FrameEvent &evt);
    
        bool keyPressed(const OIS::KeyEvent &e);
    
        bool keyReleased(const OIS::KeyEvent &e) { return true; }
    
        void _createAxis(const int lenth); //创建坐标轴:  x red, y green, z blue
    
        void _loadResources(const char* resoureFile);
    
        void _createInput();
    
        void _showDebugOverlay(bool show);
    
        void _updateStats(void);
    
        void _keyPressedDefault(const OIS::KeyEvent &e);
    
        //默认键盘、鼠标导航
        bool _navigateDefault(const Ogre::FrameEvent& evt);
    
        Ogre::SceneManager* mSceneMgr;
        Ogre::RenderWindow* mWindow;
        Ogre::Camera* mCamera;
        Ogre::Root* mRoot;
        Ogre::SceneNode* mRootNode;        //根节点
    
        OIS::InputManager* mInputManager;
        OIS::Keyboard* mKeyboard;
        OIS::Mouse* mMouse;
    
        Ogre::SceneNode* mEarthNode;
        Ogre::Rectangle2D* mScreen;
    
        int mNumScreenShots;    //截图顺序号
    
        bool mStatsOn;
        Ogre::Overlay* mDebugOverlay;
    };

    MyApplication.h

    //易变更部分
    #include "MyApplication.h"
    
    void MyApplication::createScene()
    {
        mEarthNode = mRootNode->createChildSceneNode();
        mEarthNode->attachObject(createSphere());
    
        //左眼纹理
        Ogre::TexturePtr textureLeft = Ogre::TextureManager::getSingleton().createManual("textureLeft", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, Ogre::TEX_TYPE_2D, mWindow->getWidth(), mWindow->getHeight(), 0, Ogre::PF_R8G8B8, Ogre::TU_RENDERTARGET);
        Ogre::RenderTexture *targetLeft = textureLeft->getBuffer()->getRenderTarget();
        targetLeft->addViewport(mCamera);
    
        //右眼纹理
        Ogre::Camera *cameraRight = mSceneMgr->createCamera("cameraRight");
        cameraRight->setAspectRatio(Ogre::Real(mWindow->getWidth()) / Ogre::Real(mWindow->getHeight()));
        Ogre::TexturePtr textureRight = Ogre::TextureManager::getSingleton().createManual("textureRight", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, Ogre::TEX_TYPE_2D, mWindow->getWidth(), mWindow->getHeight(), 0, Ogre::PF_R8G8B8, Ogre::TU_RENDERTARGET);
        Ogre::RenderTexture *targetRight = textureRight->getBuffer()->getRenderTarget();
        targetRight->addViewport(cameraRight);
    
        //设置相机位置、焦距
        const int x = 10, y = 150, z = 400;
        mCamera->setPosition(-x, y, z);
        cameraRight->setPosition(x, y, z);
        mCamera->lookAt(0, 0, 0);
        cameraRight->lookAt(0, 0, 0);
        mCamera->setFrustumOffset(x + x);
        mCamera->setFocalLength(Ogre::Math::Sqrt(x*x + y*y + z*z));
    
        mScreen = new Ogre::Rectangle2D(true);
        mScreen->setCorners(-1, 1, 1, -1, true);
        mScreen->setMaterial("stereo/fpCG");
        mRootNode->attachObject(mScreen);
    
        targetLeft->addListener(this);
        targetRight->addListener(this);
    }
    
    bool MyApplication::keyPressed(const OIS::KeyEvent &e)
    {
        _keyPressedDefault(e);
    
        return true;
    }
    
    bool MyApplication::frameStarted(const Ogre::FrameEvent& evt)
    {
        //if(!_navigateDefault(evt)) return false;
        mKeyboard->capture();
        if(mKeyboard->isKeyDown(OIS::KC_ESCAPE)){
            return false;
        }
    
        return true;
    }
    
    bool MyApplication::frameEnded(const Ogre::FrameEvent& evt){
        _updateStats();
    
        return true;
    }
    
    bool MyApplication::frameRenderingQueued(const Ogre::FrameEvent &evt)
    {
        mEarthNode->yaw(Ogre::Radian(evt.timeSinceLastFrame * 0.5));
        return true;
    }
    
    Ogre::MovableObject* MyApplication::createSphere(){
        createSphere("mySphereMesh", 100, 100, 100);
        Ogre::Entity* sphereEntity = mSceneMgr->createEntity ("mySphereEntity", "mySphereMesh");
        sphereEntity->setMaterialName("Test/earth");
    
        return sphereEntity;
    }
    
    //根据mesh名称、半径、经纬线条数创建对应的mesh
    void MyApplication::createSphere(const std::string& meshName, const float r, const int nRings, const int nSegments)
    {
        Ogre::MeshPtr pSphere = Ogre::MeshManager::getSingleton().createManual(meshName, Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME);
        Ogre::SubMesh *pSphereVertex = pSphere->createSubMesh();
    
        Ogre::VertexData* vertexData = new Ogre::VertexData();
        pSphere->sharedVertexData = vertexData;
    
        // define the vertex format
        Ogre::VertexDeclaration* vertexDecl = vertexData->vertexDeclaration;
        size_t currOffset = 0;
        // positions
        vertexDecl->addElement(0, currOffset, Ogre::VET_FLOAT3, Ogre::VES_POSITION);
        currOffset += Ogre::VertexElement::getTypeSize(Ogre::VET_FLOAT3);
        //// DIFFUSE
        //vertexDecl->addElement(0, currOffset, VET_FLOAT3, Ogre::VES_DIFFUSE);
        //currOffset += VertexElement::getTypeSize(VET_FLOAT3);
        // normals
        vertexDecl->addElement(0, currOffset, Ogre::VET_FLOAT3, Ogre::VES_NORMAL);
        currOffset += Ogre::VertexElement::getTypeSize(Ogre::VET_FLOAT3);
        //// two dimensional texture coordinates
        vertexDecl->addElement(0, currOffset, Ogre::VET_FLOAT2, Ogre::VES_TEXTURE_COORDINATES, 0);
        currOffset += Ogre::VertexElement::getTypeSize(Ogre::VET_FLOAT2);
    
        // allocate the vertex buffer
        vertexData->vertexCount = (nRings + 1) * (nSegments+1);
        Ogre::HardwareVertexBufferSharedPtr vBuf = Ogre::HardwareBufferManager::getSingleton().createVertexBuffer(vertexDecl->getVertexSize(0), vertexData->vertexCount, Ogre::HardwareBuffer::HBU_STATIC_WRITE_ONLY, false);
        Ogre::VertexBufferBinding* binding = vertexData->vertexBufferBinding;
        binding->setBinding(0, vBuf);
        float* pVertex = static_cast<float*>(vBuf->lock(Ogre::HardwareBuffer::HBL_DISCARD));
    
        // allocate index buffer
        pSphereVertex->indexData->indexCount = 6 * nRings * (nSegments + 1);
        pSphereVertex->indexData->indexBuffer = Ogre::HardwareBufferManager::getSingleton().createIndexBuffer(Ogre::HardwareIndexBuffer::IT_16BIT, pSphereVertex->indexData->indexCount, Ogre::HardwareBuffer::HBU_STATIC_WRITE_ONLY, false);
        Ogre::HardwareIndexBufferSharedPtr iBuf = pSphereVertex->indexData->indexBuffer;
        unsigned short* pIndices = static_cast<unsigned short*>(iBuf->lock(Ogre::HardwareBuffer::HBL_DISCARD));
    
        float fDeltaRingAngle = float(Ogre::Math::PI / nRings);
        float fDeltaSegAngle = float(2 * Ogre::Math::PI / nSegments);
        unsigned short wVerticeIndex = 0 ;
    
        // Generate the group of rings for the sphere
        for( int ring = 0; ring <= nRings; ring++ ) {
            float r0 = r * sinf (ring * fDeltaRingAngle);
            float y0 = r * cosf (ring * fDeltaRingAngle);
    
            // Generate the group of segments for the current ring
            for(int seg = 0; seg <= nSegments; seg++) {
                float x0 = r0 * sinf(seg * fDeltaSegAngle);
                float z0 = r0 * cosf(seg * fDeltaSegAngle);
    
                // Add one vertex to the strip which makes up the sphere
                *pVertex++ = x0;
                *pVertex++ = y0;
                *pVertex++ = z0;
    
                Ogre::Vector3 vNormal = Ogre::Vector3(x0, y0, z0).normalisedCopy();
                *pVertex++ = vNormal.x;
                *pVertex++ = vNormal.y;
                *pVertex++ = vNormal.z;
    
                *pVertex++ = (float) seg / (float) nSegments;
                *pVertex++ = (float) ring / (float) nRings;
    
                if (ring != nRings) {
                    // each vertex (except the last) has six indices pointing to it
                    *pIndices++ = wVerticeIndex + nSegments + 1;
                    *pIndices++ = wVerticeIndex;
                    *pIndices++ = wVerticeIndex + nSegments;
                    *pIndices++ = wVerticeIndex + nSegments + 1;
                    *pIndices++ = wVerticeIndex + 1;
                    *pIndices++ = wVerticeIndex;
                    wVerticeIndex ++;
                }
            }; // end for seg
        } // end for ring
    
        // Unlock
        vBuf->unlock();
        iBuf->unlock();
        // Generate face list
        pSphereVertex->useSharedVertices = true;
    
        // the original code was missing this line:
        pSphere->_setBounds( Ogre::AxisAlignedBox(Ogre::Vector3(-r, -r, -r), Ogre::Vector3(r, r, r) ), false );
        pSphere->_setBoundingSphereRadius(r);
        // this line makes clear the mesh is loaded (avoids memory leaks)
        pSphere->load();
    }

    MyApplication.cpp

    //系统不常变更部分实现
    #include "MyApplication.h"
    #include "windows.h"
    
    int main(int argc, char *argv[])
    {
        //设置当前工作目录,用于文件关联打开方式
        std::string file(argv[0]);
        SetCurrentDirectoryA(file.substr(0, file.find_last_of("\\")).c_str());
    
        MyApplication app;
        app.startup();
    }
    
    int MyApplication::startup()
    {
    
    #ifdef _DEBUG
        mRoot = new Ogre::Root("../plugins_d.cfg", "../ogre.cfg", "../Ogre.log");
    #else
        mRoot = new Ogre::Root("../plugins.cfg", "../ogre.cfg", "../Ogre.log");
    #endif
    
        if(!mRoot->showConfigDialog()){
            //if(!mRoot->showConfigDialog()){
            return -1;
        }
    
        mWindow = mRoot->initialise(true, "Ogre3D");
        mSceneMgr = mRoot->createSceneManager(Ogre::ST_EXTERIOR_CLOSE);
    
        mCamera = mSceneMgr->createCamera("camera");
        mCamera->setPosition(Ogre::Vector3(100, 200, 300));
        mCamera->lookAt(Ogre::Vector3(0, 0, 0));
        mCamera->setNearClipDistance(10); //default [100, 100 * 1000]
    
        Ogre::Viewport* viewport = mWindow->addViewport(mCamera);
        viewport->setBackgroundColour(Ogre::ColourValue(0.0, 0.0, 0.0));
        mCamera->setAspectRatio(Ogre::Real(viewport->getActualWidth())/Ogre::Real(viewport->getActualHeight()));
    
        mRootNode = mSceneMgr->getRootSceneNode();
    
        _loadResources("../resources_testStereo.cfg");
    
        createScene();
        _createAxis(100);
        _createInput();
    
        mDebugOverlay = Ogre::OverlayManager::getSingleton().getByName("Core/DebugOverlay");
        _showDebugOverlay(true);
    
        mRoot->addFrameListener(this);
    
        mRoot->startRendering();
        return 0;
    }
    
    void MyApplication::_createAxis(const int lenth)
    {
        Ogre::ManualObject *mo = mSceneMgr->createManualObject();
        mo->begin("BaseWhiteNoLighting", Ogre::RenderOperation::OT_LINE_LIST);
        mo->position(lenth, 0, 0);
        mo->colour(1.0, 0, 0);
        mo->position(0, 0, 0);
        mo->colour(1.0, 0, 0);
        mo->position(0, lenth, 0);
        mo->colour(0, 1.0, 0);
        mo->position(0, 0, 0);
        mo->colour(0, 1.0, 0);
        mo->position(0, 0, lenth);
        mo->colour(0, 0, 1.0);
        mo->position(0 , 0, 0);
        mo->colour(0, 0, 1.0);
        mo->end();
        mRootNode->attachObject(mo);
    }
    
    void MyApplication::_loadResources(const char* resourceFile)
    {
        Ogre::ConfigFile cf;
        cf.load(resourceFile);
    
        Ogre::ConfigFile::SectionIterator sectionIter = cf.getSectionIterator();
        Ogre::String sectionName, typeName, dataName;
        while(sectionIter.hasMoreElements()){
            sectionName = sectionIter.peekNextKey();
            Ogre::ConfigFile::SettingsMultiMap *settings = sectionIter.getNext();
            Ogre::ConfigFile::SettingsMultiMap::iterator i;
            for(i=settings->begin(); i!=settings->end(); i++){
                typeName =i->first;
                dataName = i->second;
    
                Ogre::ResourceGroupManager::getSingleton().addResourceLocation(dataName, typeName, sectionName);
            }
        }
    
        Ogre::ResourceGroupManager::getSingleton().initialiseAllResourceGroups();
    }
    
    void MyApplication::_updateStats(void)
    {
        static Ogre::String currFps = "Current FPS: ";
        static Ogre::String avgFps = "Average FPS: ";
        static Ogre::String bestFps = "Best FPS: ";
        static Ogre::String worstFps = "Worst FPS: ";
        static Ogre::String tris = "Triangle Count: ";
        static Ogre::String batches = "Batch Count: ";
    
        // update stats when necessary
        try {
            Ogre::OverlayElement* guiAvg = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/AverageFps");
            Ogre::OverlayElement* guiCurr = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/CurrFps");
            Ogre::OverlayElement* guiBest = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/BestFps");
            Ogre::OverlayElement* guiWorst = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/WorstFps");
    
            const Ogre::RenderTarget::FrameStats& stats = mWindow->getStatistics();
            guiAvg->setCaption(avgFps + Ogre::StringConverter::toString(stats.avgFPS));
            guiCurr->setCaption(currFps + Ogre::StringConverter::toString(stats.lastFPS));
            guiBest->setCaption(bestFps + Ogre::StringConverter::toString(stats.bestFPS)
                +" "+Ogre::StringConverter::toString(stats.bestFrameTime)+" ms");
            guiWorst->setCaption(worstFps + Ogre::StringConverter::toString(stats.worstFPS)
                +" "+Ogre::StringConverter::toString(stats.worstFrameTime)+" ms");
    
            Ogre::OverlayElement* guiTris = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/NumTris");
            guiTris->setCaption(tris + Ogre::StringConverter::toString(std::max((int)stats.triangleCount, 230) - 230));
    
            Ogre::OverlayElement* guiBatches = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/NumBatches");
            guiBatches->setCaption(batches + Ogre::StringConverter::toString((int)stats.batchCount - 10));
    
            //Ogre::OverlayElement* guiDbg = Ogre::OverlayManager::getSingleton().getOverlayElement("Core/DebugText");
            //guiDbg->setCaption("mDebugText");
        }
        catch(...) { /* ignore */ }
    }
    
    void MyApplication::_showDebugOverlay(bool show)
    {
        if (mDebugOverlay)
        {
            if (show)
                mDebugOverlay->show();
            else
                mDebugOverlay->hide();
        }
    }
    
    void MyApplication::_createInput()
    {
        OIS::ParamList parameters;
        unsigned int windowHandle = 0;
        std::ostringstream windowHandleString;
    
        mWindow->getCustomAttribute("WINDOW", &windowHandle);
        windowHandleString<<windowHandle;
        parameters.insert(std::make_pair("WINDOW", windowHandleString.str()));
    
        mInputManager = OIS::InputManager::createInputSystem(parameters);
        mKeyboard = static_cast<OIS::Keyboard*>(mInputManager->createInputObject(OIS::OISKeyboard, true));
        mMouse = static_cast<OIS::Mouse*>(mInputManager->createInputObject(OIS::OISMouse, true));
        mKeyboard->setEventCallback(this);
    }
    
    void MyApplication::_keyPressedDefault(const OIS::KeyEvent &e)
    {
        if(e.key == OIS::KC_SYSRQ)
        {
            std::ostringstream ss;
            ss << "screenshot_" << ++mNumScreenShots << ".png";
            mWindow->writeContentsToFile(ss.str());
        }
        else if(e.key == OIS::KC_G)
        {
            mStatsOn = !mStatsOn;
            _showDebugOverlay(mStatsOn);
        }
        else if(e.key == OIS::KC_R)
        {
            if(mCamera->getPolygonMode() == Ogre::PM_SOLID)
            {
                mCamera->setPolygonMode(Ogre::PM_WIREFRAME);
            }
            else
            {
                mCamera->setPolygonMode(Ogre::PM_SOLID);
            }
        }
    }
    
    //默认键盘、鼠标导航
    bool MyApplication::_navigateDefault(const Ogre::FrameEvent& evt)
    {
        mKeyboard->capture();
        if(mKeyboard->isKeyDown(OIS::KC_ESCAPE)){
            return false;
        }
    
        Ogre::Vector3 translate(0, 0, 0);
        if(mKeyboard->isKeyDown(OIS::KC_W)){
            translate +=Ogre::Vector3(0, 0, -1);
        }
        if(mKeyboard->isKeyDown(OIS::KC_S)){
            translate += Ogre::Vector3(0, 0, 1);
        }
        if(mKeyboard->isKeyDown(OIS::KC_A)){
            translate += Ogre::Vector3(-1, 0, 0);
        }
        if(mKeyboard->isKeyDown(OIS::KC_D)){
            translate += Ogre::Vector3(1, 0, 0);
        }
        if(mKeyboard->isKeyDown(OIS::KC_Q)){
            translate += mCamera->getOrientation().Inverse() * Ogre::Vector3(0, 1, 0);
        }
        if(mKeyboard->isKeyDown(OIS::KC_E)){
            translate += mCamera->getOrientation().Inverse() *  Ogre::Vector3(0, -1, 0);
        }
    
        Ogre::Real speed = mCamera->getPosition().y;
        if(speed < 5) speed =5;
        mCamera->moveRelative(translate * evt.timeSinceLastFrame *  speed);
    
        if(mKeyboard->isKeyDown(OIS::KC_UP)){
            mCamera->pitch(Ogre::Radian(-evt.timeSinceLastFrame));
        }else if(mKeyboard->isKeyDown(OIS::KC_DOWN)){
            mCamera->pitch(Ogre::Radian(evt.timeSinceLastFrame));
        }
        if(mKeyboard->isKeyDown(OIS::KC_LEFT)){
            mCamera->yaw(Ogre::Radian(evt.timeSinceLastFrame));
        }
        if(mKeyboard->isKeyDown(OIS::KC_RIGHT)){
            mCamera->yaw(Ogre::Radian(-evt.timeSinceLastFrame * 0.3f));
        }
    
        mMouse->capture();
        Ogre::Real rotX = Ogre::Math::Clamp(mMouse->getMouseState().X.rel * evt.timeSinceLastFrame * -1, -0.1f, 0.1f);
        Ogre::Real rotY = Ogre::Math::Clamp(mMouse->getMouseState().Y.rel * evt.timeSinceLastFrame * -1, -0.1f, 0.1f);
    
        mCamera->yaw(Ogre::Radian(rotX));
        mCamera->pitch(Ogre::Radian(rotY));
    
        return true;
    }

    MyApplicationConst.cpp

    

  

时间: 2024-08-29 14:00:24

制作立体图像(二):用Ogre渲染立体图像的相关文章

[WebGL入门]二十,绘制立体模型(圆环体)

注:文章译自http://wgld.org/,原作者杉本雅広(doxas),文章中如果有我的额外说明,我会加上[lufy:],另外,鄙人webgl研究还不够深入,一些专业词语,如果翻译有误,欢迎大家指正. 本次的demo的运行结果 立体的模型 这次稍微喘口气,开始绘制立体模型.这里说的[喘口气]是指本次的文章中没有出现任何新的技术知识点.只是利用到现在为止所介绍过的内容,来绘制一个立体的圆环体.到现在为止,只绘制了三角形和四边形,当然,在三维空间中绘制简单的多边形也没什么不对,但是缺点儿说服力.

双摄像头立体成像(三)-畸变矫正与立体校正

畸变矫正是上一篇博文的遗留问题,当畸变系数和内外参数矩阵标定完成后,就应该进行畸变的矫正,以达到消除畸变的目的,此其一. 在该系列第一部分的博文中介绍的立体成像原理中提到,要通过两幅图像估计物点的深度信息,就必须在两幅图像中准确的匹配到同一物点,这样才能根据该物点在两幅图像中的位置关系,计算物体深度.为了降低匹配的计算量,两个摄像头的成像平面应处于同一平面.但是,单单依靠严格的摆放摄像头来达到这个目的显然有些困难.立体校正就是利用几何图形变换(Geometric Image Transforma

Ogre 渲染目标解析与多文本合并渲染

实现目标 因为需求,想找一个在Ogre中好用的文本显示,经过查找和一些比对.有三种方案 一利用Overlay的2D显示来达到效果. http://www.ogre3d.org/tikiwiki/tiki-index.php?page=MovableTextOverlay 二重写Renderable与MovableObject,利用对应字体查找到每个字符元素纹理坐标. http://www.ogre3d.org/tikiwiki/tiki-index.php?page=MovableText 三利

WPF技术触屏上的应用系列(四): 3D效果图片播放器(图片立体轮放、图片立体轮播、图片倒影立体滚动)效果实现

原文:WPF技术触屏上的应用系列(四): 3D效果图片播放器(图片立体轮放.图片立体轮播.图片倒影立体滚动)效果实现 去年某客户单位要做个大屏触屏应用,要对档案资源进行展示之用.客户端是Window7操作系统,54寸大屏电脑电视一体机.要求有很炫的展示效果,要有一定的视觉冲击力,可触控操作.当然满足客户的要求也可以有其它途径.但鉴于咱是搞 .NET技术的,首先其冲想到的微软WPF方面,之前对WPF的了解与学习也只是停留在比较浅的层面,没有进一步深入学习与应用.所以在项目接来以后,也就赶鸭子上架了

C# 指针操作图像 二值化处理

/// <summary> /// 二值化图像 /// </summary> /// <param name="bmp"></param> /// <returns></returns> private static unsafe Bitmap Binaryzation(Bitmap bmp) { BitmapData dstData = bmp.LockBits(new Rectangle(0, 0, bmp.W

cocos2dx 制作单机麻将(二)

麻将逻辑2. 打乱麻将顺序2 前面讲解了如何打乱初始给定的麻将牌堆, 还有一种是打乱任意给定的麻将牌堆 //混乱扑克2 void RandAppointCardData(BYTE cbCardData[],BYTE cbMaxCount,BYTE OriginalData[]/*源牌堆数据*/) { //混乱扑克 BYTE cbRandCount=0,cbPosition=0; do { cbPosition=rand()%(cbMaxCount-cbRandCount); cbCardData

图像处理之积分图应用四(基于局部均值的图像二值化算法)

图像处理之积分图应用四(基于局部均值的图像二值化算法) 基本原理 均值法,选择的阈值是局部范围内像素的灰度均值(gray mean),该方法的一个变种是用常量C减去均值Mean,然后根据均值实现如下操作: pixel = (pixel > (mean - c)) ? object : background 其中默认情况下参数C取值为0.object表示前景像素,background表示背景像素. 实现步骤 1. 彩色图像转灰度图像 2. 获取灰度图像的像素数据,预计算积分图 3. 根据输入的参数

基于直方图的图像二值化算法实现

引言 图像二值化的目的是最大限度的将图象中感兴趣的部分保留下来,在很多情况下,也是进行图像分析.特征提取与模式识别之前的必要的图像预处理过程.在过去年里受到国内外学者的广泛关注,产生了数以百计的阈值选取方法,但如同其他图像分割算法一样,没有一个现有方法对各种各样的图像都能得到令人满意的结果. 在分类方法中,基于直方图的全局二值算法都从不同的科学层次提出了各自的实施方案,并且这类方法都有着一些共同的特点:简单.算法容易实现和执行速度快. 算法代码 第一种方法Huang L.-K et al.参考代

一种超级快速的图像二值化技术

在计算机视觉中,对图像进行二值化恐怕是最常见的操作了.为了检测目标,可能需要对每一帧图像的每一个像素点进行运算.如果能提升二值化的速度,那么,你的算法的效率就会大大的提高.本文,将介绍一种超级快速的图像二值化技术. 要解决的问题: 如上图所示,需要把彩色图像中, (1) R通道介于(smoevalue1, somevalue2)(2) G通道介于(somevalue3, somevalue4)(3) B通道介于(somevalue5, somevalue6)当图像中某个像素点同时满足上面3个条件