a new Webcam Api Tutorial in C++ for Windows(Windows Media Foundation)--WMF

Sample source code: http://pan.baidu.com/s/1o60VAEA

Foword from: http://www.dreamincode.net/forums/topic/347938-a-new-webcam-api-tutorial-in-c-for-windows/page__st__0%26

Well,

A long time ago I introduced a Webcam Tutorial.

It was for Video For Windows (VFW) an old XP api.

To be fair to myself I was running XP at the time on a Desktop PC with a usb webcam.

The program worked fine under these circumstances. It did what it said on the tin.

Then Came Vista, Windows 7 and finally Windows 8.

This muddied the waters somewhat as VFW was deprecated and performance was patchy

depending on your machine setup especially on Laptops.

Microsoft did their best to keep up with changing Technology they introduced DirectShow,

with GraphBuilders and FilterGraphs etc. It was possible to make a similar webcam program

with this technology that I had made with VFW but it was much harder to understand and get to grips with.

Then this technology was deprecated by Microsoft in favour of a new API called Windows Media Foundation (WMF).

This technology was much harder to understand than DirectShow it dealt with streams in a very different way than DirectShow and talked about topologies and sinkwriters. There was a book produced, I must say a very hard to understand book called "Developing
Microsoft Media Foundation Applications" by Anton Pollinger which eased the pain of understanding the new API a little.

There was an easy to understand and implement side of WMF which was called MFPlay in which again I could write a similar program which did the same things I had implemented under VFW.

Then Microsoft deprecated MFPlay.... yes even though it was a brand new api.

So I have had a problem watching all this and considering all the options and what best course of action to take in developing a new Webcam Tutorial I have decided the best course of action is to take the newest still not deprecated route which is DirectX 3D
rendered from WMF technologies, this currently seems to be Microsoft‘s preferred option and it does have a lot to give it merit as you will see you can do a lot more with WMF and DirectX 3D than you ever could with VFW.

However, this means this tutorial will be hard to understand, this once simple subject is now made very complicated.

I will try my best to guide you safely through it.

I am going to start light with the resource.rc file

resource.rc

#include "resource.h"
#include "windows.h"

/////////////////////////////////////////////////////////////////////////////
//
// Menu
//

IDR_MENU1 MENU
BEGIN
    POPUP "&File"
    BEGIN
        MENUITEM "Choose &Device",              ID_FILE_CHOOSEDEVICE
		 MENUITEM "Capture Image",              ID_FILE_CAPTURE
    END
END

/////////////////////////////////////////////////////////////////////////////
//
// Dialog
//

IDD_CHOOSE_DEVICE DIALOGEX 0, 0, 186, 90
STYLE DS_SETFONT | DS_MODALFRAME | DS_FIXEDSYS | WS_POPUP | WS_CAPTION | WS_SYSMENU
CAPTION "Select Device"
FONT 8, "MS Shell Dlg", 400, 0, 0x1
BEGIN
    DEFPUSHBUTTON   "OK",IDOK,129,7,50,14
    PUSHBUTTON      "Cancel",IDCANCEL,129,24,50,14
    LISTBOX         IDC_DEVICE_LIST,7,7,110,76,LBS_SORT | LBS_NOINTEGRALHEIGHT | WS_VSCROLL | WS_TABSTOP
END

///////////////////////////////////////////////
//
//Images
//

IMAGE  RCDATA  "text.png"

This code sets up a Menu it has two functions

  • 1 to choose a webcam device
  • 2 a capture function which will grab a still image from the webcam.

It also includes a dialog box for choosing which webcam device to collect data from.

and finally a mystery IMAGE called text.png which I will supply later and talk about much later.

moving swiftly on...

resource.h

#define IDR_MENU1                       101
#define IDD_CHOOSE_DEVICE               102
#define IDC_LIST1                       1001
#define IDC_DEVICE_LIST                 1002
#define ID_FILE_CHOOSEDEVICE            40001
#define ID_FILE_CAPTURE                 50001
#define IDC_STATIC                      -1

#define IMAGE                          301

This sets up some defines for the resource.rc file the menu, the dialog box and finally the mystery image.

Before I go any further it would be wise to explain a little about WMF.

WMF uses COM extensively but it is not a pure COM API instead MF is a mix of COM and normal objects but because it does use COM you must at the beginning of your program initialize COM by callingCoInitializeEx() you must also initialize
WMF by callingMFStartup() this means on exiting your application you must shutdown these objects.

Since the headers and libs I will be using are not available outwith the MS compilers this build will not as usual from my tutorials run on any MinGW compilers I am truly sorry for this but there was no way to make it so.

I will be however aiming for this to run on any Express versions of the MS compiler with the Windows 7 SDK and .net framework 4.0. installed. Since we are using MS compilers only I have decided to make this build UNICODE.

The DirectX 3D version that I will use will be 9. This is for backwards compatibility. Those who wish to reprogram it for DirectX 3D version 11 can do so and it is perfectly possible to translate this code to do so.

Most of the common headers I have included in a file called imaginatively ‘Common.h‘. And here it is.

Common.h

#include <d3d9.h>
#include <Windows.h>
#include <WindowsX.h>
#include <mfapi.h>
#include <mfidl.h>
#include <mfreadwrite.h>
#include <mferror.h>
#include <assert.h>
#include <d3dx9.h>
#include <ks.h>
#include <ksmedia.h>
#include <Dbt.h>
#include <tchar.h>
#include <strsafe.h>
#include <AclAPI.h>

template <class T> void SafeRelease(T **ppT)
{
    if (*ppT)
    {
        (*ppT)->Release();
        *ppT = NULL;
    }
}

#define BREAK_ON_FAIL(value)            if(FAILED(value)) break;
#define BREAK_ON_NULL(value, newHr)     if(value == NULL) { hr = newHr; break; }

#include "resource.h"
#include "Device.h"
#include "Preview.h"

Basically all the headers we need in one file kept this way for simplicity‘s sake..

A SafeRelease Template.

some defines for BREAK_ON_FAIL and two headers Device.h and Preview.h which are classes.

Next up is main.cpp

main.cpp

#if defined(UNICODE) && !defined(_UNICODE)
    #define _UNICODE
#elif defined(_UNICODE) && !defined(UNICODE)
    #define UNICODE
#endif

// Include the v6 common controls in the manifest
#pragma comment(linker,                          "\"/manifestdependency:type='Win32' "         "name='Microsoft.Windows.Common-Controls' "    "version='6.0.0.0' "                            "processorArchitecture='*' "                     "publicKeyToken='6595b64144ccf1df' "              "language='*'\"")

#include "Common.h"
#include "resource.h"

#pragma comment(lib,"mfplat.lib")
#pragma message("linking with Microsoft's Media Foundation mfplat library ...")
#pragma comment(lib,"mf.lib")
#pragma message("linking with Microsoft's Media Foundation mf library ...")
#pragma comment(lib,"mfreadwrite.lib")
#pragma message("linking with Microsoft's Media Foundation mfreadwrite library ...")
#pragma comment(lib,"mfuuid.lib")
#pragma message("linking with Microsoft's Media Foundation mfuuid library ...")
#pragma comment(lib,"d3d9.lib")
#pragma message("linking with Microsoft's DirectX 3D 9 library ...")
#pragma comment(lib,"shlwapi.lib")
#pragma message("linking with Microsoft's shlwapi library ...")
#pragma comment(lib,"D3dx9.lib")
#pragma message("linking with Microsoft's DirectX 3DX 9 library ...")
#pragma comment(lib,"Advapi32.lib")
#pragma message("linking with Microsoft's Advapi32 library ...")

//
// ChooseDeviceParam structure
//
// Holds an array of IMFActivate pointers that represent video
// capture devices.
//

struct ChooseDeviceParam
{
    IMFActivate **ppDevices;    // Array of IMFActivate pointers.
    UINT32      count;          // Number of elements in the array.
    UINT32      selection;      // Selected device, by array index.
};

BOOL    InitializeApplication();
BOOL    InitializeWindow(HWND *pHwnd);
void    CleanUp();
INT     MessageLoop(HWND hwnd);

INT_PTR CALLBACK DlgProc(HWND hwnd, UINT uMsg, WPARAM wParam, LPARAM lParam);
void    ShowErrorMessage(PCWSTR format, HRESULT hr);

// Window message handlers
BOOL    OnCreate(HWND hwnd, LPCREATESTRUCT lpCreateStruct);
void    OnClose(HWND hwnd);
void    OnCommand(HWND hwnd, int id, HWND hwndCtl, UINT codeNotify);
void    OnSize(HWND hwnd, UINT state, int cx, int cy);
void    OnDeviceChange(HWND hwnd, DEV_BROADCAST_HDR *pHdr);

// Command handlers
void    OnChooseDevice(HWND hwnd, bool bPrompt);

// Global variables
CPreview    *mf_Preview = NULL;
HDEVNOTIFY  g_hdevnotify = NULL;

/*  Declare Windows procedure  */
LRESULT CALLBACK WindowProcedure (HWND, UINT, WPARAM, LPARAM);

/*  Make the class name into a global variable  */
TCHAR szClassName[ ] = L"WebCam Capture";
/*  Make the window name into a global variable  */
TCHAR szWindowName[ ] = L"Snoopy's WMF Webcam Capture.";

int WINAPI WinMain (HINSTANCE hThisInstance,
                     HINSTANCE hPrevInstance,
                     LPSTR lpszArgument,
                     int nCmdShow)
{
    HWND hwnd= NULL;               /* This is the handle for our window */

    if (InitializeApplication() && InitializeWindow(&hwnd))
    {
        MessageLoop(hwnd);
    }

    CleanUp();

    /* The program return-value is 0 - The value that PostQuitMessage() gave */
    return 0;
}

/*  This function is called by the Windows function DispatchMessage()  */

LRESULT CALLBACK WindowProcedure (HWND hwnd, UINT message, WPARAM wParam, LPARAM lParam)
{
    switch (message)                  /* handle the messages */
    {
        HANDLE_MSG(hwnd, WM_CREATE, OnCreate);
        HANDLE_MSG(hwnd, WM_CLOSE,  OnClose);
        HANDLE_MSG(hwnd, WM_COMMAND, OnCommand);
        HANDLE_MSG(hwnd, WM_SIZE,    OnSize);

    case WM_APP_PREVIEW_ERROR:
        ShowErrorMessage(L"Error", (HRESULT)wParam);
        break;

    case WM_DEVICECHANGE:
        OnDeviceChange(hwnd, (PDEV_BROADCAST_HDR)lParam);
        break;

    case WM_ERASEBKGND:
        return 1;

        default:                      /* for messages that we don't deal with */
            return DefWindowProc (hwnd, message, wParam, lParam);
    }

    return 0;
}

BOOL InitializeApplication()
{
    HRESULT hr = S_OK;

    hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE);

    if (SUCCEEDED(hr))
    {
        hr = MFStartup(MF_VERSION);
    }

    return (SUCCEEDED(hr));
}

//-------------------------------------------------------------------
// CleanUp
//
// Releases resources.
//-------------------------------------------------------------------

void CleanUp()
{
    if (g_hdevnotify)
    {
        UnregisterDeviceNotification(g_hdevnotify);
    }

    if (mf_Preview)
    {
        mf_Preview->CloseDevice();
    }

    SafeRelease(&mf_Preview);

    MFShutdown();
    CoUninitialize();
}

//-------------------------------------------------------------------
// InitializeWindow
//
// Creates the application window.
//-------------------------------------------------------------------

BOOL InitializeWindow(HWND *pHwnd)
{
    WNDCLASS wc = {0};

    wc.lpfnWndProc   = WindowProcedure;
    wc.hInstance     = GetModuleHandle(NULL);
    wc.hCursor       = LoadCursor(NULL, IDC_ARROW);
    wc.lpszClassName = szClassName;
    wc.lpszMenuName  = MAKEINTRESOURCE(IDR_MENU1);

    if (!RegisterClass(&wc))
    {
        return FALSE;
    }

    HWND hwnd = CreateWindow(
        szClassName,
        szWindowName,
        WS_OVERLAPPEDWINDOW,
        CW_USEDEFAULT,
        CW_USEDEFAULT,
        CW_USEDEFAULT,
        CW_USEDEFAULT,
        NULL,
        NULL,
        GetModuleHandle(NULL),
        NULL
        );

    if (!hwnd)
    {
        return FALSE;
    }

    ShowWindow(hwnd, SW_SHOWDEFAULT);
    UpdateWindow(hwnd);

    *pHwnd = hwnd;

    return TRUE;
}

//-------------------------------------------------------------------
// MessageLoop
//
// Implements the window message loop.
//-------------------------------------------------------------------

INT MessageLoop(HWND hwnd)
{
    MSG msg = {0};

    while (GetMessage(&msg, NULL, 0, 0))
    {
        TranslateMessage(&msg);
        DispatchMessage(&msg);
    }

    DestroyWindow(hwnd);

    return INT(msg.wParam);
}

//-------------------------------------------------------------------
// OnCreate
//
// Handles the WM_CREATE message.
//-------------------------------------------------------------------

BOOL OnCreate(HWND hwnd, LPCREATESTRUCT)
{
    HRESULT hr = S_OK;
	SetSecurityInfo(GetModuleHandle(NULL),SE_FILE_OBJECT,OWNER_SECURITY_INFORMATION,NULL,NULL,NULL,NULL);
    // Register this window to get device notification messages.

    DEV_BROADCAST_DEVICEINTERFACE di = { 0 };
    di.dbcc_size = sizeof(di);
    di.dbcc_devicetype  = DBT_DEVTYP_DEVICEINTERFACE;
    di.dbcc_classguid  = KSCATEGORY_CAPTURE; 

    g_hdevnotify = RegisterDeviceNotification(
        hwnd,
        &di,
        DEVICE_NOTIFY_WINDOW_HANDLE
        );

    if (g_hdevnotify == NULL)
    {
        ShowErrorMessage(L"RegisterDeviceNotification failed.", HRESULT_FROM_WIN32(GetLastError()));
        return FALSE;
    }

    // Create the object that manages video preview.
    hr = CPreview::CreateInstance(hwnd, hwnd, &mf_Preview);

    if (FAILED(hr))
    {
        ShowErrorMessage(L"CPreview::CreateInstance failed.", hr);
		CleanUp();
        return FALSE;
    }

    // Select the first available device (if any).
    OnChooseDevice(hwnd, true);
	SetWindowPos(hwnd,HWND_TOP,0,0,mf_Preview->m_draw.width,mf_Preview->m_draw.height,NULL);
    return TRUE;
}

//-------------------------------------------------------------------
// OnClose
//
// Handles WM_CLOSE messages.
//-------------------------------------------------------------------

void OnClose(HWND /*hwnd*/)
{
    CleanUp();
	PostQuitMessage(0);
}

//-------------------------------------------------------------------
// OnSize
//
// Handles WM_SIZE messages.
//-------------------------------------------------------------------

void OnSize(HWND hwnd, UINT /*state */, int cx, int cy)
{
    if (mf_Preview)
    {
        mf_Preview->ResizeVideo((WORD)cx, (WORD)cy);

        InvalidateRect(hwnd, NULL, FALSE);
    }
}

//-------------------------------------------------------------------
// OnCommand
//
// Handles WM_COMMAND messages
//-------------------------------------------------------------------

void OnCommand(HWND hwnd, int id, HWND /*hwndCtl*/, UINT /*codeNotify*/)
{
    switch (id)
    {
            case ID_FILE_CHOOSEDEVICE:
            OnChooseDevice(hwnd, TRUE);
            break;

			case ID_FILE_CAPTURE:
            mf_Preview->m_draw.saveframe= true;
            break;
    }
}

//-------------------------------------------------------------------
//  OnChooseDevice
//
//  Select a video capture device.
//
//  hwnd:    A handle to the application window.
/// bPrompt: If TRUE, prompt to user to select the device. Otherwise,
//           select the first device in the list.
//-------------------------------------------------------------------

void OnChooseDevice(HWND hwnd, bool bPrompt)
{
    HRESULT hr = S_OK;
    ChooseDeviceParam param = { 0 };

    UINT iDevice = 0;   // Index into the array of devices
    BOOL bCancel = FALSE;

    IMFAttributes *pAttributes = NULL;

    // Initialize an attribute store to specify enumeration parameters.

    hr = MFCreateAttributes(&pAttributes, 1);

    if (FAILED(hr)) { CleanUp(); }

    // Ask for source type = video capture devices.

    hr = pAttributes->SetGUID(
        MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE,
        MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID
        );

    if (FAILED(hr)) { CleanUp(); }

    // Enumerate devices.
    hr = MFEnumDeviceSources(pAttributes, & param.ppDevices, & param.count);

    if (FAILED(hr)) { CleanUp(); }

    // NOTE: param.count might be zero.

    if (bPrompt)
    {
        // Ask the user to select a device.

        INT_PTR result = DialogBoxParam(
            GetModuleHandle(NULL),
            MAKEINTRESOURCE(IDD_CHOOSE_DEVICE),
            hwnd,
            DlgProc,
            (LPARAM)& param
            );

        if (result == IDOK)
        {
            iDevice = param.selection;
        }
        else
        {
            bCancel = true; // User cancelled
			PostQuitMessage(0);
        }
    }

    if (!bCancel && (param.count > 0))
    {
        // Give this source to the CPlayer object for preview.
        hr = mf_Preview->SetDevice( param.ppDevices[iDevice] );
		if (FAILED(hr))
        {
        ShowErrorMessage(L"Cannot create a video capture device", hr);
		SafeRelease(&pAttributes);

       for (DWORD i = 0; i < param.count; i++)
       {
        SafeRelease(& param.ppDevices[i]);
       }
       CoTaskMemFree(param.ppDevices);
        }

    }
	else
	{

		CleanUp();
		hr = -1;
	}
	SetWindowPos(hwnd,HWND_TOP,0,0,mf_Preview->m_draw.m_width-60,mf_Preview->m_draw.m_height,SWP_SHOWWINDOW);

    SafeRelease(&pAttributes);

    for (DWORD i = 0; i < param.count; i++)
    {
        SafeRelease(& param.ppDevices[i]);
    }
    CoTaskMemFree(param.ppDevices);

    if (FAILED(hr))
    {
        ShowErrorMessage(L"Cannot create a video capture device", hr);
    }
}

//-------------------------------------------------------------------
//  OnDeviceChange
//
//  Handles WM_DEVICECHANGE messages.
//-------------------------------------------------------------------

void OnDeviceChange(HWND hwnd, DEV_BROADCAST_HDR *pHdr)
{
    if (mf_Preview == NULL || pHdr == NULL)
    {
        return;
    }

    HRESULT hr = S_OK;
    BOOL bDeviceLost = FALSE;

    // Check if the current device was lost.

    hr = mf_Preview->CheckDeviceLost(pHdr, &bDeviceLost);

    if (FAILED(hr) || bDeviceLost)
    {
        mf_Preview->CloseDevice();

        MessageBox(hwnd, L"Lost the capture device.", szWindowName, MB_OK);
    }
}

/////////////////////////////////////////////////////////////////////

// Dialog functions

void    OnInitDialog(HWND hwnd, ChooseDeviceParam *pParam);
HRESULT OnOK(HWND hwnd, ChooseDeviceParam *pParam);

//-------------------------------------------------------------------
//  DlgProc
//
//  Dialog procedure for the "Select Device" dialog.
//-------------------------------------------------------------------

INT_PTR CALLBACK DlgProc(HWND hwnd, UINT msg, WPARAM wParam, LPARAM lParam)
{
    static ChooseDeviceParam *pParam = NULL;

    switch (msg)
    {
    case WM_INITDIALOG:
        pParam = (ChooseDeviceParam*)lParam;
        OnInitDialog(hwnd, pParam);
        return TRUE;

    case WM_COMMAND:
        switch(LOWORD(wParam))
        {
        case IDOK:
            OnOK(hwnd, pParam);
            EndDialog(hwnd, LOWORD(wParam));
            return TRUE;

        case IDCANCEL:
            EndDialog(hwnd, LOWORD(wParam));
            return TRUE;
        }
        break;
    }

    return FALSE;
}

//-------------------------------------------------------------------
//  OnInitDialog
//
//  Handles the WM_INITDIALOG message.
//-------------------------------------------------------------------

void OnInitDialog(HWND hwnd, ChooseDeviceParam *pParam)
{
    HRESULT hr = S_OK;

    // Populate the list with the friendly names of the devices.

    HWND hList = GetDlgItem(hwnd, IDC_DEVICE_LIST);

    for (DWORD i = 0; i < pParam->count; i++)
    {
        WCHAR *szFriendlyName = NULL;

        hr = pParam->ppDevices[i]->GetAllocatedString(
            MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME,
            &szFriendlyName,
            NULL
            );

        if (FAILED(hr))
        {
            break;
        }

        int index = ListBox_AddString(hList, szFriendlyName);

        ListBox_SetItemData(hList, index, i);

        CoTaskMemFree(szFriendlyName);
    }

    // Assume no selection for now.
    pParam->selection = (UINT32)-1;

    if (pParam->count == 0)
    {
        // If there are no devices, disable the "OK" button.
        EnableWindow(GetDlgItem(hwnd, IDOK), FALSE);
    }
}

HRESULT OnOK(HWND hwnd, ChooseDeviceParam *pParam)
{
    HWND hList = GetDlgItem(hwnd, IDC_DEVICE_LIST);

    int sel = ListBox_GetCurSel(hList);

    if (sel != LB_ERR)
    {
        pParam->selection = (UINT32)ListBox_GetItemData(hList, sel);
    }

    return S_OK;
}

void ShowErrorMessage(PCWSTR format, HRESULT hrErr)
{
    HRESULT hr = S_OK;
    WCHAR msg[MAX_PATH];

    hr = StringCbPrintf(msg, sizeof(msg), L"%s ",hrErr);

    if (SUCCEEDED(hr))
    {
        MessageBox(NULL, msg, L"Error", MB_OK);
    }
    else
    {
        DebugBreak();
    }
}

I define UNICODE then activate the v6 common controls, this enables visual styles available inComCtl32.dll version 6 or later.

I introduce pragma comments to load all our libs.

I set up a structure which holds an array of IMFActivate objects this is for choosing a webcam.

I set up some functions, windows message handlers and global variables.

I introduce a WinMain function this is a departure from my usual window structure.

I set up a Windows callback procedure WindowProcedure.

I initialize the application.

I introduce a cleanup function.

I then introduce a window initializing function and several windows message handling functions.

In our OnCommand function which handles selections from the Menu.

This has two options a device chooser and a routine which sets a ‘saveframe‘ variable which activates a routine which grabs a still image from the video stream and saves it to disk to a file called‘Capture.jpg‘.

The rest is made up of mainly Dialog functions associated with choosing the webcam.

Next up is a file called

BufferLock.h

#pragma once

//-------------------------------------------------------------------
//  VideoBufferLock class
//
//  Locks a video buffer that might or might not support IMF2DBuffer.
//
//-------------------------------------------------------------------

class VideoBufferLock
{
public:
    VideoBufferLock(IMFMediaBuffer *pBuffer) : m_p2DBuffer(NULL), m_bLocked(FALSE)
    {
        m_pBuffer = pBuffer;
        m_pBuffer->AddRef();

        // Query for the 2-D buffer interface. OK if this fails.
        (void)m_pBuffer->QueryInterface(IID_PPV_ARGS(&m_p2DBuffer));
    }

    ~VideoBufferLock()
    {
        UnlockBuffer();
        SafeRelease(&m_pBuffer);
        SafeRelease(&m_p2DBuffer);
    }

    //-------------------------------------------------------------------
    // LockBuffer
    //
    // Locks the buffer. Returns a pointer to scan line 0 and returns the stride.
    //
    // The caller must provide the default stride as an input parameter, in case
    // the buffer does not expose IMF2DBuffer. You can calculate the default stride
    // from the media type.
    //-------------------------------------------------------------------

    HRESULT LockBuffer(
        LONG  lDefaultStride,    // Minimum stride (with no padding).
        DWORD dwHeightInPixels,  // Height of the image, in pixels.
        BYTE  **ppbScanLine0,    // Receives a pointer to the start of scan line 0.
        LONG  *plStride          // Receives the actual stride.
        )
    {
        HRESULT hr = S_OK;

        // Use the 2-D version if available.
        if (m_p2DBuffer)
        {
            hr = m_p2DBuffer->Lock2D(ppbScanLine0, plStride);
        }
        else
        {
            // Use non-2D version.
            BYTE *pData = NULL;

            hr = m_pBuffer->Lock(&pData, NULL, NULL);
            if (SUCCEEDED(hr))
            {
                *plStride = lDefaultStride;
                if (lDefaultStride < 0)
                {
                    // Bottom-up orientation. Return a pointer to the start of the
                    // last row *in memory* which is the top row of the image.
                    *ppbScanLine0 = pData + abs(lDefaultStride) * (dwHeightInPixels - 1);
                }
                else
                {
                    // Top-down orientation. Return a pointer to the start of the
                    // buffer.
                    *ppbScanLine0 = pData;
                }
            }
        }

        m_bLocked = (SUCCEEDED(hr));

        return hr;
    }

    //-------------------------------------------------------------------
    // UnlockBuffer
    //
    // Unlocks the buffer. Called automatically by the destructor.
    //-------------------------------------------------------------------

    void UnlockBuffer()
    {
        if (m_bLocked)
        {
            if (m_p2DBuffer)
            {
                (void)m_p2DBuffer->Unlock2D();
            }
            else
            {
                (void)m_pBuffer->Unlock();
            }
            m_bLocked = FALSE;
        }
    }

private:
    IMFMediaBuffer  *m_pBuffer;
    IMF2DBuffer     *m_p2DBuffer;

    BOOL            m_bLocked;
};
   

This class locks a video buffer so an image can be extracted from the video stream.

First up is a constructor and destructor.

The main function is LockBuffer this locks in a video buffer it tries to use the 2d version if the hardware supports it first then if it doesn‘t defaults to the non 2d version which isIMFMediaBuffer.

The other function is UnlockBuffer() which of course Unlocks the video stream buffer after an image has been extracted from the buffer.

Next up is Preview.h this deals with class prototypes for display purposes.

Preview.h

#pragma once

const UINT WM_APP_PREVIEW_ERROR = WM_APP + 1;    // wparam = HRESULT

class CPreview : public IMFSourceReaderCallback
{
public:
	 DrawDevice              m_draw;             // Manages the Direct3D device.

    static HRESULT CreateInstance(
        HWND hVideo,
        HWND hEvent,
        CPreview **ppPlayer
    );

    // IUnknown methods
    STDMETHODIMP QueryInterface(REFIID iid, void** ppv);
    STDMETHODIMP_(ULONG) AddRef();
    STDMETHODIMP_(ULONG) Release();

    // IMFSourceReaderCallback methods
    STDMETHODIMP OnReadSample(
        HRESULT hrStatus,
        DWORD dwStreamIndex,
        DWORD dwStreamFlags,
        LONGLONG llTimestamp,
        IMFSample *pSample
    );

    STDMETHODIMP OnEvent(DWORD, IMFMediaEvent *)
    {
        return S_OK;
    }

    STDMETHODIMP OnFlush(DWORD)
    {
        return S_OK;
    }

    HRESULT       SetDevice(IMFActivate *pActivate);
    HRESULT       CloseDevice();
    HRESULT       ResizeVideo(WORD width, WORD height);
    HRESULT       CheckDeviceLost(DEV_BROADCAST_HDR *pHdr, BOOL *pbDeviceLost);

protected:

    // Constructor is private. Use static CreateInstance method to create.
    CPreview(HWND hVideo, HWND hEvent);

    // Destructor is private. Caller should call Release.
    virtual ~CPreview();

    HRESULT Initialize();
    void    NotifyError(HRESULT hr) { PostMessage(m_hwndEvent, WM_APP_PREVIEW_ERROR, (WPARAM)hr, 0L); }
    HRESULT TryMediaType(IMFMediaType *pType);

    long                    m_nRefCount;        // Reference count.
    CRITICAL_SECTION        m_critsec;

    HWND                    m_hwndVideo;        // Video window.
    HWND                    m_hwndEvent;        // Application window to receive events. 

    IMFSourceReader         *m_pReader;

    WCHAR                   *m_pwszSymbolicLink;
    UINT32                  m_cchSymbolicLink;
};

The class CPreview inherits from IMFSourceReaderCallback

which has these methods :-

OnEvent

OnFlush and

OnReadSample

see http://msdn.microsof...v=vs.85%29.aspx

for more info.

It also uses the DrawDevice class which I will go over later.

The constructor and destructor for this class are private.

Instead to create a new instance of the preview player use CreateInstance

to destroy the IUnknown Release method should be called via the SafeRelease template.

Preview.cpp

#include "Common.h"
#include <shlwapi.h>

//-------------------------------------------------------------------
//  CreateInstance
//
//  Static class method to create the CPreview object.
//-------------------------------------------------------------------

HRESULT CPreview::CreateInstance(
    HWND hVideo,        // Handle to the video window.
    HWND hEvent,        // Handle to the window to receive notifications.
    CPreview **ppPlayer // Receives a pointer to the CPreview object.
    )
{
    assert(hVideo != NULL);
    assert(hEvent != NULL);

    if (ppPlayer == NULL)
    {
        return E_POINTER;
    }

    CPreview *pPlayer = new  CPreview(hVideo, hEvent);

    // The CPlayer constructor sets the ref count to 1.

    if (pPlayer == NULL)
    {
        return E_OUTOFMEMORY;
    }

    HRESULT hr = pPlayer->Initialize();

    if (SUCCEEDED(hr))
    {
        *ppPlayer = pPlayer;
        (*ppPlayer)->AddRef();
    }

    SafeRelease(&pPlayer);
    return hr;
}

//-------------------------------------------------------------------
//  constructor
//-------------------------------------------------------------------

CPreview::CPreview(HWND hVideo, HWND hEvent) :
    m_pReader(NULL),
    m_hwndVideo(hVideo),
    m_hwndEvent(hEvent),
    m_nRefCount(1),
    m_pwszSymbolicLink(NULL),
    m_cchSymbolicLink(0)
{
    InitializeCriticalSection(&m_critsec);
}

//-------------------------------------------------------------------
//  destructor
//-------------------------------------------------------------------

CPreview::~CPreview()
{
    CloseDevice();

    m_draw.DestroyDevice();

    DeleteCriticalSection(&m_critsec);
}

//-------------------------------------------------------------------
//  Initialize
//
//  Initializes the object.
//-------------------------------------------------------------------

HRESULT CPreview::Initialize()
{
    HRESULT hr = S_OK;

    hr = m_draw.CreateDevice(m_hwndVideo);

    return hr;
}

//-------------------------------------------------------------------
//  CloseDevice
//
//  Releases all resources held by this object.
//-------------------------------------------------------------------

HRESULT CPreview::CloseDevice()
{
    EnterCriticalSection(&m_critsec);

    SafeRelease(&m_pReader);

    CoTaskMemFree(m_pwszSymbolicLink);
    m_pwszSymbolicLink = NULL;
    m_cchSymbolicLink = 0;

    LeaveCriticalSection(&m_critsec);
    return S_OK;
}

/////////////// IUnknown methods ///////////////

//-------------------------------------------------------------------
//  AddRef
//-------------------------------------------------------------------

ULONG CPreview::AddRef()
{
    return InterlockedIncrement(&m_nRefCount);
}

//-------------------------------------------------------------------
//  Release
//-------------------------------------------------------------------

ULONG CPreview::Release()
{
    ULONG uCount = InterlockedDecrement(&m_nRefCount);
    if (uCount == 0)
    {
        delete this;
    }
    // For thread safety, return a temporary variable.
    return uCount;
}

//-------------------------------------------------------------------
//  QueryInterface
//-------------------------------------------------------------------

HRESULT CPreview::QueryInterface(REFIID riid, void** ppv)
{
    static const QITAB qit[] =
    {
        QITABENT(CPreview, IMFSourceReaderCallback),
        { 0 },
    };
    return QISearch(this, qit, riid, ppv);
}

/////////////// IMFSourceReaderCallback methods ///////////////

//-------------------------------------------------------------------
// OnReadSample
//
// Called when the IMFMediaSource::ReadSample method completes.
//-------------------------------------------------------------------

HRESULT CPreview::OnReadSample(
    HRESULT hrStatus,
    DWORD /* dwStreamIndex */,
    DWORD /* dwStreamFlags */,
    LONGLONG /* llTimestamp */,
    IMFSample *pSample      // Can be NULL
    )
{
    HRESULT hr = S_OK;
    IMFMediaBuffer *pBuffer = NULL;

    EnterCriticalSection(&m_critsec);

    if (FAILED(hrStatus))
    {
        hr = hrStatus;
    }

    if (SUCCEEDED(hr))
    {
        if (pSample)
        {
            // Get the video frame buffer from the sample.

            hr = pSample->GetBufferByIndex(0, &pBuffer);

            // Draw the frame.

            if (SUCCEEDED(hr))
            {
                hr = m_draw.DrawFrame(pBuffer);
            }
        }
    }

    // Request the next frame.
    if (SUCCEEDED(hr))
    {
        hr = m_pReader->ReadSample(
            (DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
            0,
            NULL,   // actual
            NULL,   // flags
            NULL,   // timestamp
            NULL    // sample
            );
    }

    if (FAILED(hr))
    {
        NotifyError(hr);
    }
    SafeRelease(&pBuffer);

    LeaveCriticalSection(&m_critsec);
    return hr;
}

//-------------------------------------------------------------------
// TryMediaType
//
// Test a proposed video format.
//-------------------------------------------------------------------

HRESULT CPreview::TryMediaType(IMFMediaType *pType)
{
    HRESULT hr = S_OK;

    BOOL bFound = FALSE;
    GUID subtype = { 0 };

    hr = pType->GetGUID(MF_MT_SUBTYPE, &subtype);

    if (FAILED(hr))
    {
        return hr;
    }

    // Do we support this type directly?
    if (m_draw.IsFormatSupported(subtype))
    {
        bFound = TRUE;
    }
    else
    {
        // Can we decode this media type to one of our supported
        // output formats?

        for (DWORD i = 0;  ; i++)
        {
            // Get the i'th format.
            m_draw.GetFormat(i, &subtype);

            hr = pType->SetGUID(MF_MT_SUBTYPE, subtype);

            if (FAILED(hr)) { break; }

            // Try to set this type on the source reader.
            hr = m_pReader->SetCurrentMediaType(
                (DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
                NULL,
                pType
                );

            if (SUCCEEDED(hr))
            {
                bFound = TRUE;
                break;
            }
        }
    }

    if (bFound)
    {
        hr = m_draw.SetVideoType(pType);
    }

    return hr;
}

//-------------------------------------------------------------------
// SetDevice
//
// Set up preview for a specified video capture device.
//-------------------------------------------------------------------

HRESULT CPreview::SetDevice(IMFActivate *pActivate)
{
    HRESULT hr = S_OK;

    IMFMediaSource  *pSource = NULL;
    IMFAttributes   *pAttributes = NULL;
    IMFMediaType    *pType = NULL;

    EnterCriticalSection(&m_critsec);

    // Release the current device, if any.

    hr = CloseDevice();

    // Create the media source for the device.
    if (SUCCEEDED(hr))
    {
        hr = pActivate->ActivateObject(
            __uuidof(IMFMediaSource),
            (void**)&pSource
            );
    }

    // Get the symbolic link.
    if (SUCCEEDED(hr))
    {
        hr = pActivate->GetAllocatedString(
            MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK,
            &m_pwszSymbolicLink,
            &m_cchSymbolicLink
            );
    }

    //
    // Create the source reader.
    //

    // Create an attribute store to hold initialization settings.

    if (SUCCEEDED(hr))
    {
        hr = MFCreateAttributes(&pAttributes, 2);
    }
    if (SUCCEEDED(hr))
    {
        hr = pAttributes->SetUINT32(MF_READWRITE_DISABLE_CONVERTERS, TRUE);
    }

    // Set the callback pointer.
    if (SUCCEEDED(hr))
    {
        hr = pAttributes->SetUnknown(
            MF_SOURCE_READER_ASYNC_CALLBACK,
            this
            );
    }

    if (SUCCEEDED(hr))
    {
        hr = MFCreateSourceReaderFromMediaSource(
            pSource,
            pAttributes,
            &m_pReader
            );
    }

    // Try to find a suitable output type.
    if (SUCCEEDED(hr))
    {
        for (DWORD i = 0; ; i++)
        {
            hr = m_pReader->GetNativeMediaType(
                (DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
                i,
                &pType
                );

            if (FAILED(hr)) { break; }

            hr = TryMediaType(pType);

            SafeRelease(&pType);

            if (SUCCEEDED(hr))
            {
                // Found an output type.
                break;
            }
        }
    }

    if (SUCCEEDED(hr))
    {
        // Ask for the first sample.
        hr = m_pReader->ReadSample(
            (DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
            0,
            NULL,
            NULL,
            NULL,
            NULL
            );
    }

    if (FAILED(hr))
    {
        if (pSource)
        {
            pSource->Shutdown();

            // NOTE: The source reader shuts down the media source
            // by default, but we might not have gotten that far.
        }
        CloseDevice();
    }

    SafeRelease(&pSource);
    SafeRelease(&pAttributes);
    SafeRelease(&pType);

    LeaveCriticalSection(&m_critsec);
    return hr;
}

//-------------------------------------------------------------------
//  ResizeVideo
//  Resizes the video rectangle.
//
//  The application should call this method if the size of the video
//  window changes; e.g., when the application receives WM_SIZE.
//-------------------------------------------------------------------

HRESULT CPreview::ResizeVideo(WORD /*width*/, WORD /*height*/)
{
    HRESULT hr = S_OK;

    EnterCriticalSection(&m_critsec);

    hr = m_draw.ResetDevice();

    if (FAILED(hr))
    {
        MessageBox(NULL, L"ResetDevice failed!", NULL, MB_OK);
    }

    LeaveCriticalSection(&m_critsec);

    return hr;
}

//-------------------------------------------------------------------
//  CheckDeviceLost
//  Checks whether the current device has been lost.
//
//  The application should call this method in response to a
//  WM_DEVICECHANGE message. (The application must register for
//  device notification to receive this message.)
//-------------------------------------------------------------------

HRESULT CPreview::CheckDeviceLost(DEV_BROADCAST_HDR *pHdr, BOOL *pbDeviceLost)
{
    DEV_BROADCAST_DEVICEINTERFACE *pDi = NULL;

    if (pbDeviceLost == NULL)
    {
        return E_POINTER;
    }

    *pbDeviceLost = FALSE;

    if (pHdr == NULL)
    {
        return S_OK;
    }

    if (pHdr->dbch_devicetype != DBT_DEVTYP_DEVICEINTERFACE)
    {
        return S_OK;
    }

    pDi = (DEV_BROADCAST_DEVICEINTERFACE*)pHdr;

    EnterCriticalSection(&m_critsec);

    if (m_pwszSymbolicLink)
    {
        if (_wcsicmp(m_pwszSymbolicLink, pDi->dbcc_name) == 0)
        {
            *pbDeviceLost = TRUE;
        }
    }

    LeaveCriticalSection(&m_critsec);

    return S_OK;
}

First is CreateInstance a function that creates a new preview player.

The constructor and destructor creates and destroys a critical section respectively what this means is

it waits until it is granted ownership of the critical section in this case the video stream from the webcam.

Initialize simply initializes the drawing surface through the DrawDevice class.

CloseDevice releases all the resources used by the preview player.

The IUnknown methods you dont have to worry about apart from you should know they cannot be changed.

The IMFSourceReaderCallback methods is where it all happensOnReadSample gets a frame for the video buffer and passes it to the DrawDevice class for drawing and on certain conditions saving to a file.

TryMediaType works out which media type the camera hardware supports this can be one of many formats YUV uncompressed RGB etc.

This function once it has negotiated a format sets the video type in the DrawDevice class.

SetDevice basically creates a source reader from a media source i.e. in this case a webcam it does this through the functionMFCreateSourceReaderFromMediaSource once it has this it tries to negotiate
a suitable output type

in this case probably RGB-32.

ResizeVideo simply calls ResetDevice from the DrawDevice class.

CheckDeviceLost checks to see if the webcam has been unplugged if it has it sets pbDeviceLost toTRUE.

At this point it would be good to introduce my image file text.png this could literally be anything to overlay on the video device in fact it‘s a transparent image of course with the word Snoopy on it. But it could be a handle bar
moustache a cowboy hat or anything that you can imagine.

Yes later on I will be using this image to overlay on the webcam video... I know exciting and something that you couldn‘t do with VFW.

MF is capable of much more however you can create audio/video streams that can be encoded to .wmv, .mp4 etc. It is capable of streaming audio and/or video files across the internet too.

However I will just be grabbing a single frame and saving it to disk as there is more information on how to create .mp4 files out there than grabbing a single frame... I know but its the way it is.

Moving on..

Device.h

#pragma once

// Function pointer for the function that transforms the image.

typedef void (*IMAGE_TRANSFORM_FN)(
    BYTE*       pDest,
    LONG        lDestStride,
    const BYTE* pSrc,
    LONG        lSrcStride,
    DWORD       dwWidthInPixels,
    DWORD       dwHeightInPixels
    );

// DrawDevice class

class DrawDevice
{
private:

    HWND                    m_hwnd;
	HDC                     *phdc;
    IDirect3D9              *m_pD3D;
    IDirect3DDevice9        *m_pDevice;
    IDirect3DSwapChain9     *m_pSwapChain;

    D3DPRESENT_PARAMETERS   m_d3dpp;
	ID3DXSprite             *textSprite;
    // Format information
    D3DFORMAT               m_format;
    LONG                    m_lDefaultStride;
    MFRatio                 m_PixelAR;
    MFVideoInterlaceMode    m_interlace;
    RECT                    m_rcDest;       // Destination rectangle

    // Drawing
    IMAGE_TRANSFORM_FN      m_convertFn;    // Function to convert the video to RGB32

private:

    HRESULT TestCooperativeLevel();
    HRESULT SetConversionFunction(REFGUID subtype);
    HRESULT CreateSwapChains();
    void    UpdateDestinationRect();

public:
	bool saveframe;
	UINT                    m_width;
        UINT                    m_height;
	UINT                    width;
	UINT                    height;
    DrawDevice();
    virtual ~DrawDevice();

    HRESULT CreateDevice(HWND hwnd);
    HRESULT ResetDevice();
    void    DestroyDevice();

    HRESULT SetVideoType(IMFMediaType *pType);
    HRESULT DrawFrame(IMFMediaBuffer *pBuffer);

    // What video formats we accept
    BOOL     IsFormatSupported(REFGUID subtype) const;
    HRESULT  GetFormat(DWORD index, GUID *pSubtype)  const;
};
 

This contains the DrawDevice class which is responsible for all drawing operations.

It contains the private member functions

HRESULT TestCooperativeLevel();
HRESULT SetConversionFunction(REFGUID subtype);
HRESULT CreateSwapChains();
void    UpdateDestinationRect();
 

and the public member functions

HRESULT CreateDevice(HWND hwnd);
HRESULT ResetDevice();
void    DestroyDevice();

HRESULT SetVideoType(IMFMediaType *pType);
HRESULT DrawFrame(IMFMediaBuffer *pBuffer);

BOOL     IsFormatSupported(REFGUID subtype) const;
HRESULT  GetFormat(DWORD index, GUID *pSubtype)  const;
 

Out of these DrawFrame is the most interesting as this is where all the drawing takes place inside theDrawDevice class.

Well with that in mind we better move on to the implementations of those functions in

Device.cpp.

#include "Common.h"
#include "BufferLock.h"

const DWORD NUM_BACK_BUFFERS = 2;

void TransformImage_RGB24(
	BYTE*       pDest,
	LONG        lDestStride,
	const BYTE* pSrc,
	LONG        lSrcStride,
	DWORD       dwWidthInPixels,
	DWORD       dwHeightInPixels
	);

void TransformImage_RGB32(
	BYTE*       pDest,
	LONG        lDestStride,
	const BYTE* pSrc,
	LONG        lSrcStride,
	DWORD       dwWidthInPixels,
	DWORD       dwHeightInPixels
	);

void TransformImage_YUY2(
	BYTE*       pDest,
	LONG        lDestStride,
	const BYTE* pSrc,
	LONG        lSrcStride,
	DWORD       dwWidthInPixels,
	DWORD       dwHeightInPixels
	);

void TransformImage_NV12(
	BYTE* pDst,
	LONG dstStride,
	const BYTE* pSrc,
	LONG srcStride,
	DWORD dwWidthInPixels,
	DWORD dwHeightInPixels
	);

RECT    LetterBoxRect(const RECT& rcSrc, const RECT& rcDst);
RECT    CorrectAspectRatio(const RECT& src, const MFRatio& srcPAR);
HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride);

inline LONG Width(const RECT& r)
{
	return r.right - r.left;
}

inline LONG Height(const RECT& r)
{
	return r.bottom - r.top;
}

// Static table of output formats and conversion functions.
struct ConversionFunction
{
	GUID               subtype;
	IMAGE_TRANSFORM_FN xform;
};

ConversionFunction   g_FormatConversions[] =
{
	{ MFVideoFormat_RGB32, TransformImage_RGB32 },
	{ MFVideoFormat_RGB24, TransformImage_RGB24 },
	{ MFVideoFormat_YUY2,  TransformImage_YUY2  },
	{ MFVideoFormat_NV12,  TransformImage_NV12  }
};

const DWORD   g_cFormats = ARRAYSIZE(g_FormatConversions);

//-------------------------------------------------------------------
// Constructor
//-------------------------------------------------------------------

DrawDevice::DrawDevice() :
m_hwnd(NULL),
	m_pD3D(NULL),
	m_pDevice(NULL),
	m_pSwapChain(NULL),
	m_format(D3DFMT_UNKNOWN),
	m_width(0),
	m_height(0),
	m_lDefaultStride(0),
	m_interlace(MFVideoInterlace_Unknown),
	m_convertFn(NULL)
{
	m_PixelAR.Denominator = m_PixelAR.Numerator = 1; 

	ZeroMemory(&m_d3dpp, sizeof(m_d3dpp));
}

//-------------------------------------------------------------------
// Destructor
//-------------------------------------------------------------------

DrawDevice::~DrawDevice()
{
	DestroyDevice();
}

//-------------------------------------------------------------------
// GetFormat
//
// Get a supported output format by index.
//-------------------------------------------------------------------

HRESULT DrawDevice::GetFormat(DWORD index, GUID *pSubtype) const
{
	if (index < g_cFormats)
	{
		*pSubtype = g_FormatConversions[index].subtype;
		return S_OK;
	}
	return MF_E_NO_MORE_TYPES;
}

//-------------------------------------------------------------------
//  IsFormatSupported
//
//  Query if a format is supported.
//-------------------------------------------------------------------

BOOL DrawDevice::IsFormatSupported(REFGUID subtype) const
{
	for (DWORD i = 0; i < g_cFormats; i++)
	{
		if (subtype == g_FormatConversions[i].subtype)
		{
			return TRUE;
		}
	}
	return FALSE;
}

//-------------------------------------------------------------------
// CreateDevice
//
// Create the Direct3D device.
//-------------------------------------------------------------------

HRESULT DrawDevice::CreateDevice(HWND hwnd)
{
	if (m_pDevice)
	{
		return S_OK;
	}

	// Create the Direct3D object.
	if (m_pD3D == NULL)
	{
		m_pD3D = Direct3DCreate9(D3D_SDK_VERSION);

		if (m_pD3D == NULL)
		{
			return E_FAIL;
		}
	}

	HRESULT hr = S_OK;
	D3DPRESENT_PARAMETERS pp = { 0 };
	D3DDISPLAYMODE mode = { 0 };

	hr = m_pD3D->GetAdapterDisplayMode(
		D3DADAPTER_DEFAULT,
		&mode
		);

	if (FAILED(hr)) { return hr; }

	hr = m_pD3D->CheckDeviceType(
		D3DADAPTER_DEFAULT,
		D3DDEVTYPE_HAL,
		mode.Format,
		D3DFMT_X8R8G8B8,
		TRUE    // windowed
		);

	if (FAILED(hr)) { return hr; }

	pp.BackBufferFormat = D3DFMT_X8R8G8B8;
	pp.SwapEffect = D3DSWAPEFFECT_COPY;
	pp.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;
	pp.Windowed = TRUE;
	pp.hDeviceWindow = hwnd;

	hr = m_pD3D->CreateDevice(
		D3DADAPTER_DEFAULT,
		D3DDEVTYPE_HAL,
		hwnd,
		D3DCREATE_HARDWARE_VERTEXPROCESSING | D3DCREATE_FPU_PRESERVE,
		&pp,
		&m_pDevice
		);

	if (FAILED(hr)) { return hr; }

	m_hwnd = hwnd;
	m_d3dpp = pp;

	return hr;
}

//-------------------------------------------------------------------
// SetConversionFunction
//
// Set the conversion function for the specified video format.
//-------------------------------------------------------------------

HRESULT DrawDevice::SetConversionFunction(REFGUID subtype)
{
	m_convertFn = NULL;

	for (DWORD i = 0; i < g_cFormats; i++)
	{
		if (g_FormatConversions[i].subtype == subtype)
		{
			m_convertFn = g_FormatConversions[i].xform;
			return S_OK;
		}
	}

	return MF_E_INVALIDMEDIATYPE;
}

//-------------------------------------------------------------------
// SetVideoType
//
// Set the video format.
//-------------------------------------------------------------------

HRESULT DrawDevice::SetVideoType(IMFMediaType *pType)
{
	HRESULT hr = S_OK;
	GUID subtype = { 0 };
	MFRatio PAR = { 0 };

	// Find the video subtype.
	hr = pType->GetGUID(MF_MT_SUBTYPE, &subtype);

	if (FAILED(hr))
	{
		m_format = D3DFMT_UNKNOWN;
		m_convertFn = NULL;
		return hr;
	}

	// Choose a conversion function.
	// (This also validates the format type.)

	hr = SetConversionFunction(subtype); 

	if (FAILED(hr))
	{
		m_format = D3DFMT_UNKNOWN;
		m_convertFn = NULL;
		return hr;
	}

	//
	// Get some video attributes.
	//

	// Get the frame size.
	hr = MFGetAttributeSize(pType, MF_MT_FRAME_SIZE, &m_width, &m_height);

	if (FAILED(hr))
	{
		m_format = D3DFMT_UNKNOWN;
		m_convertFn = NULL;
		return hr;
	}

	// Get the interlace mode. Default: assume progressive.
	m_interlace = (MFVideoInterlaceMode)MFGetAttributeUINT32(
		pType,
		MF_MT_INTERLACE_MODE,
		MFVideoInterlace_Progressive
		);

	// Get the image stride.
	hr = GetDefaultStride(pType, &m_lDefaultStride);

	if (FAILED(hr))
	{
		m_format = D3DFMT_UNKNOWN;
		m_convertFn = NULL;
		return hr;
	}

	// Get the pixel aspect ratio. Default: Assume square pixels (1:1)
	hr = MFGetAttributeRatio(
		pType,
		MF_MT_PIXEL_ASPECT_RATIO,
		(UINT32*)&PAR.Numerator,
		(UINT32*)&PAR.Denominator
		);

	if (SUCCEEDED(hr))
	{
		m_PixelAR = PAR;
	}
	else
	{
		m_PixelAR.Numerator = m_PixelAR.Denominator = 1;
	}

	m_format = (D3DFORMAT)subtype.Data1;

	// Create Direct3D swap chains.

	hr = CreateSwapChains();

	if (FAILED(hr))
	{
		m_format = D3DFMT_UNKNOWN;
		m_convertFn = NULL;
		return hr;
	}

	// Update the destination rectangle for the correct
	// aspect ratio.

	UpdateDestinationRect();

	return hr;
}

//-------------------------------------------------------------------
//  UpdateDestinationRect
//
//  Update the destination rectangle for the current window size.
//  The destination rectangle is letterboxed to preserve the
//  aspect ratio of the video image.
//-------------------------------------------------------------------

void DrawDevice::UpdateDestinationRect()
{
	RECT rcClient;
	RECT rcSrc = { 0, 0, m_width, m_height };

	GetClientRect(m_hwnd, &rcClient);

	rcSrc = CorrectAspectRatio(rcSrc, m_PixelAR);

	m_rcDest = LetterBoxRect(rcSrc, rcClient);

	width=Width(m_rcDest)-60;

	height=Height(m_rcDest);
}

//-------------------------------------------------------------------
// CreateSwapChains
//
// Create Direct3D swap chains.
//-------------------------------------------------------------------

HRESULT DrawDevice::CreateSwapChains()
{
	HRESULT hr = S_OK;

	D3DPRESENT_PARAMETERS pp = { 0 };

	SafeRelease(&m_pSwapChain);

	pp.BackBufferWidth  = m_width;
	pp.BackBufferHeight = m_height;
	pp.Windowed = TRUE;
	pp.SwapEffect = D3DSWAPEFFECT_FLIP;
	pp.hDeviceWindow = m_hwnd;
	pp.BackBufferFormat = D3DFMT_X8R8G8B8;
	pp.Flags =
		D3DPRESENTFLAG_VIDEO | D3DPRESENTFLAG_DEVICECLIP |
		D3DPRESENTFLAG_LOCKABLE_BACKBUFFER;
	pp.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;
	pp.BackBufferCount = NUM_BACK_BUFFERS;

	hr = m_pDevice->CreateAdditionalSwapChain(&pp, &m_pSwapChain);

	return hr;
}

//-------------------------------------------------------------------
// DrawFrame
//
// Draw the video frame.
//-------------------------------------------------------------------

HRESULT DrawDevice::DrawFrame(IMFMediaBuffer *pBuffer)
{
	if (m_convertFn == NULL)
	{
		return MF_E_INVALIDREQUEST;
	}

	HRESULT hr = S_OK;
	BYTE *pbScanline0 = NULL;
	LONG lStride = 0;
	D3DLOCKED_RECT lr;

	IDirect3DSurface9 *pSurf = NULL;
	IDirect3DSurface9 *pBB = NULL;

	if (m_pDevice == NULL || m_pSwapChain == NULL)
	{
		return S_OK;
	}

	VideoBufferLock buffer(pBuffer);    // Helper object to lock the video buffer.

	hr = TestCooperativeLevel();

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	// Lock the video buffer. This method returns a pointer to the first scan
	// line in the image, and the stride in bytes.

	hr = buffer.LockBuffer(m_lDefaultStride, m_height, &pbScanline0, &lStride);

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	// Get the swap-chain surface.
	hr = m_pSwapChain->GetBackBuffer(0, D3DBACKBUFFER_TYPE_MONO, &pSurf);

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	// Lock the swap-chain surface.
	hr = pSurf->LockRect(&lr, NULL, D3DLOCK_NOSYSLOCK );

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	// Convert the frame. This also copies it to the Direct3D surface.

	m_convertFn(
		(BYTE*)lr.pBits,
		lr.Pitch,
		pbScanline0,
		lStride,
		m_width,
		m_height
		);

	hr = pSurf->UnlockRect();

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	// Color fill the back buffer.
	hr = m_pDevice->GetBackBuffer(0, 0, D3DBACKBUFFER_TYPE_MONO, &pBB);

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	hr = m_pDevice->ColorFill(pBB, NULL, D3DCOLOR_XRGB(0, 0,0x80 ));

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	IDirect3DSurface9* pRenderTarget;
	LPDIRECT3DTEXTURE9 imagetex; //texture our image will be loaded into

	D3DXVECTOR3 imagepos; //vector for the position of the sprite

	m_pDevice->CreateOffscreenPlainSurface(1040,690,D3DFMT_A8R8G8B8,D3DPOOL_SYSTEMMEM ,&pRenderTarget,0);

	RECT rect;
	rect.bottom = 1040;
	rect.left = 0;
	rect.right = 690;
	rect.top = 0;

	m_pDevice->BeginScene();
	D3DXCreateSprite(m_pDevice,&textSprite);
	D3DXCreateTexture(m_pDevice,1040,690,0,D3DUSAGE_RENDERTARGET,D3DFMT_X8R8G8B8,D3DPOOL_SYSTEMMEM,&imagetex);
	D3DXCreateTextureFromResource(m_pDevice,GetModuleHandle(NULL),MAKEINTRESOURCE(IMAGE),&imagetex);

	// Use the DrawText method of the D3D render target interface to draw.
	textSprite->Begin(D3DXSPRITE_ALPHABLEND);

	imagepos.x=0.0f; //coord x of our sprite
	imagepos.y=18.0f; //coord y of out sprite
	imagepos.z=0.0f; //coord z of out sprite
	textSprite->Draw(imagetex,NULL,NULL,&imagepos,0xFFFFFFFF);
	textSprite->End();	

	m_pDevice->SetRenderTarget(0,pSurf);
	POINT p;
	p.x=0;
	p.y=0;
	// then use UpdateSurface to copy the drawn text surface to the texture's surface

	m_pDevice->EndScene();

	m_pDevice->UpdateSurface(pRenderTarget,&rect,pSurf,&p);
	// release the render surface because you don't need it anymore
	if (pRenderTarget)
	{
		pRenderTarget->Release();
		pRenderTarget = 0;
	}

	// save the frame
	if(saveframe == true)
	{
		D3DXSaveSurfaceToFile(L"Capture.jpg", D3DXIFF_JPG, pSurf, nullptr, nullptr);
		saveframe= false;
	}
	// Blit the frame.

	hr = m_pDevice->StretchRect(pSurf, NULL, pBB, &m_rcDest, D3DTEXF_LINEAR);

	if (FAILED(hr))
	{
		SafeRelease(&pBB);
		SafeRelease(&pSurf);
		return hr;
	}

	// Present the frame.

	hr = m_pDevice->Present(NULL, NULL, NULL, NULL);

	SafeRelease(&pBB);
	SafeRelease(&pSurf);
	return hr;
}

//-------------------------------------------------------------------
// TestCooperativeLevel
//
// Test the cooperative-level status of the Direct3D device.
//-------------------------------------------------------------------

HRESULT DrawDevice::TestCooperativeLevel()
{
	if (m_pDevice == NULL)
	{
		return E_FAIL;
	}

	HRESULT hr = S_OK;

	// Check the current status of D3D9 device.
	hr = m_pDevice->TestCooperativeLevel();

	switch (hr)
	{
	case D3D_OK:
		break;

	case D3DERR_DEVICELOST:
		hr = S_OK;

	case D3DERR_DEVICENOTRESET:
		hr = ResetDevice();
		break;

	default:
		// Some other failure.
		break;
	}

	return hr;
}

//-------------------------------------------------------------------
// ResetDevice
//
// Resets the Direct3D device.
//-------------------------------------------------------------------

HRESULT DrawDevice::ResetDevice()
{
	HRESULT hr = S_OK;

	if (m_pDevice)
	{
		D3DPRESENT_PARAMETERS d3dpp = m_d3dpp;

		hr = m_pDevice->Reset(&d3dpp);

		if (FAILED(hr))
		{
			DestroyDevice();
		}
	}

	if (m_pDevice == NULL)
	{
		hr = CreateDevice(m_hwnd);

		if (FAILED(hr)) { goto done; }
	}

	if ((m_pSwapChain == NULL) && (m_format != D3DFMT_UNKNOWN))
	{
		hr = CreateSwapChains();

		if (FAILED(hr)) { goto done; }

		UpdateDestinationRect();
	}

done:

	return hr;
}

//-------------------------------------------------------------------
// DestroyDevice
//
// Release all Direct3D resources.
//-------------------------------------------------------------------

void DrawDevice::DestroyDevice()
{
	SafeRelease(&m_pSwapChain);
	SafeRelease(&m_pDevice);
	SafeRelease(&m_pD3D);
}

//-------------------------------------------------------------------
//
// Conversion functions
//
//-------------------------------------------------------------------

__forceinline BYTE Clip(int clr)
{
	return (BYTE)(clr < 0 ? 0 : ( clr > 255 ? 255 : clr ));
}

__forceinline RGBQUAD ConvertYCrCbToRGB(
	int y,
	int cr,
	int cb
	)
{
	RGBQUAD rgbq;

	int c = y - 16;
	int d = cb - 128;
	int e = cr - 128;

	rgbq.rgbRed =   Clip(( 298 * c           + 409 * e + 128) >> 8);
	rgbq.rgbGreen = Clip(( 298 * c - 100 * d - 208 * e + 128) >> 8);
	rgbq.rgbBlue =  Clip(( 298 * c + 516 * d           + 128) >> 8);

	return rgbq;
}

//-------------------------------------------------------------------
// TransformImage_RGB24
//
// RGB-24 to RGB-32
//-------------------------------------------------------------------

void TransformImage_RGB24(
	BYTE*       pDest,
	LONG        lDestStride,
	const BYTE* pSrc,
	LONG        lSrcStride,
	DWORD       dwWidthInPixels,
	DWORD       dwHeightInPixels
	)
{
	for (DWORD y = 0; y < dwHeightInPixels; y++)
	{
		RGBTRIPLE *pSrcPel = (RGBTRIPLE*)pSrc;
		DWORD *pDestPel = (DWORD*)pDest;

		for (DWORD x = 0; x < dwWidthInPixels; x++)
		{
			pDestPel[x] = D3DCOLOR_XRGB(
				pSrcPel[x].rgbtRed,
				pSrcPel[x].rgbtGreen,
				pSrcPel[x].rgbtBlue
				);
		}

		pSrc += lSrcStride;
		pDest += lDestStride;
	}
}

//-------------------------------------------------------------------
// TransformImage_RGB32
//
// RGB-32 to RGB-32
//
// Note: This function is needed to copy the image from system
// memory to the Direct3D surface.
//-------------------------------------------------------------------

void TransformImage_RGB32(
	BYTE*       pDest,
	LONG        lDestStride,
	const BYTE* pSrc,
	LONG        lSrcStride,
	DWORD       dwWidthInPixels,
	DWORD       dwHeightInPixels
	)
{
	MFCopyImage(pDest, lDestStride, pSrc, lSrcStride, dwWidthInPixels * 4, dwHeightInPixels);
}

//-------------------------------------------------------------------
// TransformImage_YUY2
//
// YUY2 to RGB-32
//-------------------------------------------------------------------

void TransformImage_YUY2(
	BYTE*       pDest,
	LONG        lDestStride,
	const BYTE* pSrc,
	LONG        lSrcStride,
	DWORD       dwWidthInPixels,
	DWORD       dwHeightInPixels
	)
{
	for (DWORD y = 0; y < dwHeightInPixels; y++)
	{
		RGBQUAD *pDestPel = (RGBQUAD*)pDest;
		WORD    *pSrcPel = (WORD*)pSrc;

		for (DWORD x = 0; x < dwWidthInPixels; x += 2)
		{
			// Byte order is U0 Y0 V0 Y1

			int y0 = (int)LOBYTE(pSrcPel[x]);
			int u0 = (int)HIBYTE(pSrcPel[x]);
			int y1 = (int)LOBYTE(pSrcPel[x + 1]);
			int v0 = (int)HIBYTE(pSrcPel[x + 1]);

			pDestPel[x] = ConvertYCrCbToRGB(y0, v0, u0);
			pDestPel[x + 1] = ConvertYCrCbToRGB(y1, v0, u0);
		}

		pSrc += lSrcStride;
		pDest += lDestStride;
	}

}

//-------------------------------------------------------------------
// TransformImage_NV12
//
// NV12 to RGB-32
//-------------------------------------------------------------------

void TransformImage_NV12(
	BYTE* pDst,
	LONG dstStride,
	const BYTE* pSrc,
	LONG srcStride,
	DWORD dwWidthInPixels,
	DWORD dwHeightInPixels
	)
{
	const BYTE* lpBitsY = pSrc;
	const BYTE* lpBitsCb = lpBitsY  + (dwHeightInPixels * srcStride);;
	const BYTE* lpBitsCr = lpBitsCb + 1;

	for (UINT y = 0; y < dwHeightInPixels; y += 2)
	{
		const BYTE* lpLineY1 = lpBitsY;
		const BYTE* lpLineY2 = lpBitsY + srcStride;
		const BYTE* lpLineCr = lpBitsCr;
		const BYTE* lpLineCb = lpBitsCb;

		LPBYTE lpDibLine1 = pDst;
		LPBYTE lpDibLine2 = pDst + dstStride;

		for (UINT x = 0; x < dwWidthInPixels; x += 2)
		{
			int  y0 = (int)lpLineY1[0];
			int  y1 = (int)lpLineY1[1];
			int  y2 = (int)lpLineY2[0];
			int  y3 = (int)lpLineY2[1];
			int  cb = (int)lpLineCb[0];
			int  cr = (int)lpLineCr[0];

			RGBQUAD r = ConvertYCrCbToRGB(y0, cr, cb);
			lpDibLine1[0] = r.rgbBlue;
			lpDibLine1[1] = r.rgbGreen;
			lpDibLine1[2] = r.rgbRed;
			lpDibLine1[3] = 0; // Alpha

			r = ConvertYCrCbToRGB(y1, cr, cb);
			lpDibLine1[4] = r.rgbBlue;
			lpDibLine1[5] = r.rgbGreen;
			lpDibLine1[6] = r.rgbRed;
			lpDibLine1[7] = 0; // Alpha

			r = ConvertYCrCbToRGB(y2, cr, cb);
			lpDibLine2[0] = r.rgbBlue;
			lpDibLine2[1] = r.rgbGreen;
			lpDibLine2[2] = r.rgbRed;
			lpDibLine2[3] = 0; // Alpha

			r = ConvertYCrCbToRGB(y3, cr, cb);
			lpDibLine2[4] = r.rgbBlue;
			lpDibLine2[5] = r.rgbGreen;
			lpDibLine2[6] = r.rgbRed;
			lpDibLine2[7] = 0; // Alpha

			lpLineY1 += 2;
			lpLineY2 += 2;
			lpLineCr += 2;
			lpLineCb += 2;

			lpDibLine1 += 8;
			lpDibLine2 += 8;
		}

		pDst += (2 * dstStride);
		lpBitsY   += (2 * srcStride);
		lpBitsCr  += srcStride;
		lpBitsCb  += srcStride;
	}
}

//-------------------------------------------------------------------
// LetterBoxDstRect
//
// Takes a src rectangle and constructs the largest possible
// destination rectangle within the specifed destination rectangle
// such thatthe video maintains its current shape.
//
// This function assumes that pels are the same shape within both the
// source and destination rectangles.
//
//-------------------------------------------------------------------

RECT    LetterBoxRect(const RECT& rcSrc, const RECT& rcDst)
{
	// figure out src/dest scale ratios
	int iSrcWidth  = Width(rcSrc);
	int iSrcHeight = Height(rcSrc);

	int iDstWidth  = Width(rcDst);
	int iDstHeight = Height(rcDst);

	int iDstLBWidth;
	int iDstLBHeight;

	if (MulDiv(iSrcWidth, iDstHeight, iSrcHeight) <= iDstWidth) {

		// Column letter boxing ("pillar box")

		iDstLBWidth  = MulDiv(iDstHeight, iSrcWidth, iSrcHeight);
		iDstLBHeight = iDstHeight;
	}
	else {

		// Row letter boxing.

		iDstLBWidth  = iDstWidth;
		iDstLBHeight = MulDiv(iDstWidth, iSrcHeight, iSrcWidth);
	}

	// Create a centered rectangle within the current destination rect

	RECT rc;

	LONG left = rcDst.left + ((iDstWidth - iDstLBWidth) / 2);
	LONG top = rcDst.top + ((iDstHeight - iDstLBHeight) / 2);

	SetRect(&rc, left, top, left + iDstLBWidth, top + iDstLBHeight);

	return rc;
}

//-----------------------------------------------------------------------------
// CorrectAspectRatio
//
// Converts a rectangle from the source's pixel aspect ratio (PAR) to 1:1 PAR.
// Returns the corrected rectangle.
//
// For example, a 720 x 486 rect with a PAR of 9:10, when converted to 1x1 PAR,
// is stretched to 720 x 540.
//-----------------------------------------------------------------------------

RECT CorrectAspectRatio(const RECT& src, const MFRatio& srcPAR)
{
	// Start with a rectangle the same size as src, but offset to the origin (0,0).
	RECT rc = {0, 0, src.right - src.left, src.bottom - src.top};

	if ((srcPAR.Numerator != 1) || (srcPAR.Denominator != 1))
	{
		// Correct for the source's PAR.

		if (srcPAR.Numerator > srcPAR.Denominator)
		{
			// The source has "wide" pixels, so stretch the width.
			rc.right = MulDiv(rc.right, srcPAR.Numerator, srcPAR.Denominator);
		}
		else if (srcPAR.Numerator < srcPAR.Denominator)
		{
			// The source has "tall" pixels, so stretch the height.
			rc.bottom = MulDiv(rc.bottom, srcPAR.Denominator, srcPAR.Numerator);
		}
		// else: PAR is 1:1, which is a no-op.
	}
	return rc;
}

//-----------------------------------------------------------------------------
// GetDefaultStride
//
// Gets the default stride for a video frame, assuming no extra padding bytes.
//
//-----------------------------------------------------------------------------

HRESULT GetDefaultStride(IMFMediaType *pType, LONG *plStride)
{
	LONG lStride = 0;

	// Try to get the default stride from the media type.
	HRESULT hr = pType->GetUINT32(MF_MT_DEFAULT_STRIDE, (UINT32*)&lStride);
	if (FAILED(hr))
	{
		// Attribute not set. Try to calculate the default stride.
		GUID subtype = GUID_NULL;

		UINT32 width = 0;
		UINT32 height = 0;

		// Get the subtype and the image size.
		hr = pType->GetGUID(MF_MT_SUBTYPE, &subtype);
		if (SUCCEEDED(hr))
		{
			hr = MFGetAttributeSize(pType, MF_MT_FRAME_SIZE, &width, &height);
		}
		if (SUCCEEDED(hr))
		{
			hr = MFGetStrideForBitmapInfoHeader(subtype.Data1, width, &lStride);
		}

		// Set the attribute for later reference.
		if (SUCCEEDED(hr))
		{
			(void)pType->SetUINT32(MF_MT_DEFAULT_STRIDE, UINT32(lStride));
		}
	}

	if (SUCCEEDED(hr))
	{
		*plStride = lStride;
	}
	return hr;
}

First we set the number of back buffers a blue screen and a surface for the webcam video information which is2.

Then we have a set of image transforms for TransformImage_RGB to TransformImage_NV12.

This translates the color information from one format to another.

The first really interesting bit after that is GetFormat this sets the pSubType GUID.

IsFormatSupported returns true or false depending on whether the format is supported.

CreateDevice creates the DirectX 3D 9 device which we will use to draw stuff.

SetConversionFunction works out which Conversion function to use to translate the video.

SetVideoType uses the conversion function to work out and set the video format.

UpdateDestinationRect The destination rectangle is letterboxed to preserve the aspect ratio of the video image.

This ensures a good image even when the image is resized.

CreateSwapChains this creates the D3D swapchains a little explanation is required however on exactly what a swapchain is and why its so important. A swap chain is a collection of buffers that are used for displaying frames to the
user. Each time an application presents a new frame for display, the first buffer in the swap chain takes the place of the displayed buffer. This process is called swapping or flipping.

A graphics adapter holds a pointer to a surface that represents the image being displayed on the monitor, called a front buffer. As the monitor is refreshed, the graphics card sends the contents of the front buffer to the monitor to be displayed. However, this
leads to a problem when rendering real-time graphics. The heart of the problem is that monitor refresh rates are very slow in comparison to the rest of the computer. Common refresh rates range from 60 Hz (60 times per second) to 100 Hz. If your application
is updating the front buffer while the monitor is in the middle of a refresh, the image that is displayed will be cut in half with the upper half of the display containing the old image and the lower half containing the new image. This problem is referred
to as tearing.

Direct3D implements two options to avoid tearing:

1. An option to only allow updates of the monitor on the vertical retrace (or vertical sync) operation. A monitor typically refreshes its image by moving a light pin horizontally, zigzagging from the top left of the monitor and ending at the bottom right. When
the light pin reaches the bottom, the monitor recalibrates the light pin by moving it back to the upper left so that the process can start again. This recalibration is called a vertical sync. During a vertical sync, the monitor is not drawing anything, so
any update to the front buffer will not be seen until the monitor starts to draw again. The vertical sync is relatively slow; however, not slow enough to render a complex scene while waiting. What is needed to avoid tearing and be able to render complex scenes
is a process called back buffering.

2. An option to use a technique called back buffering. Back buffering is the process of drawing a scene to an off-screen surface, called a back buffer. Note that any surface other than the front buffer is called an off-screen surface because it is never directly
viewed by the monitor. By using a back buffer, an application has the freedom to render a scene whenever the system is idle (that is, no windows messages are waiting) without having to consider the monitor‘s refresh rate. Back buffering brings in an additional
complication of how and when to move the back buffer to the front buffer.

It is option 2 back buffering that we will be using to display our webcam image.

DrawFrame is next and this is where most of the fun things are done.

We set up some IDirect3DSurface9‘s and lock the video buffer.

We get the swapchain surface and lock it.

We convert the frame. This also copies it to the Direct3D surface.

This is a frame from the webcam video stream.

We color the back buffer ‘Blue‘.

We set up another D3D9 surface this will hold our text.png.

We set up a D3D9 texture this is for drawing our textSprite.

We use or D3D9 device to begin a new scene.

We create a sprite, create a blank texture then load from a resource our text.png into our blank texture.

We call begin on textSprite, set the x,y,z position of our sprite.

We draw the textSprite onto our texture.

We then call end on our textSprite.

We set the render target to pSurf and set up a POINT
object called p.

We call endScene on our device.

We then update the surface.

If saveframe is true we save the file using D3DXSaveSurfaceToFile.

We then Present the Frame.

The rest of the functions are pretty much self explanatory up until the Image Transforms

these functions should not be altered they cannot be guessed at and have to be learnt it is the type of function you will use over an over in your WMF apps.

You will need the Direct X SDK for June 2010

this can be found here

http://www.microsoft...ls.aspx?id=6812

If you are using VS2012 or 2013 you will need to do somethings outlined and explained in this article.

http://blogs.msdn.co...1023-error.aspx

as you can see you need to go into control panel-> Uninstall Programs

find the Visual C++ 2010 Redistributable Package version 10.0.30319 and delete both the (x86) and (x64) versions

Install the DirectX sdk for June 2010.

Then download the Microsoft Visual C++ 2010 SP1 Redistributable Package (x86)

found here

http://www.microsoft...ls.aspx?id=8328

and install it.

Well I think that is about it.

You should have the following files before you press compile.

BufferLock.h

Common.h

Device.h

Preview.h

resource.h

resource.rc

Device.cpp

Preview.cpp

main.cpp

The program should link and compile without warnings or errors.

The libs are linked in by pragma comments so I will not go over the libs req‘d to run this program as it is unnecessary.

What the program does allows you to connect to a webcam through a dialog displays that webcam image in a window overlays a graphic over it and allows you to save the composite image to a file called Capture.jpg.

The code base comes from Anton Pollinger‘s book

Developing Microsoft

Media Foundation

Applications

however that code and book does not show you how to save an image to disk or overlay a graphic it does however show you how to display a webcam image in a window.

Also the code on msdn and in Pollinger‘s book is liberally sprinkled with goto‘s which I have removed and redesigned the

class structures somewhat to accomodate for this.

The saving of the image and graphic overlay comes from my own understanding and I believe the Tutorial to be unique in the world of C++ Tutorials.

References

Developing Microsoft

Media Foundation

Applications

MSDN

Link to text.png file for downloading

https://app.box.com/...sd4e4910oia30g6

I hope you have found this Tutorial useful.

Regards

Snoopy.

from another code;

CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE);
MFStartup(MF_VERSION);

IMFMediaSource *src;
IMFAttributes *attr;
IMFActivate **devices;
UINT32 count = 0;

IMFAttributes *attributes = nullptr;
attributes->SetGUID(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID);
MFEnumDeviceSources(attributes, &devices, &count);

IMFSourceReader *reader;
IMFMediaSource *source = nullptr;
devices[0]->ActivateObject( __uuidof(IMFMediaSource), (void**)&source);
MFCreateSourceReaderFromMediaSource(source, NULL, &reader);

IMFSample *sample;
reader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, &sample);
if(sample)
{
  IMFMediaBuffer* buffer;
  BYTE* data;
  DWORD max, current;
  sample->GetBufferByIndex(0, &buffer);
  buffer->Lock(&data, &max, &cur);

   buffer->UnLock();
  SafeRelease(&buffer);
}
SafeRelease(&sample);

SafeRelease(&source);
MFShutdown();
CoUninitialize();

a new Webcam Api Tutorial in C++ for Windows(Windows Media Foundation)--WMF,布布扣,bubuko.com

时间: 2024-10-24 16:53:09

a new Webcam Api Tutorial in C++ for Windows(Windows Media Foundation)--WMF的相关文章

Laravel API Tutorial: How to Build and Test a RESTful API

With the rise of mobile development and JavaScript frameworks, using a RESTful API is the best option to build a single interface between your data and your client. Laravel is a PHP framework developed with PHP developer productivity in mind. Written

Instant Buy Android API Tutorial

转自:https://developers.google.com/wallet/instant-buy/android/tutorial This tutorial guides you through integrating Instant Buy into a purchase flow, in the context of an example bike store. The tutorial provides details and complete source code to hel

一劳永逸搭建android开发环境(android官网reference sample api tutorial全下载)

[摘要]本文简单介绍了android开发环境的搭建,重点介绍了SDK manager和AVD升级问题:并提供了android reference,sample,api,及docs的下载信息. [1]为何写这个题目呢? 1.工欲其事必先利器: 2.墙内的世界太烦躁,健康向上的东西也得不到: 3.google及其android在墙外: 4.SDK不好太多,太零散,非得用管理工具才好使: 如果你会翻越长城这堵世界奇迹之墙,你可以在android官网https://developer.android.c

Java Logging API - Tutorial

Java Logging This article describes how to use the Logging API in Java programs. It includes an example for creating an HTML logger. Table of Contents 1. Overview 1.1. Logging 1.2. Logging in Java 1.3. Create a logger 1.4. Level 1.5. Handler 1.6. For

Java Date, Calendar and Time API - Tutorial

Java Date and Time API This article explains the API for using Calendar, Date and Time in Java and how to format the output of a date. Table of Contents 1. Overview 2. Format date 3. Working with Dates and Calendars 3.1. Calendar 3.2. Date and Date C

Activiti Rest API tutorial

http://192.168.66.182:8080/activiti-rest/service/repository/deployments/ {"data":[{"id":"20","name":"Demo processes","deploymentTime":"2018-08-01T10:02:11.212+08:00","category"

实现python扩展的C API方法过程全纪录(windows)

第一步:安装编译器 推荐使用mingw,使用最为便利,可以避免各种难以记忆和看不懂的设置. 下载只需安装其中的gcc部分即可,并且将编译器所在文件夹添加的环境变量path之下,例如: pah = %path%;c:\minGW\bin 第二步:安装python 推荐使用pythonxy,安装最为方便,省去很多不必要的麻烦. 第三步:写一段测试代码 基本方法就是:C函数+c API 包装器,静态数组,模块初始化 //pythonc.c #include <python.h> #include &

常用open api

SNS类网站API Facebook - http://developers.facebook.com/ 人人网开放平台 - http://dev.renren.com/ 51.com开放平台 - http://developers.51.com/ MySpace开发者平台 - http://developer.myspace.cn/ Opensocial - http://wiki.opensocial.org/ Google Gadgets 小工具 API 开发人员指南 - http://w

常用API(转载)

SNS类网站API Facebook - http://developers.facebook.com/ 人人网开放平台 - http://dev.renren.com/ 51.com开放平台 - http://developers.51.com/ MySpace开发者平台 - http://developer.myspace.cn/ Opensocial - http://wiki.opensocial.org/ Google Gadgets 小工具 API 开发人员指南 - http://w