在Win8上面,Image source切换的时候有bug。当咱们短期定时切换的时候,Image不能正常地显示对应的图片。Image控件又不支持GIF播放,因此GIF图片的播放就是一个很是头痛的问题。windows
正巧,最近在研究MFSource,我就突发奇想,能不能用MFSource去播放GIF图片。app
结果成功了,项目文件在这里 https://gifwin8player.codeplex.com/ 你们能够和我一块儿来维护这个项目。dom
接下来,跟你们分享一下我在开发中遇到的技术问题。但愿能给你们一些帮助。异步
首先MFSource,能够帮助咱们拓展MediaElement所支持的媒体类型,和添加一些新的编码解码。微软有一个很是好的例程 http://code.msdn.microsoft.com/windowsapps/Media-extensions-sample-8e1b8275 这个例程里面演示了不少技术,MFSource能够帮助咱们添加播放器支持的协议,好比说咱们须要播放器播放一个加密的url地址,咱们也能够用他添加新的解码器。MFT能够帮助咱们增长一些视频特效。async
个人例程就是基于修改MFSource,这个项目里关于MFSource只有三个类。GIFByteStreamHandler, GIFSrc,GIFStream。GIFByteStreamHandler是主要处理媒体打开时候的操做,这个媒体既能够是流,也能够是URL。同时这个类也能够获得应用传来的参数(IPropertyStore *pProps)。GIFSrc,这个类是起主要做用的,须要填入视频的一些信息,好比尺寸,还要给出每一帧的数据。GIFStream这个类是维护应答的,播放器会调(RequestSample(IUnknown* pToken);)这个函数来请求帧,而后咱们再用(m_pEventQueue->QueueEventParamUnk(MEMediaSample, GUID_NULL, S_OK, pSample);)来应答。ide
Media Foundation是基于消息应答的机制工做的。与Win32的MF不一样,Windows Store 里面的MF没有消息队列的API,不过例程中已经实现了一个消息队列。函数
我修改的思路也很简单,就是在须要的时候,给出帧的数据和时间。this
首先,修改的是CGIFByteStreamHandler::BeginCreateObject函数。若是咱们应用中调用MediaElement.Source = new Uri(**)的时候,咱们能够从LPCWSTR pwszURL 参数中得到这个Uri。若是咱们是用MediaElement.SetSource()函数的话,传入的就是流。MediaElement.Source 不能指定任意的文件,好比不能指定图片库的某一个图片,可是像ms-appx这样的路径就能够获取。编码
CGIFSource是单例运行的,CGIFSource::BeginOpen函数用来打开流,WIC须要的流是IStream,IMFByteStream能够直接转换成IStream,可是那个函数不能在Windows Store 里面用。咱们只能经过IRandomAccessStream作中转。代码以下加密
if (SUCCEEDED(hr)) { Microsoft::WRL::ComPtr<ABI::Windows::Storage::Streams::IRandomAccessStream> iRandomstream; hr = MFCreateStreamOnMFByteStreamEx( pStream, IID_PPV_ARGS(&iRandomstream) ); hr = CreateStreamOverRandomAccessStream( reinterpret_cast<IUnknown*>(iRandomstream.Get()), IID_PPV_ARGS(&istream) ); }
这里面还须要额外的设置,在GIFSource.idl里面添加 import "windows.storage.idl"; 引用头文件#include <Shcore.h>和对应的Shcore.lib。我试过添加#include <Windows.winmd>,这个会致使编译报错,在idl里面用import也要注意大小写要跟(C:\Program Files (x86)\Windows Kits\8.0\Include\winrt)里面的一致。
MF中的异步操做是经过callback模式来作的,
// Create an async result object. We'll use it later to invoke the callback. if (SUCCEEDED(hr)) { hr = MFCreateAsyncResult(NULL, pCB, pState, &m_pBeginOpenResult); } if (m_pBeginOpenResult) { hr = m_pBeginOpenResult->SetStatus(hr); if (SUCCEEDED(hr)) { hr = MFInvokeCallback(m_pBeginOpenResult); } }
这段代码是建立一个异步请求的结果,而后再激发这个结果,最后的结果在CGIFByteStreamHandler::Invoke 中执行。
打开文件以后,咱们要建立视频的描述,这段代码是在CGIFSource::CreateStream 中被DeliverPlayload()调用。
HRESULT CGIFSource::CreateStream(long stream_id) { OutputDebugString(L"call CreateStream function\n"); HRESULT hr = S_OK; IMFMediaType *pType = NULL; IMFStreamDescriptor *pSD = NULL; CGIFStream *pStream = NULL; IMFMediaTypeHandler *pHandler = NULL; hr = MFCreateMediaType(&pType); if (SUCCEEDED(hr)) { hr = pType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video); } if (SUCCEEDED(hr)) { hr = pType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB24); } // Format details. if (SUCCEEDED(hr)) { // Frame size hr = MFSetAttributeSize( pType, MF_MT_FRAME_SIZE, m_pImage->GetWidth(), m_pImage->GetHeight() ); } if (SUCCEEDED(hr)) { // Frame rate hr = MFSetAttributeRatio( pType, MF_MT_FRAME_RATE, 24000, 1001 ); // 23.976 fps } if (SUCCEEDED(hr)) { // Sequence header. hr=pType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT,1); } if (SUCCEEDED(hr)) { // Create the stream descriptor from the media type. hr = MFCreateStreamDescriptor(stream_id, 1, &pType, &pSD); } // Set the default media type on the stream handler. if (SUCCEEDED(hr)) { hr = pSD->GetMediaTypeHandler(&pHandler); } if (SUCCEEDED(hr)) { hr = pHandler->SetCurrentMediaType(pType); } // Create the new stream. if (SUCCEEDED(hr)) { pStream = new (std::nothrow) CGIFStream(this, pSD, hr); if (pStream == NULL) { hr = E_OUTOFMEMORY; } } if (SUCCEEDED(hr)) { m_streams.AddStream(stream_id, pStream); pStream->AddRef(); } SafeRelease(&pType); SafeRelease(&pSD); SafeRelease(&pStream); return hr; }
steam建立好以后,要建立MF里面的描述 经过这个函数InitPresentationDescriptor()
这个函数执行完以后,初始化的工做就完成了。
用户点播放的时候,系统会调CGIFSource::CreatePresentationDescriptor这个函数来获取视频信息,而后调用CGIFSource::Start。接着内部的消息队列会调CGIFSource::DoStart,向MF中发MESourceStarted消息,接到这个消息以后CGIFStream::RequestSample会被触发,若是CGIFStream里面维护的队列为空的话,就会问CGIFSource要数据,最后定为到DeliverPayload()函数,
HRESULT CGIFSource::DeliverPayload() { // When this method is called, the read buffer contains a complete // payload, and the payload belongs to a stream whose type we support. wchar_t buf[256]; swprintf_s(buf,L"call do DeliverPayload function with frame %d",m_CurrentIndex); OutputDebugString((LPCWSTR)buf); HRESULT hr = S_OK; //GIFPacketHeader packetHdr; CGIFStream *pStream = NULL; // not AddRef'd IMFMediaBuffer *pBuffer = NULL; IMFSample *pSample = NULL; BYTE *pData = NULL; // Pointer to the IMFMediaBuffer data. IWICBitmapFrameDecode *pWicFrame = NULL; // If we are still opening the file, then we might need to create this stream. if (SUCCEEDED(hr)) { if (m_state == STATE_OPENING) { hr = CreateStream(0); } } // Create a media buffer for the payload. if (SUCCEEDED(hr)) { hr = MFCreateMemoryBuffer(m_pImage->GetBuffSize(), &pBuffer); } if (SUCCEEDED(hr)) { hr = pBuffer->Lock(&pData, NULL, NULL); } if (SUCCEEDED(hr)) { hr=m_pImage->GetFrameBuffByIndex(m_CurrentIndex,&pData); } if (SUCCEEDED(hr)) { hr = pBuffer->Unlock(); } if (SUCCEEDED(hr)) { hr = pBuffer->SetCurrentLength(m_pImage->GetBuffSize()); } // Create a sample to hold the buffer. if (SUCCEEDED(hr)) { hr = MFCreateSample(&pSample); } if (SUCCEEDED(hr)) { hr = pSample->AddBuffer(pBuffer); } // Time stamp the sample. if (SUCCEEDED(hr)) { //LONGLONG hnsStart = m_CurrentIndex * 1000000; // 1s hr = pSample->SetSampleTime(m_SampleTimespan); m_SampleTimespan+=m_pImage->GetFrameDelay()*10000; } // Deliver the payload to the stream. if (SUCCEEDED(hr)) { /*hr = m_pEventQueue->QueueEventParamUnk( MEMediaSample, GUID_NULL, S_OK, pSample);*/ hr = m_streams[0]->DeliverPayload(pSample); } // If the open operation is still pending, check if we're done. if (SUCCEEDED(hr)) { if (m_state == STATE_OPENING) { hr = InitPresentationDescriptor(); goto done; } } m_CurrentIndex++; if(m_CurrentIndex>=m_pImage->GetFrameNumbers()) { hr = m_pEventQueue->QueueEventParamVar( MEEndOfStream, GUID_NULL, S_OK, NULL); if (FAILED(hr)) { goto done; } hr =QueueAsyncOperation(SourceOp::OP_END_OF_STREAM); if (FAILED(hr)) { goto done; } }else { if (SUCCEEDED(hr)) { if (StreamsNeedData()) { hr = RequestSample(); } } } done: SafeRelease(&pBuffer); SafeRelease(&pSample); return hr; }
这个函数把图片里面的数据转换成sample,再加上时间传回CGIFStream,最后发送hr = m_pEventQueue->QueueEventParamUnk(MEMediaSample, GUID_NULL, S_OK, pSample);给MF,这样帧的数据就输出了。
写好了以后,还要把这个source引入到项目中,有一点要注意的是,要在Package.appxmanifast,里面添加额外的信息。右键单击这个文件->查看代码,在Package下面添加节点
<Extensions> <Extension Category="windows.activatableClass.inProcessServer"> <InProcessServer> <Path>GIFSource.dll</Path> <ActivatableClass ActivatableClassId="GIFSource.GIFByteStreamHandler" ThreadingModel="both" /> </InProcessServer> </Extension> </Extensions>
而后在MediaExtensionManager中添加ByteStreamHandler
_extensionManager.RegisterByteStreamHandler("GIFSource.GIFByteStreamHandler", ".gif", "image/gif");