国产一级a片免费看高清,亚洲熟女中文字幕在线视频,黄三级高清在线播放,免费黄色视频在线看

打開APP
userphoto
未登錄

開通VIP,暢享免費(fèi)電子書等14項(xiàng)超值服

開通VIP
stagefright框架
stagefright框架(一)Video Playback的流程
在Android上,預(yù)設(shè)的多媒體框架(multimedia framework)是OpenCORE。OpenCORE的優(yōu)點(diǎn)是兼顧了跨平臺(tái)的移植性,而且已經(jīng)過多方驗(yàn)證,所以相對(duì)來說較為穩(wěn)定;但是其缺點(diǎn)是過於龐大複雜,需要耗費(fèi)相當(dāng)多的時(shí)間去維護(hù)。從Android 2.0開始,Google引進(jìn)了架構(gòu)稍為簡(jiǎn)潔的Stagefright,並且有逐漸取代OpenCORE的趨勢(shì) (註1)。
[圖1] Stagefright在Android多媒體架構(gòu)中的位置。
[圖2] Stagefright所涵蓋的模組 (註2)。
以下我們就先來看看Stagefright是如何播放一個(gè)影片檔。
Stagefright在Android中是以shared library的形式存在(libstagefright.so),其中的module -- AwesomePlayer可用來播放video/audio (註3)。AwesomePlayer提供許多API,可以讓上層的應(yīng)用程式(Java/JNI)來呼叫,我們以一個(gè)簡(jiǎn)單的程式來說明video playback的流程。
在Java中,若要播放一個(gè)影片檔,我們會(huì)這樣寫:
MediaPlayer mp = new MediaPlayer();
mp.setDataSource(PATH_TO_FILE); ...... (1)
mp.prepare(); ........................ (2)、(3)
mp.start(); .......................... (4)
在Stagefright中,則會(huì)看到相對(duì)應(yīng)的處理;
(1) 將檔案的絕對(duì)路徑指定給mUri
status_t AwesomePlayer::setDataSource(const char* uri, ...)
{
return setDataSource_l(uri, ...);
}
status_t AwesomePlayer::setDataSource_l(const char* uri, ...)
{
mUri = uri;
}
(2) 啟動(dòng)mQueue,作為event handler
status_t AwesomePlayer::prepare()
{
return prepare_l();
}
status_t AwesomePlayer::prepare_l()
{
prepareAsync_l();
while (mFlags & PREPARING)
{
mPreparedCondition.wait(mLock);
}
}
status_t AwesomePlayer::prepareAsync_l()
{
mQueue.start();
mFlags |= PREPARING;
mAsyncPrepareEvent = new AwesomeEvent(
this
&AwesomePlayer::onPrepareAsyncEvent);
mQueue.postEvent(mAsyncPrepareEvent);
}
(3) onPrepareAsyncEvent被觸發(fā)
void AwesomePlayer::onPrepareAsyncEvent()
{
finishSetDataSource_l();
initVideoDecoder(); ...... (3.3)
initAudioDecoder();
}
status_t AwesomePlayer::finishSetDataSource_l()
{
dataSource = DataSource::CreateFromURI(mUri.string(), ...);
sp<MediaExtractor> extractor =
MediaExtractor::Create(dataSource); ..... (3.1)
return setDataSource_l(extractor); ......................... (3.2)
}
(3.1) 解析mUri所指定的檔案,並且根據(jù)其header來選擇對(duì)應(yīng)的extractor
sp<MediaExtractor> MediaExtractor::Create(const sp<DataSource> &source, ...)
{
source->sniff(&tmp, ...);
mime = tmp.string();
if (!strcasecmp(mime, MEDIA_MIMETYPE_CONTAINER_MPEG4)
{
return new MPEG4Extractor(source);
}
else if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_MPEG))
{
return new MP3Extractor(source);
}
else if (!strcasecmp(mime, MEDIA_MIMETYPE_AUDIO_AMR_NB)
{
return new AMRExtractor(source);
}
}
(3.2) 使用extractor對(duì)檔案做A/V的分離 (mVideoTrack/mAudioTrack)
status_t AwesomePlayer::setDataSource_l(const sp<MediaExtractor> &extractor)
{
for (size_t i = 0; i < extractor->countTracks(); ++i)
{
sp<MetaData> meta = extractor->getTrackMetaData(i);
CHECK(meta->findCString(kKeyMIMEType, &mime));
if (!haveVideo && !strncasecmp(mime, "video/", 6))
{
setVideoSource(extractor->getTrack(i));
haveVideo = true;
}
else if (!haveAudio && !strncasecmp(mime, "audio/", 6))
{
setAudioSource(extractor->getTrack(i));
haveAudio = true;
}
}
}
void AwesomePlayer::setVideoSource(sp<MediaSource> source)
{
mVideoTrack = source;
}
(3.3) 根據(jù)mVideoTrack中的編碼類型來選擇video decoder (mVideoSource)
status_t AwesomePlayer::initVideoDecoder()
{
mVideoSource = OMXCodec::Create(mClient.interface(),
mVideoTrack->getFormat(),
false,
mVideoTrack);
}
(4) 將mVideoEvent放入mQueue中,開始解碼播放,並交由mVideoRenderer來畫出
status_t AwesomePlayer::play()
{
return play_l();
}
status_t AwesomePlayer::play_l()
{
postVideoEvent_l();
}
void AwesomePlayer::postVideoEvent_l(int64_t delayUs)
{
mQueue.postEventWithDelay(mVideoEvent, delayUs);
}
void AwesomePlayer::onVideoEvent()
{
mVideoSource->read(&mVideoBuffer, &options);
[Check Timestamp]
mVideoRenderer->render(mVideoBuffer);
postVideoEvent_l();
}
(註1) 從Android2.3 (Gingerbread) 開始,預(yù)設(shè)的多媒體框架為 Stagefright。
(註2) Stagefright的架構(gòu)尚不斷在演進(jìn)中,本系列文章並未含括所有的模組。
(註3) Audio的播放是交由 AudioPlayer 來處理,請(qǐng)參考《Stagefright (6) - Audio Playback的流程》。
stagefright框架(二)- 和OpenMAX的運(yùn)作
Stagefright的編解碼功能是利用OpenMAX框架,而且用的還是OpenCORE之OMX的實(shí)作,我們來看一下Stagefright和OMX是如何運(yùn)作的。
(1) OMX_Init
OMXClient mClient;
AwesomePlayer::AwesomePlayer()
{
mClient.connect();
}
status_t OMXClient::connect()
{
mOMX = service->getOMX();
}
sp<IOMX> MediaPlayerService::getOMX()
{
mOMX = new OMX;
}
OMX::OMX() : mMaster(new OMXMaster)
OMXMaster::OMXMaster()
{
addPlugin(new OMXPVCodecsPlugin);
}
OMXPVCodecsPlugin::OMXPVCodecsPlugin()
{
OMX_MasterInit();
}
OMX_ERRORTYPE OMX_MasterInit() <-- under OpenCORE
{
return OMX_Init();
}
(2) OMX_SendCommand
OMXCodec::function_name()
{
mOMX->sendCommand(mNode, OMX_CommandStateSet, OMX_StateIdle);
}
status_t OMX::sendCommand(node, cmd, param)
{
return findInstance(node)->sendCommand(cmd, param);
}
status_t OMXNodeInstance::sendCommand(cmd, param)
{
OMX_SendCommand(mHandle, cmd, param, NULL);
}
(3) 其他作用在 OMX 元件的指令
其他作用在OMX元件的指令也和OMX_SendCommand的call path一樣,請(qǐng)見下表:
OMXCodec
OMX
OMXNodeInstance
useBuffer
useBuffer (OMX_UseBuffer)
getParameter
getParameter (OMX_GetParameter)
fillBuffer
fillBuffer (OMX_FillThisBuffer)
emptyBuffer
emptyBuffer (OMX_EmptyThisBuffer)
(4) Callback Functions
OMX_CALLBACKTYPE OMXNodeInstance::kCallbacks =
{
&OnEvent, <--------------- omx_message::EVENT
&OnEmptyBufferDone, <----- omx_message::EMPTY_BUFFER_DONE
&OnFillBufferDone <------- omx_message::FILL_BUFFER_DONE
}
stagefright框架(三)-選擇Video Decoder
在《Stagefright (1) – Video Playback的流程》中,我們並沒有詳述Stagefright是如何根據(jù)影片檔的類型來選擇適合的video decoder,現(xiàn)在,就讓我們來看一看。
(1) Video decoder是在onPrepareAsyncEvent中的initVideoDecoder被決定的
OMXCodec::Create()會(huì)回傳video decoder給mVideoSource。
status_t AwesomePlayer::initVideoDecoder()
{
mVideoSource = OMXCodec::Create(mClient.interface(),
mVideoTrack->getFormat(),
false,
mVideoTrack);
}
sp<MediaSource> OMXCodec::Create(&omx, &meta, createEncoder, &source, matchComponentName)
{
meta->findCString(kKeyMIMEType, &mime);
findMatchingCodecs(mime, ..., &matchingCodecs); ........ (2)
for (size_t i = 0; i < matchingCodecs.size(); ++i)
{
componentName = matchingCodecs[i].string();
softwareCodec =
InstantiateSoftwareCodec(componentName, ...); ..... (3)
if (softwareCodec != NULL) return softwareCodec;
err = omx->allocateNode(componentName, ..., &node); ... (4)
if (err == OK)
{
codec = new OMXCodec(..., componentName, ...); ...... (5)
return codec;
}
}
}
(2) 根據(jù)mVideoTrack的MIME從kDecoderInfo挑出合適的components
void OMXCodec::findMatchingCodecs(mime, ..., matchingCodecs)
{
for (int index = 0;; ++index)
{
componentName = GetCodec(
kDecoderInfo,
sizeof(kDecoderInfo)/sizeof(kDecoderInfo[0]),
mime,
index);
matchingCodecs->push(String8(componentName));
}
}
static const CodecInfo kDecoderInfo[] =
{
...
{ MEDIA_MIMETYPE_VIDEO_MPEG4, "OMX.qcom.video.decoder.mpeg4" },
{ MEDIA_MIMETYPE_VIDEO_MPEG4, "OMX.TI.Video.Decoder" },
{ MEDIA_MIMETYPE_VIDEO_MPEG4, "M4vH263Decoder" },
...
}
GetCodec會(huì)依據(jù)mime從kDecoderInfo挑出所有的component name,然後存到matchingCodecs中。
(3) 根據(jù)matchingCodecs中component的順序,我們會(huì)先去檢查其是否為software decoder
static sp<MediaSource> InstantiateSoftwareCodec(name, ...)
{
FactoryInfo kFactoryInfo[] =
{
...
FACTORY_REF(M4vH263Decoder)
...
};
for (i = 0; i < sizeof(kFactoryInfo)/sizeof(kFactoryInfo[0]); ++i)
{
if (!strcmp(name, kFactoryInfo[i].name))
return (*kFactoryInfo[i].CreateFunc)(source);
}
}
所有的software decoder都會(huì)被列在kFactoryInfo中,我們藉由傳進(jìn)來的name來對(duì)應(yīng)到適合的decoder。
(4) 如果該component不是software decoder,則試著去配置對(duì)應(yīng)的OMX component
status_t OMX::allocateNode(name, ..., node)
{
mMaster->makeComponentInstance(
name,
&OMXNodeInstance::kCallbacks,
instance,
handle);
}
OMX_ERRORTYPE OMXMaster::makeComponentInstance(name, ...)
{
plugin->makeComponentInstance(name, ...);
}
OMX_ERRORTYPE OMXPVCodecsPlugin::makeComponentInstance(name, ...)
{
return OMX_MasterGetHandle(..., name, ...);
}
OMX_ERRORTYPE OMX_MasterGetHandle(...)
{
return OMX_GetHandle(...);
}
(5) 若該component為OMX deocder,則回傳;否則繼續(xù)檢查下一個(gè)component
stagefright框架(四)-Video Buffer傳輸流程
這篇文章將介紹Stagefright中是如何和OMX video decoder傳遞buffer。
(1) OMXCodec會(huì)在一開始的時(shí)候透過read函式來傳送未解碼的data給decoder,並且要求decoder將解碼後的data傳回來
status_t OMXCodec::read(...)
{
if (mInitialBufferSubmit)
{
mInitialBufferSubmit = false;
drainInputBuffers(); <----- OMX_EmptyThisBuffer
fillOutputBuffers(); <----- OMX_FillThisBuffer
}
...
}
void OMXCodec::drainInputBuffers()
{
Vector<BufferInfo> *buffers = &mPortBuffers[kPortIndexInput];
for (i = 0; i < buffers->size(); ++i)
{
drainInputBuffer(&buffers->editItemAt(i));
}
}
void OMXCodec::drainInputBuffer(BufferInfo *info)
{
mOMX->emptyBuffer(...);
}
void OMXCodec::fillOutputBuffers()
{
Vector<BufferInfo> *buffers = &mPortBuffers[kPortIndexOutput];
for (i = 0; i < buffers->size(); ++i)
{
fillOutputBuffer(&buffers->editItemAt(i));
}
}
void OMXCodec::fillOutputBuffer(BufferInfo *info)
{
mOMX->fillBuffer(...);
}
(2) Decoder從input port讀取資料後,開始進(jìn)行解碼,並且回傳EmptyBufferDone通知OMXCodec
void OMXCodec::on_message(const omx_message &msg)
{
switch (msg.type)
{
case omx_message::EMPTY_BUFFER_DONE:
{
IOMX::buffer_id buffer = msg.u.extended_buffer_data.buffer;
drainInputBuffer(&buffers->editItemAt(i));
}
}
}
OMXCodec收到EMPTY_BUFFER_DONE之後,繼續(xù)傳送下一個(gè)未解碼的資料給decoder。
(3) Decoder將解碼完的資料送到output port,並回傳FillBufferDone通知OMXCodec
void OMXCodec::on_message(const omx_message &msg)
{
switch (msg.type)
{
case omx_message::FILL_BUFFER_DONE:
{
IOMX::buffer_id buffer = msg.u.extended_buffer_data.buffer;
fillOutputBuffer(info);
mFilledBuffers.push_back(i);
mBufferFilled.signal();
}
}
}
OMXCodec收到FILL_BUFFER_DONE之後,將解碼後的資料放入mFilledBuffers,發(fā)出mBufferFilled信號(hào),並且要求decoder繼續(xù)送出資料。
(4) read函式在後段等待mBufferFilled信號(hào)。當(dāng)mFilledBuffers被填入資料後,read函式將其指定給buffer指標(biāo),並回傳給AwesomePlayer
status_t OMXCodec::read(MediaBuffer **buffer, ...)
{
...
while (mFilledBuffers.empty())
{
mBufferFilled.wait(mLock);
}
BufferInfo *info = &mPortBuffers[kPortIndexOutput].editItemAt(index);
info->mMediaBuffer->add_ref();
*buffer = info->mMediaBuffer;
}
stagefright框架(五)-Video Rendering
AwesomePlayer::onVideoEvent除了透過OMXCodec::read取得解碼後的資料外,還必須將這些資料(mVideoBuffer)傳給video renderer,以便畫到螢?zāi)簧先ァ?div style="height:15px;">
(1) 要將mVideoBuffer中的資料畫出來之前,必須先建立mVideoRenderer
void AwesomePlayer::onVideoEvent()
{
...
if (mVideoRenderer == NULL)
{
initRenderer_l();
}
...
}
void AwesomePlayer::initRenderer_l()
{
if (!strncmp("OMX.", component, 4))
{
mVideoRenderer = new AwesomeRemoteRenderer(
mClient.interface()->createRenderer(
mISurface,
component,
...)); .......... (2)
}
else
{
mVideoRenderer = new AwesomeLocalRenderer(
...,
component,
mISurface); ............................ (3)
}
}
(2) 如果video decoder是OMX component,則建立一個(gè)AwesomeRemoteRenderer作為mVideoRenderer
從上段的程式碼(1)來看,AwesomeRemoteRenderer的本質(zhì)是由OMX::createRenderer所創(chuàng)建的。createRenderer會(huì)先建立一個(gè)hardware renderer -- SharedVideoRenderer (libstagefrighthw.so);若失敗,則建立software renderer -- SoftwareRenderer (surface)。
sp<IOMXRenderer> OMX::createRenderer(...)
{
VideoRenderer *impl = NULL;
libHandle = dlopen("libstagefrighthw.so", RTLD_NOW);
if (libHandle)
{
CreateRendererFunc func = dlsym(libHandle, ...);
impl = (*func)(...); <----------------- Hardware Renderer
}
if (!impl)
{
impl = new SoftwareRenderer(...); <---- Software Renderer
}
}
(3) 如果video decoder是software component,則建立一個(gè)AwesomeLocalRenderer作為mVideoRenderer
AwesomeLocalRenderer的constructor會(huì)呼叫本身的init函式,其所做的事和OMX::createRenderer一模一樣。
void AwesomeLocalRenderer::init(...)
{
mLibHandle = dlopen("libstagefrighthw.so", RTLD_NOW);
if (mLibHandle)
{
CreateRendererFunc func = dlsym(...);
mTarget = (*func)(...); <---------------- Hardware Renderer
}
if (mTarget == NULL)
{
mTarget = new SoftwareRenderer(...); <--- Software Renderer
}
}
(4) mVideoRenderer一經(jīng)建立就可以開始將解碼後的資料傳給它
void AwesomePlayer::onVideoEvent()
{
if (!mVideoBuffer)
{
mVideoSource->read(&mVideoBuffer, ...);
}
[Check Timestamp]
if (mVideoRenderer == NULL)
{
initRenderer_l();
}
mVideoRenderer->render(mVideoBuffer); <----- Render Data
}
stagefright框架(六)-Audio Playback的流程
到目前為止,我們都只著重在video處理的部分,對(duì)於audio卻隻字未提。這篇文章將會(huì)開始audio處理的流程。
Stagefright中關(guān)於audio的部分是交由AudioPlayer來處理,它是在AwesomePlayer::play_l中被建立的。
(1) 當(dāng)上層應(yīng)用程式要求播放影音時(shí),AudioPlayer同時(shí)被建立出來,並且被啟動(dòng)
status_t AwesomePlayer::play_l()
{
...
mAudioPlayer = new AudioPlayer(mAudioSink, ...);
mAudioPlayer->start(...);
...
}
(2) AudioPlayer在啟動(dòng)的過程中會(huì)先去讀取第一筆解碼後的資料,並且開啟audio output
status_t AudioPlayer::start(...)
{
mSource->read(&mFirstBuffer);
if (mAudioSink.get() != NULL)
{
mAudioSink->open(..., &AudioPlayer::AudioSinkCallback, ...);
mAudioSink->start();
}
else
{
mAudioTrack = new AudioTrack(..., &AudioPlayer::AudioCallback, ...);
mAudioTrack->start();
}
}
從AudioPlayer::start的程式碼來看,AudioPlayer似乎並沒有將mFirstBuffer傳給audio output。
(3) 開啟audio output的同時(shí),AudioPlayer會(huì)將callback函式設(shè)給它,之後每當(dāng)callback函式被呼叫,AudioPlayer便去audio decoder讀取解碼後的資料
size_t AudioPlayer::AudioSinkCallback(audioSink, buffer, size, ...)
{
return fillBuffer(buffer, size);
}
void AudioPlayer::AudioCallback(..., info)
{
buffer = info;
fillBuffer(buffer->raw, buffer->size);
}
size_t AudioPlayer::fillBuffer(data, size)
{
mSource->read(&mInputBuffer, ...);
memcpy(data, mInputBuffer->data(), ...);
}
解碼後audio資料的讀取就是由callback函式所驅(qū)動(dòng),但是callback函式又是怎麼由audio output去驅(qū)動(dòng)的,目前從程式碼上還看不出來。另外一方面,從上面的程式片段可以看出,fillBuffer將資料(mInputBuffer)複製到data之後,audio output應(yīng)該會(huì)去取用data。
(5) 至於audio decoder的工作流程則和video decoder相同,可參閱《Stagefright (4) - Video Buffer傳輸流程
stagefright框架(七)-Audio和Video的同步
講完了audio和video的處理流程,接下來要看的是audio和video同步化(synchronization)的問題。OpenCORE的做法是設(shè)置一個(gè)主clock,而audio和video就分別以此作為輸出的依據(jù)。而在Stagefright中,audio的輸出是透過callback函式來驅(qū)動(dòng),video則根據(jù)audio的timestamp來做同步。以下是詳細(xì)的說明:
(1) 當(dāng)callback函式驅(qū)動(dòng)AudioPlayer讀取解碼後的資料時(shí),AudioPlayer會(huì)取得兩個(gè)時(shí)間戳 -- mPositionTimeMediaUs和mPositionTimeRealUs
size_t AudioPlayer::fillBuffer(data, size)
{
...
mSource->read(&mInputBuffer, ...);
mInputBuffer->meta_data()->findInt64(kKeyTime, &mPositionTimeMediaUs);
mPositionTimeRealUs = ((mNumFramesPlayed + size_done / mFrameSize) * 1000000) / mSampleRate;
...
}
mPositionTimeMediaUs是資料裡面所載明的時(shí)間戳(timestamp);mPositionTimeRealUs則是播放此資料的實(shí)際時(shí)間(依據(jù)frame number及sample rate得出)。
(2) Stagefright中的video便依據(jù)從AudioPlayer得出來之兩個(gè)時(shí)間戳的差值,作為播放的依據(jù)
void AwesomePlayer::onVideoEvent()
{
...
mVideoSource->read(&mVideoBuffer, ...);
mVideoBuffer->meta_data()->findInt64(kKeyTime, &timeUs);
mAudioPlayer->getMediaTimeMapping(&realTimeUs, &mediaTimeUs);
mTimeSourceDeltaUs = realTimeUs - mediaTimeUs;
nowUs = ts->getRealTimeUs() - mTimeSourceDeltaUs;
latenessUs = nowUs - timeUs;
...
}
AwesomePlayer從AudioPlayer取得realTimeUs(即mPositionTimeRealUs)和mediaTimeUs(即mPositionTimeMediaUs),並算出其差值mTimeSourceDeltaUs。
(3) 最後我們將該video資料做排程
void AwesomePlayer::onVideoEvent()
{
...
if (latenessUs > 40000)
{
mVideoBuffer->release();
mVideoBuffer = NULL;
postVideoEvent_l();
return;
}
if (latenessUs < -10000)
{
postVideoEvent_l(10000);
return;
}
mVideoRenderer->render(mVideoBuffer);
...
}
本站僅提供存儲(chǔ)服務(wù),所有內(nèi)容均由用戶發(fā)布,如發(fā)現(xiàn)有害或侵權(quán)內(nèi)容,請(qǐng)點(diǎn)擊舉報(bào)
打開APP,閱讀全文并永久保存 查看更多類似文章
猜你喜歡
類似文章
[整理]Stagefright框架中視頻播放流程
stagefright框架(三)
stagefright + omx小結(jié)
Android-StageFright之decode
Android播放器框架分析 1
opencore and stagefright
更多類似文章 >>
生活服務(wù)
分享 收藏 導(dǎo)長(zhǎng)圖 關(guān)注 下載文章
綁定賬號(hào)成功
后續(xù)可登錄賬號(hào)暢享VIP特權(quán)!
如果VIP功能使用有故障,
可點(diǎn)擊這里聯(lián)系客服!

聯(lián)系客服