Audio Audio Terminology This glossary of audio-related terminology includes widely-used generic terms and Android-specific terms. See the central Android Platform Glossary for the canonical definitions of terms. Generic Terms Generic audio-related terms have conventional meanings. Digital Audio Digital audio terms relate to handling sound using audio signals encoded in digital form. For details, refer to Digital Audio.

Author:Mooguzragore Jukree
Language:English (Spanish)
Published (Last):2 January 2019
PDF File Size:6.45 Mb
ePub File Size:14.95 Mb
Price:Free* [*Free Regsitration Required]

It is a system service which starts during boot and enables the platform for audio related use-cases in the following ways. Provide the access interface for the upper layer for using Audio. Uses the HAL to manage the audio devices. Only by understanding AudioFlinger can we better penetrate into other modules based on this, so we put it in the forefront of our analysis. Let us proceed further. AudioFlinger Service: Startup and Operation. The mediaserver process is started by the init.

It is worth mentioning that the AudioFlinger::instantiate is not a static class within AudioFlinger, but an implementation of the BinderService class. Several services, including AudioFlinger and AudioPolicyService, inherit from this unified Binder service class, such as.

The binderservice after all the binder related procedures [Please check the tutorial on Native service addition for details] adds this to the servicemanager entry. We see that it simply initializes some of the internal variables, and there is no code other than this. AudioFlinger is an important entity. Other processes access it through the ServiceManager interface and call createTrack, openOutput and a series of interfaces to drive AudioFlinger to perform audio processing operations, which we will explain.

It is still a static memory space so far and does not involve specific work. In terms of the distribution of functions, AudioPolicyService is the policy maker.

It decides some of the important things like. When to open the audio interface device. What kind of Stream-type audio corresponds to what device etc. AudioFlinger is the executor of the strategy defined and decided and does some of the following tasks.

How to communicate specifically with the audio device. How to maintain the audio sstate sanity in the existing system. How to deal with the mixture of multiple audio streams, etc. The audio interfaces supported by the Audio system [AudioFlinger] fall into three categories.

Each audio device interface supported implements a corresponding shared library. So how does AudioFlinger know which interfaces are supported by the current device and which specific audio devices are supported by each interface?

One of the main responsibilities of AudioPolicyService is to guide AudioFlinger to load device interfaces according to user configuration. This process will eventually call loadHwModule of AudioFlinger as follows. First find out if mAudioHwDevs has added the audio interface indicated by the variable name, and if so, return directly. Loads the specified audiointerface, such as "primary", "a2dp" or "usb". The library file name corresponding to the audio interface device has a certain format.

For example, the module name of a2dp may be audio. There are two main search paths. Before each device operation, we must first change the mHardwareStatus value. Add the loaded device to the mAudioHwDevs key-value pair, where the key value is generated by nextUniqueId. This ensures that the audiointerface has a globally unique id number. The completion of the module loading of the audio interface is only the first step in the long journey as each interface contains more than one device.

When we play back or record any audio Stream which one should I choose? These decision-making tasks are done by AudioPolicyService We will discuss this in detail later. Let us see how the AudioFlinger opens an Output channel an audiointerface may contain a number of output. Open the audio output channel The corresponding interface in AF is openOutput. We break this function in some steps as below.

We have seen that outHwDev is used to store an open audio interface device. From the above structure definition. Let us revisit the functions we saw earlier, but this time in a slightly more detailed manner. When module is equal to 0 all known audio interface devices are loaded first and then they are determined according to the devices. When modules is non-zero, it indicates that Audio Policy specifies a specific device id number.

At this time, the global mAudioHwDevs variable is checked to determine whether there is a device that meets the requirements. Each handle value uniquely identifies the audio device that has been added.

In first case, when the previous module is 0, it will load all potential devices. If modules is non-zero, and the device that meets the requirements is also not found in mAudioHwDevs, the program will not terminate there - it will do its best, traverse through all the elements in the array and look for any audio interface that supports devices.

After AudioFlinger::openOutput the channel is already open. The next question in the puzzle is who is going to put something in the channel? This is PlaybackThread. Let us look at two different situations. DirectOutput : If you do not need to mix the streams. Mixer: Need to mix the streams.

Let us analyze the working of PlaybackThread in the latter case as an example. AudioFlinger has two global variables for Recording and Playback threads which are. Let us see its constructor as follows. The first step is to create an AudioMixer object. This is the key to mixing, which we will cover in more detail later. Then check the number of channels. In the case of Mixer, there must be more than one channels.

Finally, according to the configuration initFastMixer determine whether to use fast mixer. We should know that the task of a playback thread is to continuously process the upper layer data request, then pass it to the next layer, and eventually write to the hardware device.

However, in the above function, it does not seem to see the program start a new thread, did not see into the thread loop, or to call other functions that may cause thread creation. So under what circumstances MixerThread will really enter the thread loop?

I do not know if you have noticed the definition of mPlaybackThreads before, we again listed as follows. At the same time, we can also see that the ancestor of the PlaybackThread class is RefBase.

Thread inherits from RefBase: We know according to the characteristics of the strong pointer, the target object will call onFirstRef when it is first referenced. This function is implemented as follows. Let us organize the content described in this section. As long as AudioTrack and AudioFlinger continue to pass data, the entire audio playback continues. We will continue analysis in the next tutorial.


Subscribe to RSS



Audio Terminology



Android AudioFlinger回顾


Related Articles