xref: /aosp_15_r20/external/oboe/docs/FullGuide.md (revision 05767d913155b055644481607e6fa1e35e2fe72c)
1*05767d91SRobert Wu# Full Guide To Oboe
2*05767d91SRobert WuOboe is a C++ library which makes it easy to build high-performance audio apps on Android. Apps communicate with Oboe by reading and writing data to streams.
3*05767d91SRobert Wu
4*05767d91SRobert Wu## Audio streams
5*05767d91SRobert Wu
6*05767d91SRobert WuOboe moves audio data between your app and the audio inputs and outputs on your Android device. Your app passes data in and out using a callback function or by reading from and writing to *audio streams*, represented by the class `AudioStream`. The read/write calls can be blocking or non-blocking.
7*05767d91SRobert Wu
8*05767d91SRobert WuA stream is defined by the following:
9*05767d91SRobert Wu
10*05767d91SRobert Wu*   The *audio* *device* that is the source or sink for the data in the stream.
11*05767d91SRobert Wu*   The *sharing mode* that determines whether a stream has exclusive access to an audio device that might otherwise be shared among multiple streams.
12*05767d91SRobert Wu*   The *format* of the audio data in the stream.
13*05767d91SRobert Wu
14*05767d91SRobert Wu### Audio device
15*05767d91SRobert Wu
16*05767d91SRobert WuEach stream is attached to a single audio device.
17*05767d91SRobert Wu
18*05767d91SRobert WuAn audio device is a hardware interface or virtual endpoint that acts as a source or sink for a continuous stream of digital audio data. Don't confuse an *audio device*
19*05767d91SRobert Wu(a built-in mic or bluetooth headset) with the *Android device* (the phone or watch) that is running your app.
20*05767d91SRobert Wu
21*05767d91SRobert WuOn API 23 and above you can use the `AudioManager` method [getDevices()](https://developer.android.com/reference/android/media/AudioManager.html#getDevices(int)) to discover the audio devices that are available on your Android device. The method returns information about the [type](https://developer.android.com/reference/android/media/AudioDeviceInfo.html) of each device.
22*05767d91SRobert Wu
23*05767d91SRobert WuEach audio device has a unique ID on the Android device. You can  use the ID to bind an audio stream to a specific audio device.  However, in most cases you can let Oboe choose the default primary device rather than specifying one yourself.
24*05767d91SRobert Wu
25*05767d91SRobert WuThe audio device attached to a stream determines whether the stream is for input or output. A stream can only move data in one direction. When you define a stream you also set its direction. When you open a stream Android checks to ensure that the audio device and stream direction agree.
26*05767d91SRobert Wu
27*05767d91SRobert Wu### Sharing mode
28*05767d91SRobert Wu
29*05767d91SRobert WuA stream has a sharing mode:
30*05767d91SRobert Wu
31*05767d91SRobert Wu*   `SharingMode::Exclusive` (available on API 26+) means the stream has exclusive access to an endpoint on its audio device; the endpoint cannot be used by any other audio stream. If the exclusive endpoint is already in use, it might not be possible for the stream to obtain access to it. Exclusive streams provide the lowest possible latency by bypassing the mixer stage, but they are also more likely to get disconnected. You should close exclusive streams as soon as you no longer need them, so that other apps can access that endpoint. Not all audio devices provide exclusive endpoints. System sounds and sounds from other apps can still be heard when an exclusive stream is in use as they use a different endpoint.
32*05767d91SRobert Wu
33*05767d91SRobert Wu![Oboe exclusive sharing mode diagram](images/oboe-sharing-mode-exclusive.jpg)
34*05767d91SRobert Wu
35*05767d91SRobert Wu*   `SharingMode::Shared` allows Oboe streams to share an endpoint. The operating system will mix all the shared streams assigned to the same endpoint on the audio device.
36*05767d91SRobert Wu
37*05767d91SRobert Wu![Oboe exclusive sharing mode diagram](images/oboe-sharing-mode-shared.jpg)
38*05767d91SRobert Wu
39*05767d91SRobert Wu
40*05767d91SRobert WuYou can explicitly request the sharing mode when you create a stream, although you are not guaranteed to receive that mode. By default, the sharing mode is `Shared`.
41*05767d91SRobert Wu
42*05767d91SRobert Wu### Audio format
43*05767d91SRobert Wu
44*05767d91SRobert WuThe data passed through a stream has the usual digital audio attributes, which you must specify when you define a stream. These are as follows:
45*05767d91SRobert Wu
46*05767d91SRobert Wu*   Sample format
47*05767d91SRobert Wu*   Samples per frame
48*05767d91SRobert Wu*   Sample rate
49*05767d91SRobert Wu
50*05767d91SRobert WuOboe permits these sample formats:
51*05767d91SRobert Wu
52*05767d91SRobert Wu| AudioFormat | C data type | Notes |
53*05767d91SRobert Wu| :------------ | :---------- | :---- |
54*05767d91SRobert Wu| I16 | int16_t | common 16-bit samples, [Q0.15 format](https://source.android.com/devices/audio/data_formats#androidFormats) |
55*05767d91SRobert Wu| Float | float | -1.0 to +1.0 |
56*05767d91SRobert Wu
57*05767d91SRobert WuOboe might perform sample conversion on its own. For example, if an app is writing AudioFormat::Float data but the HAL uses AudioFormat::I16, Oboe might convert the samples automatically. Conversion can happen in either direction. If your app processes audio input, it is wise to verify the input format and be prepared to convert data if necessary, as in this example:
58*05767d91SRobert Wu
59*05767d91SRobert Wu    AudioFormat dataFormat = stream->getDataFormat();
60*05767d91SRobert Wu    //... later
61*05767d91SRobert Wu    if (dataFormat == AudioFormat::I16) {
62*05767d91SRobert Wu         convertFloatToPcm16(...)
63*05767d91SRobert Wu    }
64*05767d91SRobert Wu
65*05767d91SRobert Wu## Creating an audio stream
66*05767d91SRobert Wu
67*05767d91SRobert WuThe Oboe library follows a [builder design pattern](https://en.wikipedia.org/wiki/Builder_pattern) and provides the class `AudioStreamBuilder`.
68*05767d91SRobert Wu
69*05767d91SRobert Wu### Set the audio stream configuration using an AudioStreamBuilder.
70*05767d91SRobert Wu
71*05767d91SRobert WuUse the builder functions that correspond to the stream parameters. These optional set functions are available:
72*05767d91SRobert Wu
73*05767d91SRobert Wu    AudioStreamBuilder streamBuilder;
74*05767d91SRobert Wu
75*05767d91SRobert Wu    streamBuilder.setDeviceId(deviceId);
76*05767d91SRobert Wu    streamBuilder.setDirection(direction);
77*05767d91SRobert Wu    streamBuilder.setSharingMode(shareMode);
78*05767d91SRobert Wu    streamBuilder.setSampleRate(sampleRate);
79*05767d91SRobert Wu    streamBuilder.setChannelCount(channelCount);
80*05767d91SRobert Wu    streamBuilder.setFormat(format);
81*05767d91SRobert Wu    streamBuilder.setPerformanceMode(perfMode);
82*05767d91SRobert Wu
83*05767d91SRobert WuNote that these methods do not report errors, such as an undefined constant or value out of range. They will be checked when the stream is opened.
84*05767d91SRobert Wu
85*05767d91SRobert WuIf you do not specify the deviceId, the default is the primary output device.
86*05767d91SRobert WuIf you do not specify the stream direction, the default is an output stream.
87*05767d91SRobert WuFor all parameters, you can explicitly set a value, or let the system
88*05767d91SRobert Wuassign the optimal value by not specifying the parameter at all or setting
89*05767d91SRobert Wuit to `kUnspecified`.
90*05767d91SRobert Wu
91*05767d91SRobert WuTo be safe, check the state of the audio stream after you create it, as explained in step 3, below.
92*05767d91SRobert Wu
93*05767d91SRobert Wu### Open the Stream
94*05767d91SRobert Wu
95*05767d91SRobert WuDeclare a **shared pointer** for the stream. Make sure it is declared with the appropriate scope. The best place is as a member variable in a managing class or as a global. Avoid declaring it as a local variable because the stream may get deleted when the function returns.
96*05767d91SRobert Wu
97*05767d91SRobert Wu    std::shared_ptr<oboe::AudioStream> mStream;
98*05767d91SRobert Wu
99*05767d91SRobert WuAfter you've configured the `AudioStreamBuilder`, call `openStream()` to open the stream:
100*05767d91SRobert Wu
101*05767d91SRobert Wu    Result result = streamBuilder.openStream(mStream);
102*05767d91SRobert Wu    if (result != OK){
103*05767d91SRobert Wu        __android_log_print(ANDROID_LOG_ERROR,
104*05767d91SRobert Wu                            "AudioEngine",
105*05767d91SRobert Wu                            "Error opening stream %s",
106*05767d91SRobert Wu                            convertToText(result));
107*05767d91SRobert Wu    }
108*05767d91SRobert Wu
109*05767d91SRobert Wu
110*05767d91SRobert Wu### Verifying stream configuration and additional properties
111*05767d91SRobert Wu
112*05767d91SRobert WuYou should verify the stream's configuration after opening it.
113*05767d91SRobert Wu
114*05767d91SRobert WuThe following properties are guaranteed to be set. However, if these properties
115*05767d91SRobert Wuare unspecified, a default value will still be set, and should be queried by the
116*05767d91SRobert Wuappropriate accessor.
117*05767d91SRobert Wu
118*05767d91SRobert Wu* framesPerCallback
119*05767d91SRobert Wu* sampleRate
120*05767d91SRobert Wu* channelCount
121*05767d91SRobert Wu* format
122*05767d91SRobert Wu* direction
123*05767d91SRobert Wu
124*05767d91SRobert WuThe following properties may be changed by the underlying stream construction
125*05767d91SRobert Wu*even if explicitly set* and therefore should always be queried by the appropriate
126*05767d91SRobert Wuaccessor. The property settings will depend on device capabilities.
127*05767d91SRobert Wu
128*05767d91SRobert Wu* bufferCapacityInFrames
129*05767d91SRobert Wu* sharingMode (exclusive provides lowest latency)
130*05767d91SRobert Wu* performanceMode
131*05767d91SRobert Wu
132*05767d91SRobert WuThe following properties are only set by the underlying stream. They cannot be
133*05767d91SRobert Wuset by the application, but should be queried by the appropriate accessor.
134*05767d91SRobert Wu
135*05767d91SRobert Wu* framesPerBurst
136*05767d91SRobert Wu
137*05767d91SRobert WuThe following properties have unusual behavior
138*05767d91SRobert Wu
139*05767d91SRobert Wu* deviceId is respected when the underlying API is AAudio (API level >=28), but not when it
140*05767d91SRobert Wuis OpenSLES. It can be set regardless, but *will not* throw an error if an OpenSLES stream
141*05767d91SRobert Wuis used. The default device will be used, rather than whatever is specified.
142*05767d91SRobert Wu
143*05767d91SRobert Wu* mAudioApi is only a property of the builder, however
144*05767d91SRobert WuAudioStream::getAudioApi() can be used to query the underlying API which the
145*05767d91SRobert Wustream uses. The property set in the builder is not guaranteed, and in
146*05767d91SRobert Wugeneral, the API should be chosen by Oboe to allow for best performance and
147*05767d91SRobert Wustability considerations. Since Oboe is designed to be as uniform across both
148*05767d91SRobert WuAPIs as possible, this property should not generally be needed.
149*05767d91SRobert Wu
150*05767d91SRobert Wu* mBufferSizeInFrames can only be set on an already open stream (as opposed to a
151*05767d91SRobert Wubuilder), since it depends on run-time behavior.
152*05767d91SRobert WuThe actual size used may not be what was requested.
153*05767d91SRobert WuOboe or the underlyng API will limit the size between zero and the buffer capacity.
154*05767d91SRobert WuIt may also be limited further to reduce glitching on particular devices.
155*05767d91SRobert WuThis feature is not supported when using a callback with OpenSL ES.
156*05767d91SRobert Wu
157*05767d91SRobert WuMany of the stream's properties may vary (whether or not you set
158*05767d91SRobert Wuthem) depending on the capabilities of the audio device and the Android device on
159*05767d91SRobert Wuwhich it's running. If you need to know these values then you must query them using
160*05767d91SRobert Wuthe accessor after the stream has been opened. Additionally,
161*05767d91SRobert Wuthe underlying parameters a stream is granted are useful to know if
162*05767d91SRobert Wuthey have been left unspecified. As a matter of good defensive programming, you
163*05767d91SRobert Wushould check the stream's configuration before using it.
164*05767d91SRobert Wu
165*05767d91SRobert Wu
166*05767d91SRobert WuThere are functions to retrieve the stream setting that corresponds to each
167*05767d91SRobert Wubuilder setting:
168*05767d91SRobert Wu
169*05767d91SRobert Wu
170*05767d91SRobert Wu| AudioStreamBuilder set methods | AudioStream get methods |
171*05767d91SRobert Wu| :------------------------ | :----------------- |
172*05767d91SRobert Wu| `setDataCallback()` |  `getDataCallback()` |
173*05767d91SRobert Wu| `setErrorCallback()` |  `getErrorCallback()` |
174*05767d91SRobert Wu| `setDirection()` | `getDirection()` |
175*05767d91SRobert Wu| `setSharingMode()` | `getSharingMode()` |
176*05767d91SRobert Wu| `setPerformanceMode()` | `getPerformanceMode()` |
177*05767d91SRobert Wu| `setSampleRate()` | `getSampleRate()` |
178*05767d91SRobert Wu| `setChannelCount()` | `getChannelCount()` |
179*05767d91SRobert Wu| `setFormat()` | `getFormat()` |
180*05767d91SRobert Wu| `setBufferCapacityInFrames()` | `getBufferCapacityInFrames()` |
181*05767d91SRobert Wu| `setFramesPerCallback()` | `getFramesPerCallback()` |
182*05767d91SRobert Wu|  --  | `getFramesPerBurst()` |
183*05767d91SRobert Wu| `setDeviceId()` (not respected on OpenSLES) | `getDeviceId()` |
184*05767d91SRobert Wu| `setAudioApi()` (mainly for debugging) | `getAudioApi()` |
185*05767d91SRobert Wu
186*05767d91SRobert WuThe following AudioStreamBuilder fields were added in API 28 to
187*05767d91SRobert Wuspecify additional information about the AudioStream to the device. Currently,
188*05767d91SRobert Wuthey have little effect on the stream, but setting them helps applications
189*05767d91SRobert Wuinteract better with other services.
190*05767d91SRobert Wu
191*05767d91SRobert WuFor more information see: [Usage/ContentTypes](https://source.android.com/devices/audio/attributes).
192*05767d91SRobert WuThe InputPreset may be used by the device to process the input stream (such as gain control). By default
193*05767d91SRobert Wuit is set to VoiceRecognition, which is optimized for low latency.
194*05767d91SRobert Wu
195*05767d91SRobert Wu* `setUsage(oboe::Usage usage)`  - The purpose for creating the stream.
196*05767d91SRobert Wu* `setContentType(oboe::ContentType contentType)` - The type of content carried
197*05767d91SRobert Wu  by the stream.
198*05767d91SRobert Wu* `setInputPreset(oboe::InputPreset inputPreset)` - The recording configuration
199*05767d91SRobert Wu  for an audio input.
200*05767d91SRobert Wu* `setSessionId(SessionId sessionId)` - Allocate SessionID to connect to the
201*05767d91SRobert Wu  Java AudioEffects API.
202*05767d91SRobert Wu
203*05767d91SRobert Wu
204*05767d91SRobert Wu## Using an audio stream
205*05767d91SRobert Wu
206*05767d91SRobert Wu### State transitions
207*05767d91SRobert Wu
208*05767d91SRobert WuAn Oboe stream is usually in one of five stable states (the error state, Disconnected, is described at the end of this section):
209*05767d91SRobert Wu
210*05767d91SRobert Wu*   Open
211*05767d91SRobert Wu*   Started
212*05767d91SRobert Wu*   Paused
213*05767d91SRobert Wu*   Flushed
214*05767d91SRobert Wu*   Stopped
215*05767d91SRobert Wu
216*05767d91SRobert WuData only flows through a stream when the stream is in the Started state. To
217*05767d91SRobert Wumove a stream between states, use one of the functions that request a state
218*05767d91SRobert Wutransition:
219*05767d91SRobert Wu
220*05767d91SRobert Wu    Result result;
221*05767d91SRobert Wu    result = stream->requestStart();
222*05767d91SRobert Wu    result = stream->requestStop();
223*05767d91SRobert Wu    result = stream->requestPause();
224*05767d91SRobert Wu    result = stream->requestFlush();
225*05767d91SRobert Wu
226*05767d91SRobert WuNote that you can only request pause or flush on an output stream:
227*05767d91SRobert Wu
228*05767d91SRobert WuThese functions are asynchronous, and the state change doesn't happen
229*05767d91SRobert Wuimmediately. When you request a state change, the stream moves to one of the
230*05767d91SRobert Wucorresponding transient states:
231*05767d91SRobert Wu
232*05767d91SRobert Wu*   Starting
233*05767d91SRobert Wu*   Pausing
234*05767d91SRobert Wu*   Flushing
235*05767d91SRobert Wu*   Stopping
236*05767d91SRobert Wu*   Closing
237*05767d91SRobert Wu
238*05767d91SRobert WuThe state diagram below shows the stable states as rounded rectangles, and the transient states as dotted rectangles.
239*05767d91SRobert WuThough it's not shown, you can call `close()` from any state
240*05767d91SRobert Wu
241*05767d91SRobert Wu![Oboe Lifecycle](images/oboe-lifecycle.png)
242*05767d91SRobert Wu
243*05767d91SRobert WuOboe doesn't provide callbacks to alert you to state changes. One special
244*05767d91SRobert Wufunction,
245*05767d91SRobert Wu`AudioStream::waitForStateChange()` can be used to wait for a state change.
246*05767d91SRobert WuNote that most apps will not need to call `waitForStateChange()` and can just
247*05767d91SRobert Wurequest state changes whenever they are needed.
248*05767d91SRobert Wu
249*05767d91SRobert WuThe function does not detect a state change on its own, and does not wait for a
250*05767d91SRobert Wuspecific state. It waits until the current state
251*05767d91SRobert Wuis *different* than `inputState`, which you specify.
252*05767d91SRobert Wu
253*05767d91SRobert WuFor example, after requesting to pause, a stream should immediately enter
254*05767d91SRobert Wuthe transient state Pausing, and arrive sometime later at the Paused state - though there's no guarantee it will.
255*05767d91SRobert WuSince you can't wait for the Paused state, use `waitForStateChange()` to wait for *any state
256*05767d91SRobert Wuother than Pausing*. Here's how that's done:
257*05767d91SRobert Wu
258*05767d91SRobert Wu```
259*05767d91SRobert WuStreamState inputState = StreamState::Pausing;
260*05767d91SRobert WuStreamState nextState = StreamState::Uninitialized;
261*05767d91SRobert Wuint64_t timeoutNanos = 100 * kNanosPerMillisecond;
262*05767d91SRobert Wuresult = stream->requestPause();
263*05767d91SRobert Wuresult = stream->waitForStateChange(inputState, &nextState, timeoutNanos);
264*05767d91SRobert Wu```
265*05767d91SRobert Wu
266*05767d91SRobert Wu
267*05767d91SRobert WuIf the stream's state is not Pausing (the `inputState`, which we assumed was the
268*05767d91SRobert Wucurrent state at call time), the function returns immediately. Otherwise, it
269*05767d91SRobert Wublocks until the state is no longer Pausing or the timeout expires. When the
270*05767d91SRobert Wufunction returns, the parameter `nextState` shows the current state of the
271*05767d91SRobert Wustream.
272*05767d91SRobert Wu
273*05767d91SRobert WuYou can use this same technique after calling request start, stop, or flush,
274*05767d91SRobert Wuusing the corresponding transient state as the inputState. Do not call
275*05767d91SRobert Wu`waitForStateChange()` after calling `AudioStream::close()` since the underlying stream resources
276*05767d91SRobert Wuwill be deleted as soon as it closes. And do not call `close()`
277*05767d91SRobert Wuwhile `waitForStateChange()` is running in another thread.
278*05767d91SRobert Wu
279*05767d91SRobert Wu### Reading and writing to an audio stream
280*05767d91SRobert Wu
281*05767d91SRobert WuThere are two ways to move data in or out of a stream.
282*05767d91SRobert Wu1) Read from or write directly to the stream.
283*05767d91SRobert Wu2) Specify a data callback object that will get called when the stream is ready.
284*05767d91SRobert Wu
285*05767d91SRobert WuThe callback technique offers the lowest latency performance because the callback code can run in a high priority thread.
286*05767d91SRobert WuAlso, attempting to open a low latency output stream without an audio callback (with the intent to use writes)
287*05767d91SRobert Wumay result in a non low latency stream.
288*05767d91SRobert Wu
289*05767d91SRobert WuThe read/write technique may be easier when you do not need low latency. Or, when doing both input and output, it is common to use a callback for output and then just do a non-blocking read from the input stream. Then you have both the input and output data available in one high priority thread.
290*05767d91SRobert Wu
291*05767d91SRobert WuAfter the stream is started you can read or write to it using the methods
292*05767d91SRobert Wu`AudioStream::read(buffer, numFrames, timeoutNanos)`
293*05767d91SRobert Wuand
294*05767d91SRobert Wu`AudioStream::write(buffer, numFrames, timeoutNanos)`.
295*05767d91SRobert Wu
296*05767d91SRobert WuFor a blocking read or write that transfers the specified number of frames, set timeoutNanos greater than zero. For a non-blocking call, set timeoutNanos to zero. In this case the result is the actual number of frames transferred.
297*05767d91SRobert Wu
298*05767d91SRobert WuWhen you read input, you should verify the correct number of
299*05767d91SRobert Wuframes was read. If not, the buffer might contain unknown data that could cause an
300*05767d91SRobert Wuaudio glitch. You can pad the buffer with zeros to create a
301*05767d91SRobert Wusilent dropout:
302*05767d91SRobert Wu
303*05767d91SRobert Wu    Result result = mStream->read(audioData, numFrames, timeout);
304*05767d91SRobert Wu    if (result < 0) {
305*05767d91SRobert Wu        // Error!
306*05767d91SRobert Wu    }
307*05767d91SRobert Wu    if (result != numFrames) {
308*05767d91SRobert Wu        // pad the buffer with zeros
309*05767d91SRobert Wu        memset(static_cast<sample_type*>(audioData) + result * samplesPerFrame, 0,
310*05767d91SRobert Wu               (numFrames - result) * mStream->getBytesPerFrame());
311*05767d91SRobert Wu    }
312*05767d91SRobert Wu
313*05767d91SRobert WuYou can prime the stream's buffer before starting the stream by writing data or silence into it. This must be done in a non-blocking call with timeoutNanos set to zero.
314*05767d91SRobert Wu
315*05767d91SRobert WuThe data in the buffer must match the data format returned by `mStream->getDataFormat()`.
316*05767d91SRobert Wu
317*05767d91SRobert Wu### Closing an audio stream
318*05767d91SRobert Wu
319*05767d91SRobert WuWhen you are finished using a stream, close it:
320*05767d91SRobert Wu
321*05767d91SRobert Wu    stream->close();
322*05767d91SRobert Wu
323*05767d91SRobert WuDo not close a stream while it is being written to or read from another thread as this will cause your app to crash. After you close a stream you should not call any of its methods except for quering it properties.
324*05767d91SRobert Wu
325*05767d91SRobert Wu### Disconnected audio stream
326*05767d91SRobert Wu
327*05767d91SRobert WuAn audio stream can become disconnected at any time if one of these events happens:
328*05767d91SRobert Wu
329*05767d91SRobert Wu*   The associated audio device is no longer connected (for example when headphones are unplugged).
330*05767d91SRobert Wu*   An error occurs internally.
331*05767d91SRobert Wu*   An audio device is no longer the primary audio device.
332*05767d91SRobert Wu
333*05767d91SRobert WuWhen a stream is disconnected, it has the state "Disconnected" and calls to `write()` or other functions will return `Result::ErrorDisconnected`.  When a stream is disconnected, all you can do is close it.
334*05767d91SRobert Wu
335*05767d91SRobert WuIf you need to be informed when an audio device is disconnected, write a class
336*05767d91SRobert Wuwhich extends `AudioStreamErrorCallback` and then register your class using `builder.setErrorCallback(yourCallbackClass)`. It is recommended to pass a shared_ptr.
337*05767d91SRobert WuIf you register a callback, then it will automatically close the stream in a separate thread if the stream is disconnected.
338*05767d91SRobert Wu
339*05767d91SRobert WuYour callback can implement the following methods (called in a separate thread):
340*05767d91SRobert Wu
341*05767d91SRobert Wu* `onErrorBeforeClose(stream, error)` - called when the stream has been disconnected but not yet closed,
342*05767d91SRobert Wu  so you can still reference the underlying stream (e.g.`getXRunCount()`).
343*05767d91SRobert WuYou can also inform any other threads that may be calling the stream to stop doing so.
344*05767d91SRobert WuDo not delete the stream or modify its stream state in this callback.
345*05767d91SRobert Wu* `onErrorAfterClose(stream, error)` - called when the stream has been stopped and closed by Oboe so the stream cannot be used and calling getState() will return closed.
346*05767d91SRobert WuDuring this callback, stream properties (those requested by the builder) can be queried, as well as frames written and read.
347*05767d91SRobert WuThe stream can be deleted at the end of this method (as long as it not referenced in other threads).
348*05767d91SRobert WuMethods that reference the underlying stream should not be called (e.g. `getTimestamp()`, `getXRunCount()`, `read()`, `write()`, etc.).
349*05767d91SRobert WuOpening a separate stream is also a valid use of this callback, especially if the error received is `Error::Disconnected`.
350*05767d91SRobert WuHowever, it is important to note that the new audio device may have vastly different properties than the stream that was disconnected.
351*05767d91SRobert Wu
352*05767d91SRobert WuSee the SoundBoard sample for an example of setErrorCallback.
353*05767d91SRobert Wu
354*05767d91SRobert Wu## Optimizing performance
355*05767d91SRobert Wu
356*05767d91SRobert WuYou can optimize the performance of an audio application by using special high-priority threads.
357*05767d91SRobert Wu
358*05767d91SRobert Wu### Using a high priority data callback
359*05767d91SRobert Wu
360*05767d91SRobert WuIf your app reads or writes audio data from an ordinary thread, it may be preempted or experience timing jitter. This can cause audio glitches.
361*05767d91SRobert WuUsing larger buffers might guard against such glitches, but a large buffer also introduces longer audio latency.
362*05767d91SRobert WuFor applications that require low latency, an audio stream can use an asynchronous callback function to transfer data to and from your app.
363*05767d91SRobert WuThe callback runs in a high-priority thread that has better performance.
364*05767d91SRobert Wu
365*05767d91SRobert WuYour code can access the callback mechanism by implementing the virtual class
366*05767d91SRobert Wu`AudioStreamDataCallback`. The stream periodically executes `onAudioReady()` (the
367*05767d91SRobert Wucallback function) to acquire the data for its next burst.
368*05767d91SRobert Wu
369*05767d91SRobert WuThe total number of samples that you need to fill is numFrames * numChannels.
370*05767d91SRobert Wu
371*05767d91SRobert Wu    class AudioEngine : AudioStreamDataCallback {
372*05767d91SRobert Wu    public:
373*05767d91SRobert Wu        DataCallbackResult AudioEngine::onAudioReady(
374*05767d91SRobert Wu                AudioStream *oboeStream,
375*05767d91SRobert Wu                void *audioData,
376*05767d91SRobert Wu                int32_t numFrames){
377*05767d91SRobert Wu            // Fill the output buffer with random white noise.
378*05767d91SRobert Wu            const int numChannels = AAudioStream_getChannelCount(stream);
379*05767d91SRobert Wu            // This code assumes the format is AAUDIO_FORMAT_PCM_FLOAT.
380*05767d91SRobert Wu            float *output = (float *)audioData;
381*05767d91SRobert Wu            for (int frameIndex = 0; frameIndex < numFrames; frameIndex++) {
382*05767d91SRobert Wu                for (int channelIndex = 0; channelIndex < numChannels; channelIndex++) {
383*05767d91SRobert Wu                    float noise = (float)(drand48() - 0.5);
384*05767d91SRobert Wu                    *output++ = noise;
385*05767d91SRobert Wu                }
386*05767d91SRobert Wu            }
387*05767d91SRobert Wu            return DataCallbackResult::Continue;
388*05767d91SRobert Wu        }
389*05767d91SRobert Wu
390*05767d91SRobert Wu        bool AudioEngine::start() {
391*05767d91SRobert Wu            ...
392*05767d91SRobert Wu            // register the callback
393*05767d91SRobert Wu            streamBuilder.setDataCallback(this);
394*05767d91SRobert Wu        }
395*05767d91SRobert Wu    private:
396*05767d91SRobert Wu        // application data goes here
397*05767d91SRobert Wu    }
398*05767d91SRobert Wu
399*05767d91SRobert Wu
400*05767d91SRobert WuNote that the callback must be registered on the stream with `setDataCallback`. Any
401*05767d91SRobert Wuapplication-specific data can be included within the class itself.
402*05767d91SRobert Wu
403*05767d91SRobert WuThe callback function should not perform a read or write on the stream that invoked it. If the callback belongs to an input stream, your code should process the data that is supplied in the audioData buffer (specified as the second argument). If the callback belongs to an output stream, your code should place data into the buffer.
404*05767d91SRobert Wu
405*05767d91SRobert WuIt is possible to process more than one stream in the callback. You can use one stream as the master, and pass pointers to other streams in the class's private data. Register a callback for the master stream. Then use non-blocking I/O on the other streams.  Here is an example of a round-trip callback that passes an input stream to an output stream. The master calling stream is the output
406*05767d91SRobert Wustream. The input stream is included in the class.
407*05767d91SRobert Wu
408*05767d91SRobert WuThe callback does a non-blocking read from the input stream placing the data into the buffer of the output stream.
409*05767d91SRobert Wu
410*05767d91SRobert Wu    class AudioEngine : AudioStreamDataCallback {
411*05767d91SRobert Wu    public:
412*05767d91SRobert Wu
413*05767d91SRobert Wu        DataCallbackResult AudioEngine::onAudioReady(
414*05767d91SRobert Wu                AudioStream *oboeStream,
415*05767d91SRobert Wu                void *audioData,
416*05767d91SRobert Wu                int32_t numFrames) {
417*05767d91SRobert Wu            const int64_t timeoutNanos = 0; // for a non-blocking read
418*05767d91SRobert Wu            auto result = recordingStream->read(audioData, numFrames, timeoutNanos);
419*05767d91SRobert Wu            // result has type ResultWithValue<int32_t>, which for convenience is coerced
420*05767d91SRobert Wu            // to a Result type when compared with another Result.
421*05767d91SRobert Wu            if (result == Result::OK) {
422*05767d91SRobert Wu                if (result.value() < numFrames) {
423*05767d91SRobert Wu                    // replace the missing data with silence
424*05767d91SRobert Wu                    memset(static_cast<sample_type*>(audioData) + result.value() * samplesPerFrame, 0,
425*05767d91SRobert Wu                        (numFrames - result.value()) * oboeStream->getBytesPerFrame());
426*05767d91SRobert Wu
427*05767d91SRobert Wu                }
428*05767d91SRobert Wu                return DataCallbackResult::Continue;
429*05767d91SRobert Wu            }
430*05767d91SRobert Wu            return DataCallbackResult::Stop;
431*05767d91SRobert Wu        }
432*05767d91SRobert Wu
433*05767d91SRobert Wu        bool AudioEngine::start() {
434*05767d91SRobert Wu            ...
435*05767d91SRobert Wu            streamBuilder.setDataCallback(this);
436*05767d91SRobert Wu        }
437*05767d91SRobert Wu
438*05767d91SRobert Wu        void setRecordingStream(AudioStream *stream) {
439*05767d91SRobert Wu          recordingStream = stream;
440*05767d91SRobert Wu        }
441*05767d91SRobert Wu
442*05767d91SRobert Wu    private:
443*05767d91SRobert Wu        AudioStream *recordingStream;
444*05767d91SRobert Wu    }
445*05767d91SRobert Wu
446*05767d91SRobert Wu
447*05767d91SRobert WuNote that in this example it is assumed the input and output streams have the same number of channels, format and sample rate. The format of the streams can be mismatched - as long as the code handles the translations properly.
448*05767d91SRobert Wu
449*05767d91SRobert Wu#### Data Callback - Do's and Don'ts
450*05767d91SRobert WuYou should never perform an operation which could block inside `onAudioReady`. Examples of blocking operations include:
451*05767d91SRobert Wu
452*05767d91SRobert Wu- allocate memory using, for example, malloc() or new
453*05767d91SRobert Wu- file operations such as opening, closing, reading or writing
454*05767d91SRobert Wu- network operations such as streaming
455*05767d91SRobert Wu- use mutexes or other synchronization primitives
456*05767d91SRobert Wu- sleep
457*05767d91SRobert Wu- stop or close the stream
458*05767d91SRobert Wu- Call read() or write() on the stream which invoked it
459*05767d91SRobert Wu
460*05767d91SRobert WuThe following methods are OK to call:
461*05767d91SRobert Wu
462*05767d91SRobert Wu- AudioStream::get*()
463*05767d91SRobert Wu- oboe::convertResultToText()
464*05767d91SRobert Wu
465*05767d91SRobert Wu### Setting performance mode
466*05767d91SRobert Wu
467*05767d91SRobert WuEvery AudioStream has a *performance mode* which has a large effect on your app's behavior. There are three modes:
468*05767d91SRobert Wu
469*05767d91SRobert Wu* `PerformanceMode::None` is the default mode. It uses a basic stream that balances latency and power savings.
470*05767d91SRobert Wu* `PerformanceMode::LowLatency` uses smaller buffers and an optimized data path for reduced latency.
471*05767d91SRobert Wu* `PerformanceMode::PowerSaving` uses larger internal buffers and a data path that trades off latency for lower power.
472*05767d91SRobert Wu
473*05767d91SRobert WuYou can select the performance mode by calling `setPerformanceMode()`,
474*05767d91SRobert Wuand discover the current mode by calling `getPerformanceMode()`.
475*05767d91SRobert Wu
476*05767d91SRobert WuIf low latency is more important than power savings in your application, use `PerformanceMode::LowLatency`.
477*05767d91SRobert WuThis is useful for apps that are very interactive, such as games or keyboard synthesizers.
478*05767d91SRobert Wu
479*05767d91SRobert WuIf saving power is more important than low latency in your application, use `PerformanceMode::PowerSaving`.
480*05767d91SRobert WuThis is typical for apps that play back previously generated music, such as streaming audio or MIDI file players.
481*05767d91SRobert Wu
482*05767d91SRobert WuIn the current version of Oboe, in order to achieve the lowest possible latency you must use the `PerformanceMode::LowLatency` performance mode along with a high-priority data callback. Follow this example:
483*05767d91SRobert Wu
484*05767d91SRobert Wu```
485*05767d91SRobert Wu// Create a callback object
486*05767d91SRobert WuMyOboeStreamCallback myCallback;
487*05767d91SRobert Wu
488*05767d91SRobert Wu// Create a stream builder
489*05767d91SRobert WuAudioStreamBuilder builder;
490*05767d91SRobert Wubuilder.setDataCallback(myCallback);
491*05767d91SRobert Wubuilder.setPerformanceMode(PerformanceMode::LowLatency);
492*05767d91SRobert Wu```
493*05767d91SRobert Wu
494*05767d91SRobert Wu## Thread safety
495*05767d91SRobert Wu
496*05767d91SRobert WuThe Oboe API is not completely [thread safe](https://en.wikipedia.org/wiki/Thread_safety).
497*05767d91SRobert WuYou cannot call some of the Oboe functions concurrently from more than one thread at a time.
498*05767d91SRobert WuThis is because Oboe avoids using mutexes, which can cause thread preemption and glitches.
499*05767d91SRobert Wu
500*05767d91SRobert WuTo be safe, don't call `waitForStateChange()` or read or write to the same stream from two different threads. Similarly, don't close a stream in one thread while reading or writing to it in another thread.
501*05767d91SRobert Wu
502*05767d91SRobert WuCalls that return stream settings, like `AudioStream::getSampleRate()` and `AudioStream::getChannelCount()`, are thread safe.
503*05767d91SRobert Wu
504*05767d91SRobert WuThese calls are also thread safe:
505*05767d91SRobert Wu
506*05767d91SRobert Wu* `convertToText()`
507*05767d91SRobert Wu* `AudioStream::get*()` except for `getTimestamp()` and `getState()`
508*05767d91SRobert Wu
509*05767d91SRobert Wu<b>Note:</b> When a stream uses an error callback, it's safe to read/write from the callback thread while also closing the stream from the thread in which it is running.
510*05767d91SRobert Wu
511*05767d91SRobert Wu
512*05767d91SRobert Wu## Code samples
513*05767d91SRobert Wu
514*05767d91SRobert WuCode samples are available in the [samples folder](../samples).
515*05767d91SRobert Wu
516*05767d91SRobert Wu## Known Issues
517*05767d91SRobert Wu
518*05767d91SRobert WuThe following methods are defined, but will return `Result::ErrorUnimplemented` for OpenSLES streams:
519*05767d91SRobert Wu
520*05767d91SRobert Wu* `getFramesRead()`
521*05767d91SRobert Wu* `getFramesWritten()`
522*05767d91SRobert Wu* `getTimestamp()`
523*05767d91SRobert Wu
524*05767d91SRobert WuAdditionally, `setDeviceId()` will not be respected by OpenSLES streams.
525