⚠️  Sunset Notice: This service will be discontinued as of September 30th, 2023.  Learn more »

Did you come here for Live Video Shopping?

This is documentation for Bambuser Live Streaming SDK.

If you're looking for documentation regarding Live Video Shopping (opens new window) , see these pages (opens new window)

Optional broadcasting features

Switching cameras

The Bambuser broadcasting SDKs for Android and iOS let your app use any of the available cameras on a device.

iOS

You can check for a front camera through the BambuserView hasFrontCamera (opens new window) property, optionally expose a Switch camera button in your app UI and then hook it up to the BambuserView swapCamera (opens new window) method.

Android

You can either check for the number of available cameras through the Broadcaster getCameraCount() (opens new window) method, or get more detailed information about the cameras through the getSupportedCameras() (opens new window) method. Based on the available cameras you may show suitable UI components in your app, and can then switch to the next camera using the switchCamera() (opens new window) method, or switch to a specific camera through the setCameraId(id) (opens new window) method.

Setting a title on a broadcast

When your app starts broadcasting you can add give it a title. The title will be included in the broadcast metadata you get from the API (opens new window). When using the API, the titleContains option lets you search for broadcasts having a specific title.

Using hashtags

If your title contains hashtags, we will use those to create individual tags and place them on your broadcast. Those tags will be included in the broadcast metadata you get from the API. Use the hasAnyTags or hasAllTags options in the API to search for broadcasts with tags.

iOS example

[bambuserView setBroadcastTitle:@"Test broadcast #testing"];

Android example

mBroadcaster.setTitle("Test broadcast #testing");

Hashtags

In the examples above, we would pick up #testing from the title and turn it into a tag. The response from the broadcast metadata API would then show the tags in a list {"tags": [{"text": "testing"}]}. The title will be left as-is.

Setting the author of a broadcast

Your app probably has many users and you want to make sure you can tell which of your broadcasts were made by each user. The easiest way of doing that is by having your app define the author for each broadcast. The author will be included in the broadcast metadata you get from the API (opens new window). Use the byAuthors option to search for broadcasts by a specific author (or a list of authors).

iOS example

[bambuserView setAuthor:@"John Smith"];

Android example

mBroadcaster.setAuthor("John Smith");

Enabling geolocation

If you want to keep track of your broadcasts geographical location, simply enable geo positioning. The position will be included as separate lat and lon values in the broadcast metadata you get from the API (opens new window).

iOS example

[bambuserView setSendPosition:YES];

Android example

mBroadcaster.setSendPosition(true);

Attaching additional metadata

You may add any additional data, structured as you prefer (for example JSON or XML), in a custom string attached to each broadcast. The data you insert will be exposed by the broadcast metadata API (opens new window) in a property named customData.

iOS example

[bambuserView setCustomData:@"any custom metadata you want to attach and parse later"];

Android example

mBroadcaster.setCustomData("any custom metadata you want to attach and parse later");

On-demand archiving

By default, all broadcasts are stored on the Bambuser backend, to support later on-demand viewing. You may override this behaviour in the broadcasting apps if you are only interested in live content.

Note

When server-side archiving is disabled, the Bambuser backend does not store the broadcast content at all. Such broadcasts can only be accessed live through the API and video player, then they are gone.

iOS

Disable server-side archiving by setting the BambuserView saveOnServer (opens new window) property to NO.

Android

Disable server-side archiving by calling Broadcaster setSaveOnServer(false) (opens new window).

Saving a local copy of a broadcast

Due to constantly changing mobile network conditions around the broadcaster, a live broadcast will usually vary both in frame rate and video resolution. In parallel with the live broadcast, the broadcaster SDKs for Android and iOS can store a full quality video file locally, independent of network conditions.

iOS

Use the BambuserView saveLocally (opens new window) and localFilename (opens new window) properties to store a video file on the device.

Android

Use the Broadcaster storeLocalMedia(File, LocalMediaObserver) (opens new window) method and LocalMediaObserver (opens new window) interface to store a video file on the device. Your app may need the WRITE_EXTERNAL_STORAGE (opens new window) permission.

Taking pictures

In addition to live broadcasting, the SDKs for Android and iOS have basic functionality for taking still pictures.

iOS

The iOS broadcast SDK supports taking simple snapshots at the currently active camera resolution. See the takeSnapshot (opens new window) method.

Android

The Android broadcast SDK supports storing high resolution pictures or simple snapshots to files through the takePicture(File, Resolution, PictureObserver) (opens new window) method. A list of supported picture resolutions is available through the getSupportedPictureResolutions() (opens new window) method.

Broadcast dimensions

The SDKs support setting the maximum broadcast dimensions and optionally enforcing a specific aspect ratio, e.g. 16:9. This is especially useful on Android, where manufacturers offer very different cameras with different supported resolutions.

iOS

The currently default maximum video resolution used for live broadcasting is 1280x720. If you want to conserve bandwidth, you can set a lower value through the BambuserView maxBroadcastDimension (opens new window) property.

If you want to enforce a specific aspect ratio, e.g. 16:9, 4:3 or 1:1, see the setOrientation: previewOrientation: withAspect: by: (opens new window) method and previewFrame (opens new window) property.

Android

The currently default camera resolution is 1280x720, but never higher than the camera supports. Older or slower devices may offer for example 640x480.

Optionally use getSupportedResolutions() (opens new window) to see which resolutions the current camera supports, then apply a suitable resolution through the setResolution(width, height) (opens new window) method.

The default maximum video resolution used for live broadcasting is 960x540 and the maximum supported is currently 1280x720. If you want to raise or lower the limit, see the setMaxLiveResolution() (opens new window) method.

If you want to enforce a specific aspect ratio, e.g. 16:9, 4:3 or 1:1, see the Broadcaster setRotation(preview, capture, width, height) (opens new window) and SurfaceViewWithAutoAR setCropToParent(crop) (opens new window) methods.

Before starting a broadcast you can optionally do an uplink speed test to determine whether the current network is fast enough for streaming to the Bambuser servers.

iOS

The BambuserView will do an uplink speed test automatically after initialization. Implement the BambuserViewDelegate uplinkTestComplete (opens new window) delegate method to get the results whenever an uplink speed test is completed. Optionally invoke the BambuserView startLinktest (opens new window) method to initiate uplink speed tests manually.

Android

Implement the UplinkSpeedObserver (opens new window) interface and call Broadcaster setUplinkSpeedObserver(observer) (opens new window) to enable automatic uplink speed testing and get the results whenever an uplink speed test is completed. Optionally invoke the Broadcaster startUplinkTest() (opens new window) method to initiate uplink speed tests manually.

While broadcasting live you can instead observe the stream health, measured automatically at regular intervals by the broadcast library.

iOS

Implement the BambuserViewDelegate healthUpdated (opens new window) delegate method to observe the stream health while broadcasting.

Android

Implement the onStreamHealthUpdate (opens new window) method in the Broadcaster.Observer (opens new window) interface to observe the stream health while broadcasting.

Additional camera features

The broadcast SDKs for Android and iOS offer simple APIs to use camera functionality such as focusing, zooming and toggling a LED torch.

iOS

The BambuserView (opens new window) automatically uses continuous auto-focus. Check the hasLedTorch (opens new window) and maxZoom (opens new window) properties to determine available features on the current device, then use the torch (opens new window) and zoom (opens new window) properties to apply desired values.

Android

The Broadcaster (opens new window) by default uses continuous auto-focus on modern devices that support it. Check the hasFocus() (opens new window) method to determine whether the current camera supports focusing, then use the focus() (opens new window) method to toggle between continuous auto-focus and focus locked.

Check the hasZoom() (opens new window), getZoomRatios() (opens new window) and hasTorch() (opens new window) methods to determine what the current camera supports. Use the setZoom(zoom) (opens new window) and toggleTorch() (opens new window) methods to apply desired values.

Text and audio feedback

The Bambuser Content Manager lets moderators chat or initiate a voice talkback session to the broadcaster. The broadcast SDKs for Android and iOS, which the Bambuser broadcaster apps build upon, come with built-in support for both incoming text and audio talkback.

Incoming text can be shown in a broadcasting app either by using the provided example chat UI or designing your own. Voice talkback can be integrated by implementing a simple API available in the SDKs, as shown in the example code included in the SDKs.