Efficient tools to build your streaming infrastructure


FAQ for Larix family of applications

Some frequently asked questions about Larix Broadcaster, Larix Screencaster, Larix Player and their SDKs.

Larix apps general setup and usage questions? Check Larix documentation reference to see full list of instructions for various platforms and servers.

Some specific questions are answered below.

Q1: How does adaptive bitrate (ABR) work?

Adaptive video bitrate is supported in 3 modes:

  • Logarithmic descend - gracefully descend from max bitrate down step by step. Retries to raise back to previous step every minute. Best fit for good networks.
  • Ladder ascend - first cut bitrate by 2/3 and increase it back to normal as much as possible. Retries to raise back to previous steps in 15 seconds, 1.5 and then 5 minutes. Best fit for networks with big losses.
  • Hybrid approach calculates percentage of factually delivered packets and decreases the target bitrate by that ratio. Minimum bitrate is 25%. Larix tries to restore the bitrate every 30 seconds by 500Kbps steps.
    For TCP protocols (RTMP/RTSP), Hybrid mode also takes into account latency (period of time to send all queued data).
    For SRT it only depends on ratio between amount of data to send and amount of actually sent.
    On iOS it supports RTMP, RTSP and SRT. On Android Hybrid is supported only for SRT.
  • Variable FPS can be used as an option, it will reduce bitrate by decreasing FPS in addition to changing the bitrate value.

The trigger for start switching to lower bitrate or frame rate is the number of lost packets per certain period of time.
For logarithmic descent it's 4 packets per 10 last seconds.
For Ladder ascend it equals "bitrate/300000" for the last 10 seconds, e.g. for 2Mbps it's 6 packets.
Example for hybrid: you set target bitrate to 6000Kbps, the actual outgoing bitrate is 5000Kbps. Due to network failures the real delivery bitrate is dropped by 50% to 2500Kbps. So the target bitrate is reduced buy half to 3000Kbps. A minute later it tries to restore it to 3500Kbps.

In case of RTMP and RTSP connections we count lost packets' stats ourselves.
On iOS 11+, the packets are not lost but are kept in system buffer, so if the ABR is not used, there will be an increase it delivery delay.
In case of SRT the packet loss is defined by pktSndDrop property and it depends on how SRT handled the loss in accordance to latency and other internal factors.

Can ABR implementation be changed?
For iOS you can check that ABR logic in StreamConditioner.swift.
For Android it's defined across StreamConditionerBase.java, StreamConditionerLogarithmicDescend.java and StreamConditionerLadderAscend.java.

Q2: Framerate issues: no 24FPS, no 60FPS, wrong FPS etc

A typical issue: Larix FPS is set to 25FPS, but decoder on receiver side shows 30FPS or no FPS data at all.
First of all, mobile encoders do not add proper SPS information into the content which cause some decoders to get confused and to use some default value like "30". There's no way to set the definite FPS at the moment. The output content will not have proper information in SPS, the encoder just doesn't provide it. That's why we cannot provide it as well.

Larix always uses system encoder so we cannot precisely control the framerate.

On iOS platform the produced framerate may be variable, we cannot control it. At the moment we can only use the encoder setting which "recommends" the encoder to use certain frame rate.
This is how it's described in Apple docs: "This is not used to control the frame rate; it is provided as a hint to the video encoder so that it can set up internal configuration before compression begins. The actual frame rate will depend on frame durations and may vary."

The same applies to Android. There we can select some frame rate range from pre-defined a list of rate rages supported by the encoder. Some ranges may contain just one value (30..30) - that a "fixed frame rate", some contain ranges (1..25) - that's a "variable frame rate". But that's also a recommendation.

Android 60FPS support notice: most of Android devices with 60 fps cameras do not provide this capability to third-party apps, so only in native camera app can do it. So if your device has that support, most probably Larix won't be able to use it.

Known issues: on Samsung S21 Ultra with Qualcomm chip (usually shipped for North American market) most streaming apps including Larix will not stay at 30fps or any other frame rate. After a few seconds of encoding it goes down to 15fps regardless of any settings. This must be the hardware encoder issue which we unfortunately cannot control.

Q3: Can I set specific profile and level for output stream encoding?

iOS supports the following video formats: H.264 Baseline Level 3.0, Baseline Level 3.1, Main Level 3.1, and High Profile Level 4.1. Please refer to this article and this article for details.
On Android please refer to this article on profile and this article on level. You need Android 5.0 for profile and Android 6.0 for level. And note that profile/level combination support depends on device’ hardware type.

Q4: How does Larix handle bitrate setup?

On Android the bitrate parameter is defined by just typing the value.
On iOS the bitrate parameter has predefined set of values, and by default it's selected based on the resolution.

In general, once you define a bitrate (or use default one), the device encoder will use it as target bitrate for encoding and Larix will publish the stream with that bitrate. If network conditions get worse, then you will see frames loss at some point for RTMP and RTSP. SRT will try compensating that with its error recovery within the "latency" parameter period.

If you know that your network will not be fine, you can enable Adaptive bitrate feature.

Bitrate matches resolution option is also available in case you don't know the exact value. The following ruls apply here:

  • Bitrate is selected according to this table based on resolution:
    ["2160":4500, "1080":3000, "720":2000, "540":1500, "480":1000, "360":700, "288":500, "144":300]
  • If you use HEVC then the bitrate is multiplied by 0.5.
  • If you use 50FPS and above then the bitrate is multiplied by 1.6.

Q5: Can I make my application perform streaming from the background?

How can we do streaming when the app is closed, is in background or when the device is locked?

On Android it's available as non-default feature. In order to use it, go to "Advanced options" menu and enable "Background streaming". After the app is restarted, your app will work in background. Also notice "Quit if inactive in background" option.
On iOS the background mode puts a lot of limitations. Encoder and camera are not accessible, only audio is available. Larix Screencaster is built as application extension and cannot be used as a foundation for full-features background streaming app.

Q6: Can I set input gain (incoming audio volume)?

On iOS you can set input gain using standard AVAudioSession’s API. Please refer to this article and this article.
On Android we plan implementing this soon.

Q7: How can I apply image or text or animation overlay on the outgoing stream?

On iOS image overlays are available as part of default functionality, you can refer to SDK sources for implementation details.
In general, you can apply any CoreImage filter to outgoing video stream. You will implement CoreImage filers directly, same way like apply them to photo. Please refer to this Apple article. You can see example of overlay in StreamerSingleCam.swift of LarixSample - there is an internal function overlay() inside of rotateAndEncode, uncomment its call to see the result.
On Android it's possible to stream any picture, check MainActivityGLES.java / mCameraLogo. You can check more extended examples in Camera2demo and CameraFX sample applications from Android SDK.
HTML overlays: if you'd like to make overlay based on web page, you can convert browser window into bitmap and then make overlay with it. On Android it's described here. On iOS, you'll need to convert WebView into bitmap.

Q8: My Android device has multiple rear and frontal cameras, how can I use them?

Larix shows all cameras which are available via system API. Android 10+ provides extended capabilities for capturing from physical cameras and Larix has that support. However, many manufacturers allow using additional cameras mostly only in their own apps. So if some device doesn't provide this information, Larix won't be able to work with it.

Q9: What is the minimum latency for outgoing streams?

Glass-to-glass latency between a sensor of device and the playback device consists of multiple components. We described it in this article to cover all stages of delivery. Mobile device hardware, Larix Broadcaster and delivery protocol are responsible for the first few stages and Larix can control only a part of them.

Device's sensor and encoder add some latency but it's insignificant related to later stages, it's just dozens of milliseconds.

Interleaving buffer. Having frames from encoder, Larix streaming library first collects them in a buffer to perform interleaving compensation. This step is needed to align video and audio frames for better processing on receiver side. Interleaving buffer usually collects about 16 frames of audio and video, for 30FPS video this means about 250ms delay. Using Larix SDK, it's possible to disable interleaving compensation but we strongly recommend to keep it working to avoid issues on your receiver side.

Then there is a buffer that prepares data for sending via a designated protocol. So further parts depend on a protocol. Also, each platform may have its specifics.
Notice that Larix is capable of streaming via multiple simultaneous connections so each connection will have its own latency.

iOS-specific buffering.
For RTMP and RTSP currently the buffer is 200 items, which is 3000ms at maximum. However the latency may be reduced if the moment of stream's start is near the first frame of GOP. The GOP size is 2000ms so there's a high chance of getting lower latency. You may reduce it in SDK via library method.
For SRT and RIST, data from interleaving buffer is sent directly to respective protocol libraries, so no additional delay is introduced there. However, SRT has "latency" parameter and RIST has "buffer" for their own packets loss compensation algorithms. Read respective protocols' docs to learn more about their respective recommended parameters' values.

Android-specific buffering. RTMP, RTSP, SRT and RIST take data from the same buffer which is 200 items, or 3000ms. You may reduce it in Larix app via Advanced options / Enable custom encoder buffer menu item, as well as via library method.
If you use only SRT or RIST, you may reduce buffer down to 250ms for interleaving compensation purposes. Just like it was mentioned for iOS above, SRT has "latency" parameter and RIST has "buffer" for their own packets loss compensation algorithms.

The approaches described above may change over time because our team keeps improving them.

Q10: How can I stream from Larix to OBS on my desktop PC over mobile network?

Usually, LAN's traffic is separated from the WAN by a Router. The OBS computer is in LAN, but the mobile device has access via WAN. To make mobile connect to some computer in your LAN, a special translation rule must be created for redirecting connections from WAN to LAN. This rule simply tells the router to redirect connection from a WAN IP:port to some LAN computer IP and port. The rule is set on a Router and can be created in the "NAT" or "Port forwarding" section. When creating rules, define UDP protocol for SRT or RIST, and TCP for RTMP or RTSP connections.
The setup details of this NAT rule heavily depends on a router model and network configuration. Routers are different, so check their respective manuals for details on creating translation rules.

Q11: Cannot stream in my local network from Larix for iOS.

Trying to stream over local WiFi network without Internet access to a local RTMP server but get "No Internet connection" error. All devices use manual IPs and there is no DHCP nor DNS server.
This is a known issue of latest iOS version. You must specify any DNS server settings (e.g. 1.1.1.1) to have a local WiFi connection, even without an actual Internet connection.

Q12: Where can I find saved files after video recording?

On iOS, the recordings can be saved to iCloud Drive and Photo Library.
If you want to customize path via SDK, refer to Streamer.swift / startRecord() and customize below code block:

let documents = try FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
let df = DateFormatter()
df.dateFormat = "yyyyMMddHHmmss"
let fileName = "MVI_" + df.string(from: Date()) + ".mp4"
let fileUrl = documents.appendingPathComponent(fileName)

On Android 8+ you can select any destination for recording, including SD card. On earlier OS versions we use /DCIM/LarixBroadcaster/ on the internal storage due to Android limitation.

Q13: How can I combine together the files with split recording?

You may install MP4Box and run command like "MP4Box -add input_file1.mp4 -cat input_file2.mp4 output_file.mp4".

Q14: USB OTG camera (UVC) does not work on my Android, any chance to make it work?

We've put together UBS OTG support overview to answer this question.

Q15: Why Larix Screencaster does not provide audio from my apps?

Android platform version 9 or earlier does not allow its applications to take audio from other applications. This is a security constraint, so Larix is not able to do that.
On Android 10, app audio recoding is allowed from apps which support external recording. To use this option, choose Audio -> Sound settings -> Media sounds.

iOS platform allows capturing the screen of user device and puts some limitations on audio. If you stream your screen, you can only use your microphone.
If currently opened application supports ReplayKit, then you'll be able to stream its sound.

Q16: How can I remove white bar at the bottom of Larix Player for iOS?

If you're an end user, you can follow this article to disable that white bar.

If you're a subscriber of Player SDK for iOS, add the following to PlayerView

override var prefersHomeIndicatorAutoHidden: Bool {
return true
}

Q17: What do Advanced connections settings mean?

Settings > Connections > Advanced settings menu has the following settings:

  • Max. active connections option means maximum amount of simultaneously streamed connections. This value can limit the number of connection in Connections menu. By default it's "3".
  • Reconnect timeout is the time in seconds, Larix will wait before trying to connect again.
  • Don't check network presence. By default, the stream won't start if no network is available, and no retries are performed if the network goes occasionally offline. If this option is enabled, it allows connecting regardless of network presence either when starting or resuming stream.
  • Reconnect timeout w/o network Reconnect timeout is the time in seconds, which Larix will wait before trying to connect again. If set to Never, Larix will never perform reconnection.
  • Idle timeout: Larix will stop streaming if no data can be sent to a publishing point during this period, it's set in seconds.
  • Unsent threshold (iOS only) defines time in seconds for storing unsent data. If more data is accumulated in Larix's buffer, Larix will stop streaming.

Q18: Why does low bitrate setting not take effect?

Q: I specified low bitrate in the Video encoding settings, but my stream doesn’t honor it. Why is the bitrate much higher?

This is the way quality floor enforcement works on Android. It always tries to make a perfect quality picture or save details in fast motion scenes by increasing bitrate. There’s no way to control it in VBR encoding mode, which most mobiles are equipped with and used by default.

You can find more details in official Android documentation:

If you need low bitrate regardless of the quality, please look for a mobile device with CBR encoding support, and specify it in Larix encoder parameters among with the low bitrate settings.

Q19: Is there a trial for your SDKs?

Our SDKs can be tried in action using corresponding freeware applications available in app stores. The SDKs have source code of those apps.
Also take a look at architecture overview of Larix Broadcaster for Android and for iOS.
We also have a couple of tutorial apps, they are also provided as part of SDKs to show how to start development.

Q20: Are there any limitations on SDKs usage?

No. You may create any number of applications for publishing by your company, based on purchased SDKs. If you un-subscribe, your apps will remain fully functional.

Q21: Can I use your SDK if my SDK subscription is canceled?

Yes, you can use SDK and release your apps even if SDK was canceled after one or more months of payments. However, you will not be receiving SDK updates nor will you be able to get our technical support.

Q22: I used to be subscribed to SDK before, then canceled, how can I get updates now?

Just purchse the SDK again.

Q23: Can you make a branded app for me and submit it in stores?

No. Our team in concentrated on the core product, and we recommend hiring mobile dev professionals who can make custom branding for you.

You may consider contacting integrator companies which have experience with our products. They are not affiliated with Softvelum but you should definitely try contacting them.

Q24: Can I develop apps with your SDK for my customers?

You must have a separate SDK purchase for each of your customers.
Re-sale or re-distribution of SDKs or their parts outside of original subscriber organization is forbidden.
So yes, you can develop apps for other companies but for each new customer you will have to make separate subscription under the same terms as you would have for your own development.

Helpdesk

If you still have questions, contact us.