FAQ

Products / Plugins
Platform / Framework

Common issues of the stream publishing

Products / Plugins:Video Call / Voice Call / Live Streaming

Platform / Framework:iOS / Android / macOS / Windows / Web

Last updated:2021-09-26 16:12


After enabling the traffic control feature (by calling the enableTrafficControl method), why does this feature only take effect on streams of the main channel when the network condition is not ideal?

Cause: In the enableTrafficControl method of the SDK for Android, iOS, macOS and Windows, we don't provide a channel property for now. Therefore, this traffic control feature only takes effect to streams of the main channel by default. For example, the traffic control feature won't take effect on streams of the secondary channel when you publishing streams of two channels at the same time.

In the web app, no images are displayed through the video object after setting the srcObject property of the createStream method.

You can refer to the following two methods:

  • Set the autoplay property for the video object to enable the auto playback feature.
  • Set the controls property for the video object, and allow users to click the corresponding component to playback manually.

The autoplay property won't take effect in Safari on iOS, which required to use the second method.

How to publish audio-only streams on your web app?

The SDK publishes audio and video streams by default. To publish the audio-only streams, set the video property of the camera object to false when calling the createStream.

let localStream = await zg.createStream({
        camera: {
            video: false,//Do not publish video streams.
            audio: true,
       }
  });

Can empty string be passed to the audioInput and videoInput property when calling createStream method on the web app?

  • audioInput: Audio input device. Optional, use default settings if nothing passed.
  • videoInput: Video input device. Optional, use default settings if nothing passed.

The SDK uses default settings when you don't pass value to the audioInput and videoInput property.

The SDK can't identify and get the device when you pass an empty string to the audioInput and videoInput property. You can set these two properties to audioInput: undefined and videoInput:undefined.

The error NotReadableError Could not start video source occurs when creating streams on the web app.

You can refer to the following solutions:

  1. It is possibly because the browser doesn't get the media device permission. You can check whether the web page is allowed to obtain media device information.

  2. It is possibly because the media player is being used. For example, you can check whether the camera is being used by other processes when the camera is not available.

The error https or localhost required occurs on the web app.

  • Based on the privacy requirements of the browser security policy, the Web platform requires HTTPS when invoking the camera, which is required by the browser.
  • You can use the localhost for integration in the test environment, but remember to use the HTTPS for your web app when it goes alive.

How to share the system sounds and microphone sounds when sharing the screen on the web app?

Screen sharing on the web app allows you to select the target window for sharing, which has different performances according to different platforms:

Windows:

  • When sharing the whole screen: system sounds can be published.
  • When sharing the browser tab: only the sound of the current tab can be published.
  • When sharing the app: app sounds can't be published.
  • The screen sharing feature does not support capturing the sound with the microphone. To publish the microphone sound, you need to use the SDK to publish a secondary channel stream for publishing the sound collected by the microphone.

macOS: System sounds can't be published, only video data can be published.

After calling the enablePublishDirectToCDN to direct publish streams to CDN, is it necessary to set the enablePublishDirectToCDN to false again after stopping the stream publishing?

The answer is YES. We recommend you call these two methods in pairs, and set the enablePublishDirectToCDN to false every time you stopped the stream publishing.

Is it supported to publish PNG images with transparent channels when capturing data with your own devices?

If you choose to capture audio or video data with your own media devices rather than using the SDK's capturing capability:

The answer is YES, the images with transparent channel can be published. But if you want to render (including local preview or stream playing) the images you published with the SDK, it may failed and the transparent part of the image will turn to black. Here are the reasons:

  • The SDK doesn't support rendering the transparent channel.
  • Other media player doesn't support rendering transparent channel when playing streams from the CDN.

Does it support setting setting keys for stream publishing? Is it required to use corresponding keys to publish the stream?

The Express SDK 1.19.0 or later supports this feature. You can set keys for stream publishing by calling the setPublishStreamEncryptionKey method. And the stream subscriber can play streams by calling the setPlayStreamDecryptionKey method with the corresponding key.

This feature is only supported when using the ZEGO co-hosting service. It's not supported when forwarding streams or direct publishing streams to the CDN.

How to set the image for the local preview and stream playing on the web app?

You can add a CSS format, for example, transform: scale(-1, 1);, to the video object of the local preview and stream playing.

Page Directory
Download PDF