Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To string up a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to bury interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: motionless bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Stationary issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID roped to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

String up up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To commence collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobile capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directives in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Dangle up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call commence method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Stationary issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Motionless capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immovable potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Suspend up

To dangle a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call commence method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: motionless bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Motionless issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobilized capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID strapped to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following instructions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Dangle up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: motionless bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immovable capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID tied to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Devices
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to bury interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immobile bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Stationary issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobile capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Stationary potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID roped to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to demonstrate your local movie track from camera you should create UIView on storyboard and then use the following code:

String up up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To commence collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to drown interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Motionless issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immovable capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID corded to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Devices
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Suspend up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to drown interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immovable issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobile potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were fully unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directives in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Dangle up

To string up a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobile issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobile capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immovable potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID strapped to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were fully unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobile issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID corded to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To dangle a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immobilized bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immovable issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Motionless capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID roped to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to demonstrate your local movie track from camera you should create UIView on storyboard and then use the following code:

Dangle up

To dangle a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immovable issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobile capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Stationary potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID strapped to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were fully unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Ordinary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Contraptions
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to demonstrate your local movie track from camera you should create UIView on storyboard and then use the following code:

Suspend up

To string up a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobile issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were fully unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following instructions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to showcase movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to drown interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immobile bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobile potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID roped to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Ordinary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

String up up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Stationary issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobilized capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Stationary potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were totally unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Devices
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To commence collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to bury interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were fully unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Ordinary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Devices
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to demonstrate your local movie track from camera you should create UIView on storyboard and then use the following code:

Dangle up

To string up a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: motionless bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Motionless issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Stationary capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID strapped to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Ordinary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Instruments
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directives in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to showcase movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc confinements black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to drown interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immovable capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID strapped to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Contraptions
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to showcase movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can embark recorder by accessing recorder property in session example. Call commence method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data drown through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to bury interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobilized capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID corded to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were entirely unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Ordinary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Contraptions
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Embarked with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following directives in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

String up up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Three. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To embark collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set response time interval

If an opponent did not reaction you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: stationary bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobilized issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobile capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to bury audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID tied to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were totally unsupported.
  • Liquidated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Contraptions
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open demonstrating an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to showcase your local movie track from camera you should create UIView on storyboard and then use the following code:

Drape up

To string up a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are glad to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a shove notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To begin collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to bury interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immobilized bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobile issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Motionless capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Immobilized potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID trussed to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Liquidated old deprecated enums in QBRTCConnectionState enum.
  • Eliminated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were fully unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Implements
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Began with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to begin using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

Dangle up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-built movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To commence collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call commence method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to bury interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you undoubtedly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immobile bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobile issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Motionless capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID corded to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Eliminated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just eliminate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Elementary code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Devices
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following guidelines in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open showcasing an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Three: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Three: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set response time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no reaction, rejected, hangup and failed.

Manage remote media tracks

In order to demonstrate movie views with flows which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to display your local movie track from camera you should create UIView on storyboard and then use the following code:

String up up

To drape a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc confinements muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-made movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-built logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-built movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For utter source code of custom-built capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working thrust notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To commence collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can begin recorder by accessing recorder property in session example. Call begin method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data bury through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to drown interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-made ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio rivulets. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Immobile issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Motionless capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Motionless potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession completes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to submerge audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID corded to them. Property will be nil if track is local.
  • Eliminated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Liquidated old deprecated in Two.Trio methods from QBRTCCameraCapture class.
    • Eliminated startSession deprecated method, use startSession: instead.
    • Eliminated stopSession deprecated method, use stopSession: instead.
    • Eliminated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Liquidated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Liquidated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Plain code sample for iOS WebRTC Movie Talk (movie calling) via QuickBlox SDK API

Quickblox Docs

Enterprise
Devices
  • Home
  • Documentation
  • Pricing
  • Enterprise
  • Contact

Sources

The VideoChat code sample permits you to lightly add movie calling and audio calling features into your iOS app. Enable a movie call function similar to FaceTime or Skype using this code sample as a basis.

It is built on the top of WebRTC technology.

Check out our fresh feature of QuickbloxWebRTC SDK — Screen sharing

System requirements

  • The QuickbloxWebRTC.framework supports the next:
    • Quickblox.framework v2.7 (pod QuickBlox)
    • iPhone 4S+.
    • iPad Two+.
    • iPod Touch Five+.
    • iOS 8+.
    • iOS simulator 32/64 bit (audio might not work on simulators).
    • Wi-Fi and 4G/LTE connections.

Getting Commenced with Movie Calling API

Installation with CocoaPods

CocoaPods is a dependency manager for Objective-C, which automates and simplifies the process of using 3rd-party frameworks or libraries like QuickbloxWebRTC.framework in your projects.

Step 1: Downloading CocoaPods

CocoaPods is distributed as a ruby gem, and is installed by running the following instructions in Terminal.app:

Step Two: Creating a Podfile

Project dependencies to be managed by CocoaPods are specified in the Podfile. Create this file in the same directory as your Xcode project (.xcodeproj) file:

TextEdit should open displaying an empty file. You just created the pod file and opened it! Ready to add some content to the empty pod file?

Copy and paste the following lines into the TextEdit window:

Step Trio: Installing Dependencies

Now you can install the dependencies in your project:

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

Step Four: Importing Headers

At this point, everything is in place for you to commence using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Add the Framework to your Xcode Project

Please note that Quickblox iOS SDK is required for apps using QuickbloxWebRTC

Step 1: Download & unzip the Framework

Step Two: Add the framework to your Xcode Project

Haul the QuickbloxWebRTC.framework folder you downloaded into your Xcode project. Make sure the "Copy items to destination’s group folder" checkbox is checked.

Step Trio: Link Binary With Library Frameworks

Click on Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

  • Here is the list of required Apple library frameworks:
    • libicucore.dylib
    • libc++.dylib
    • libresolv.dylib
    • libxml2.dylib
    • libz.dylib
    • CFNetwork.framework
    • GLKit.framework
    • MobileCoreServices.framework
    • SystemConfiguration.framework
    • VideoToolbox.framework
    • Accelerate.framework

Step Four: Embedded binary for Dynamic framework

From version Two.Four QuickbloxWebRTC is required to be added as Embedded binary as it is dynamic framework.

Step Five: Importing Headers

At this point, everything is in place for you to embark using the Quickblox and QuickbloxWebRTC frameworks. Just #import the headers in <YourProjectName-Prefix>.pch file:

Run Script Phase for Dynamic framework

Add a "Run Script Phase" in build phases of your project. Past the following snippet in the script:

This fixes a known Apple bug, that does not permitting to publish archives to the App store with dynamic frameworks that contains simulator platforms. Script will only work for archiving.

Life cycle

Call users

To call users just use this method:

After this your opponents (users with IDs= 2123, 2123, 3122) will receive one call request per five 2nd for a duration of forty five seconds (you can configure these settings with QBRTCConfig):

self.session – this refers to this session. Each particular audio – movie call has a unique sessionID. This permits you to have more than one independent audio-video conferences.

If you want to increase the call timeout, e.g. set to sixty seconds:

Accept a call

To accept a call request just use this method:

After this your opponent will receive an accept signal:

Reject a call

To reject a call request just use this method:

After this your opponent will receive a reject signal:

Connection life-cycle

Called when connection is initiated with user:

Called when connection is closed for user

Called in case when connection is established with user:

Called in case when user is disconnected:

Called in case when user did not react to your call within timeout .

note: use +[QBRTCConfig setAnswerTimeInterval:value] to set reaction time interval

Called in case when connection failed with user.

States

Called when QBRTCSession state was switched. Session’s state might be fresh, pending, connecting, connected and closed.

Called when session connection state switched for a specific user. Connection state might be unknown, fresh, pending, connecting, checking, connected, disconnected, closed, count, disconnect timeout, no response, rejected, hangup and failed.

Manage remote media tracks

In order to display movie views with rivulets which you have received from your opponents you should create QBRTCRemoteVideoView views on storyboard and then use the following code:

You can as well get remote audio track for a specific user in call using this QBRTCClientDelegate method (use it, for example, to mute a specific user audio in call:

You can always get both remote movie and audio tracks for a specific user ID in call using these QBRTCSession methods:

Manage local movie track

In order to demonstrate your local movie track from camera you should create UIView on storyboard and then use the following code:

String up up

To suspend a up call:

After this your opponent’s will receive a hangUp signal

In the next step if all opponents are inactive then QBRTCClient delegates will be notified about:

Disable / enable audio stream

You can disable / enable the audio stream during a call:

Please note: due to webrtc limitations muffle will be placed into stream content if audio is disabled.

Disable / enable movie stream

You can disable / enable the movie stream during a call:

Please note: due to webrtc limitations black frames will be placed into stream content if movie is disabled.

Switch camera

You can switch the movie capture position during a call (Default: front camera):

‘videoCapture’ below is QBRTCCameraCapture described in CallController above

Audio Session (Previously Sound Router)

QBRTCSoundRouter is deprecated from version Two.Trio. Instead from now on you should use QBRTCAudioSession class. Audio Session methods looks almost the same as Sound Router ones, with exception of being more customizable and conform to many requirements.

QBRTCAudioSession also does have a delegate protocol with helpful methods:

Also QBRTCAudioSession introducing some fresh properties, that might be also helpful in any case:

Background mode

Use the QuickbloxRTC.framework in applications running in the background state

Set the app permissions

In the Info.plist file for your app, set up the background mode permissions as described in the Apple documentation for creating VoIP apps. The key is UIBackgroundModes. Add the audio value to this dictionary. Do not add voip to this dictionary. We have seen applications rejected from the App Store specifically for the use of the voip flag, so it is significant not to skip this step.

There is also a UI for setting app background modes in XCode Five. Under the app build settings, open the "Capabilities" tab. In this tab, turn on "Background Modes" and set the "Audio and AirPlay" checkbox to set the audio background mode, just as in the method for editing Info.plist, above. For completeness, we describe both methods, but the results are identical — you only need to use one of the methods.

When correctly configured, iOS provides an indicator that your app is running in the background with an active audio session. This is seen as a crimson background of the status bar, as well as an extra bar indicating the name of the app holding the active audio session — in this case, your app.

Screen sharing

We are blessed to introduce you a fresh feature of QuickbloxWebRTC SDK — Screen sharing.

It gives you an capability to promote your product, share a screen with formulas to students, distribute podcasts, share movie/audio/photo moments of your life in real-time all over the world.

To implement this feature in your application we give you the capability to create custom-made movie capture.

Movie capture is a base class you should inherit from in order to send frames you your opponents.

Custom-built movie capture

QBRTCVideoCapture class permits to send frames to your opponents.

By inheriting this class you are able to provide custom-made logic to create frames, modify them and then send to your opponents.

Below you can find an example of how to implement a custom-made movie capture and send frames to your opponents.

Note: a CADisplayLink object is a timer object that permits your application to synchronize its drawing to the refresh rate of the display.

For total source code of custom-made capture and extra methods please refer to sample-videochat-webrtc sample

To link this capture to your local movie track simply use:

Calling offline users

We made it effortless to call offline users.

Quickblox iOS SDK provides methods to notify an application about fresh events even if application is closed.

How to configure Push-notifications in your application you can find here

Assuming you have working shove notifications it is very effortless to notify users about fresh call.

You can call -sendPushToOpponentsAboutNewCall whenever you make a call.

If application is in background, opponent will see a thrust notification.

If application is in foreground, nothing will happen in UI.

WebRTC Stats reporting

From v2.1 you are able to observe stats provided by WebRTC.

To commence collecting report information do the following:

And classes that adopt QBRTCClientDelegate protocol will be notified with

For example, audioSendInputLevel property indicates mic input level even while audio track disabled, so you can check if user is presently speaking/talking.

You can also use already parsed and readable string that we are providing with most significant stats for current report, just use this method:

Recording audio and movie calls

From SDK version Two.6 there is a class, called QBRTCRecorder. You cannot allocate it by yourself, but it is stored in each example of QBRTCSession by the property named recorder if the requirements conform. Otherwise, recorder property value will be nil.

Recorder requirements

  • Device must not be in a low-performance category. To check whether your device is in low spectacle category use UIDevice+QBPerformance category method qbrtc_lowPerformance.
  • Only one to one audio and movie calls are supported for now.

Usage

Once you have created fresh rtc session, you can commence recorder by accessing recorder property in session example. Call embark method and input desired file url:

You can configure output file movie settings and movie orientation using these methods:

Once the call is finished or whenever you want before that you need to simply call stop method:

Note that stop method is asynchronous and will take some time to finalize record file. Once the completion block is called, recording file should be ready by expected url unless some error happens. In order to treat any recorder errors, simply subscribe to delegate of QBRTCRecorder and treat this method:

Accessing remote audio data

From SDK version Two.6 QBRTCAudioTrack class (that represents remote audio track for a specific user) supports audio data submerge through freshly added QBRTCAudioTrackSinkInterface protocol.

In order to access audio data in real time, simply subscribe to submerge interface using methods:

Now treat protocol method to access audio data:

This interface provides AudioBufferList with audio data, AudioStreamBasicDescription description of audio data, a number of frames in current packet, and current media time that conforms to each packet.

Settings

You can switch different settings for a session

Set reaction time interval

If an opponent did not response you within dialing time interval, then userDidNotRespond: and then connectionClosedForUser: delegate methods will be called

Default value: forty five seconds

Minimum value: ten seconds

If user did not react within the given interval, then a following delegate method will be called

Set dialing time interval

Indicates how often we send notifications to your opponents about your call

Default value: five seconds

Minimum value: three seconds

Enable DTLS (Datagram Transport Layer Security)

DTLS is enabled by default.

Datagram Transport Layer Security (DTLS) is used to provide communications privacy for datagram protocols. This fosters a secure signaling channel that cannot be tampered with. In other words, no eavesdropping or message forgery can occur on a DTLS encrypted connection.

Set custom-built ICE servers

You can customize a list of ICE servers.

By default, the server in North Virginia turn.quickblox.com is used, but you can add/setup more: turnsingapore.quickblox.com if you are located in Asia and turnireland.quickblox.com if you are located in Europe.

How does WebRTC select which TURN server to use if numerous options are given?

During the connectivity checking phase, WebRTC will choose the TURN relay with the lowest round-trip time. Thus, setting numerous TURN servers permits your application to scale-up in terms of bandwidth and number of users.

Here is a list with default settings that we use, you can customize all of them or only some particular:

Movie codecs: VP8 vs H264

H264 is the most preferred movie codec for iOS.

Chrome added support for H264 movie codec in fifty revision.

H264 is the only one movie codec for iOS that has hardware support.

Movie quality

1. Movie quality depends on hardware you use. iPhone 4s will not treat FullHD rendering. But iPhone 6+ will.

Two. Movie quality depends on network you use and how many connections you have.

For multi-calls set lower movie quality. For peer-to-peer calls you can set higher quality.

You can use our QBRTCCameraCapture formats with position method in order to get all supported formats for current device:

WebRTC has auto scaling of movie resolution and quality to keep network connection active.

To get best quality and spectacle you should use H264.

1. If some opponent user in call devices do not support H264, then automatically VP8 will be used

Two. If both caller and callee have H264 support, then H264 will be used.

Audio codecs: OPUS vs iSAC vs iLBC

In the latest versions of Firefox and Chrome this codec is used by default for encoding audio flows. This codec is relatively fresh (released in 2012). It implements lossy audio compression. Opus can be used for both low and high bitrates.

Supported bitrate: constant and variable, from six kbit/s to five hundred ten kbit/s Supported sampling rates: from eight kHz to forty eight kHz

If you develop a WebRTC application that is supposed to work with high-quality audio, the only choice on audio codecs is OPUS.v

OPUS requires has the best quality, but it also requires a good internet connection.

This codec was developed special for VoIP applications and streaming audio.

Supported bitrates: adaptive and variable. From ten kbit/s to fifty two kbit/s. Supported sampling rates: thirty two kHz

Good for voice data, but not as good as OPUS.

This audio codec is well-known, it was released in 2004, and became part of the WebRTC project in two thousand eleven when Google acquired Global IP Solutions (the company that developed iLIBC).

When you have very bad channels and low bandwidth, you certainly should attempt iLBC — it should be strong on such cases.

Supported bitrates: immovable bitrate. 15.Two kbit/s or 13.33 kbit/s Supported sampling rate: eight kHz

When you have a strong reliable and good internet connection – then use OPUS.

If you use WebRTC on 3g networks – use iSAC. If you still have problems – attempt iLBC.

Enable specified audio codec

Framework changelog

  • Conference module (Enterprise-only feature):
    • Stationary issue with disappearing user in a room when the internet connection is slow.
    • Added capability to perform audio-only calls. Use fresh createSessionWithChatDialogID:conferenceType: method for this with desired conference type enum.
    • Immobile capability to subscribe to the user in session without being required to join the room (this introduces the capability to receive someone’s media without sending own).
  • Stationary potential memory leak with for movie calls when the recorder (introduced in Two.6) was not in use.

v2.6 – May 30, two thousand seventeen (DEPRECATED – use Two.6.0.1)

  • WebRTC r 18213
  • Added QBRTCRecorder class. This class represents WebRTC audio and movie calls recorder. Check out this link for more information on how to use it.
  • Added fresh delegate methods to QBRTCAudioSession class.
    • Added audioSessionDidStartPlayOrRecord: delegate. Called when the audio device is notified to begin playback or recording.
    • Added audioSessionDidStopPlayOrRecord: delegate. Called when the audio device is notified to stop playback or recording.
    • Added audioSessionDidBeginInterruption: delegate. Called when AVAudioSession starts an interruption event.
    • Added audioSessionDidEndInterruption:shouldResumeSession: delegate. Called when AVAudioSession finishes an interruption event.
  • Added QBRTCAudioTrackSinkInterface protocol to QBRTCAudioTrack class. Use this protocol to drown audio data for a specific remote audio track in real time. Check out this link for more information on how to use it.
  • Added adaptOutputFormatToWidth:height:fps: method to QBRTCVideoCapture class. This method permits you to adapt frames in your capture to any possible dimension you want. Note that this method adapts existing captured framework, not the camera format.
  • Added userIDNSNumber property to QBRTCMediaStreamTrack class. This means that both QBRTCAudioTrack and QBRTCVideoTrack classes will now have a specific user ID strapped to them. Property will be nil if track is local.
  • Liquidated old deprecated QBRTCFrameConverter class.

Added Enterprise-only feature: WebRTC Conference calls. This feature permits to participate in movie calls with up to ten people. See https://quickblox.com/plans/.

  • Added volume property to QBRTCAudioTrack class. Use it to switch volume for a specific remote audio track, which you can get in client for a specific user in call.
  • Added fresh audioLevelControlEnabled property in QBRTCMediaStreamConfiguration class. Determines whether webrtc audio level control is enabled. Rough example: slightly reducing audio volume for all tracks while you are talking (local audio track receiving sound). Default value is NO.
  • Eliminated old deprecated in Two.Three methods from QBRTCCameraCapture class.
    • Liquidated startSession deprecated method, use startSession: instead.
    • Liquidated stopSession deprecated method, use stopSession: instead.
    • Liquidated stopSessionAndTeardownOutputs: deprecated method, use stopSession: instead.
    • Eliminated selectCameraPosition: deprecated method, use setPosition: instead.
    • Liquidated currentPosition deprecated method, use position instead.
  • Deprecated deinitializeRTC method in QBRTCClient class. From now on QBRTCCLient managing deinitialization of webrtc on itself after initial initialization by initializeRTC method. Just liquidate usage of this method.
  • Eliminated old deprecated QBRTCSoundRouter class. Use QBRTCAudioSession instead.
  • Eliminated old deprecated enums in QBRTCConnectionState enum.
  • Liquidated QBRTCPixelFormat420v and QBRTCPixelFormatBGRA deprecated enums in QBRTCPixelFormat enum. Those formats weren’t implemented by SDK and were downright unsupported.
  • Eliminated initWithPixelBuffer: deprecated method in QBRTCVideoFrame class. Use initWithPixelBuffer:videoRotation: instead.

Related video:

Leave a Reply

Your email address will not be published. Required fields are marked *