Active Liveness iOS SDK (Advance Implementation)

This SDK version update will be not supported in June 2022, please move to the latest update on version 1.3.0. Please refers to the new documentation.

About Active Liveness iOS SDK

IdentifAI Active Liveness iOS SDK is an extension of IdentifAI Cloud Face Liveness to give suggestions to check if the person is ‘Live’ by asking them to do some expressions then AI predicts the liveness. This prevents artificial input (e.g. image from the library, print attack) to enter your identity verification System.

Getting Started

Requirement

  • Recommended Xcode version: 13
  • Minimal iOS version: 11
  • Docs use Swift
  • IDE in this document uses Xcode

Prerequisite

Please install Cocoapods version 1.0.0. You can find the installation guide here.

Get Access Key

Before starting integration, you will need API access_key and secret_key . You can get the access_key and secret_key on your application dashboard, please visit dashboard.identifai.id/application.

Initialize Cocoapods

  1. 1.
    Include the following Nodeflux pod in your podfile pod 'LivenessSDK', git:'https://github.com/nodefluxio/ios-sdk-face-liveness.git', tag: '1.2.6'. Alternatively, you can download the latest iOS Liveness framework and add xcframework file to your current project.
  2. 2.
    Open the terminal and go to the root of your project. Input command “pod install”, SDK and its dependency should automatically fetch itself.
  3. 3.
    Next open your swift file, where you want to import Nodeflux SDK by add Import LivenessSDK
  4. 4.
    Now your SDK is ready.

Advanced Implementation Using Submission Token

By using the token integration step, you still need to access our API to get the result. It offers you some benefits on the security layer, such as mentioned below:
  1. 1.
    You can save your access key and secret key information on your backend instead of in mobile/frontend, so that reduces the risk of leaking the key.
  2. 2.
    To get verification results, the mechanism incorporates token validation and backend to backend interactions to minimize the risk of interception between mobile/frontend and backend.

API Sequence Diagram

Submission token squence diagram
Description:
  1. 1.
    The end-user initiates the liveness verification through the app, then the client mobile will trigger the need for a submission token to perform the liveness check.
  2. 2.
    Client backend invokes a request for generating a one-time submission token.
  3. 3.
    Nodeflux backend will validate the HMAC authorization (check here for HMAC authorization) and return a submission token to the client backend.
  4. 4.
    The client backend returns the submission token to the client mobile which will perform the SDK. The submission token will expire in 5 minutes.
  5. 5.
    After getting the submission token, SDK will perform a request job to activate the liveness activity.
  6. 6.
    After performing the liveness check, the Nodeflux backend will return a job_id.
  7. 7.
    To get the job result, the client requests GET job_id to get liveness status from the mobile to be triggered to the client backend.
  8. 8.
    The client backend invokes the GET result request by job_id to the Nodeflux backend.
  9. 9.
    The Nodeflux backend returns status and the liveness result to the client backend.

Obtain Submission Token

Request Header:
Parameter
Value
Authorization
HMAC token
x-nodeflux-timestamp
your x-nodeflux-timestamp
Request Body: null

Configure Submission Token on the Application

Then you will get a response:
{
"ok": true,
"submission_token": "9268e8f3-1c2e-4b39-8604-61fe30a29908-GXYDEMJNGA4S2MRS"
}
Example Submission Token implementation:
import UIKit
import AVFoundation
import LivenessSDK
class ViewController: UIViewController {
@IBOutlet var ScoreLabel: UILabel!
private var accessKey: String?
private var secretKey: String?
private var submissionToken: String =Your Submission Token
// Create variable of Liveness SDK
var SDKController: SDKViewController?
// You can start Nodeflux Liveness SDK via button or any other screen lifecycle
@IBAction func startSDK(_ sender: UIButton) {
SDKController = SDKViewController(accessKey: self.accessKey, secretKey: self.secretKey, submissionToken: self.submissionToken ?? "")
SDKController?.delegate = self
self.present(SDKController!, animated: true)
}
// Please give camera usage approval in info.plist file
func setMediaApproval() {
AVCaptureDevice.requestAccess(for: AVMediaType.video) { response in
if response{
//success request camera usage
print("Log message: success granted camera permission")
} else {
//failed request camera usage
print("Log message: failing request camera usage")
}
}
}
}
extension ViewController: SDKViewControllerDelegate {
func onSuccess(score: Float, isLive: Bool, images: [String]) {
// You can neglect this delegate function (it is used on basic implementation)
}
func onSuccessWithSubmissionToken(jobID: String, images: [String]) {
// You can handle submission token success process from this delegate function
}
func onError(message: String, errors: [String : Any]) {
// You can handle error process from this delegate function
}
Variable description:
Images : return an image with base64 type.
jobID: job_id information to be used to get the verification result

Get Verification Result

Send GET request to get liveness result to endpoint:
Request Header:
Headers Parameter
Value
Authorization
Your submission token
you will get a result response:
{
"job": {
"id": "22cb223e9cc2fa2c6bfad0526119827e71be1b6de5123b7fGIYDEMJNGA4S2MRS",
"result": {
"status": "success",
"analytic_type": "FACE_LIVENESS",
"result": [
{
"face_liveness": {
"live": true,
"liveness": 0.7660550475120544
}
}
]
}
},
"message": "Face Liveness Success",
"ok": true
}
Tips:
To get liveness results, we suggest creating a request pooling to ensure the backend can get the result because it is performing async API.

Get Error Result

Error because no face is detected

{
"job": {
"id": "<job_id>",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS",
"result": []
}
},
"message": "No face detected",
"ok": true
}

Error because the face is occluded

{
"job": {
"id": "<job_id>",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS",
"result": []
}
},
"message": "the face is most likely occluded",
"ok": true
}

Error because multiple faces are detected

{
"job": {
"id": "<job_id>",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS",
"result": []
}
},
"message": "Multiple faces are detected",
"ok": true
}