Active Liveness iOS SDK V2 with Token Submission (Advance Implementation)

About Active Liveness iOS SDK

IdentifAI Active Liveness iOS SDK is an extension of IdentifAI Cloud Face Liveness to give suggestions to check if the person is ‘Live’ by asking them to do some expressions then AI predicts the liveness. This prevents artificial input (e.g. image from the library, print attack) to enter your identity verification System.
The following content assumes you're using our SDK 1.2.6 and below versions. If you are currently using 1.2.6 or below, please refer to this migration to update to the latest version 2.0.0 for more information.

Getting Started

Requirement

  • Recommended Xcode version: 13
  • Minimal iOS version: 11
  • Docs use Swift
  • IDE in this document uses Xcode

Prerequisite

Please install Cocoapods version 1.0.0. You can find the installation guide here.

Get Access Key

Before starting integration, you will need API access_key and secret_key . You can get the access_key and secret_key on your application dashboard, please visit dashboard.identifai.id/application.

Initialize Cocoapods

  1. 1.
    Include the following Nodeflux pod in your podfile pod 'LivenessSDK', git:'https://github.com/nodefluxio/ios-sdk-face-liveness.git', tag: '2.1.1'. Alternatively, you can download the latest iOS Liveness framework and add xcframework file to your current project.
  2. 2.
    Open the terminal and go to the root of your project. Input command “pod install”, SDK and its dependency should automatically fetch itself.
  3. 3.
    Next open your swift file, where you want to import Nodeflux SDK by add Import LivenessSDK
  4. 4.
    Now your SDK is ready.

Advanced Implementation Using Submission Token

By using the token integration step, you still need to access our API to get the result. It offers you some benefits on the security layer, such as mentioned below:
  1. 1.
    You can save your access key and secret key information on your backend instead of in your mobile/frontend, which reduces the risk of leaking the key.
  2. 2.
    To get verification results, the mechanism incorporates token validation and backend to backend interactions to minimize the risk of interception between mobile/frontend and backend.

API Sequence Diagram

Submission token squence diagram
Description:
  1. 1.
    The end-user initiates the liveness verification through the app, then the client mobile will trigger the need for a submission token to perform the liveness check.
  2. 2.
    Client backend invokes a request for generating a one-time submission token.
  3. 3.
    Nodeflux backend will validate the HMAC authorization (check here for HMAC authorization) and return a submission token to the client backend.
  4. 4.
    The client backend returns the submission token to the client mobile which will perform the SDK. The submission token will expire in 5 minutes.
  5. 5.
    After getting the submission token, SDK will perform a request job to activate the liveness activity.
  6. 6.
    After performing the liveness check, the Nodeflux backend will return a job_id.
  7. 7.
    To get the job result, the client requests GET job_id to get liveness status from the mobile to be triggered to the client backend.
  8. 8.
    The client backend invokes the GET result request by job_id to the Nodeflux backend.
  9. 9.
    The Nodeflux backend returns status and the liveness result to the client backend.

Obtain Submission Token

Request Header:
Parameter
Value
Authorization
HMAC token
x-nodeflux-timestamp
your x-nodeflux-timestamp
Request Body: null

Configure Submission Token on the Application

Then you will get a response:
{
"ok": true,
"submission_token": "9268e8f3-1c2e-4b39-8604-61fe30a29908-GXYDEMJNGA4S2MRS"
}
Example Submission Token implementation:
import UIKit
import AVFoundation
import LivenessSDK
class ViewController: UIViewController {
@IBOutlet var ScoreLabel: UILabel!
private var accessKey: String?
private var secretKey: String?
private var submissionToken: String =Your Submission Token
// Create variable of Liveness SDK
var SDKController: SDKViewController?
// You can start Nodeflux Liveness SDK via button or any other screen lifecycle
@IBAction func startSDK(_ sender: UIButton) {
SDKController = SDKViewController(accessKey: self.accessKey, secretKey: self.secretKey, submissionToken: self.submissionToken ?? "")
SDKController?.delegate = self
self.present(SDKController!, animated: true)
}
// Please give camera usage approval in info.plist file
func setMediaApproval() {
AVCaptureDevice.requestAccess(for: AVMediaType.video) { response in
if response{
//success request camera usage
print("Log message: success granted camera permission")
} else {
//failed request camera usage
print("Log message: failing request camera usage")
}
}
}
}
extension ViewController: SDKViewControllerDelegate {
func onSuccess(score: Float, isLive: Bool, images: [String]) {
// You can neglect this delegate function (it is used on basic implementation)
}
func onSuccessWithSubmissionToken(jobID: String, images: [String]) {
// You can handle submission token success process from this delegate function
}
func onError(message: String, errors: [String : Any]) {
// You can handle error process from this delegate function
}
Variable description:
Images : return an image with base64 type.
jobID: job_id information to be used to get the verification result
Send GET request to get liveness result to endpoint:
Request Header:
Headers Parameter
Value
Authorization
Your submission token
you will get a result response for real selfie verification:
{
"job": {
"id": "77cd223e419f9dabb8ac5a2cc2fd5f845412bed220ade789GIRNGAYS2MRX",
"result": {
"status": "success",
"analytic_type": "FACE_LIVENESS_SDK_V2",
"result": [
{
"face_liveness": {
"doubt": true,
"liveness": 0.7963000535964966
},
"image_quality": {
"doubt": false,
"contrast": true,
"face_size": true,
"sharpness": true,
"brightness": true
},
"spoof_components": {
"artifact": 0.08577598432699851,
"spoof_edge": 0.05006326002342876,
"deformation": 0.4036999464035034
}
}
]
}
},
"message": "It is likely detect certain live selfie",
"ok": true
}
you will get a result response for spoof selfie verification:
{
"job": {
"id": "77cd223e419f9dabb8ac5a2cc2fd5f845412bed220ade789GIYDEMRYS2MRX",
"result": {
"status": "success",
"analytic_type": "FACE_LIVENESS_SDK_V2",
"result": [
{
"face_liveness": {
"doubt": false,
"liveness": 0.3963000535964966
},
"image_quality": {
"doubt": false,
"contrast": true,
"face_size": true,
"sharpness": true,
"brightness": true
},
"spoof_components": {
"artifact": 0.08577598432699851,
"spoof_edge": 0.05006326002342876,
"deformation": 0.4036999464035034
}
}
]
}
},
"message": "It is likely detect a spoof object",
"ok": true
}
you will get a result response for doubt selfie verification if you get a liveness score between 0.5 and 0.6:
{
"job": {
"id": "3086aa23dedcdcb50c365e0aa6d8f4203063999692ab7b87GIYDEMRNGAYRX",
"result": {
"status": "success",
"analytic_type": "FACE_LIVENESS_SDK_V2",
"result": [
{
"face_liveness": {
"doubt": false,
"liveness": 0.8388728320598602
},
"image_quality": {
"doubt": false,
"contrast": true,
"face_size": true,
"sharpness": true,
"brightness": true
},
"spoof_components": {
"artifact": 0.050710720817248145,
"spoof_edge": 0.05008247632164675,
"deformation": 0.1611271679401398
}
}
]
}
},
"message": "It is likely detect certain live selfie",
"ok": true
}
Tips:
To get liveness results, we suggest creating a request pooling to ensure the backend can get the result because it is performing async API.

Get Error Result

Error because no face is detected

{
"job": {
"id": "<job_id>",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS",
"result": []
}
},
"message": "No face detected",
"ok": true
}

Error because the face is occluded

{
"job": {
"id": "<job_id>",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS",
"result": []
}
},
"message": "the face is most likely occluded",
"ok": true
}

Error because multiple faces are detected

{
"job": {
"id": "<job_id>",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS",
"result": []
}
},
"message": "Multiple faces are detected",
"ok": true
}

Error because face pose is over tilted

{
"job": {
"id": "30f6b786c09436d104cbc4edcdc21279579d7326ad5dda98GIYDENGAYS2MRX",
"result": {
"status": "incompleted",
"analytic_type": "FACE_LIVENESS_SDK_V2",
"result": []
}
},
"message": "the face is most likely over tilted, please ensure your face is straight facing the camera",
"ok": true
}

Configurable Parameter

You can configure image quality checking in 2 ways, setting the threshold or by changing the image quality filter behavior.
The image quality assessment will check the quality of the face area. Attributes that are checked by the algorithm consist of:
  • Sharpness: an image quality attribute to evaluate whether the face area is blurred or not. The value below 0.1 indicates a very blur image, the value between 0.1 to 0.2 indicates doubt, and the best condition of the sharpness is above 0.2.
  • Contrast: an image quality attribute to evaluate the face area has enough contrast conditions. The value below 0.5 indicates low contrast, the value between 0.5 to 0,6 indicates a doubt condition, and enough contrast should be above 0.6.
  • Brightness: an image quality attribute to evaluate whether the face area has enough lightning condition. The value below 0.3 indicates a very dark condition, the value between 0.3 to 0.4 indicates a doubtful condition, and the best brightness condition is above 0.4.
Face Quality Attribute
Reject
Doubt
Accept
Sharpness
s < 0.1
0.1 ≤ s < 0.2
s ≥ 0.2
Contrast
s < 0.5
0.5 ≤ s < 0.6
s ≥ 0.6
Brightness
s < 0.3
0.3 ≤ s < 0.4
s ≥ 0.4

Image Quality Filter

By set imageQualityFilter on the SDKController, it helps you change the image quality filter behavior. When activated (true), the system will reject the image quality that does not comply with our standard. Otherwise, it will deactivate (false) the filter and only produce a warning.
// Create variable of Liveness SDK
var SDKController: SDKViewController?
// You can start Nodeflux Liveness SDK via button or any other screen lifecycle
@IBAction func startSDK(_ sender: UIButton) {
SDKController = SDKViewController(imageQualityFilter: true, accessKey: self.accessKey, secretKey: self.secretKey, submissionToken: self.submissionToken ?? "")
SDKController?.delegate = self
self.present(SDKController!, animated: true)
}Get Verification Result

Image Quality Assessment Threshold

To configure the image quality assessment threshold, you can add an extra parameter on the SDKController. by giving parameter imageQualityAssessment: self.imageQualityAssessment
// Create variable of Liveness SDK
var SDKController: SDKViewController?
// You can start Nodeflux Liveness SDK via button or any other screen lifecycle
@IBAction func startSDK(_ sender: UIButton) {
SDKController = SDKViewController(imageQualityAssessment: self.imageQualityAssessment, imageQualityFilter: true, accessKey: self.accessKey, secretKey: self.secretKey, submissionToken: self.submissionToken ?? "")
SDKController?.delegate = self
self.present(SDKController!, animated: true)
}Get Verification Result
The imageQualityAssessment threshold value is mentioned in a stringify json:
{\n \"brightness_threshold\": 0.3,\n \"sharpness_threshold\": 0.3,\n \"contrast_threshold\": 0.3\n}

Configurable Parameter

Setting
Description
Data Type
Default Value
setAccessKey
optional (required if you using basic implementation)
String
nil
setSecretKey
optional (required if you using basic implementation)
String
nil
setSubmissionToken
optional (required if you using submission_token implementation)
String
nil
setThreshold
optional (number of threshold for acceptence score)
Double
0.7
setActiveLivenessFlag
optional (boolean flag to activate or deactivate active liveness)
Boolean
true
setImageQualityFilter
optional (boolean flag to activate or deactivate iqa )
Boolean
nil
setImageQualityAssessment
optional (custom value for iqa parameter)
String
nil
setTimeoutThreshold
optional (millisecond value of active liveness duration)
Int
30000ms
setGestureToleranceThreshold
optional (millisecond value of each gesture tollerance)
Int
10000ms
setSessionID
optional (this session ID is used for tracking capability and reconciliation)
String (Max 128 digits)
nil