Active Liveness Detection
This active liveness check provides some sets of activities to check face liveness for the purpose of liveness verification. The Nodeflux active liveness SDK gives you some benefits to create better onboarding flow during the verification process, such as:
- Double spoof checking by performing AI model combination of active liveness process and passive liveness.
- Carefully designed simple UI to guide your customers to perform motion checking for preventing liveness attacks.
- Secure the spoofing activity by filtering artificial input such as masks, print attacks, and replay attacks.
The SDK will verify the face liveness twice, first by checking the motion and then doing the passive liveness check. The motion check will be done on the SDK (on the device), but the passive check will be sent to the Nodeflux Backend to be verified.
Active Liveness Gesture Overview
On the active liveness part, users should follow instructions to be verified as a live person. Some poses will consider the user to not use attributes during taking the selfie to prevent false checking. It has 3 random head pose combinations containing pitch, yaw, roll movement; blinking eyes movement, and smile.
- 1.To be verified as “Live”, a person should do liveness instructions.
- 2.All of the instructions must be done in under 10 seconds. Otherwise, the process will start from the beginning.
- 3.Only a single face in the frame is allowed, otherwise, the process cannot continue and will start from the beginning.
- 4.The face-id of the user must be consistent. If there any face changes during the process, the SDK will be detected as a different person, so the process will start from the beginning.
- 5.Please frame the face and neck are captured on the guidance frame.
The active liveness pose will capture the selfie movement based on the instruction. If the user does the wrong pose instruction, it will be rejected and get vailed verification. The user should pass the verification before 10 seconds, if the verification time is up, the process will be repeated from the beginning.
To check the head pose movement, we use the pitch as X-axis movement, yaw as Y-axis movement, roll as Z-axis movement.
Head Movement Axis
The pitch movement consists of nod up and nod down, the yaw movement consists of a look right and a look left pose, and for the roll movement consist of nod right and nod left. To change the pose instruction, please check our string.xml details. You are able to modify the wording for a better experience. Here are the list of possible head random poses on Nodeflux Active Liveness.
We use a landmark for validating the instruction. The object occlusion can contribute false positives in the mouth pose. Such as face masks, sunglasses, and normal glasses.
- Face masks most likely though not absolutely will affect mouth pose detection by giving the low probability of smile detection. Different masks can produce different pose quality readings and can cause false positives. On our benchmark, the mask types that we test are medical masks and n95 type with solid color with no color/printing texture.
- Sunglasses will affect eye pose detection. Using dark sunglasses interfere with eye detection, however, transparent sunglasses can cause false detection. It means there will be a possibility of using sunglasses the verification is passed.
- Eyeglasses, although it is transparent, the possibility of occlusion will be found on glare light effect on the glasses. Avoid this condition so that the possibility of false detection can be decreased.