Skip to content

Commit

Permalink
Box Oval Interface for Selfie Capture Screen (#250)
Browse files Browse the repository at this point in the history
* improve code and file structure

* add the liveness guides

* add some config values to liveness guides view.

* change lottie animation frame

* run pod install

* setup animation progress time for the liiveness guide lottie animation.

* fix homeview fore each warnings.

* connect face bounds detection to the the indicator.

* add throttling to camera feed.

* use full screen cover to present home screen products.

* control progress arc visibility based on progress.

* add a dummy submit function.

* setup timers.

* add animation to the progress fill arcs.

* refactor the components of liveness guides view.

* setup delay timer and introduce state for the current animation that should be displayed with instructions after delay.

* update arc shape init

* add new lottie files. define a guide animation enum to hold the animation details.

* some refactoring.

* create a validator class for the face observation data.

* refactor buffer image processing and communication between face detector, face validator and selfie view model. some code formatting.

* remove some debug views. some refactoring and improvements.

* make capture instruction strings localizable

* introduce current liveness task into the face validator to set the right capture instruction.

* fix cropping for selfie quality check.

* show or hide the circular ring and the liveness guides based on whether face is within bounds.

* add processing view to view captured images.

* reset animation as user is completing liveness checks

* inject current liveness task into liveness guide to control which progress is showing. update reset delay timer on main thread. use appropriate error for face detection during cropping.

* processing screen layout. introduce a backport of stateobject.

* run pod install to import missing lottie files.

* present selfie capture flow in navigation view, programmatically navigate to processing view based on capture status. add a circular progress view with a loader image for the selfie processing view.

* refactor view appear setup and reset selfie capture state variables

* import submit method from selfie viewmodel.

* extract submit selfie functionality into a new class to manage the submission processes.

* some refactoring

* move backport and stateobject to helper folder.

* code formatting.

* restore threshold value

* replace ObservedObject with StateObject in HomeView so that it's initialised once.

* use proxy size instead of frame for window size calculation and face layout guide positioning.

* remove stateobject backport

* use a delegate to communicate selfie submission updates to selfie view model.
add the right title to processing screen.

* update processing changes on main thread.

* inject failure reason into the api call for submitting selfie

* rename selfie submission manager

* reset the threshold for timeout for liveness check.

* remove presentation mode variable.

* introduce an environment key to manage dismissing of the selfie capture flow.

* improve error handling of selfie capture.

* localise strings.

* code formatting.

* pod install

* make loader background color themeable.

* run pod install.

* fix missing files and build errors.

* redesign selfie capture screen to use box and oval for camera area and face bounding area

* add a view to preview selfie image, also add an actions view.

* redesign the the progress arcs for active liveness.

* improvements to validating face bounding box. add a frame to selfie preview image.

* remove processing view, add a view state to control visibility of different items based on capture state.

* adjust face size and position evaluation. fix layout for attribution and retry button.

* change active liveness progress colours

* code formatting.

* update faceboundmultiplier constant.

* refactor task timer.

* improve submission handling.
  • Loading branch information
tobitech authored Nov 6, 2024
1 parent 1c642f5 commit a93950f
Show file tree
Hide file tree
Showing 35 changed files with 956 additions and 330 deletions.
100 changes: 50 additions & 50 deletions Example/SmileID.xcodeproj/project.pbxproj

Large diffs are not rendered by default.

26 changes: 12 additions & 14 deletions Example/SmileID/Home/HomeView.swift
Original file line number Diff line number Diff line change
Expand Up @@ -10,16 +10,16 @@ struct HomeView: View {
_viewModel = StateObject(wrappedValue: HomeViewModel(config: config))
}

let columns = [GridItem(.flexible()), GridItem(.flexible())]

var body: some View {
NavigationView {
VStack(spacing: 24) {
Text("Test Our Products")
.font(SmileID.theme.header2)
.foregroundColor(.black)

MyVerticalGrid(
maxColumns: 2,
items: [
ScrollView(showsIndicators: false) {
LazyVGrid(columns: columns) {
ProductCell(
image: "smart_selfie_enroll",
name: "SmartSelfie™ Enrollment",
Expand All @@ -38,7 +38,7 @@ struct HomeView: View {
)
)
}
),
)
ProductCell(
image: "smart_selfie_authentication",
name: "SmartSelfie™ Authentication",
Expand All @@ -51,7 +51,7 @@ struct HomeView: View {
delegate: viewModel
)
}
),
)
ProductCell(
image: "smart_selfie_enroll",
name: "SmartSelfie™ Enrollment (Strict Mode)",
Expand All @@ -71,7 +71,7 @@ struct HomeView: View {
)
)
}
),
)
ProductCell(
image: "smart_selfie_authentication",
name: "SmartSelfie™ Authentication (Strict Mode)",
Expand All @@ -85,7 +85,7 @@ struct HomeView: View {
delegate: viewModel
)
}
),
)
ProductCell(
image: "enhanced_kyc",
name: "Enhanced KYC",
Expand All @@ -101,7 +101,7 @@ struct HomeView: View {
)
)
}
),
)
ProductCell(
image: "biometric",
name: "Biometric KYC",
Expand All @@ -117,7 +117,7 @@ struct HomeView: View {
)
)
}
),
)
ProductCell(
image: "document",
name: "\nDocument Verification",
Expand All @@ -131,7 +131,7 @@ struct HomeView: View {
delegate: viewModel
)
}
),
)
ProductCell(
image: "enhanced_doc_v",
name: "Enhanced Document Verification",
Expand All @@ -146,10 +146,8 @@ struct HomeView: View {
)
}
)
].map {
AnyView($0)
}
)
}

Text("Partner \(viewModel.partnerId) - Version \(version) - Build \(build)")
.font(SmileID.theme.body)
Expand Down
11 changes: 7 additions & 4 deletions Example/SmileID/Home/ProductCell.swift
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
import SmileID
import SwiftUI

struct ProductCell: View {
struct ProductCell<Content: View>: View {
let image: String
let name: String
let onClick: (() -> Void)?
@ViewBuilder let content: () -> any View
@ViewBuilder let content: () -> Content
@State private var isPresented: Bool = false

init(
image: String,
name: String,
onClick: (() -> Void)? = nil,
@ViewBuilder content: @escaping () -> any View
@ViewBuilder content: @escaping () -> Content
) {
self.image = image
self.name = name
Expand Down Expand Up @@ -44,7 +44,10 @@ struct ProductCell: View {
.fullScreenCover(
isPresented: $isPresented,
content: {
AnyView(content())
NavigationView {
content()
}
.environment(\.modalMode, $isPresented)
}
)
}
Expand Down
11 changes: 8 additions & 3 deletions Sources/SmileID/Classes/FaceDetector/FaceDetectorV2.swift
Original file line number Diff line number Diff line change
Expand Up @@ -67,12 +67,14 @@ class FaceDetectorV2: NSObject {
let faceObservation = faceDetections.first,
let faceQualityObservation = faceQualityObservations.first
else {
self.resultDelegate?.faceDetector(self, didFailWithError: FaceDetectorError.noFaceDetected)
self.resultDelegate?.faceDetector(
self, didFailWithError: FaceDetectorError.noFaceDetected)
return
}

let convertedBoundingBox =
self.viewDelegate?.convertFromMetadataToPreviewRect(rect: faceObservation.boundingBox) ?? .zero
self.viewDelegate?.convertFromMetadataToPreviewRect(
rect: faceObservation.boundingBox) ?? .zero

let uiImage = UIImage(pixelBuffer: imageBuffer)
let brightness = self.calculateBrightness(uiImage)
Expand Down Expand Up @@ -166,7 +168,10 @@ class FaceDetectorV2: NSObject {
}

let croppedImage = UIImage(cgImage: croppedCGImage)
guard let resizedImage = croppedImage.pixelBuffer(width: cropSize.width, height: cropSize.height) else {
guard
let resizedImage = croppedImage.pixelBuffer(
width: cropSize.width, height: cropSize.height)
else {
throw FaceDetectorError.unableToCropImage
}

Expand Down
37 changes: 21 additions & 16 deletions Sources/SmileID/Classes/FaceDetector/FaceValidator.swift
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ final class FaceValidator {
// MARK: Constants
private let selfieQualityThreshold: Float = 0.5
private let luminanceThreshold: ClosedRange<Int> = 80...200
private let faceBoundsMultiplier: CGFloat = 1.2
private let faceBoundsMultiplier: CGFloat = 1.5
private let faceBoundsThreshold: CGFloat = 50

init() {}
Expand All @@ -33,7 +33,10 @@ final class FaceValidator {
currentLivenessTask: LivenessTask?
) {
// check face bounds
let faceBoundsState = checkAcceptableBounds(using: faceGeometry.boundingBox)
let faceBoundsState = checkFaceSizeAndPosition(
using: faceGeometry.boundingBox,
shouldCheckCentering: currentLivenessTask == nil
)
let isAcceptableBounds = faceBoundsState == .detectedFaceAppropriateSizeAndPosition

// check brightness
Expand Down Expand Up @@ -98,22 +101,26 @@ final class FaceValidator {
}

// MARK: Validation Checks
private func checkAcceptableBounds(using boundingBox: CGRect) -> FaceBoundsState {
if boundingBox.width > faceBoundsMultiplier * faceLayoutGuideFrame.width {
private func checkFaceSizeAndPosition(using boundingBox: CGRect, shouldCheckCentering: Bool) -> FaceBoundsState {
let maxFaceWidth = faceLayoutGuideFrame.width - 20
let minFaceWidth = faceLayoutGuideFrame.width / faceBoundsMultiplier

if boundingBox.width > maxFaceWidth {
return .detectedFaceTooLarge
} else if boundingBox.width * faceBoundsMultiplier < faceLayoutGuideFrame.width {
} else if boundingBox.width < minFaceWidth {
return .detectedFaceTooSmall
} else {
if abs(
boundingBox.midX - faceLayoutGuideFrame.midX
) > faceBoundsThreshold {
return .detectedFaceOffCentre
} else if abs(boundingBox.midY - faceLayoutGuideFrame.midY) > faceBoundsThreshold {
}

if shouldCheckCentering {
let horizontalOffset = abs(boundingBox.midX - faceLayoutGuideFrame.midX)
let verticalOffset = abs(boundingBox.midY - faceLayoutGuideFrame.midY)

if horizontalOffset > faceBoundsThreshold || verticalOffset > faceBoundsThreshold {
return .detectedFaceOffCentre
} else {
return .detectedFaceAppropriateSizeAndPosition
}
}

return .detectedFaceAppropriateSizeAndPosition
}

private func checkSelfieQuality(_ value: SelfieQualityData) -> Bool {
Expand All @@ -125,8 +132,6 @@ final class FaceValidator {
_ isAcceptableBrightness: Bool,
_ isAcceptableSelfieQuality: Bool
) -> Bool {
return isAcceptableBounds &&
isAcceptableBrightness &&
isAcceptableSelfieQuality
return isAcceptableBounds && isAcceptableBrightness && isAcceptableSelfieQuality
}
}
19 changes: 14 additions & 5 deletions Sources/SmileID/Classes/FaceDetector/LivenessCheckManager.swift
Original file line number Diff line number Diff line change
Expand Up @@ -61,17 +61,26 @@ class LivenessCheckManager: ObservableObject {

/// Resets the task timer to the initial timeout duration.
private func resetTaskTimer() {
stopTaskTimer()
taskTimer = Timer.scheduledTimer(withTimeInterval: 1.0, repeats: false) { _ in
self.elapsedTime += 1
if self.elapsedTime == self.taskTimeoutDuration {
self.handleTaskTimeout()
guard taskTimer == nil else { return }
DispatchQueue.main.async {
self.taskTimer = Timer.scheduledTimer(
withTimeInterval: 1.0,
repeats: true) { [weak self] _ in
self?.taskTimerFired()
}
}
}

private func taskTimerFired() {
self.elapsedTime += 1
if self.elapsedTime == self.taskTimeoutDuration {
self.handleTaskTimeout()
}
}

/// Stops the current task timer.
private func stopTaskTimer() {
guard taskTimer != nil else { return }
taskTimer?.invalidate()
taskTimer = nil
}
Expand Down
11 changes: 11 additions & 0 deletions Sources/SmileID/Classes/Helpers/NavigationHelper.swift
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,14 @@ extension View {
}
}
}

public struct ModalModeKey: EnvironmentKey {
public static let defaultValue = Binding<Bool>.constant(false)
}

extension EnvironmentValues {
public var modalMode: Binding<Bool> {
get { self[ModalModeKey.self] }
set { self[ModalModeKey.self] = newValue }
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ public class SmileIDResourcesHelper {
public static var ConsentContactDetails = SmileIDResourcesHelper.image("ConsentContactDetails")!
public static var ConsentDocumentInfo = SmileIDResourcesHelper.image("ConsentDocumentInfo")!
public static var ConsentPersonalInfo = SmileIDResourcesHelper.image("ConsentPersonalInfo")!
public static var Loader = SmileIDResourcesHelper.image("Loader")!

/// Size of font.
public static let pointSize: CGFloat = 16
Expand Down
11 changes: 11 additions & 0 deletions Sources/SmileID/Classes/Networking/Models/FailureReason.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
import Foundation

public enum FailureReason {
case activeLivenessTimedOut

var key: String {
switch self {
case .activeLivenessTimedOut: return "mobile_active_liveness_timed_out"
}
}
}
Loading

0 comments on commit a93950f

Please sign in to comment.