Enhancing Security: Infused Hybrid Vision Transformer for Signature Verification

Ishfaq, Muhammad and Saadia, Ayesha and Alserhani, Faeiz M. and Gul, Ammara (2024) Enhancing Security: Infused Hybrid Vision Transformer for Signature Verification. IEEE Access, 12. pp. 137504-137521. ISSN 2169-3536

[thumbnail of Enhancing_Security_Infused_Hybrid_Vision_Transformer_for_Signature_Verification.pdf]
Preview
Text
Enhancing_Security_Infused_Hybrid_Vision_Transformer_for_Signature_Verification.pdf - Published Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (1MB)

Abstract

Handwritten signature verification is challenging because there is a huge variation between the orientation thickness and appearance of handwritten signatures. A strong signature verification system is essential to refine the accuracy of confirming user authentication. This investigation introduces an inclusive framework for training and evaluating hybrid vision transformer models on diverse signature datasets, aiming to refine the accuracy in confirming user authentication. In previous studies, transformer & MobileNet were used for computer vision classification and signature verification separately. Drawing inspiration from the Convolutional Neural Network (CNN), the hybrid model is proposed as a deep-learning model (ResNet-18 & MobileNetV2) with the Vision Transformer model (proposed method 1 & proposed method 2).To bring originality to this study, we excluded the final layer of the feature extractor and smoothly integrated it with the initial layer of the vision transformer. In the scope of this research, we introduced a unique hybrid vision transformer model. Furthermore, we incorporated swish and tangent hyperbolic (tanh) activation functions into the validation model to enhance its performance. Experimental results showcase the effectiveness of the proposed hybrid model, achieving notable accuracies on various datasets, including 92.33% accuracy on Bhsig-Bengali, 99.89% accuracy on Bhsig-Hindi, 99.96% accuracy on Cedar, and 74.09% accuracy on UTsig-Persian datasets, respectively. The practical implications of this research extend to real-time signature verification for secure and efficient user authentication, particularly in mobile applications. This advancement in signature verification technology presents new possibilities for practical use in diverse scenarios beyond academia.

Item Type: Article
Identification Number: 10.1109/ACCESS.2024.3447083
Dates:
Date
Event
28 August 2024
Accepted
28 August 2024
Published Online
Uncontrolled Keywords: Vision transformer ResNet-18, MobileNetV2, handwritten character verification, signature verification, hybrid vision transformer, handwritten signature verification, UTsig-Persian
Subjects: CAH11 - computing > CAH11-01 - computing > CAH11-01-01 - computer science
Divisions: Architecture, Built Environment, Computing and Engineering > Computer Science
Depositing User: Gemma Tonks
Date Deposited: 22 Aug 2025 10:45
Last Modified: 22 Aug 2025 10:45
URI: https://www.open-access.bcu.ac.uk/id/eprint/16617

Actions (login required)

View Item View Item

Research

In this section...