In the digital world, the signatures have to be uploaded with most of the online documents. But these signatures can be easily scanned or impersonated to create counterfeit or improper documents. The proper signatures are forged based on the details that are handy or obtainable to the forger. Using the available data, a forger can imitate a signature. Such imitated signatures are said to be improper signatures that can be differentiated only by the signer. But every time, a signer cannot be called to identify the originality of the signature. Hence, identifying the valid or genuine signature from an invalid signature has become an important area of study. Signature identification of scanned or photographed images is still an important unsolved problem in pattern recognition
This article proposes a novel algorithm for signature identification. This algorithm constructed a structural graph using midpoint traverse method (MPTM). From the structural graphs, the features of the signatures are computed and classified using SVM. False Acceptance Ratio (FAR), False Rejection Ratio (FRR) of genuine and forged signatures results in the accuracy of the identification
In general, there are two major methods of signature identification. The first one is an online method and the second one is an offline method. The online method identifies and measures sequential data such as handwriting and pen pressure with a special device. In contrast, the offline method uses an optical scanner (scanner, mobile camera, etc.,) to obtain handwritten data on paper
There are many identification methods to verify whether a signature in a document is genuine or not. Vohra K et, al
For structural analysis of signature, graph theory techniques can be employed as it analyses the signature more accurately as each point in the signature is considered for verification; this helps to identify the genuineness of a signature more exactly. A summary of the existing methods and their classification techniques is given in
Methods 
Features 
Classifier 
Guerbai et al. 2015 
Curvelet transform 
RBFSVM, MLP 
Pham et al. 2015 
Geometrybased features. 
Likelihood ratio 
Serdouk et al. 2016 
Gradient Local Binary Patterns (GLBP) and LRF 
kNN 
Pal et al. 2016 
Uniform Local Binary Patterns (ULBP) 
Nearest Neighbor 
Loka et al. 2017 
Long range correlation (LRC) 
SVM 
Zois et al. 2019 
Lattice arrangements and Pixel distribution 
Decision tree 
Sharif et al. 2020 
Local pixel distribution 
GA, SVM 
Batool et al. 2020 
GLCM, geometric features 
SVM 
Ajij M et al. 2021 
Quasistraightline segments 
SVM 
Proposed Method 
Average edge distance 
SVM 
This paper is organised as follows. Section 2 talks about the preprocessing of the image of the input signature and then extraction of the points of the image using grid merging algorithm and MidPoint Traverse Method (MPTM). Section 3 discusses constructing the bipartite graph and complete bipartite graph based on the extracted points followed by classification by SVM. Section 4 validates the proposed method.
This section proposes an offline signature automatic identification method that can be used to identify the signature’s authenticity
[
The extraction of points and the construction of the graphs are illustrated in the following steps:
The preprocessing involves the following steps:
•
•
•
•
Extract the vertex points from the grid merged images. These points are used to draw a bipartite graph and complete a bipartite graph of the input image. These points are extracted using the MPTM.
Signature traversed to the MidPoint else
Signature not traversed to the MidPoint
S
The size of the grid depends on the size of the input image. The width of each row and column is constant and chosen according to system requirements. The points of the signature images that coincide with the grids’ centre point are extracted as points. The points are extracted by traversing the signature starting from left to right. [
A bipartite graph is constructed based on the extracted vertex points and their interactions among them. Each point is
The complete bipartite graph of the input signature is given in [
The dataset has 21 different sets of 60 signatures each
Signatures 
Training Set 
Testing Set 
Total 
Genuine Signatures 
26 
4 
30 
Forged Signatures 
25 
5 
30 
Total 
51 
9 
60 
All image data selected for testing were tested in MATLAB R2018b. The program was run on WINDOWSTVFERPR, 2.30GHz with 8.00 GB Ram.
For 26 authentic and 25 forged training signatures, a threshold T was calculated. A threshold high (TH) is the highest value of the feature point in the collection of signatures, and a threshold low (TL) is the lowest value of the feature point in the set of signatures (TL). For each signature, the midpoint is computed, which is the threshold (T) value for detecting false acceptance and rejection.
For the bipartite graph and complete bipartite graph constructed from the extracted vertex points of the input signature, the pairwise distance
This method is based on the number of edges,
Let
Then
The score value
Let
Then
The score value
Let
Then
Vapnik et al.
The proposed model uses 10fold crossvalidation to set the parameter cost. This crossvalidation reduces the misclassification error of training and testing sets. The model is trained with the ICDAR dataset
As an initial setup, each individual had a training size of <26+25>, i.e., 26 genuine signatures and 25 forged signatures were used for training, as mentioned in 4.2. In addition, to record the error rates (FAR, FRR), its finished the experiments for 21 distinctive sets of signature images with the training size <26+25>. The results obtained for six sets of signatures verified using the features such as edges in the bipartite graph method, Average edge Ddistance in the complete bipartite graph method and Average edge distance in the complete bipartite graph method from the ICDAR dataset are shown in Table






1 
7.69 
10 
91.15 
2 
7.69 
3.33 
94.48 
3 
7.69 
0 
96.15 
4 
19.23 
0 
90.38 
5 
3.84 
0 
98.07 
6 
3.84 
3.33 
96.41 
Mean 
8.33 
2.11 
94.44 






1 
3.84 
0 
98.07 
2 
0 
3.33 
98.33 
3 
0 
3.33 
98.33 
4 
7.69 
0 
96.15 
5 
7.69 
3.33 
94.48 
6 
0 
13.33 
93.33 
Mean 
3.20 
3.88 
96.44 






1 
3.84 
0 
98.07 
2 
7.69 
0 
96.15 
3 
3.84 
0 
93.07 
4 
0 
3.33 
98.33 
5 
3.84 
0 
98.07 
6 
0 
3.33 
98.33 
Mean 
3.20 
1.11 
97.00 
For each set, error rates are recorded, and the resultant FAR, FRR are described. Further, the training size <26+25>, as revealed in Sect. 4.2, we have tested the performance of our algorithm on different distance methods also. The performance of the proposed method is compared with the existing methods are shown in [






ICDAR 
Sharvari, K.S et al., 
VGG16 ResNet50 MobileNetV2 DenseNet121 Xception 
 
 
67.9 62.32 59.52 60.73 60.29 
KAO HH et al., 
Deep learning 
 
 
94.37 

Navid SM et al., 
VGG19 
 
 
94 


SVM (Mean) & Edges in Bipartite graph method 
8.33 
2.11 
94.44 

SVM (Mean) & Average edge Ddistance in complete bipartite graph method 
3.20 
3.88 
96.44 

SVM (Mean) & Average edge distance in Complete bipartite graph method 
3.20 
1.11 
97.00 
[
[Table 7] summarises the true label of genuine and forged signatures. This proposed method extracts the features using edge distance methods for bipartite and a complete bipartite graph in all signatures. The set of all signature parameters divides the genuine and forged sets labelled as 0 and 1. The SVM model helps predict genuine and forged signatures with an accuracy level of more than 95%.
The graphbased method is developed for the identification of the genuineness of the offline signature. In this paper, a set of proper and improper signatures were collected from a signatory. The signatures’ points were extracted and the bipartite graph and complete bipartite graphs were constructed. The features were calculated from these graphs using edges, average edge distance and average edge Ddistance methods. From these features of the proper and improper training signatures, a threshold value
The authors wish to thank the management of Sri Sivasubramaniya Nadar College of Engineering for their continuous support and encouragement.