Content uploaded by Farsad Zamani Boroujeni
Author content
All content in this area was uploaded by Farsad Zamani Boroujeni on Jun 08, 2022
Content may be subject to copyright.
CVBL_IRIS Gender Classification Database
Image Processing and Biometric Research, Computer Vision And Biometric Laboratory (CVBL)
Saeed Aryanmehr
Computer Department
Islamic Azad University, Isfahan (Khorasgan) Branch
Isfahan, Iran
e-mail: Saeed.aryanmehr@khuisf.ac.ir
Mohsen Karimi
Electric Department
Islamic Azad University, Dolatabad Branch
Isfahan, Iran
e-mail: mohsenkarimi.ht@iauda.ac.ir
Farsad Zamani Boroujeni
Computer Department
Islamic Azad University, Isfahan (Khorasgan) Branch Isfahan, Iran
e-mail: f.zamani@khuisf.ac.ir
Abstract—Iris recognition has been an interesting subject for
many research studies in the last two decades and has raised
many challenges for the researchers. One new and interesting
challenge in the iris studies is gender recognition using iris
images. Gender classification can be applied to reduce
processing time of the identification process. On the other hand,
it can be used in applications such as access control systems, and
gender-based marketing and so on. To the best of our
knowledge, only a few numbers of studies are conducted on
gender recognition through analysis of iris images. Considering
the importance of this research area and its commercial
applications, it is highly essential for researchers to make use of
efficient color features in their algorithms which necessitates the
production of color iris image databases. The present study
introduces an iris image database for gender classification and
proposes a new gender classification algorithm for its
evaluation. The database consists of iris images taken from 720
subjects including 370 females and 350 males in university
students. For each student, more than 6 images were taken from
his/her both left and right eyes. After examining the images, 3
images from the left eye and 3 images from the right eye were
selected among the most appropriate images and were included
in the database. All 4320 images from this database were taken
under the same condition and by the same color camera. Finally,
the quality and the efficiency of the introduced database are
evaluated using a new method that extract Zernike moments on
spectral features and two well-known classifiers, namely, SVM
and KNN. The results revealed that there is a significant
improvement in gender classification compared with the similar
databases.
Keywords-iris; gender classification; texture spectral feature;
zernike moment
I. INTRODUCTION
Gender classification using biometric features is an
interesting topic in biometric identification that attract many
researchers around the world. There are some research
studies in the literature on gender recognition, however most
of them rely only on visual features extracted from gray-level
images. Comprehensive studies on all important aspects of
gender classification problem require appropriate database
for pre-processing, proper feature extraction, and training
classifiers. The resulting algorithms would be more efficient
and more suitable for industry. Among different biometrics
applied in gender recognition such as face, iris, voice, and
other similar biometrics, the iris patterns are more robust to
be forged providing a highly reliable authentication tool for
important and sensitive systems. In recent years, there have
been a vast amount of research studies to produce efficient
algorithms and databases to recognize the gender by analysis
of iris images. Nevertheless, there are a few image databases
for gender recognition on iris. Among them two widely used
databases are UND_V [1] and GFI [2] that will be described
in the following section. Images in both UND_V and GFI
databases was taken in the infrared spectrum which require
expensive equipment. The UND_V database has 1944
images from 324 subjects, 3 images were taken from both left
and right eyes. Also, the database is collected from subjects
of both genders, i.e. 175 males and 149 female subjects. The
GFI database has 3000 images from 1500 subjects (750 males
and 750 females) and both left and right eyes were acquired
and saved in the database. Despite many advantages of these
two databases, there are some shortcomings as well. Firstly,
both databases were produced by infrared cameras and the
algorithms that process this type of images do not benefit
from color features of the iris. Secondly, although UND_V
database has sufficient left and right eye images, it has been
collected from a small population which provides insufficient
examples to train the algorithms in real world applications.
Although GFI database has sufficient examples, there is only
one image taken from each eye in the database, which makes
it unreliable such that there is no alternative if the quality of
an image is not appropriate for a particular application.
However, thanks to the databases produced so far, there are
some comprehensive research studies with interesting
findings.
Research and studies on gender recognition and
classification using iris images were first introduced by
Thomas in 2007 [3]. Nevertheless, there are just a few studies
conducted on gender recognition and classification using iris
images [1]-[5]. In most of the research studies, one similar
procedure is observed which mainly consists of image
acquisition, pre-processing, feature extraction, and finally
classification. The method proposed by Thomas et al. relies
mainly on iris segmentation algorithm proposed by Daugman
[6]. The authors of [3] were the first researchers who
proposed gender classification based on iris images. They
examined 5000 left-eye irises. In their proposed method, 7
geometric features were chosen from iris and cornea. Then,
they used a random tree classification and achieved 80%
accuracy. The studies were conducted only on the left eyes
which limits the applicability of their database compared with
those include the iris images of the both eyes. Two years later,
Hollingworth et al. [7] proved that in the iris images that are
segmented by Daugman method, pixels contain different
amount of information. In- other word, all segments of the
iris do not possess the same information and usually the
intermediate bands own more information. Generally,
identifying bands with more information could lead to the
higher speed and accuracy of the recognition process.
Followed by this study, the method suggested in [8] used
statistical and spectral features extracted by the wavelet
transform followed by applying support vector machine for
iris recognition which resulted in 83.3% of accuracy. They
extracted the information from the whole segments of the iris
which causes a remarkable increase in the processing time.
The total of 300 images were used in their study that were
taken from both eyes, one image per eye. Also, the total
number of samples were 150 subjects including 50 females
and 100 males.
The method presented in [1] suggests the use of local
binary pattern (LBP) for gender recognition and found great
improvement in the gender recognition. In their proposed
method, different versions of LBP were examined on the iris
images and resulted in 91% accuracy. Tapia et al. [2] found
that some extracted bands include further information for
gender recognition than others. To select the most
informative bands, the mutual information technique was
used and resulted in 89% of accuracy. In their proposed
method, the most informative bands were selected from 20
bands and they found that by increasing the number of bands,
it is likely to find more informative bands. Recently, many
researchers have been interested in Zernike features, because
Zernike moments extract stronger features in images with
soft textures. Zernike moments proved to be efficient in palm
print detection [9], finger print identification [10], and face
recognition [11] which are common biometric methods used
for human identification. Thus, it is expected that features
extracted from moments near the center of iris serve as major
descriptors for human identification and gender recognition.
Thus, it is important to know about the properties of spectral
features extracted from the regions close to the center of the
iris. Considering the circular shape of the iris, this region has
a significant amount of texture with spectral features [2]. The
present study proposed a new Zernike moment-based feature
extraction that is able to improve the rate of gender
classification significantly. The present study is organized as
follows: section II introduces the CVBL_IRIS database in
details. Section III presents the proposed method based on
Zernike moments that extract spectral features from the
texture in the gender classification. Section IV compares the
results obtained from applying the proposed method on the
CVBL_IRIS and UND_V database and discusses the
findings. Finally, section V includes the findings and presents
the results and conclusions.
II. CVBL_IRIS DATABASE
The iris images in the existing databases are acquired
using infrared cameras which only produce gray-level
images. The present study proposed a new database
containing color iris images for human identification and
gender classification. The proposed database was produced
in 2017 and the images were taken by Canon D550 with 18-
55 lens. In order to prevent the reflection of light from the
iris, a flash ring, MEIKE MK-FC110 [13], was used. The
flash was installed on the lens to improve the quality of image
and for enhancing the salient patterns of the iris. Figure 1
shows the steps of installing the flash ring on the camera.
Figure 1. The MEIKE MK-FC110 flash ring.
The resolution of the acquiring images was originally
51843456. Then, the images were resized to 320 280 for
final processes. This pre-processing step was done by the
ImageJ-1/43-U software [14] with which iris region was
extracted from the images. The database consisted of 4320
samples from 720 subjects. The subjects were selected from
university students of Islamic Azad University, Isfahan
(Khorasgan) branch. For each subject six images were taken;
3 images from the left eye and 3 others from the right eye.
The number of male and female subjects was almost equal.
The age range of the sample population was from 18 to 50
years old but the majority of the population ware selected
from students with 18 to 25 years old. Table I shows the detail
of comparison between the existing databases and the
proposed database for gender recognition.
As it is shown in Table I, the sample population and the
quality of images is similar to the standard databases. Figure
2 shows three iris images taken from the right eye of a female
subject and Figure 3 shows that of a male subject after the
cropping step.
TABLE I. SUMMARY OF THE PROPOSED DATABASE AND GFI AND
UND_V
Database
UND_V
GFI
Our
Database
NIR
NIR
VIS
Image type
324
1500
720
No. of subjects
6
2
6
No. of images per
subject
3
1
3
No. of images per
eye
175 Men,
149Women
750 Men, 750
Women
350 Men,
370 Women
No of subjects per
gender
640x480
640x480
640x580
Resolution
TIFF
TIFF
JPG
Format
Yes
Yes
Yes
Available
Figure 2. Three left eye iris images taken from a female subject
Figure 3. Three right eye iris images taken from a male subject
All the images in this database were acquired in the
similar conditions and the distance of lens with the subjects’
1
https://ieee-dataport.org/documents/cvbl-iris-super-resolution-dataset
eye was determined and controlled by a tripod and a fixing
instrument. Figure 4 shows how the imaging instrument was
used.
The images of the proposed database are accessible in
IEEE Dataport
1
. All the data files are labeled by a very simple
rule to make the searching and querying the images easy and
straight forward. Table II presents the naming rule
accompanied by an example.
Figure 4. The imaging equipment used for acquiring CVBL_IRIS
database
TABLE II. IMAGE FILE NAME DESCRIPTION
ID.
Subject
Gender
Orientation
ID. eye
ID
GF(Female)
LE(Left)
01 to 03
GM(Male)
RI(Right)
01 to 03
Example file name :122.GM.LE.00
Male
Left eyes
ID. eye
Figure 5. The example of naming the images in the CVBL database
III. THE EVALUATION ALGORITHM
The spectral features of iris texture include huge amount
of information that can be used to increase the accuracy of
identification. Texture information is intuitively represented
the degree of smoothness, greatness, and homogeneity. There
are two types of texture features including spatial domain
features and frequency domain features. To describe the
texture three main methods can be examined:
Statistical Methods
Spectral Methods
Structural Methods
The mean of standard deviation, smoothness, third
moment homogeneity, and entropy are of statistical
moments. Spectral features involve frequency domain
transforms such as Fourier transform or multi-resolution
transforms, e.g. wavelet transform. Consequently, the
recognition and interpretation of the spectral features become
simpler by expressing spectrum in the form of polar
coordinates.
Almost all of the identification methods in different
biometric domains, use a set of particular features either in
spatial or frequency domain. This fact holds for gender
classification through iris images as well. The main objective
of the present study is to propose a new database for gender
classification as well as introducing a method for evaluating
gender classification on this new database. The proposed
algorithm called SP_ZR is applied on UND_V and
CVBL_IRIS databases in order to compare the results and the
quality of them. In the proposed method, a texture spectral
feature called Zernike Moments were used. In the following
section the basic concepts of the spectral feature extractor and
Zernike moments are presented and elaborated.
A. The Basic Concepts of Zernike and Spectral Texture
Zernike features are highly popular because they can
extract strong features from highly delicate textures. By
considering that, iris has delicate and coherent textures with
huge variety of patterns. Zernike moments proved to be
efficient feature in the finger print images, palm line, palm
vessels, and face images. Thus, it is expected that Zernike
moments are able to efficiently extract useful features for
gender recognition. Also, Zernike moments have no
information redundancy because the basic functions are
orthogonal. Therefore, in comparison with other features, the
computational load of the Zernike moments is reduced as a
result of significant reduction in the data. Furthermore, the
Zernike moments are independent of rotation and translation
and are robust to noise. The core of Zernike moments are the
Zernike multi-sentences. Two-dimensional moments from
rank p and frequency q are on the image in polar coordinate
(r,σ). Which can be expressed as follows (1):
(1)
where (2)
is equal to the multi-sentenced true part (3). In equation
(3) p> q and p-q are even.
(3)
IV. AN EXPLANATION OF THE PROPOSED METHOD
The present paper introduces a new iris database for
gender recognation. It also proposes a new method based on
Zernike moment-based feature extraction which extract the
feature from texture spectral features called SP_ZR. Figure 6
shows the block diagram of the implemantation steps of this
algorithm.
In the proposed method, first the images are divided into
training and testing partitions in both CVBL_IRIS and
UND_V databases. From each three existing images of an
eye, two images were used for training and one was used for
testing. In the next step, that the Daugman algorithm is used
for iris segmentation. Figure 7 shows the original image, the
segmented region and normalized image for a sample iris
image in the CVBL-IRIS database. Similarly, Figure 8 shows
the cropped image, the segmented and normalized iris image
in the UND_V database. After the segmentation step, the
Zernike moment was extracted from segmented iris image.
The final produced feature vectors were divided into two
classes of male and female proving the training and testing
samples fort SVM and KNN classifiers.
Pre-processing
Features
Extraction
Classifi
cation segmen
tation
Corp imageCorp image
Resize Image and
rgb2gray
Segmentation
&
Normalization
Extract Zernike
features
Extract texture
features
Segmentation
&
Normalization
Resize Image and
rgb2gray
Test Image
Extract Zernike
features
Image acquisition
Extract texture
features
Train Data
Features
database
SVM
&
KNN
Gender
classification
Figure 6. Block diagram of proposed method
a b
c
Figure 7. a) a sample of cropped image , b) segmented image, c)
normalized image in the CVBL-IRIS database
V. RESULTS AND DISCUSSION
The proposed method, SP_ZR was implemented in the
CVBL_IRIS database. It first extracts the spectral features
from the iris texture. Then the coefficients near the spectral
features of the texture are computed. To evaluate the
proposed method, the identification rate was used as the
evaluation measure to prove the superiority of the
CVBL_IRIS database compared with UND_V database.
Table III show the results from applying the proposed method
on the CVBL_IRIS_ and UND_V by the KNN classifier.
a b
c
Figure 8. a) a sample of cropped image , b) segmented image, c)
normalized image in the CVBL-IRIS database
TABLE III. THE COMPARISON OF IDENTIFICATION RATE OF THE
PROPOSED METHOD IN THE RIGHT AND LEFT EYE IN 1 TO 9TH ORDER
MOMENT WITH KNN CLASSIFIER
Order
CVBL
Right
CVBL
Left
UND_V
Right
UND_V Left
1
78%
68%
59%
60%
2
77%
78%
63%
64%
3
80%
79%
66%
68%
4
85%
82%
71%
69%
5
83%
83%
70%
71%
6
81%
85%
74%
72%
7
80%
88%
70%
73%
8
80%
86%
70%
71%
9
82%
82%
71%
73%
In this phase the first order to nine order moments were
tested. As shown in Table III, the best results were obtained
in 6th and 7th order moments for the right eyes of CVBL_IRIS
with the identification rate of 88% and 85%, respectively.
While, for UND_V database, the identification rate for the 6th
order moment is about 74% for the left eye. Also, the
identification rate of 73% was obtained in the 9th order
moment for the left eye in the UND_V database.
Table IV shows the comparison of recognition rate of the
proposed method in left and right eye in the moment order
from first to 9th by the SVM classifier in both databases.
The proposed method in the 9th order moment in the left
eye with the identification rate of 77% was the best result.
However, the same identification rate for the right eye was
75%. After applying the proposed method, SP_ZR, on the
UND_V database, in the right eye, the best results were 71%
and 69% for the 9th order moment.
The findings revealed that the proposed method, which is
based on extracting Zernike moment-based features,
performs much better on the CVBL_IRIS than the UND_V
database. It is apparent that the presence of dominate features
of the recorded textures in this database (CVBL_IRIS)
justifies its superiority. Although the computation time
increases as the order of Zernike moment is increased, it
seems that improvement of the identification rate can
compensate the increase in computation time.
TABLE IV. THE COMPARISON OF IDENTIFICATION RATE OF THE
PROPOSED METHOD IN THE RIGHT AND LEFT EYE IN FIRST TO 9TH ORDER
MOMENT WITH SVM CLASSIFIER
Order
CVBL
Right
CVBL
Left
UND_V
Right
UND_V
Right
1
68%
61%
58%
59%
2
69%
66%
61%
63%
3
68%
70%
63%
61%
4
69%
70%
62%
59%
5
69%
69%
65%
63%
6
73%
73%
57%
60%
7
70%
74%
69%
68%
8
73%
76%
70%
72%
9
77%
76%
71%
69%
VI. CONCLOSION
The present study proposes a new database called
CVBL_IRIS for gender classification based on iris images.
To evaluate the proposed database, a new feature extraction-
based method called SP_ZR based on extracting Zernike
moments from sepctral features was proposed and applied on
the CVBL_IRIS database. To evaluate the proposed method
and to compare the databases, the proposed method was
applied on the UND_V database as well. The results of
identification rate show that the extracted features are more
discriminative in the proposed database compared with its
counterpart. Furthermore, using our database, the future
studies can use color features for iris recognition and gender
classification.
REFERENCES
[1] J. E. Tapia, C. A. Perez, and K. W. Bowyer, “Gender classification
from iris images using fusion of uniform local binary patterns,” in
European Conference on Computer Vision, 2014, pp. 751–763.
[2] J. E. Tapia, C. A. Perez, and K. W. Bowyer, “Gender Classification
From the Same Iris Code Used for Recognition,” IEEE Trans. Inf.
Forensics Secur., vol. 11, no. 8, pp. 1760–1770, 2016.
[3] V. Thomas, N. V Chawla, K. W. Bowyer, and P. J. Flynn, “Learning
to predict gender from iris images,” in Biometrics: Theory,
Applications, and Systems, 2007. BTAS 2007. First IEEE International
Conference on, 2007, pp. 1–5.
[4] A. Bansal, R. Agarwal, and R. K. Sharma, “SVM based gender
classification using iris images,” in Computational Intelligence and
Communication Networks (CICN), 2012 Fourth International
Conference on, 2012, pp. 425–429.
[5] S. Lagree and K. W. Bowyer, “Predicting ethnicity and gender from
iris texture,” in Technologies for Homeland Security (HST), 2011
IEEE International Conference on, 2011, pp. 440–445.
[6] J. Daugman, “How iris recognition works,” IEEE Trans. circuits Syst.
video Technol., vol. 14, no. 1, pp. 21–30, 2004.R. Nicole, “Title of
paper with only first word capitalized,” J. Name Stand. Abbrev., in
press.
[7] K. P. Hollingsworth, K. W. Bowyer, and P. J. Flynn, “The best bits in
an iris code,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 31, no. 6,
pp. 964–973, 2009.
[8] A. Bansal, R. Agarwal, and R. K. Sharma, “SVM based gender
classification using iris images,” in Computational Intelligence and
Communication Networks (CICN), 2012 Fourth International
Conference on, 2012, pp. 425–429.
[9] S. Kahyaei and M. S. Moin, “Robust matching of fingerprints using
pseudo-Zernike moments,” 2016 4th Int. Conf. Control.
Instrumentation, Autom. ICCIA 2016, no. January, pp. 116–120, 2016.
[10] J. Svoboda, J. Masci, and M. M. Bronstein, “Palmprint recognition via
discriminative index learning,” Proc. - Int. Conf. Pattern Recognit., pp.
4232–4237, 2017.
[11] W. L. Yang and L. L. Wang, “Research of palmprint identification
method using Zernike moment and neural network,” Proc. - 2010 6th
Int. Conf. Nat. Comput. ICNC 2010, vol. 3, no. Icnc, pp. 1310–1313,
2010.
[12] D. N. Satange, A. Alsubari, and R. J. Ramteke, “Composite Feature
Extraction based on Gabor and Zernike Moments for Face
Recognition,” IOSR J. Comput. Eng., no. August, pp. 2278–661, 2017.
[13] “Meike FC-110, Meike FC-110 LED Macro Ring Flash Light FC110
for Canon EOS Nikon Pentax Olympus camera | MeiKe Store.”
[Online]. Available: http://www.meikestore.com/product/meike-fc-
110-meike-fc-110-led-macro-ring-flash-light-fc110-for-canon-eos-
nikon-pentax-olympus-camera/2094.html. [Accessed: 21-Dec-2017].
[14] “Download.” [Online]. Available:
https://imagej.nih.gov/ij/download.html. [Accessed: 21-Dec-2017].