Lulu Filterized Lin’s Correlative Theil–Sen Regression-Based Fully Connected Deep Multilayer Perceptive Neural Network for Eye Gaze Pattern Recognition
Abstract
Gaze estimation is process finding the point of gaze on observe axis of eye. Gaze tracking schemes are mainly employed in HCI and study of visual scanning samples. Traditional tracking schemes usually need accurate personal calibration procedure to evaluate the particular eye metrics. In order to improve the accurate gaze estimation, Lulu Filterized Lin’s Correlative Theil–Sen Regression-based Fully Connected Deep Multilayer Perceptive Neural Network (LFLCTR-FCDMPNN) is designed for accurate gaze pattern identification through lesser time consumption. Fully Connected Deep Multilayer Perceptive NN contains input layer, three hidden layers, output layer. In input layer, number of gaze images is collected. Then using Lulu nonlinear smoothing filtering method is applied in initial hidden layer for removing noise as well as enhancing image quality. In second hidden layer, Polar coordinate system-based eye-gaze point estimation is performed. Finally, the Gaze Pattern matching is carried out in third hidden layer using Lin’s Concordance Correlative Theil–Sen regression. The estimated gaze points are organized at gaze plane to identify gaze patterns. Then pattern matching performed by Lin’s Concordance Correlation. In this way, the eye gaze patterns are correctly recognized at the output layer. Experimental evaluation is conducted to demonstrate performance analysis of LFLCTR-FCDMPNN technique through different metrics like gaze pattern recognition accuracy, gaze pattern recognition time, and false-positive rate with different number of eye images. Explained result illustrates which LFLCTR-FCDMPNN method improves the accuracy of gaze pattern recognition and decreases the time consumption than the conventional prediction methods. Using the Synthes Eyes dataset, it turned out that the FPR of the suggested LFLCTR-FCDMPNN was 63% higher than existing.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.