Xception transfer learning with early stopping for facial age estimation

Authors

  • Marina V. Polyakova Odessa Polytecnic National University, 1, Shevchenko Ave. Odessa, 65044, Ukraine
  • Vladyslav V. Rogachko Odessa Polytecnic National University, 1, Shevchenko Ave. Odessa, 65044, Ukraine
  • Oleksandr H. Nesteriuk Odessa Polytecnic National University, 1, Shevchenko Ave. Odessa, 65044, Ukraine
  • Natalia A. Huliaieva Odessa Polytecnic National University, 1, Shevchenko Ave., Odessa, 65044, Ukraine

DOI:

https://doi.org/10.15276/aait.07.2024.6

Keywords:

Facial age estimation, Xception, parameter tuning, early stopping, uneven illuminated images, transfer learning

Abstract

The rapid development of deep learning attracts more attention to the analysis of person's face images. Deep learning methods
of facial age estimation are more effective compared to methods based on anthropometric models, models of active appearance,
texture models, subspace of aging patterns. However, deep learning networks require more computing power to process images. Pretrained models do not need a large training set and their training time is less. However, the parameters obtained as a result of transfer
learning of the pre-training network significantly affect its efficiency. It is also necessary to take into account the properties of the
processed images, in particular, the conditions under which they were obtained. Recently, the facial age estimation is implemented in
applications in devices with limited resources of computing, for example, in smartphones. The memory size and power consumption
of such applications are limited by the computing power of mobile devices. In addition, when photographing a person's face with a
smartphone camera, it is very difficult to ensure the uniform lighting. The aim of the research is reducing the error of faci al age
estimation from uneven illuminated images by applying an early stopping of transfer learning of the Xception network. The proposed
technique of transfer learning includes an early stopping of training, if the improvement of the results is not observed within a certain
number of epochs. Then the network weights from the epoch with the lowest validation loss are saved. As a result of the proposed
technique applying, the average absolute error of age estimation was about five years from unevenly illuminated test images. A
number of parameters of the used in this case Xception network is less than that of other deep learning neural networks which solved
the age estimation problem. Then applying of the Xception network reduces the resource consumption of devices with limited
computing power. Prospects for further research are reducing the unevenness of facial image lighting to decrease the error of age
estimation. Also, to reduce the computing resources, it is promising to use fast transforms in the Xception convolutional layers.

Downloads

Download data is not yet available.

Author Biographies

Marina V. Polyakova, Odessa Polytecnic National University, 1, Shevchenko Ave. Odessa, 65044, Ukraine

Doctor of Engineering Sciences, Associated Professor, Professor of Department of Applied Mathematics and Information Technology

Scopus Author ID: 57017879200

Vladyslav V. Rogachko, Odessa Polytecnic National University, 1, Shevchenko Ave. Odessa, 65044, Ukraine

bachelor, graduate student of Department of Applied Mathematics and Information Technology

Oleksandr H. Nesteriuk, Odessa Polytecnic National University, 1, Shevchenko Ave. Odessa, 65044, Ukraine

PhD, Associated Professor of Department of Computer Systems

Scopus Author ID: 57207314663

Natalia A. Huliaieva, Odessa Polytecnic National University, 1, Shevchenko Ave., Odessa, 65044, Ukraine

Senior Lecturer, Department of Applied Mathematics and Information Technologies

Scopus Author ID: 57202228675

Downloads

Published

2024-04-03

How to Cite

[1]
Polyakova M.V., Rogachko V.V., Nesteriuk O.H.., Huliaieva N.A.. “Xception transfer learning with early stopping for facial age estimation”. Applied Aspects of Information Technology. 2024; Vol. 7, No. 1: 69–80. DOI:https://doi.org/10.15276/aait.07.2024.6.