Enlightening deep neural networks with knowledge of confounding factors

Y Zhong, G Ettinger - Proceedings of the IEEE International …, 2017 - openaccess.thecvf.com
Y Zhong, G Ettinger
Proceedings of the IEEE International Conference on Computer …, 2017openaccess.thecvf.com
Despite the popularity of deep neural networks, we still strive to better understand the
underlying mechanism that drives their success. Motivated by observations that neurons in
trained deep nets predict variation explaining factors indirectly related to the training tasks,
we recognize that a deep network learns representations more general than the task at hand
in order to disentangle impacts of multiple confounding factors governing the data and
isolate the effects of the concerning factors. Consequently, we propose to augment training …
Abstract
Despite the popularity of deep neural networks, we still strive to better understand the underlying mechanism that drives their success. Motivated by observations that neurons in trained deep nets predict variation explaining factors indirectly related to the training tasks, we recognize that a deep network learns representations more general than the task at hand in order to disentangle impacts of multiple confounding factors governing the data and isolate the effects of the concerning factors. Consequently, we propose to augment training of deep models with information on auxiliary explanatory data factors to boost this disentanglement and improve the generalizability of trained models to compute better feature representations. We adopt this principle to build a pose-aware DCNN and demonstrate that auxiliary pose information improves the classification accuracy. It is readily applicable to improve the recognition and classification performance for various deep-learning applications.
openaccess.thecvf.com