Greedy Pruning for Continually Adapting Networks

Loading...
Thumbnail Image

Institution

http://id.loc.gov/authorities/names/n79058482

Degree Level

Master's

Degree

Master of Science

Department

Department of Computing Science

Supervisor / Co-Supervisor and Their Department(s)

Citation for Previous Publication

Link to Related Item

Abstract

Gradient Descent algorithms suffer many problems when learning representations using fixed neural network architectures, such as reduced plasticity on non-stationary continual tasks and difficulty training sparse architectures from scratch. A common workaround is continuously adapting the neural network by generating and pruning the features, a process often called Generate and Test. This thesis focuses on neural network pruning in the online, continual setting. We look at existing pruning metrics and propose a novel pruner that attempts to estimate the ideal greedy pruner. Additionally, we observe that greedy pruning can be ineffective when features are highly correlated and does not remove these redundant features. To mitigate this issue, we also propose online feature decorrelation. Through empirical experiments in the online supervised learning setting, we show that a greedy pruner combined with the proposed feature decorrelator allows us to continually replace useless parts of the network with new features while producing a statistically significant performance improvement.

Item Type

http://purl.org/coar/resource_type/c_46ec

Alternative

License

Other License Text / Link

This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.

Language

en

Location

Time Period

Source