Distilled Gaussianization
暂无分享,去创建一个
[1] David Duvenaud,et al. FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models , 2018, ICLR.
[2] Adam M. Oberman,et al. How to Train Your Neural ODE: the World of Jacobian and Kinetic Regularization , 2020, ICML.
[3] Rich Caruana,et al. Model compression , 2006, KDD '06.
[4] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[5] Stefania Matteoli,et al. An Overview of Background Modeling for Detection of Targets and Anomalies in Hyperspectral Remotely Sensed Imagery , 2014, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.
[6] Vladimir Vapnik,et al. The Nature of Statistical Learning , 1995 .
[7] Valero Laparra,et al. Iterative Gaussianization: From ICA to Random Rotations , 2011, IEEE Transactions on Neural Networks.
[8] Ramesh A. Gopinath,et al. Gaussianization , 2000, NIPS.
[9] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[10] Valero Laparra,et al. Information Theory Measures via Multidimensional Gaussianization , 2020, ArXiv.
[11] Stefano Ermon,et al. Gaussianization Flows , 2020, AISTATS.