Differences between revisions 9 and 18 (spanning 9 versions)
Revision 9 as of 2017-06-11 13:52:04
Size: 1983
Editor: DavidOwen
Comment: ML generating ML
Revision 18 as of 2019-08-02 15:31:04
Size: 3443
Editor: DavidOwen
Comment:
Deletions are marked like this. Additions are marked like this.
Line 11: Line 11:
 * [[https://arxiv.org/abs/1706.03741|Deep reinforcement learning from human preferences]]: Aims to minimize how much time a human must give feedback to the system for the system to train itself correctly
 * [[https://arxiv.org/abs/1708.00630|ProjectionNet: Learning Efficient On-Device Deep Networks Using Neural Projections]]: Trains a simpler ANN "next to" a more traditional ANN for image recognition, getting good results from the simpler ANN with reduced memory requirements.
 * [[https://www.theatlantic.com/business/archive/2012/05/when-correlation-is-not-causation-but-something-much-more-screwy/256918/|When Correlation Is Not Causation, But Something Much More Screwy]]
 * [[https://dmitryulyanov.github.io/deep_image_prior|Deep Image Prior]]
 * [[http://www.argmin.net/2018/01/25/optics/|Lessons from Optics, The Other Deep Learning]]: Phenomena noticed in training deep ANNs, with an analogy to optics.
 * [[http://ai2.ethz.ch/|AI2 (Abstract Interpretation for AI Safety)]]: Using absint to guard against adversarial attacks.
 * [[https://arxiv.org/abs/1806.04743|INFERNO: Inference-Aware Neural Optimisation]]
 * [[https://thomas-tanay.github.io/post--L2-regularization/|A New Angle on L2 Regularization]]
 * [[https://arxiv.org/abs/1908.00200|KiloGrams: Very Large N-Grams for Malware Classification]]: Clever use of multi-passing over a large dataset with an approximating first pass, to reduce computational and memory requirements.

Papers for discussion

Papers (last edited 2019-08-04 01:39:13 by DavidOwen)