🐲 LAIR Publications & Preprints 📔

Published work

Launay, Julien et al. “Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures.“ Advances in Neural Information Processing Systems, pp. 9346--9360, vol. 33, (2020)

Dong, Jonathan and Ohana, Ruben et al. “Reservoir Computing meets Recurrent Kernels and Structured Transforms.“ Advances in Neural Information Processing Systems, pp. 16785--16796, vol. 33, (2020)

Keriven, N. et al. “NEWMA: A New Method for Scalable Model-Free Online Change-Point Detection.” IEEE Transactions on Signal Processing 68 (2020): 3515-3528.

Ohana, Ruben et al. “Kernel Computations from Large-Scale Random Features Obtained by Optical Processing Units.” ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2020): 9294-9298.

Preprints

Hesslow, Daniel et al. “Photonic co-processors in HPC: using LightOn OPUs for Randomized Numerical Linear Algebra.” ArXiv abs/2104.14429 (2021).

Hesslow, Daniel and Iacopo Poli. “Contrastive Embeddings for Neural Architectures.” ArXiv abs/2102.04208 (2021)

Cappelli, A. et al. “Adversarial Robustness by Design through Analog Computing and Synthetic Gradients.” ArXiv abs/2101.02115 (2021)

Refinetti, Maria et al. “The dynamics of learning with feedback alignment.” ArXiv abs/2011.12428 (2020)

Launay, Julien et al. “Light-in-the-loop: using a photonics co-processor for scalable training of neural networks.” ArXiv abs/2006.01475 (2020)

Chatelain, A. et al. “Online Change Point Detection in Molecular Dynamics With Optical Random Features.” ArXiv abs/2006.08697 (2020)

Launay, Julien et al. “Principled Training of Neural Networks with Direct Feedback Alignment.” ArXiv abs/1906.04554 (2019)