Postdoctoral Researcher at the Foundation of Data Science Institute
Contact
MIT D32-588
Cambridge, USA
Boston University, 665 Commonwealth ave.
dhadi at mit dot edu
I do research on the foundations of machine learning at FODSI, as a postdoctoral researcher hosted by
MIT and Boston University. Before, I was a SNSF postdoctoral researcher at Princeton University and INRIA Paris. I accomplished my PhD in computer science at ETH Zurich.
[19] Towards Training Without Depth Limits: Batch Normalization Without Gradient Explosion with Alexandru Meterez, Amir Joudaki, Francesco Orabona, Alexander Immer, Gunnar Ratsch. ICLR 24
[17] On the impact of activation and normalization in obtaining isometric embeddings at initialization with Amir Joudaki and Francis Bach. NeurIPs 23
[16] On bridging the gap between mean field and finite width in deep random neural networks with batch normalization with Amir Joudaki and Francis Bach, ICML 23 (code, poster)
[15] Efficient displacement convex optimization with particle gradient descent with Jason D. Lee, and Chi Jin. ICML 23 (code, poster)
[14] Batch normalization orthogonalizes representations in deep random networks with Amir Joudaki, and Francis Bach. (spotlight) NeurIPS 21 (code, poster)
[13] Batch normalization provably avoids ranks collapse for randomly initialised deep networks with Jonas Kohler, Francis Bach, Thomas Hofmann, Aurelien Lucchi. NeurIPS 20 (code, poster)
[12] Exponential convergence rates for batch normalization: The power of length-direction decoupling in non-convex optimization with Jonas Kohler, Aurelien Lucchi, Thomas Hofmann, Ming Zhou and Klaus Neymeyr. AISTATS 19
[11] Local saddle point optimization: A curvature exploitation approach with Leonard Adolphs, Aurelien Lucchi and Thomas Hofmann. AISTATS 19 (poster)
[10] Polynomial-time Sparse Measure Recovery: From Mean Field Theory to Algorithm Design with Francis Bach, arXiv 2022
[9] Rethinking the Variational Interpretation of Accelerated Optimization Methods with Peiyuan Zhang, and Antonio Orvieto. NeurIPS 21 (poster)
[8] Revisiting the role of euler numerical integration on acceleration and stability in convex optimization with Peiyuan Zhang, Antonio Orvieto, Thomas Hofmann and Roy S Smith. AISTATS 18 (poster)
[7] Escaping saddles with stochastic gradients with Jonas Kohler, Aurelien Lucchi and Thomas Hofmann. (long presentation) ICML 18(slides)
[6] Adaptive newton method for empirical risk minimization to statistical accuracy with Aryan Mokhtari, Hadi Daneshmand, Aurelien Lucchi, Thomas Hofmann and Alejandro Ribeiro. NeurIPS 16
[5] Starting small-learning with adaptive sample sizes with Aurelien Lucchi and Thomas Hofmann. ICML 16
[4] Estimating diffusion network structures: Recovery conditions, sample complexity & soft-thresholding algorithm with Manuel Gomez-Rodriguez, Le Song and Bernhard Schoelkopf. ICML 14 (Recomended to JMLR)
[3] Estimating diffusion networks: Recovery conditions, sample complexity & soft-thresholding algorithm with Manuel Gomez-Rodriguez, Le Song and Bernhard Schoelkopf. JMLR 16
[2] A Time-Aware Recommender System Based on Dependency Network of Items with Amin Javari, Seyed Ebrahim Abtahi and Mahdi Jalili. The Computer Journal 14
[1] Inferring causal molecular networks: empirical assessment through a community-based effort with Steven M Hill, Laura M Heise et al. Nature Methods