LoRAINNE 2024

LoRAINNe’24: workshop on LOw-Rank Approximations and their Interactions with Neural NEtworks

Location: IDMC (Institut des Sciences du Digital), Nancy, France. How to get there

Date: 26 and 27 November, 2024.

About the Workshop

This workshop will explore low-rank matrix and tensor decompositions/approximations and their interactions with neural networks and machine learning at large. The workshop will cover theoretical foundations as well as practical applications, with the main goal to connect researchers working in these fields. This workshop is also a closing event for the ANR LeaFleT project (project ANR-19-CE23-0021).

Schedule

We will start at 13h-14h on 26 November and should finish on the 27th in the afternoon. Stay tuned for the detailed schedule of talks!

Registration

Registration is free but mandatory (the capacity is limited). Click here to register!

Confirmed speakers

We will be announcing the full list of speakers shortly.

Preliminary Program

André de Almeida

Overview of tensor decompositions and some applications to wireless communications

Christophe Cerisara

Study of a few properties of LLM pruning

Jérémy Cohen

Implicit Regularization in Regularized (Nonnegative) Low-Rank Approximations

Abstract: Regularized nonnegative low-rank approximations such as sparse Nonnegative Matrix Factorization or Sparse Nonnegative Tucker Decomposition are an important branch of dimensionality reduction models with enhanced interpretability. However, from a practical perspective, the choice of regularizers and regularization coefficients is often challenging because of the multifactor nature of these models and the lack of theory to back these choices. The work presented aims at improving upon these issues. By studying a more general model called the Homogeneous Regularized Scale-Invariant, we prove that the scale- invariance inherent to low-rank approximation models causes an implicit regularization with unexpected effects. This observation enables to better understand the effect of regularization functions in low-rank approximation models, to guide the choice of the regularization hyperparameters, and to design balancing strategies to enhance the convergence speed of dedicated optimization algorithms. We showcase our contributions on sparse Nonnegative Matrix Factorization, ridge-regularized Canonical Polyadic decomposition and sparse Nonnegative Tucker Decomposition.

Mariya Ishteva

Decoupling multivariate functions using tensors

Bernard Mourrain

Low rank approximation of moment matrices and tensors

Yang Qi

On the multi-spiked random tensor model

Yassine Zniyed

Network compression using tensor decompositions and pruning

Tentative schedule:

Date Time Session
Tue 26 Nov   (Day 1)
  13:45-14:00 Registration
  14:00-14:15 Opening remarks
  14:15-15:15 André de Almeida
  15:15-15:45 Mariya Ishteva
  15:45-16:15 Coffee break
  16:15-17:15 Francesco Tudisco
  17:15-17:30 Spotlight posters
  17:30-19:00 Cocktail + posters
Wed 26 Nov   (Day 2)
  09:00-09:30 Welcome coffee
  09:30-10:30 Bernard Mourrain
  10:30-11:00 Jeremy Cohen
  11:00-11:30 Coffee break
  11:30-12:30 Julia Gusak
  12:30-14:00 Lunch break
  14:00-15:00 Christophe Cerisara
  15:00-15:30 Invited talk
  15:30-16:00 Invited talk
  16:00-17:00 Farewell coffee break

Contact Us

Organization committee:

You can email the organizers at: firstname.lastname@univ-lorraine.fr


© 2024 Workshop LoRaINNe