Is ICML a good conference?

Is ICML a good conference?

Many believe that the International Conference on Machine Learning (ICML) is one of the best machine learning conferences.

Yann LeCun : Is ICML a good conference?
But they rejected Yann LeCun? Moreover, nobody at the conference is within light years of LeCun!

As a result of my recent ICML rejection. I started thinking about new ways to describe what we are doing with Machine Learning. More specifically, Deep Learning. 

Thus, here’s a shot at it. 

Our theory of Heavy-Tailed Self-Regularization (HT-SR) is really a phenomenology.

Furthermore, based on the Random Matrix Theory (RMT).

In addition, motivated by the theory of the statistical mechanics of learning.

One that explains extensive empirical studies of the spectral properties of the weight matrices of Deep Neural Networks.

For nearly all well-trained SOTA DNN models (in CV and NLP). While the individual layer weight matrices W become well regularized. However, their Correlation Matrices X are Heavy-Tailed.

Moreover, the HT-SR theory indicates that as training proceeds, and/or regularization is increased. The Heavy-Tailedness of the correlations. However that is measured–generally increases.

Using these facts, HT-SR allows one to construct various generalization capacity metrics for DNNs (which are implemented in the weightwatcher open source tool) that measure the average Heavy-Tailedness of the layers

The more Heavy-Tailed a layer is, the better it generalizes.  Up to a point. And then becomes overfit.

pip install weightwatcher

If you are training or fine tuning your own DNN modes, you can use it to visualize how Heavy-Tailed each of your model layers are.  You can see which layers are converging well, which are stalled, and which are overfit.   And without needing any test data!

Written by Charles H. Martin

Back To News

Is ICML a good conference?