UnsolvedMajor Unsolved Problem

Tight PAC-Bayes Bounds for Deep Neural Networks

§ Problem Statement

§ Discussion

Loading discussion…

§ Significance & Implications

§ Known Partial Results

§ References

[1]

Computing nonvacuous generalization bounds for deep (stochastic) neural networks with many more parameters than training data

Gintare Karolina Dziugaite, Daniel M. Roy (2017)

UAI

📍 Section 7 (Conclusion and Future Work), paragraph beginning “Our PAC-Bayes bound can be tightened in several ways...”, p. 9

[2]

PAC-Bayesian stochastic model selection

David McAllester (2003)

Machine Learning

📍 Section 3 (PAC-Bayesian theorems), Theorem 1 (original PAC-Bayes generalization bound with KL divergence), p. 9.

[3]

A Primer on PAC-Bayesian Learning

Benjamin Guedj (2019)

📍 Section 4 (Open problems and perspectives), enumeration of key open challenges including tightness and scalability, pp. 15–17.

§ Tags