San Francisco: A team from Facebook AI Research (FAIR) has developed a novel low-dimensional design space called ‘RegNet’ that outperforms traditional available models like from Google and runs five times faster on GPUs.
RegNet produces simple, fast and versatile networks and in experiments, it outperformed Google’s SOTA EfficientNet models, said the researchers in a paper titled ‘Designing Network Design Spaces; published on pre-print repository ArXiv.
The researchers aimed for “interpretability and to discover general design principles that describe networks that are simple, work well, and generalize across settings”.
The Facebook AI team conducted controlled comparisons with EfficientNet with no training-time enhancements and under the same training setup. Introduced in 2019, Google’s EfficientNet uses a combination of NAS and model scaling rules and represents the current SOTA.
With comparable training settings and Flops, RegNet models outperformed EfficientNet models while being up to 5× faster on GPUs.
Rather than designing and developing individual networks, the team focused on designing actual network design spaces comprising huge and possibly infinite populations of model architectures. Design space quality is analyzed using error empirical distribution function (EDF).
Analyzing the RegNet design space also provided researchers other unexpected insights into network design. They noticed, for example, that the depth of the best models is stable across compute regimes with an optimal depth of 20 blocks (60 layers). (IANS)