Field Note #51. On "Deep Computational Burden"

Source: Neil Thompson et al., "The Computational Limits of Deep Learning," avail. online at arXiv (July 2022)

The Main Idea: Computational Gauntlet.

The fate of deep learning is tied to solving the computational burden.  The computational expense comes from two aspects that increase deep learning efficacy: extremely large data sets and overparameterization that enhances model flexibility (and thus improves chances that predictive attributes are captured).

"[T]he growing computational burden of deep learning will soon be constraining for a range of applications, making the achievement of important benchmark milestones impossible if current trajectories hold." (Id. at 15)

The Visuals.

The Details.

You've successfully subscribed to Sandeep C. Ramesh
Great! Next, complete checkout for full access to Sandeep C. Ramesh
Welcome back! You've successfully signed in.
Unable to sign you in. Please try again.
Success! Your account is fully activated, you now have access to all content.
Error! Stripe checkout failed.
Success! Your billing info is updated.
Error! Billing info update failed.