Maryland U & Google Introduce LilNetX: Simultaneously Optimizing DNN Size, Cost, Structured Sparsity & Accuracy | Synced
A team from the University of Maryland and Google Research proposes LilNetX, an end-to-end trainable technique for neural networks that jointly optimizes model parameters for accuracy, model size o...
Source: Synced | AI Technology & Industry Review
A team from the University of Maryland and Google Research proposes LilNetX, an end-to-end trainable technique for neural networks that jointly optimizes model parameters for accuracy, model size on the disk, and computation on any given task.