Skip to content
Institutional repository
Communities & Collections
Browse
Site
Log In
Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks
Statistics
Statistics by Category
Download view's map
PNG
JPEG/JPG
Reports
Most viewed
Most viewed per month
Top city views
File Visits
Export Excel
Export CSV
Item
Views
Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks
1335