Skip to content
Institutional repository
Communities & Collections
Browse
Site
Log In
imec Publications
Articles
Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks
Publication:
Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks
Date
2018-12
Journal article
Simple item page
Full metadata
Statistics
Loading...
Loading...
Files
40680.pdf
477.7 KB
Basic data
APA
Chicago
Harvard
IEEE
Basic data
APA
Chicago
Harvard
IEEE
Author(s)
Godin, F.
;
Degrave, Jonas
;
Dambre, Joni
;
De Neve, Wesley
Journal
Pattern Recognition Letters
Abstract
Description
Metrics
Views
1921
since deposited on 2021-10-25
Acq. date: 2025-10-27
Citations
Metrics
Views
1921
since deposited on 2021-10-25
Acq. date: 2025-10-27
Citations