Publication:

Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks

Date

 
dc.contributor.authorGodin, F.
dc.contributor.authorDegrave, Jonas
dc.contributor.authorDambre, Joni
dc.contributor.authorDe Neve, Wesley
dc.contributor.imecauthorDambre, Joni
dc.date.accessioned2021-10-25T19:03:37Z
dc.date.available2021-10-25T19:03:37Z
dc.date.embargo9999-12-31
dc.date.issued2018-12
dc.identifier.issn0167-8655
dc.identifier.urihttps://imec-publications.be/handle/20.500.12860/30773
dc.identifier.urlhttps://www.sciencedirect.com/science/article/abs/pii/S0167865518305646
dc.source.beginpage8
dc.source.endpage14
dc.source.journalPattern Recognition Letters
dc.source.volume116
dc.title

Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks

dc.typeJournal article
dspace.entity.typePublication
Files

Original bundle

Name:
40680.pdf
Size:
477.7 KB
Format:
Adobe Portable Document Format
Publication available in collections: