دانلود و نمایش مقالات مرتبط با DReLU::صفحه 1
بلافاصله پس از پرداخت دانلود کنید

با سلام خدمت کاربران در صورتی که با خطای سیستم پرداخت بانکی مواجه شدید از طریق کارت به کارت (6037997535328901 بانک ملی ناصر خنجری ) مقاله خود را دریافت کنید (تا مشکل رفع گردد). 

نتیجه جستجو - DReLU

تعداد مقالات یافته شده: 1
ردیف عنوان نوع
1 Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study
افزایش شبکه های نرم افزاری بصورت جمع شده با استفاده از واحدهای خطی یکسو کننده جابجا شده: یک مطالعه مقایسه ای سیستماتیک-2019
A substantial number of expert and intelligent systems rely on deep learning methods to solve problems in areas such as economics, physics, and medicine. Improving the accuracy of the activation functions used by such methods can directly and positively impact the overall performance and quality of the mentioned systems at no cost whatsoever. In this sense, enhancing the design of such theoretical fun- damental blocks is of great significance as it immediately impacts a broad range of current and future real-world deep learning based applications. Therefore, in this paper, we turn our attention to the inter- working between the activation functions and the batch normalization, which is practically a mandatory technique to train deep networks currently. We propose the activation function Displaced Rectifier Linear Unit (DReLU) by conjecturing that extending the identity function of ReLU to the third quadrant enhances compatibility with batch normalization. Moreover, we used statistical tests to compare the impact of us- ing distinct activation functions (ReLU, LReLU, PReLU, ELU, and DReLU) on the learning speed and test accuracy performance of standardized VGG and Residual Networks state-of-the-art models. These Convo- lutional Neural Networks were trained on CIFAR-100 and CIFAR-10, the most commonly used deep learn- ing computer vision datasets. The results showed DReLU speeded up learning in all models and datasets. Besides, statistical significant performance assessments ( p < 0.05) showed DReLU enhanced the test accu- racy presented by ReLU in all scenarios. Furthermore, DReLU showed better test accuracy than any other tested activation function in all experiments with one exception, in which case it presented the second best performance. Therefore, this work demonstrates that it is possible to increase performance replacing ReLU by an enhanced activation function.
Keywords: DReLU | Activation function | Batch normalization | Comparative study | Convolutional Neural Networks | Deep learning
مقاله انگلیسی
rss مقالات ترجمه شده rss مقالات انگلیسی rss کتاب های انگلیسی rss مقالات آموزشی
logo-samandehi
بازدید امروز: 974 :::::::: بازدید دیروز: 3084 :::::::: بازدید کل: 4058 :::::::: افراد آنلاین: 11