دانلود مقاله انگلیسی رایگان:افزایش شبکه های نرم افزاری بصورت جمع شده با استفاده از واحدهای خطی یکسو کننده جابجا شده: یک مطالعه مقایسه ای سیستماتیک - 2019
دانلود بهترین مقالات isi همراه با ترجمه فارسی
دانلود مقاله انگلیسی سیستم های خبره رایگان
  • Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study
    Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study

    سال انتشار:

    2019


    عنوان انگلیسی مقاله:

    Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study


    ترجمه فارسی عنوان مقاله:

    افزایش شبکه های نرم افزاری بصورت جمع شده با استفاده از واحدهای خطی یکسو کننده جابجا شده: یک مطالعه مقایسه ای سیستماتیک


    منبع:

    Sciencedirect - Elsevier - Expert Systems With Applications, 124 (2019) 271-281: doi:10:1016/j:eswa:2019:01:066


    نویسنده:

    David Macêdo ∗, Cleber Zanchettin , Adriano L.I. Oliveira , Teresa Ludermir


    چکیده انگلیسی:

    A substantial number of expert and intelligent systems rely on deep learning methods to solve problems in areas such as economics, physics, and medicine. Improving the accuracy of the activation functions used by such methods can directly and positively impact the overall performance and quality of the mentioned systems at no cost whatsoever. In this sense, enhancing the design of such theoretical fun- damental blocks is of great significance as it immediately impacts a broad range of current and future real-world deep learning based applications. Therefore, in this paper, we turn our attention to the inter- working between the activation functions and the batch normalization, which is practically a mandatory technique to train deep networks currently. We propose the activation function Displaced Rectifier Linear Unit (DReLU) by conjecturing that extending the identity function of ReLU to the third quadrant enhances compatibility with batch normalization. Moreover, we used statistical tests to compare the impact of us- ing distinct activation functions (ReLU, LReLU, PReLU, ELU, and DReLU) on the learning speed and test accuracy performance of standardized VGG and Residual Networks state-of-the-art models. These Convo- lutional Neural Networks were trained on CIFAR-100 and CIFAR-10, the most commonly used deep learn- ing computer vision datasets. The results showed DReLU speeded up learning in all models and datasets. Besides, statistical significant performance assessments ( p < 0.05) showed DReLU enhanced the test accu- racy presented by ReLU in all scenarios. Furthermore, DReLU showed better test accuracy than any other tested activation function in all experiments with one exception, in which case it presented the second best performance. Therefore, this work demonstrates that it is possible to increase performance replacing ReLU by an enhanced activation function.
    Keywords: DReLU | Activation function | Batch normalization | Comparative study | Convolutional Neural Networks | Deep learning


    سطح: متوسط
    تعداد صفحات فایل pdf انگلیسی: 11
    حجم فایل: 688 کیلوبایت

    قیمت: رایگان


    توضیحات اضافی:




اگر این مقاله را پسندیدید آن را در شبکه های اجتماعی به اشتراک بگذارید (برای به اشتراک گذاری بر روی ایکن های زیر کلیک کنید)

تعداد نظرات : 0

الزامی
الزامی
الزامی
rss مقالات ترجمه شده rss مقالات انگلیسی rss کتاب های انگلیسی rss مقالات آموزشی