با سلام خدمت کاربران در صورتی که با خطای سیستم پرداخت بانکی مواجه شدید از طریق کارت به کارت (6037997535328901 بانک ملی ناصر خنجری ) مقاله خود را دریافت کنید (تا مشکل رفع گردد).
ردیف | عنوان | نوع |
---|---|---|
1 |
Log-sum enhanced sparse deep neural network
شبکه عصبی پراکنده عمیق با افزایش log-sum-2020 How to design deep neural networks (DNNs) for the representation and analysis of high dimensional but
small sample size data is still a big challenge. One solution is to construct a sparse network. At present,
there exist many approaches to achieve sparsity for DNNs by regularization, but most of them are carried
out only in the pre-training process due to the difficulty in the derivation of explicit formulae in the finetuning
process. In this paper, a log-sum function is used as the regularization terms for both the
responses of hidden neurons and the network connections in the loss function of the fine-tuning process.
It provides a better approximation to the L0-norm than several often used norms. Based on the gradient
formula of the loss function, the fine-tuning process can be executed more efficiently. Specifically, the
commonly used gradient calculation in many deep learning research platforms, such as PyTorch or
TensorFlow, can be accelerated. Given the analytic formula for calculating gradients used in any layer
of DNN, the error accumulated from successive numerical approximations in the differentiation process
can be avoided. With the proposed log-sum enhanced sparse deep neural network (LSES-DNN), the sparsity
of the responses and the connections can be well controlled to improve the adaptivity of DNNs. The
proposed model is applied to MRI data for both the diagnosis of schizophrenia and the study of brain
developments. Numerical experiments demonstrate its superior performance among several classical
classifiers tested. Keywords: Deep neural network | Log-sum enhanced sparsity | Back propagation algorithm | Concise gradient formula | Magnetic resonance imaging |
مقاله انگلیسی |