کارابرن عزیز، مقالات سایت ( همگی جزو مقالات isi می باشند) بالاترین کیفیت ترجمه را دارند، ترجمه آنها کامل و دقیق می باشد (حتی محتوای جداول و شکل های نیز ترجمه شده اند) و از بهترین مجلات isi مانند IEEE، Sciencedirect، Springer، Emerald و ... انتخاب گردیده اند.
از نرم افزار winrar برای باز کردن فایل های فشرده استفاده می شود. برای دانلود آن بر روی لینک زیر کلیک کنید
Visualizing the knowledge structure and evolution of big data research in healthcare informatics
بصری سازی ساختار دانش و تکامل تحقیقات داده های بزرگ در انفورماتیک بهداشتی-2017
Background: In recent years, the literature associated with healthcare big data has grown rapidly, but few studies have used bibliometrics and a visualization approach to conduct deep mining and reveal a panorama of the healthcare big data field. Methods: To explore the foundational knowledge and research hotspots of big data research in the field of healthcare informatics, this study conducted a series of bibliometric analyses on the related literature, including papers’ production trends in the field and the trend of each paper’s co-author number, the distribution of core institutions and countries, the core literature distribution, the related information of prolific authors and innovation paths in the field, a keyword co-occurrence analysis, and research hotspots and trends for the future. Results: By conducting a literature content analysis and structure analysis, we found the following: (a) In the early stage, researchers from the United States, the People’s Republic of China, the United Kingdom, and Germany made the most contributions to the literature associated with healthcare big data research and the innovation path in this field. (b) The innovation path in healthcare big data consists of three stages: the disease early detection, diagnosis, treatment, and prognosis phase, the life and health promotion phase, and the nursing phase. (c) Research hotspots are mainly concentrated in three dimensions: the disease dimension (e.g., epidemiology, breast cancer, obesity, and diabetes), the technical dimension (e.g., data mining and machine learning), and the health service dimension (e.g., customized service and elderly nursing). Conclusion: This study will provide scholars in the healthcare informatics community with panoramic knowledge of healthcare big data research, as well as research hotspots and future research directions.
Keywords: Big data | Healthcare informatics | Bibliometrics | Knowledge structure | Knowledge management
Large scale deep learning for computer aided detection of mammographic lesions
یادگیری عمیق در مقیاس بزرگ برای تشخیص کامپوزیتی ضایعات ماموگرافی-2017
Article history:Received 11 February 2016Revised 12 July 2016Accepted 20 July 2016Available online 2 August 2016Keywords:Computer aided detection MammographyDeep learning Machine learning Breast cancerConvolutional neural networksRecent advances in machine learning yielded new techniques to train deep neural networks, which re- sulted in highly successful applications in many pattern recognition tasks such as object detection and speech recognition. In this paper we provide a head-to-head comparison between a state-of-the art in mammography CAD system, relying on a manually designed feature set and a Convolutional Neural Net- work (CNN), aiming for a system that can ultimately read mammograms independently. Both systems are trained on a large data set of around 45,000 images and results show the CNN outperforms the traditional CAD system at low sensitivity and performs comparable at high sensitivity. We subsequently investigate to what extent features such as location and patient information and commonly used manual features can still complement the network and see improvements at high speciﬁcity over the CNN especially with location and context features, which contain information not available to the CNN. Additionally, a reader study was performed, where the network was compared to certiﬁed screening radiologists on a patch level and we found no signiﬁcant difference between the network and the readers.© 2016 Elsevier B.V. All rights reserved.
Keywords: Computer aided detection | Mammography | Deep learning | Machine learning | Breast cancer | Convolutional neural networks
Lightweight adaptive Random-Forest for IoT rule generation and execution
تصادفی جنگل انطباقی سبک برای تولید و اجرای قانون IoT-2017
Article history:Available online 30 March 2017Keywords: Internet of Things SecurityRules extraction Random-Forest Active learningThe area of the Internet of Things is growing rapidly. The volume of transmitted data over the various sensors is growing accordingly. Sensors typically are low in resources of storage, memory and process- ing power. Data security and privacy are part of the major concerns and drawbacks of this growing do- main. Sensor traﬃc analysis has become an increasingly important domain to protect IoT infrastructures from intruders. An IoT network intrusion detection system is required to monitor and analyze the traﬃc and predict possible attacks. Machine leaning techniques can automatically extract normal and abnormal patterns from a large set of training sensors data. Due to the high volume of traﬃc and the need for real-time reaction, accurate threat discovery is mandatory. This work focuses on designing a lightweight comprehensive IoT rules generation and execution framework. It is composed of three components, a machine learning rule discovery, a threat prediction model builder and tools to ensure timely reaction to rules violation and un-standardized and ongoing changes in traﬃc behavior. The generated detection model is expected to identify in real-time exceptions and notify the system accordingly. We use Random- Forest (RF) as the machine learning platform for rules discovery and real-time anomaly detection. To al- low RF adaptation to IoT we propose several improvements to make it lightweight and propose a process that combines IoT network capabilities; messaging and resource sharing, to build a comprehensive and eﬃcient IoT security framework.© 2017 Elsevier Ltd. All rights reserved.
Keywords: Internet of Things | Security Rules extraction | Random-Forest | Active learning
Shopping with a robotic companion
خرید با یک همراه رباتیک-2017
In this paper, we present a robotic shopping assistant, designed with a cognitive architecture, grounded in machine learning systems, in order to study how the human-robot interaction (HRI) is changing the shopping behavior in smart technological stores. In the software environment of the NAO robot, con- nected to the Internet with cloud services, we designed a social-like interaction where the robot carries out actions with the customer. In particular, we focused our design on two main skills the robot has to learn: the ﬁrst is the ability to acquire social input communicated by relevant clues that humans provide about their emotional state (emotions, emotional speech), or collected in the Social Media (such as, information on the customers tastes, cultural background, etc.). The second is the skill to express in turn its own emotional state, so that it can affect the customer buying decision, reﬁning in the user the sense of interacting with a human-like companion. By combining social robotics and machine learning systems the potential of robotics to assist people in real life situations will increase, providing a gentle customers acceptance of advanced technologies.© 2017 Elsevier Ltd. All rights reserved.
Keywords:Social robotics | Human Robot Interaction (HRI) | Emotion and Gesture Recognition | Machine learning | Smart retail settings
Adaptive synthesis of dynamically feasible full-body movements for the humanoid robot HRP-2 by flexible combination of learned dynamic movement primitives
سنتز سازگاری در جهت حرکات پویای امکان پذیر تمام قسمت های بدن برای ربات HRP-2 انسان نما توسط ترکیب انعطاف پذیر از حرکت پویای یاد گرفته شده اولیه-2017
Article history:Available online 9 February 2017Keywords: Robotics NavigationWalking pattern generator Goal-directed movements Movement primitives Motor coordination Action sequencesSkilled human full-body movements are often planned in a highly predictive manner. For example, during walking while reaching towards a goal object, steps and body postures are adapted to the goal position already multiple steps before the goal contact. The realization of such highly predictive behaviors for humanoid robots is a challenge because standard approaches, such as optimal control, result in computation times that are prohibitive for the predictive control of complex coordinated full- body movements over multiple steps. We devised a new architecture that combines the online-planning of complex coordinated full-body movements, based on the flexible combination of learned dynamic movement primitives, with a Walking Pattern Generator (WPG), based on Model Predictive Control (MPC), which generates dynamically feasible locomotion of the humanoid robot HRP-2. A dynamic filter corrects the Zero Moment Point (ZMP) trajectories in order to guarantee the dynamic feasibility of the executed behavior taking into account the upper-body movements, at the same time ensuring an accurate approximation of the planned motion trajectories. We demonstrate the high flexibility of the chosen movement planning approach, and the accuracy and feasibility of the generated motion. In addition, we show that a naïve approach, which generates adaptive motion by using machine learning methods by the interpolation between feasible training motion examples fails to guarantee the stability and dynamic feasibility of the generated behaviors.© 2017 Elsevier B.V. All rights reserved.
Keywords:Robotics | Navigation | Walking pattern generator | Goal-directed movements | Movement primitives | Motor coordination | Action sequences
ترکیب دانش تخصصی با فراگیری ماشین براساس آموزش فازی
سال انتشار: 2017 - تعداد صفحات فایل pdf انگلیسی: 5 - تعداد صفحات فایل doc فارسی: 18
در این مقاله یک رویکرد آموزش فازی مبتنی بر تنظیم غیر خطی که تلاش آن به منظور ممانعت در طول آموزش است، معرفی می کند. ایده اصلی به منظور محدود کردن آموزش بدین منظور است که دانش تخصصی مبنا برای ساخت مدلی که هنوز هم قابل مشاهده است، بکار برده شود. اجرای این ایده با یک روش جدید تنظیم غیر خطی که برای هر نوع مجموعه داده¬ی آموزشی قابل اجراست، صورت گرفت. این روش با استفاده از مجموعه داده¬ی عملکرد محصول بزرگ (> 4500 محصول زراعی) برای چغندرقند که در مزارع کشاورزی در طی یک دوره 14 ساله (1976-1989) در شرق آلمان جمع آوری شده، اثبات است. این نرم افزار در SAMT2، نرم افزار رایگان و منبع گسترده، با استفاده از زبان برنامه نویسی پایتون اجرا گردید.
کلید واژه ها: مدل سازی فازی | دانش تخصصی | فراگیری ماشین | تنظیم غیر خطی | بهينه سازي | مدل سازی عملکرد
|مقاله ترجمه شده|
Granular Computing Based Machine Learning in the Era of Big Data
یادگیری ماشین مبتنی بر محاسبات گرانول در عصر داده های بزرگ-2017
Big Data have attracted much attention from a variety of circles of scientific research, marketing, business management, nd decision making of government. It is a huge and challenging task to extract knowledge and build models with big data 1, 2]. Structuralized knowledge organization and reasoning are considered as an efficient and effective paradigm for dealing with large-scale tasks. The idea of underlying structuralized reasoning has been widely used for a long time. Granular Computing, the term coined by Prof. L. A. Zadeh, refers to a new computational paradigm, which focuses on knowledge representation and reasoning with information granules . By information granules one regards a collection of elements drawn together by their closeness (resemblance, proximity, functionality, etc.) articulated in terms of some useful spatial, temporal, or functional relationships. As put lucidly in : Information granulation and Granular Computing play a pivotal role in human cognitive processes and decision-making activities. We perceive complex phenomena by organizing ex isting knowledge along with available experimental evidence and structuring it in a form of some meaningful, semantically sound entities, which are central to all processes of describing the world, reasoning about the environment and supporting decision-making activities. Granular Computing leads to several important advantages of robustness, simplification and ef ficiency. Fuzzy sets, rough sets, computing with words, shadowed sets are some formal frameworks among active branches of granular computing. We have been witnessing significant advances of Granular Computing in scientific and engineering domains. Just to come up with some facts, we retrieved literature about Granular Computing in Thomson Reuters (http://apps.webofknowledge. com/). More than 2,500 hits have been reported. Also these studies are gaining more attention across different research areas.
Big Data Analytics for User-Activity Analysis and User-Anomaly Detection in Mobile Wireless Network
تجزیه و تحلیل داده بزرگ برای تجزیه و تحلیل کاربری فعالیت و تشخیص ناهنجاری کاربر در شبکه های بی سیم سیار-2017
The next generation wireless networks are expected to operate in fully automated fashion to meet the burgeoning capacity demand and to serve users with superior quality of experience. Mobile wireless networks can leverage spatiotemporal information about user and network condition to embed the system with end-to-end visibility and intelligence. Big data analytics has emerged as a promising approach to unearth meaningful insights and to build artificially intelligent models with assistance of machine learning tools. Utilizing aforementioned tools and techniques, this paper contributes in two ways. First, we utilize mobile network data (big data) – call detail record (CDR) – to analyze anomalous behavior of mobile wireless network. For anomaly detection purposes, we use unsupervised clustering techniques namely k-means clustering and hierarchical clustering. We compare the detected anomalies with ground truth information to verify their correctness. From the comparative analysis, we observe that when the network experiences abruptly high (unusual) traffic demand at any location and time, it identifies that as anomaly. This helps in identifying regions of interest (RoI) in the network for special action such as resource allocation, fault avoidance solution etc. Second, we train a neural-network based prediction model with anomalous and anomaly-free data to highlight the effect of anomalies in data while training/building intelligent models. In this phase, we transform our anomalous data to anomaly-free and we observe that the error in prediction while training the model with anomaly-free data has largely decreased as compared to the case when the model was trained with anomalous data.
Index Terms: Next generation wireless networks | 5G | Anomaly detection | call detail record | machine learning | network analytics | network behavior analysis | wireless cellular network
An anomaly detection system based on variable N-gram features and one-class SVM
یک سیستم تشخیص ناهنجاری بر اساس ویژگی های متغیر N-gram و یک کلاس SVM-2017
Article history:Received 3 July 2016Revised 9 June 2017Accepted 21 July 2017 Available online xxxKeywords:Software securityAnomaly detection systems Intrusion detection and prevention Feature extractionTracing System callsContext: Run-time detection of system anomalies at the host level remains a challenging task. Existing techniques suffer from high rates of false alarms, hindering large-scale deployment of anomaly detection techniques in commercial settings.Objective: To reduce the false alarm rate, we present a new anomaly detection system based on a novel feature extraction technique, which combines the frequency with the temporal information from system call traces, and on one-class support vector machine (OC-SVM) detector.Method: The proposed feature extraction approach starts by segmenting the system call traces into mul- tiple n-grams of variable length and mapping them to ﬁxed-size sparse feature vectors, which are then used to train OC-SVM detectors.Results: The results achieved on a real-world system call dataset show that our feature vectors with up to 6-grams outperform the term vector models (using the most common weighting schemes) pro- posed in related work. More importantly, our anomaly detection system using OC-SVM with a Gaussian kernel, trained on our feature vectors, achieves a higher-level of detection accuracy (with a lower false alarm rate) than that achieved by Markovian and n-gram based models as well as by the state-of-the-art anomaly detection techniques.Conclusion: The proposed feature extraction approach from traces of events provides new and general data representations that are suitable for training standard one-class machine learning algorithms, while preserving the temporal dependencies among these events.© 2017 Elsevier B.V. All rights reserved.
Keywords: Software security | Anomaly detection systems | Intrusion detection and prevention | Feature extraction | Tracing | System calls
GPU-based parallel optimization of immune convolutional neural network and embedded system
بهینه سازی موازی بر اساس GPU از شبکه عصبی کانولوشن ایمنی و سیستم جاسازی شده-2017
Up to now, the image recognition system has been utilized more and more widely in the security monitoring, the industrial intelligent monitoring, the unmanned vehicle, and even the space exploration. In designing the image recognition system, the traditional convolutional neural network has some de- fects such as long training time, easy over-ﬁtting and high misclassiﬁcation rate. In order to overcome these defects, we ﬁrstly used the immune mechanism to improve the convolutional neural network and put forward a novel immune convolutional neural network algorithm, after we analyzed the network structure and parameters of the convolutional neural network. Our algorithm not only integrated the location data of the network nodes and the adjustable parameters, but also dynamically adjusted the smoothing factor of the basis function. In addition, we utilized the NVIDIA GPU (Graphics Processing Unit) to accelerate the new immune convolutional neural network (ICNN) in parallel computing and built a real-time embedded image recognition system for this ICNN. The immune convolutional neural net- work algorithm was improved with CUDA programming and was tested with the sample data in the GPU-based environment. The GPU-based implementation of the novel immune convolutional neural network algorithm was made with the cuDNN, which was designed by NVIDIA for GPU-based accel- erating of DNNs in machine learning. Experimental results show that our new immune convolutional neural network has higher recognition rate, more stable performance and faster computing speed than the traditional convolutional neural network.& 2016 Elsevier Ltd. All rights reserved.
Keywords:Immune algorithm | Convolutionalneuralnetwork | Image recognition | Parallelcomputing | Embedded system | Security monitoring