کارابرن عزیز، مقالات isi بالاترین کیفیت ترجمه را دارند، ترجمه آنها کامل و دقیق می باشد (محتوای جداول و شکل های نیز ترجمه شده اند) و از بهترین مجلات isi انتخاب گردیده اند. همچنین تمامی ترجمه ها دارای ضمانت کیفیت بوده و در صورت عدم رضایت کاربر مبلغ عینا عودت داده خواهد شد.
از نرم افزار winrar برای باز کردن فایل های فشرده استفاده می شود. برای دانلود آن بر روی لینک زیر کلیک کنید
Toward modeling and optimization of features selection in Big Data based social Internet of Things
به سوی مدل سازی و بهینه سازی انتخاب ویژگی ها در داده های بزرگ مبتنی بر اینترنت اشیا اجتماعی-2018
The growing gap between users and the Big Data analytics requires innovative tools that address the challenges faced by big data volume, variety, and velocity. Therefore, it becomes computationally inefficient to analyze and select features from such massive volume of data. Moreover, advancements in the field of Big Data application and data science poses additional challenges, where a selection of appropriate features and High-Performance Computing (HPC) solution has become a key issue and has attracted attention in recent years. Therefore, keeping in view the needs above, there is a requirement for a system that can efficiently select features and analyze a stream of Big Data within their requirements. Hence, this paper presents a system architecture that selects features by using Artificial Bee Colony (ABC). Moreover, a Kalman filter is used in Hadoop ecosystem that is used for removal of noise. Furthermore, traditional MapReduce with ABC is used that enhance the processing efficiency. Moreover, a complete four-tier architecture is also proposed that efficiently aggregate the data, eliminate unnecessary data, and analyze the data by the proposed Hadoop-based ABC algorithm. To check the efficiency of the proposed algorithms exploited in the proposed system architecture, we have implemented our proposed system using Hadoop and MapReduce with the ABC algorithm. ABC algorithm is used to select features, whereas, MapReduce is supported by a parallel algorithm that efficiently processes a huge volume of data sets. The system is implemented using MapReduce tool at the top of the Hadoop parallel nodes with near real time. Moreover, the proposed system is compared with Swarm approaches and is evaluated regarding efficiency, accuracy and throughput by using ten different data sets. The results show that the proposed system is more scalable and efficient in selecting features.
Keywords: SIoT ، Big Data ، ABC algorithm، Feature selection
A new architecture of Internet of Things and big data ecosystem for secured smart healthcare monitoring and alerting system
معماری جدید اینترنت اشیاء و اکوسیستم داده های بزرگ برای نظارت بر سیستم مراقبت سلامت هوشمند و سیستم هشدار دهنده امن-2018
Wearable medical devices with sensor continuously generate enormous data which is often called as big data mixed with structured and unstructured data. Due to the complexity of the data, it is difficult to process and analyze the big data for finding valuable information that can be useful in decision making. On the other hand, data security is a key requirement in healthcare big data system. In order to overcome this issue, this paper proposes a new architecture for the implementation of IoT to store and process scalable sensor data (big data) for health care applications. The Proposed architecture consists of two main sub architectures, namely, Meta Fog-Redirection (MF-R) and Grouping and Choosing (GC) architecture. MF-R architecture uses big data technologies such as Apache Pig and Apache HBase for collection and storage of the sensor data (big data) generated from different sensor devices. The proposed GC architecture is used for securing integration of fog computing with cloud computing. This architecture also uses key management service and data categorization function (Sensitive, Critical and Normal) for providing security services. The framework also uses MapReduce based prediction model to predict the heart diseases. Performance evaluation parameters such as throughput, sensitivity, accuracy, and f-measure are calculated to prove the efficiency of the proposed architecture as well as the prediction model.
Keywords: Wireless sensor networks ، Internet of Things ، Big data analytics ، Cloud computing and health car
Role of Big Data and Machine Learning in Diagnostic Decision Support in Radiology
نقش داده های بزرگ و یادگیری ماشین در پشتیبانی از تصمیم گیری تشخیصی در رادیولوژی-2018
The field of diagnostic decision support in radiology is undergoing rapid transformation with the availability of large amounts of patient data and the development of new artificial intelligence methods of machine learning such as deep learning. They hold the promise of providing imaging specialists with tools for improving the accuracy and efficiency of diagnosis and treatment. In this article, we will describe the growth of this field for radiology and outline general trends highlighting progress in the field of diagnostic decision support from the early days of rule-based expert systems to cognitive assistants of the modern era.
Key Words: Diagnostic decision support, artificial intelligence, deep learning, machine learning, cognitive assistants, medical image analysis, knowledge and reasoning
High-order possibilistic c-means algorithms based on tensor decompositions for big data in IoT
الگوریتم های c-means احتمالی اولویت بالا بر اساس تجزیه تانسور برای داده های بزرگ در اینترنت اشیا-2018
Internet of Things (IoT) connects the physical world and the cyber world to offer intelligent services by data mining for big data. Each big data sample typically involves a large number of attributes, posing a remarkable challenge on the high-order possibilistic c-means algorithm (HOPCM). Specially, HOPCM requires high-performance servers with a large-scale memory and a powerful computing unit, to cluster big samples, limiting its applicability in IoT systems with low-end devices such as portable computing units and embedded devises which have only limited memory space and computing power. In this paper, we propose two high-order possibilistic c-means algorithms based on the canonical polyadic decomposition (CP-HOPCM) and the tensor-train network (TT-HOPCM) for clustering big data. In detail, we use the canonical polyadic decomposition and the tensor-train network to compress the attributes of each big data sample. To evaluate the performance of our algorithms, we conduct the experiments on two representative big data datasets, i.e., NUS-WIDE-14 and SNAE2, by comparison with the conventional highorder possibilistic c-means algorithm in terms of attributes reduction, execution time, memory usage and clustering accuracy. Results imply that CP-HOPCM and TT-HOPCM are potential for big data clustering in IoT systems with low-end devices since they can achieve a high compression rate for heterogeneous samples to save the memory space significantly without a significant clustering accuracy drop.
Keywords: Big data ، IoT ، Possibilistic c-means clustering ، Canonical polyadic decomposition ، Tensor-train network
A unique feature extraction using MRDWT for automatic classification of abnormal heartbeat from ECG big data with Multilayered Probabilistic Neural Network classifier
استخراج ویژگی منحصر به فرد با استفاده از MRDWT برای طبقه بندی خودکارضربان قلب غیر طبیعی از داده های بزرگ ECG با چند لایه طبقه بندی احتمالی شبکه عصبی-2018
This paper employs a novel adaptive feature extraction techniques of electrocardiogram (ECG) signal for detection of cardiac arrhythmias using multiresolution discrete wavelet transform from ECG big data. In this paper, five types ECG arrhythmias including normal beats have been classified. The MIT-BIH database of 48 patient records is utilized for detection and analysis of cardiac arrhythmias. Proposed feature extraction utilizes Daubechies as wavelet function and extracts 21 feature points which include the QRS complex of ECG signal. The Multilayered Probabilistic Neural Network (MPNN) classifier is pro posed as the best-suited classifier for the proposed feature. Total 1700 ECG betas were tested using MPNN classifier and compared with other three classifiers Back Propagation (BPNN), Multilayered Perceptron (MLP) and Support Vector Machine (SVM). The system efficiency and performance have been evaluated using seven types of evaluation criteria: precision (PR), F-Score, positive predictivity (PP), sensitivity (SE), classification error rate (CER) and specificity (SP). The overall system accuracy, using MPNN technique utilizing the proposed feature, obtained is 99.53% whereas using BPNN, MLP and SVM provide 97.94%, 98.53%, and 99%. The processing time using MPNN classifier is only 3 s which show that the proposed techniques not only very accurate and efficient but also very quick.
Keywords: Signal processing ، Artificial intelligence ، Pattern recognition ، Soft computing ، Wavelet transform
Fault-diagnosis for reciprocating compressors using big data and machine learning
تشخیص گسل برای کمپرسورهای مجاور با استفاده از داده های بزرگ و یادگیری ماشین-2018
Reciprocating compressors are widely used in petroleum industry. A small fault in recipro cating compressor may cause serious issues in operation. Traditional regular maintenance and fault diagnosis solutions cannot efficiently detect potential faults in reciprocating com pressors. This paper proposes a fault-diagnosis system for reciprocating compressors. It applies machine-learning techniques to data analysis and fault diagnosis. The raw data is denoised first. Then the denoised data is sparse coded to train a dictionary. Based on the learned dictionary, potential faults are finally recognized and classified by support vector machine (SVM). The system is evaluated by using 5-year operation data collected from an offshore oil corporation in a cloud environment. The collected data is evenly divided into two halves. One half is used for training, and the other half is used for testing. The results demonstrate that the proposed system can efficiently diagnose potential faults in com pressors with more than 80% accuracy, which represents a better result than the current practice.
Keywords: Reciprocating compressor، Big data ، Cloud computing ، Deep learning ، RPCA ، SVM
An efficient fuzzy c-means approach based on canonical polyadic decomposition for clustering big data in IoT
یک روش کارآمد فازی c-means بر اساس تجزیه polyadic کانونی برای خوشه بندی داده های بزرگ در اینترنت اشیا-2018
Mining smart data from the collected big data in Internet of Things which attempts to better human life by integrating physical devices into the information space. As one of the most important clus tering techniques for drilling smart data, the fuzzy c-means algorithm (FCM) assigns each object to multiple groups by calculating a membership matrix. However, each big data object has a large number of attributes, posing an remarkable challenge on FCM for IoT big data real-time cluster ing. In this paper, we propose an efficient fuzzy c-means approach based on the tensor canonical polyadic decomposition for clustering big data in Internet of Things. In the presented scheme, the traditional fuzzy c-means algorithm is converted to the high-order tensor fuzzy c-means algorithm (HOFCM) via a bijection function. Furthermore, the tensor canonical polyadic decomposition is utilized to reduce the attributes of every objects for enhancing the clustering efficiency. Finally, the extensive experiments are conducted to compare the developed scheme with the traditional fuzzy c-means algorithm on two large IoT datasets including sWSN and eGSAD regarding clus tering accuracy and clustering efficiency. The results argue that the developed scheme achieves a significantly higher clustering efficiency with a slight clustering accuracy drop compared with the traditional algorithm, indicating the potential of the developed scheme for drilling smart data from IoT big data.
Keywords: Big data, Internet of Things, Smart data, Fuzzy c-means algorithm, Canonical polyadic decomposition
Fall detection system for elderly people using IoT and Big Data
سیستم تشخیص سقوط برای سالمندان با استفاده از اینترنت اشیا و داده های بزرگ-2018
Falls represent a major public health risk worldwide for the elderly people. A fall not assisted in time can cause functional impairment in an elder and a significant decrease in his mobility, independence and life quality. In that sense, the present work proposes an innovative IoT-based system for detecting falls of elderly people in indoor environments, which takes advantages of low-power wireless sensor networks, smart devices, big data and cloud computing. For this purpose, a 3D-axis accelerometer embedded into a 6LowPAN device wearable is used, which is responsible for collecting data from movements of elderly people in real-time. To provide high efficiency in fall detection, the sensor readings are processed and analyzed using a decision trees based Big Data model running on a Smart IoT Gateway. If a fall is detected, an alert is activated and the system reacts automatically by sending notifications to the groups responsible for the care of the elderly people. Finally, the system provides services built on cloud. From medical perspective, there is a storage service that enables healthcare professional to access to falls data for perform further analysis. On the other hand, the system provides a service leveraging this data to create a new machine learning model each time a fall is detected. The results of experiments have shown high success rates in fall detection in terms of accuracy, precision and gain.
Keywords: Fall detection; Internet-of-Things; Big Data, 6LowPAN; wearable sensor; Smart IoT Gateway; fall detection; decision tree learning algorithm; accelerometer; elderly people.
Conception and Exploration of Using Data as a Service in Tunnel Construction with the NATM
مفهوم و اکتشاف استفاده از داده ها به عنوان یک سرویس در ساخت تونل با NATM-2018
The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on. The variation patterns of advance geological prediction data, stress–strain data of supporting struc tures, and deformation data of the surrounding rock are vitally important in assessing the rationality and reliability of construction schemes, and provide essential information to ensure the safety and scheduling of tunnel construction. However, as the quantity of these data increases significantly, the uncertainty and discreteness of the mass data make it extremely difficult to produce a reasonable con struction scheme; they also reduce the forecast accuracy of accidents and dangerous situations, creating huge challenges in tunnel construction safety. In order to solve this problem, a novel data service system is proposed that uses data-association technology and the NATM, with the support of a big data environ ment. This system can integrate data resources from distributed monitoring sensors during the construc tion process, and then identify associations and build relations among data resources under the same construction conditions. These data associations and relations are then stored in a data pool. With the development and supplementation of the data pool, similar relations can then be used under similar con ditions, in order to provide data references for construction schematic designs and resource allocation. The proposed data service system also provides valuable guidance for the construction of similar projects.
Keywords: New Austrian Tunneling Method ، Big data environments ، Data as a service ، Tunnel construction
SLA based healthcare big data analysis and computing in cloud network
تحلیل داده های بزرگ سلامت مبتنی بر sla و محاسبات در شبکه های ابری-2018
Large volume of multi-structured and low-latency patient data are generated in healthcare services, which is challenging task to process and analyze within the Service Level Agreement (SLA). In this paper, a Parallel Semi-Naive Bayes (PSNB) based probabilistic method is used to process the healthcare big data in cloud for future health condition prediction. In order to improve the accuracy of PSNB method, a Modified Conjunctive Attribute (MCA) algorithm is proposed for reducing the dimension. Emergency condition of the patient is considered by setting a global priority among the patients and an Optimal Data Distribution (ODD) algorithm is proposed to position both batch and streaming patient data into the Spark nodes. Further, a Dynamic Job Scheduling (DJS) algorithm is designed to schedule the jobs efficiently to the most suitable nodes for processing the data taking SLA into account. Our proposed PSNB algorithm provides better accuracy of 87.8% for both batch and streaming data, which is 12.8% higher than the original Naive Bayes (NB) algorithm and can conveniently be employed in various patient monitoring applications.
Keywords: Big Data ، cloud computing ،healthcare, spark