دانلود و نمایش مقالات مرتبط با Data integrity::صفحه 1
بلافاصله پس از پرداخت دانلود کنید

با سلام خدمت کاربران در صورتی که با خطای سیستم پرداخت بانکی مواجه شدید از طریق کارت به کارت (6037997535328901 بانک ملی ناصر خنجری ) مقاله خود را دریافت کنید (تا مشکل رفع گردد). 

نتیجه جستجو - Data integrity

تعداد مقالات یافته شده: 34
ردیف عنوان نوع
1 DOPIV: Post-Quantum Secure Identity-Based Data Outsourcing with Public Integrity Verification in Cloud Storage
DOPIV: برون سپاری داده مبتنی بر هویت امن پس از کوانتومی با تأیید صحت عمومی در فضای ذخیره سازی ابری-2022
Public verification enables cloud users to employ a third party auditor (TPA) to check the data integrity. However, recent breakthrough results on quantum computers indicate that applying quantum computers in clouds would be realized. A majority of existing public verification schemes are based on conventional hardness assumptions, which are vulnerable to adversaries equipped with quantum computers in the near future. Moreover, new security issues need to be solved when an original data owner is restricted or cannot access the remote cloud server flexibly. In this paper, we propose an efficient identity-based data outsourcing with public integrity verification scheme (DOPIV) in cloud storage. DOPIV is designed on lattice-based cryptography, which achieves post-quantum security. DOPIV enables an original data owner to delegate a proxy to generate the signatures of data and outsource them to the cloud server. Any TPA can perform data integrity verification efficiently on behalf of the original data owner, without retrieving the entire data set. Additionally, DOPIV possesses the advantages of being identity-based systems, avoiding complex certificate management procedures. We provide security proofs of DOPIV in the random oracle model, and conduct a comprehensive performance evaluation to show that DOPIV is more practical in post-quantum secure cloud storage systems.
Index Terms: Cloud storage | public verification | lattice-based cryptography | identity-based data outsourcing | post-quantum security
مقاله انگلیسی
2 Efficient Implementation of Lightweight Hash Functions on GPU and Quantum Computers for IoT Applications
اجرای کارآمد توابع هش سبک در GPU و کامپیوترهای کوانتومی برای کاربردهای اینترنت اشیا-2022
Secure communication is important for Internet of Things (IoT) applications, to avoid cybersecurity attacks. One of the key security aspects is data integrity, which can be protected by employing cryptographic hash functions. Recently, US National Institute of Standards and Technology (NIST) announced a competition to standardize lightweight hash functions, which can be used in IoT applications. IoT communication involves various hardware platforms, from low-end microcontrollers to high-end cloud servers with GPU accelerators. Since many sensor nodes are connected to the gateway devices and cloud servers, performing high throughput integrity check is important to secure IoT applications. However, this is a time consuming task even for high-end servers, which may affect the response time in IoT systems. Moreover, no prior work had evaluated the performance of NIST candidates on contemporary processors like GPU and quantum computers. In this study, we showed that with carefully crafted implementation techniques, all the finalist hash function candidates in the NIST standardization competition can achieve high throughput (up-to 1,000 Gbps) on a RTX 3080 GPU. This research output can be used by IoT gateway devices and cloud servers to perform data integrity checks at high speed, thus ensuring a timely response. In addition, this is also the first study that showcase the implementation of NIST lightweight hash functions on a quantum computer (ProjectQ). Besides securing the communication in IoT, these efficient implementations on a GPU and quantum computer can be used to evaluate the strength of respective hash functions against brute-force attack.
INDEX TERMS: Graphics processing units (GPU) | hash function | lightweight cryptography | quantum computer.
مقاله انگلیسی
3 Disintegration testing augmented by computer Vision technology
آزمایش تجزیه با فناوری Vision کامپیوتری تقویت شده است-2022
Oral solid dosage forms, specifically immediate release tablets, are prevalent in the pharmaceutical industry. Disintegration testing is often the first step of commercialization and large-scale production of these dosage forms. Current disintegration testing in the pharmaceutical industry, according to United States Pharmacopeia (USP) chapter 〈701〉, only gives information about the duration of the tablet disintegration process. This infor- mation is subjective, variable, and prone to human error due to manual or physical data collection methods via the human eye or contact disks. To lessen the data integrity risk associated with this process, efforts have been made to automate the analysis of the disintegration process using digital lens and other imaging technologies. This would provide a non-invasive method to quantitatively determine disintegration time through computer algorithms. The main challenges associated with developing such a system involve visualization of tablet pieces through cloudy and turbid liquid. The Computer Vision for Disintegration (CVD) system has been developed to be used along with traditional pharmaceutical disintegration testing devices to monitor tablet pieces and distinguish them from the surrounding liquid. The software written for CVD utilizes data captured by cameras or other lenses then uses mobile SSD and CNN, with an OpenCV and FRCNN machine learning model, to analyze and interpret the data. This technology is capable of consistently identifying tablets with ≥ 99.6% accuracy. Not only is the data produced by CVD more reliable, but it opens the possibility of a deeper understanding of disintegration rates and mechanisms in addition to duration.
keywords: از هم پاشیدگی | اشکال خوراکی جامد | تست تجزیه | یادگیری ماشین | شبکه های عصبی | Disintegration | Oral Solid Dosage Forms | Disintegration Test | Machine Learning | Neural Networks
مقاله انگلیسی
4 A Survey on Post-Quantum Public-Key Signature Schemes for Secure Vehicular Communications
مرور طرح‌های امضای کلید عمومی پسا کوانتومی برای ارتباطات امن خودرو-2022
Basic security requirements such as confidentiality, user authentication and data integrity, are assured by using public-key cryptography (PKC). In particular, public-key signature schemes provide non-repudiation, integrity of transmitted messages and authentication. The presence of a large scale quantum computer would be a real threat to break the most widely used public-key cryptographic algorithms in practice, RSA, DSA, ECDSA signature schemes and Diffie-Hellman key exchange. Thus, all security protocols and applications where these public-key cryptographic algorithms are used are vulnerable to quantum-computer attacks. There are five directions of cryptographic primitives secure against a quantum computer: multivariate quadratic equation-based, hash-based, lattice-based, code-based and supersingular isogeny-based cryptography. These primitives could serve as replacements for current public-key cryptographic algorithms to prepare for post-quantum era. It is important to prioritize the fields to be replaced by post-quantum cryptography (PQC) since it is hard to replace the currently deployed PKC with PQC at the same time. The fields directly connected to human life such as vehicular communications should be the primary targets of PQC applications. This survey is dedicated to providing guidelines for adapting the most suitable post-quantum candidates to the requirements of various devices and suggesting efficient and physically secure implementations that can be built into existing embedded applications as easily as traditional PKC. It focuses on the five types of post-quantum signature schemes and investigates their theoretical backgrounds, structures, state-of-the-art constructions and implementation aspects on various platforms raging from resource constrained IoT devices to powerful servers connected to the devices for secure communications in post-quantum era. It offers appropriate solutions to find tradeoffs between key sizes, signature lengths, performance, and security for practical applications.
Index Terms— Implementation attack | post-quantum cryptography | public-key signature scheme | quantum algorithm | Shor algorithm | side-channel attack.
مقاله انگلیسی
5 A systematic quality assurance framework for the upgrade of radiation oncology information systems
یک چارچوب تضمین کیفیت سیستماتیک برای ارتقاء سیستم های اطلاعاتی آنکولوژی پرتونگاری-2020
In spite of its importance, no systematic and comprehensive quality assurance (QA) program for radiation oncology information systems (ROIS) to verify clinical and treatment data integrity and mitigate against data errors/corruption and/or data loss risks is available. Based on data organization, format and purpose, data in ROISs falls into five different categories: (1) the ROIS relational database and associated files; (2) the ROIS DICOM data stream; (3) treatment machine beam data and machine configuration data; (4) electronic medical record (EMR) documents; and (5) user-generated clinical and treatment reports from the ROIS. For each data category, this framework proposes a corresponding data QA strategy to very data integrity. This approach verified every bit of data in the ROIS, including billions of data records in the ROIS SQL database, tens of millions of ROIS database-associated files, tens of thousands of DICOM data files for a group of selected patients, almost half a million EMR documents, and tens of thousands of machine configuration files and beam data files. The framework has been validated through intentional modifications with test patient data. Despite the ‘big data’ nature of ROIS, the multiprocess and multithread nature of our QA tools enabled the whole ROIS data QA process to be completed within hours without clinical interruptions. The QA framework suggested in this study proved to be robust, efficient and comprehensive without labor-intensive manual checks and has been implemented for our routine ROIS QA and ROIS upgrades.
Keywords: Quality assurance | Radiation oncology information system | Clinical data integrity and safety | Radiation oncology data management | Integrated oncology system
مقاله انگلیسی
6 Based blockchain-PSO-AES techniques in finger vein biometrics: A novel verification secure framework for patient authentication
روش های مبتنی بر بلاکچین-PSO-AES در بیومتریک رگ های انگشت: یک چارچوب تأیید صحت جدید برای احراز هویت بیمار-2019
The main objective of this study is to propose a novel verification secure framework for patient authentication between an access point (patient enrolment device) and a node database. For this purpose, two stages are used. Firstly, we propose a new hybrid biometric pattern model based on a merge algorithm to combine radio frequency identification and finger vein (FV) biometric features to increase the randomisation and security levels in pattern structure. Secondly, we developed a combination of encryption, blockchain and steganography techniques for the hybrid pattern model. When sending the pattern from an enrolment device (access point) to the node database, this process ensures that the FV biometric verification system remains secure during authentication by meeting the information security standard requirements of confidentiality, integrity and availability. Blockchain is used to achieve data integrity and availability. Particle swarm optimisation steganography and advanced encryption standard techniques are used for confidentiality in a transmission channel. Then, we discussed how the proposed framework can be implemented on a decentralised network architecture, including access point and various databases node without a central point. The proposed framework was evaluated by 106 samples chosen from a dataset that comprises 6000 samples of FV images. Results showed that (1) high-resistance verification framework is protected against spoofing and brute-force attacks; most biometric verification systems are vulnerable to such attacks. (2) The proposed framework had an advantage over the benchmark with a percentage of 55.56% in securing biometric templates during data transmission between the enrolment device and the node database.
Keywords: Finger vein | Blockchain | Cryptography | Steganography | RFID | CIA
مقاله انگلیسی
7 Blockchain in healthcare and health sciences : A scoping review
بلاکچین در مراقبت های بهداشتی و علوم بهداشتی: بررسی مقدماتی-2019
Background: Blockchain can be described as an immutable ledger, logging data entries in a decentralized manner. This new technology has been suggested to disrupt a wide range of data-driven domains, including the health domain. Objective: The purpose of this study was to systematically review, assess and synthesize peer-reviewed publications utilizing/proposing to utilize blockchain to improve processes and services in healthcare, health sciences and health education. Method: A structured literature search on the topic was conducted in October 2018 relevant bibliographic databases. Result: 39 publications fulfilled the inclusion criteria. The result indicates that Electronic Health Records and Personal Health Records are the most targeted areas using blockchain technology. Access control, interoperability, provenance and data integrity are all issues that are meant to be improved by blockchain technology in this field. Ethereum and Hyperledger fabric seem to be the most used platforms/frameworks in this domain. Conclusion: This study shows that the endeavors of using blockchain technology in the health domain are increasing exponentially. There are areas within the health domain that potentially could be highly impacted by blockchain technology.
Keywords: Blockchain | Health systems | Scoping review | Distributed ledger
مقاله انگلیسی
8 Efficient, dynamic and identity-based Remote Data Integrity Checking for multiple replicas
یکپارچگی داده های از راه دور کارآمد ، پویا و مبتنی بر هویت بررسی تکرارهای متعدد-2019
Nowadays, cloud storage plays an increasingly important role in our daily life. However, the cloud users do not have the physical possession of their own data anymore. To confirm whether the outsourced files are maintained intact without downloading them entirely, a mechanism namely Remote Data Integrity Checking (RDIC) is invented. Currently, some RDIC schemes allow the data owners with limited computation or communication power to delegate the checking task to a third-party verifier. However, most of these schemes rely on the complicated and resource consuming public key infrastructure (PKI). In this paper, we propose a novel identitybased RDIC scheme, namely Efficient, Dynamic and Identity-based Multiple Replication Provable Data Possession (EDID-MRPDP) without the burden of PKI. We introduce a new construction of Homomorphic Verifiable Tag (HVT) and a novel data structure namely Compressed Authentication Array (CAA), which allow EDID-MRPDP to perform batch verification for multiple data owners and cloud servers simultaneously and efficiently, both from computation and communication aspects. To the best of our knowledge, EDID-MRPDP is the first ID-based RDIC scheme with full dynamic updates and multi-replica batch checking. We provide comprehensive correctness and soundness proofs of EDID-MRPDP. Meanwhile, the detailed performance analyses and simulations show that EDID-MRPDP is practical for large-scale cloud applications.
Keywords: Cloud storage | Dynamic data update | Identity-based cryptography | Multi-replica | Batch checking | Provable data possession
مقاله انگلیسی
9 Detailed analysis and improvement of an efficient and secure identity-based public auditing for dynamic outsourced data with proxy
تجزیه و تحلیل دقیق و بهبود ممیزی عمومی مبتنی بر هویت کارآمد و ایمن برای داده های برون سپاری پویا با پروکسی-2019
For data owners of restricted cloud access with a delegated proxy, public auditing technology for cloud data integrity, plays critical roles in ensuring powerful productivity that flexible cloud services provide for their business. In order to address the scalability of data owners and storage clouds for secure public auditing, Yu et al. (2017) proposed an Identity-Based Public Auditing for Dynamic Outsourced Data with Proxy Processing (https://doi.org/10.3837/tiis.2017.10.019), which also overcomes complicated public key certificates management issue. In this article, we figure out that this scheme is vulnerable to data loss attack where clouds could pass integrity auditing without original data. Meanwhile, a threat of system security is demonstrated, i.e., any entities are able to recover proxy private keys and impersonate proxy to forge proxy tag, with two arbitrary data tag pairs of same data owner. To enable secure identity-based batch public auditing with proxy processing, we propose an improved scheme without these security flaws and prove its security under CDH hard problem in the random oracle model. With complexity anal- ysis, our scheme shows better efficiency over identity-based proxy-oriented data uploading and remote data integrity checking in public cloud (ID-PUIC) in a single owner effort on a single cloud. Especially, we give the detailed analysis for how efficiently the attacks on Yu et al.’s scheme could be launched with an experiment, and demonstrate complete reduction on probability and time for proving security of our improved scheme. For potential application in big data storage, we first evaluate the error detection probability varying on number of auditing blocks, and then conduct detailed performance analysis by simulating our scheme and ID-PUIC scheme on the different number of data owners and storage clouds, with up to 10 6 data blocks
Keywords: Cloud storage | Proxy | Public data auditing | Identity-based cryptography | Provable security
مقاله انگلیسی
10 Blockchain data-based cloud data integrity protection mechanism
مکانیسم حفاظت از داده های ابر مبتنی بر داده بلاکچین -2019
Despite the rapid development of cloud computing for many years, data security and trusted computing are still the main challenges in current cloud computing applications. In order to solve this problem, many scholars have carried out a lot of research on this, and proposed many models including data integrity test and secure multi-party calculation. However, most of these solutions face problems such as excessive computational complexity or lack of scalability. This paper studies the use of blockchain techniques to improve this situation. Blockchain is a decentralized new distributed computing paradigm. Applying blockchain technology to cloud computing, using the security mechanism of the former to improve the performance of the latter’s secure storage and secure computing is a promising research topic. In this paper, the distributed virtual machine agent model is deployed in the cloud by using mobile agent technology. The virtual machine agent enables multi-tenants to cooperate with each other to ensure data trust verification. The tasks of reliable data storage, monitoring and verification are completed by virtual machine agent mechanism. This is also a necessary condition for building a blockchain integrity protection mechanism. The blockchain-based integrity protection framework is built by the virtual machine proxy model, and the unique hash value corresponding to the file generated by the Merkel hash tree is used to monitor the data change by means of the smart contract on the blockchain, and the data is owned in time. The user issues a warning message for data tampering; in addition, a ‘‘block-and-response’’ mode is used to construct a blockchain-based cloud data integrity verification scheme.
Keywords: Blockchain | Cloud data | Integrity verification | Merkel hash tree
مقاله انگلیسی
rss مقالات ترجمه شده rss مقالات انگلیسی rss کتاب های انگلیسی rss مقالات آموزشی
logo-samandehi
بازدید امروز: 1431 :::::::: بازدید دیروز: 0 :::::::: بازدید کل: 1431 :::::::: افراد آنلاین: 53