با سلام خدمت کاربران در صورتی که با خطای سیستم پرداخت بانکی مواجه شدید از طریق کارت به کارت (6037997535328901 بانک ملی ناصر خنجری ) مقاله خود را دریافت کنید (تا مشکل رفع گردد).
ردیف | عنوان | نوع |
---|---|---|
1 |
Hybrid Classical-Quantum Optimization Techniques for Solving Mixed-Integer Programming Problems in Production Scheduling
تکنیکهای بهینهسازی ترکیبی کلاسیک-کوانتومی برای حل مسائل برنامهنویسی عدد صحیح مختلط در زمانبندی تولید-2022 Quantum computing (QC) holds great promise to open up a new era of computing and has been
receiving significant attention recently. To overcome the performance limitations of near-term QC, utilizing
the current quantum computers to complement classical techniques for solving real-world problems is of
utmost importance. In this article, we develop QC-based solution strategies that exploit quantum annealing
and classical optimization techniques for solving large-scale scheduling problems in manufacturing systems.
The applications of the proposed algorithms are illustrated through two case studies in production scheduling.
First, we present a hybrid QC-based solution approach for the job-shop scheduling problem. Second, we propose a hybrid QC-based parametric method for the multipurpose batch scheduling problem with a fractional
objective. The proposed hybrid algorithms can tackle optimization problems formulated as mixed-integer
linear and mixed-integer fractional programs, respectively, and provide feasibility guarantees. Performance
comparison between state-of-the-art exact and heuristic solvers and the proposed QC-based hybrid solution
techniques is presented for both job-shop and batch scheduling problems. Unlike conventional classical
solution techniques, the proposed hybrid frameworks harness quantum annealing to supplement established
deterministic optimization algorithms and demonstrate performance efficiency over standard off-the-shelf
optimization solvers.
INDEX TERMS: Hybrid techniques | optimization | quantum annealing | quantum computing (QC) | scheduling. |
مقاله انگلیسی |
2 |
Implementing Graph-Theoretic Feature Selection by Quantum Approximate Optimization Algorithm
پیاده سازی انتخاب ویژگی گراف-نظری توسط الگوریتم بهینه سازی تقریبی کوانتومی-2022 Feature selection plays a significant role in computer science; nevertheless, this task is intractable since its search space scales exponentially with the number of dimensions. Motivated by the potential advantages of near-term quantum computing, three graph-theoretic feature selection (GTFS) methods, including minimum cut (MinCut)-based, densest k -subgraph (DkS)-based, and maximal-independent set/minimal vertex cover (MIS/MVC)-based, are investigated in this article, where the original graph-theoretic problems are naturally formulated as the quadratic problems in binary variables and then solved using the quantum approximate optimization algorithm (QAOA). Specifically, three separate graphs are created from the raw feature set, where the vertex set consists of individual features and pairwise measure describes the edge. The corresponding feature subset is generated by deriving a subgraph from the established graph using QAOA. For the above three GTFS approaches, the solving procedure and quantum circuit for the corresponding graph-theoretic problems are formulated with the framework of QAOA. In addition, those proposals could be employed as a local solver and integrated with the Tabu search algorithm for solving large-scale GTFS problems utilizing limited quantum bit resource. Finally, extensive numerical experiments are conducted with 20 publicly available datasets and the results demonstrate that each model is superior to its classical scheme. In addition, the complexity of each model is only O(pn2) even in the worst cases, where p is the number of layers in QAOA and n is the number of features.
Index Terms: Feature selection | graph theory | parameterized quantum circuit | quantum approximation optimization algorithm | quantum computing. |
مقاله انگلیسی |
3 |
Layer VQE: A Variational Approach for Combinatorial Optimization on Noisy Quantum Computers
لایه VQE: یک رویکرد متغیر برای بهینه سازی ترکیبی در کامپیوترهای کوانتومی پر سر و صدا-2022 Combinatorial optimization on near-term quantum devices is a promising path to demonstrating quantum advantage. However, the capabilities of these devices are constrained by high noise or
error rates. In this article, inspired by the variational quantum eigensolver (VQE), we propose an iterative
layer VQE (L-VQE) approach. We present a large-scale numerical study, simulating circuits with up to
40 qubits and 352 parameters, that demonstrates the potential of the proposed approach. We evaluate
quantum optimization heuristics on the problem of detecting multiple communities in networks, for which we
introduce a novel qubit-frugal formulation. We numerically compare L-VQE with the quantum approximate
optimization algorithm (QAOA) and demonstrate that QAOA achieves lower approximation ratios while
requiring significantly deeper circuits. We show that L-VQE is more robust to finite sampling errors and has
a higher chance of finding the solution as compared with standard VQE approaches. Our simulation results
show that L-VQE performs well under realistic hardware noise.
INDEX TERMS: Combinatorial optimization | hybrid quantum-classical algorithm | quantum optimization. |
مقاله انگلیسی |
4 |
Quantum Approximate Optimization Algorithm Based Maximum Likelihood Detection
الگوریتم بهینه سازی تقریبی کوانتومی مبتنی بر تشخیص حداکثر احتمال-2022 Recent advances in quantum technologies pave the
way for noisy intermediate-scale quantum (NISQ) devices, where
the quantum approximation optimization algorithm (QAOA)
constitutes a promising candidate for demonstrating tangible
quantum advantages based on NISQ devices. In this paper,
we consider the maximum likelihood (ML) detection problem of
binary symbols transmitted over a multiple-input and multipleoutput (MIMO) channel, where finding the optimal solution is
exponentially hard using classical computers. Here, we apply the
QAOA for the ML detection by encoding the problem of interest
into a level-p QAOA circuit having 2p variational parameters,
which can be optimized by classical optimizers. This level-p
QAOA circuit is constructed by applying the prepared Hamiltonian to our problem and the initial Hamiltonian alternately
in p consecutive rounds. More explicitly, we first encode the
optimal solution of the ML detection problem into the ground
state of a problem Hamiltonian. Using the quantum adiabatic
evolution technique, we provide both analytical and numerical
results for characterizing the evolution of the eigenvalues of
the quantum system used for ML detection. Then, for level-
1 QAOA circuits, we derive the analytical expressions of the
expectation values of the QAOA and discuss the complexity
of the QAOA based ML detector. Explicitly, we evaluate the
computational complexity of the classical optimizer used and the
storage requirement of simulating the QAOA. Finally, we evaluate
the bit error rate (BER) of the QAOA based ML detector and
compare it both to the classical ML detector and to the classical
minimum mean squared error (MMSE) detector, demonstrating
that the QAOA based ML detector is capable of approaching the
performance of the classical ML detector.
Index Terms: Quantum technology | maximum likelihood (ML) detection | quantum approximation optimization algorithm (QAOA) | bit error rate (BER). |
مقاله انگلیسی |
5 |
Quantum Computing for Applications in Data Fusion
محاسبات کوانتومی برای برنامه های کاربردی در ترکیب داده ها-2022 Quantum computing promises significant improvements of computation capabilities in various fields such as machine
learning and complex optimization problems. Rapid technological advancements suggest that adiabatic and gate base quantum
computing may see practical applications in the near future. In
this work, we adopt quantum computing paradigms to develop
solvers for two well–known combinatorial optimization problems
in information fusion and resource management: multi-target
data association (MTDA) and weapon target assignment (WTA).
These problems are NP-hard (non-)linear integer programming
optimization tasks which become computationally expensive for
large problem sizes. We derive the problem formulations adapted
for the use in quantum algorithms and present solvers based on
adiabatic quantum computing (AQC) and the Quantum Approximative Optimization Algorithm (QAOA). The feasibility of the models
is demonstrated by numerical simulation and first experiments on
quantum hardware.
Index Terms: adiabatic quantum computing | weapon-target assignment | data association | multi-target tracking | quantum gates | Ising model |
مقاله انگلیسی |
6 |
Solving Vehicle Routing Problem Using Quantum Approximate Optimization Algorithm
حل مسئله مسیریابی خودرو با استفاده از الگوریتم بهینه سازی تقریبی کوانتومی-2022 Intelligent transportation systems (ITS) are a critical component of Industry 4.0 and 5.0, particularly having
applications in logistic management. One of their crucial utilization is in supply-chain management and scheduling for
optimally routing transportation of goods by vehicles at a given
set of locations. This paper discusses the broader problem of
vehicle traffic management, more popularly known as the Vehicle
Routing Problem (VRP), and investigates the possible use of
near-term quantum devices for solving it. For this purpose,
we give the Ising formulation for VRP and some of its constrained
variants. Then, we present a detailed procedure to solve VRP
by minimizing its corresponding Ising Hamiltonian using a
hybrid quantum-classical heuristic called Quantum Approximate
Optimization Algorithm (QAOA), implemented on the IBM
Qiskit platform. We compare the performance of QAOA with
classical solvers such as CPLEX on problem instances of up to
15 qubits. We find that performance of QAOA has a multifaceted
dependence on the classical optimization routine used, the depth
of the ansatz parameterized by p, initialization of variational
parameters, and problem instance itself.
Index Terms— Vehicle routing problem | ising model | combinatorial optimization | quantum approximate algorithms | variational quantum algorithms. |
مقاله انگلیسی |
7 |
Wireless Sensor Network coverage optimization based on Yin–Yang pigeon-inspired optimization algorithm for Internet of Things
بهینه سازی پوشش شبکه حسگر بی سیم بر اساس الگوریتم بهینه سازی الهام گرفته از کبوتر یین یانگ برای اینترنت اشیا-2022 As an important technology of Internet of Things (IoT), wireless sensor network (WSN) has
the problem of low coverage caused by uneven nodes distribution. Aiming at the problem,
a WSN coverage optimization method based on the Yin–Yang pigeon-inspired optimization
algorithm (Yin–YangPIO) is proposed. Firstly, the good point set is introduced into initialization
phase which makes pigeon population more evenly distributed in the solution space; then, Yin–
Yang-pair optimization algorithm (YYPO) and pigeon-inspired optimization algorithm (PIO) are
combined, and different strategies are used in the map and compass operator and the landmark
operator to improve the optimization ability; later on, the opposition-based learning is added to
PIO to expand the search range; finally, several functions are selected to prove the optimization
ability of the Yin–YangPIO. Through three sets of WSN coverage optimization experiments with
different parameters, the effectiveness of the proposed method in WSN coverage optimization
is demonstrated.
Keywords: Wireless sensor network | Pigeon-inspired optimization algorithm | Yin–Yang-pair optimization algorithm | Opposition-based learning | Internet of Things | شبکه حسگر بی سیم | الگوریتم بهینه سازی الهام گرفته از کبوتر | الگوریتم بهینه سازی جفت یین یانگ | یادگیری مبتنی بر مخالفت | اینترنت اشیا |
مقاله انگلیسی |
8 |
Deep learning-based transceiver design for multi-user MIMO systems
طراحی فرستنده گیرنده مبتنی بر یادگیری عمیق برای سیستم های MIMO چند کاربره-2022 Multi-user multiple-input multiple-output (MIMO) is a key technique to increase both the
channel capacity and the number of users that can be served simultaneously. One of the main
challenges related to the deployment of such systems is the complexity of the transceiver
processing. Although the conventional optimization algorithms are able to provide excellent
performance, they generally require considerable computational complexity, which gets in the
way of their practical application in real-time systems. In contrast to existing work, we study
a DL-based transceiver design scheme for a downlink MIMO broadcasting channel (MIMO BC)
system, which consists of a base station (BS) serving multi-users. The objective of this work
is to maximize the sum-rate of all users by jointly optimizing the transmitter and receivers
under the total power constraint, while suppressing interference as much as possible. Due to
the inter-user interference in such system, the considered problem is nonconvex and NP-hard.
Different from traditional optimization algorithms, we rely on the convolutional neural networks
(CNNs) to optimize the transceivers in an adaptive way. In the proposed scheme, we develop an
unsupervised learning strategy, where a loss function is constructed innovatively for reducing
the inter-user interference. Simulation results show that the inter-user interference is reduced
effectively by our proposed CNN-based transceiver optimization method.
Keywords: Transceiver design | MIMO BC | Deep learning | Convolutional neural networks |
مقاله انگلیسی |
9 |
Reconfiguration of electrical distribution network-based DG and capacitors allocations using artificial ecosystem optimizer: Practical case study
پیکربندی مجدد تخصیص DG و خازن مبتنی بر شبکه توزیع الکتریکی با استفاده از بهینه ساز اکوسیستم مصنوعی: مطالعه موردی عملی-2021 In this article, a new implementation of Artificial Ecosystem Optimizer (AEO) technique
is developed for distributed generators (DGs) and capacitors allocation considering the Reconfiguration of Power Distribution Systems (RPDS). The AEO is inspired from three energy transfer
mechanisms involving production, consumption, and decomposition in an ecosystem. In the production mechanism, the production operator allows AEO to produce a new individual randomly,
whereas the search space exploration can be improved as illustrated in the consumption mechanism
and exploitation can be performed in the decomposition. A practical case study of 59-bus Cairo distribution system in Egypt is simulated with different loading percentages. For optimizing the performance of that practical network, the AEO algorithm is employed for different scenarios. Besides,
the results obtained by recent optimization techniques which are Jellyfish Search Optimizer (JFS),
Supply Demand Optimizer (SDO), Crow Search Optimizer (CSO), Particle Swarm Optimization
(PSO), Grey Wolf Optimizer (GWO) and Whale Optimization Algorithm (WOA) are compared
with the developed AEO. The simulation results demonstrate the efficacies and superiority of the
AEO compared to the others. It surpasses the other algorithms in terms of obtaining the best, mean,
worst, and standard deviations. After optimal RPDS and DGs placements, the power losses are
decreased by 78.4, 77.84 and 71.4% at low, nominal and high levels, respectively. However, the best
scenario with its application prospects is mentioned after optimal RPDS, DGs, and capacitors
where the power losses are decreased by 68.8, 85.87 and 89.91% at low, nominal and high levels,
respectively.
KEYWORDS: Artificial ecosystem optimizer | Distributed generators | Electrical systems | Power losses | Reconfiguration |
مقاله انگلیسی |
10 |
Deep belief network-based hybrid model for multimodal biometric system for futuristic security applications
مدل ترکیبی مبتنی بر باور عمیق برای سیستم بیومتریک چند حالته برای برنامه های امنیتی آینده-2021 Biometrics is the technology to identify humans uniquely based on face, iris, and fingerprints, etc. Biometric authentication allows the person recognition automatically on the basis of behavioral or physiological charac- teristics. Biometrics are broadly employed in several commercial as well as the official identification systems for automatic access control. This paper introduces the model for multimodal biometric recognition based on score level fusion method. The overall procedure of the proposed method involves five steps, such as pre-processing, feature extraction, recognition score using Multi- support vector neural network (Multi-SVNN) for all traits, score level fusion, and recognition using deep belief neural network (DBN). The first step is to input the training images into pre-processing steps. Thus, the pre-processing of three traits, like iris, ear, and finger vein is done. Then, the feature extraction is done for each modality to extract the features. After that, the texture features are extracted from pre-processed images of the ear, iris, and finger vein, and the BiComp features are acquired from individual images using a BiComp mask. Then, the recognition score is computed based on the Multi-SVNN classifier to provide the score individually for all three traits, and the three scores are provided to the DBN. The DBN is trained using the chicken earthworm optimization algorithm (CEWA). The CEWA is the integration of the chicken swarm optimization (CSO), and earthworm optimization algorithm (EWA) for the optimal authentication of the person. The analysis proves that the developed method acquired a maximal accuracy of 95.36%, maximal sensitivity of 95.85%, and specificity of 98.79%, respectively. Keywords: Multi-modal Bio-metric system | Chicken Swarm Optimization | Earthworm Optimization algorithm | Deep Belief Network | Multi-SVNN |
مقاله انگلیسی |