Deep reinforcement learning based AGVs real-time scheduling with mixed rule for flexible shop floor in industry 4.0
زمانبندی مبتنی بر یادگیری تقویتی عمیق مبتنی بر AGV با قاعده مختلط برای کف انعطاف پذیر در صنعت 4.0-2020
Driven by the recent advances in industry 4.0 and industrial artificial intelligence, Automated Guided Vehicles (AGVs) has been widely used in flexible shop floor for material handling. However, great challenges aroused by the high dynamics, complexity, and uncertainty of the shop floor environment still exists on AGVs real-time scheduling. To address these challenges, an adaptive deep reinforcement learning (DRL) based AGVs real-time scheduling approach with mixed rule is proposed to the flexible shop floor to minimize the makespan and delay ratio. Firstly, the problem of AGVs real-time scheduling is formulated as a Markov Decision Process (MDP) in which state representation, action representation, reward function, and optimal mixed rule policy, are described in detail. Then a novel deep q-network (DQN) method is further developed to achieve the optimal mixed rule policy with which the suitable dispatching rules and AGVs can be selected to execute the scheduling towards various states. Finally, the case study based on a real-world flexible shop floor is illustrated and the results validate the feasibility and effectiveness of the proposed approach.
Keywords: Automated guided vehicles | Real-time scheduling | Deep reinforcement learning | Industry 4.0
Energy-cognizant scheduling for preference-oriented fixed-priority real-time tasks
برنامه ریزی دانش شناخت انرژی برای وظایف در زمان واقعی با اولویت ثابت تنظیم گرا -2020
Energy management is one of the crucial design issues when executing real-time applications with stringent tim- ing requirements. Dynamic slowdown of processor voltage if accompanied with processor shutdown method, helps in better saving energy. Traditionally, energy management has been applied to real-time scheduling algo- rithms that prioritize tasks based on timing parameters only, however, recently applications having tasks with different execution-preferences on the same computing unit found significant importance in various areas. In this paper, dynamic voltage scaling (DVS) and dynamic power management (DPM) techniques are used for energy management while scheduling preference-oriented fixed-priority periodic real-time tasks. Preference-oriented energy-aware rate-monotonic scheduling (PER) and preference-oriented extended energy-aware rate-monotonic scheduling (PEER) algorithms are proposed that maximize energy savings while fulfilling preference-value of tasks. Extensive simulations show that PER and PEER outperforms in terms of energy savings when compared to several related studies.
Keywords: Dynamic voltage | scaling Energy management | Fixed-priority | Preference-oriented | Rate-monotonic | Real-time tasks | Task scheduling