University of Bahrain
Scientific Journals

Efficient Task Scheduling in Cloud using Double Deep QNetwork

Show simple item record

dc.contributor.author Radhika, S
dc.contributor.author Keshari Swain, Sangram
dc.contributor.author Adinarayana, S
dc.contributor.author Ramesh Babu, BSSV
dc.date.accessioned 2024-04-24T15:32:52Z
dc.date.available 2024-04-24T15:32:52Z
dc.date.issued 2024-04-24
dc.identifier.uri https://journal.uob.edu.bh:443/handle/123456789/5604
dc.description.abstract Cloud computing has transformed data management with its scale and flexibility. Cloud resources are transient and diversified, making task scheduling difficult. This paper proposes Double Deep Q-Network (DDQN) reinforcement learning model to solve the cloud computing task scheduling problem. Double Deep Q-Network (DDQN) is a powerful reinforcement learning system that improves on Deep Q-Networks (DQN). The target network and the online network are the two distinct neural networks that DDQN presents. To create a more consistent and less unpredictable learning process, the target network is updated on a regular basis to imitate the Q-value estimations of the online network. Traditional DQN can have problems with overestimation bias, which is something that this dual-network architecture helps to alleviate. DDQN is a reliable and efficient tool for solving complex reinforcement learning problems. It excels in learning optimal strategies through iteratively improving its Q-value estimations. DDQN presents a robust framework for addressing the challenges inherent in cloud computing task scheduling. Its dual-network architecture and iterative learning process offer a promising avenue for enhancing the efficiency and effectiveness of resource allocation in cloud environments. Through its continuous refinement of Q-value estimations, DDQN emerges as a valuable asset in navigating the complexities of modern data management within cloud infrastructures. en_US
dc.language.iso en en_US
dc.publisher University of Bahrain en_US
dc.subject Cloud computing, Data management, Task scheduling, Double Deep Q-Network (DDQN), Reinforcement learning, Deep QNetworks (DQN), Target network, Online network. en_US
dc.title Efficient Task Scheduling in Cloud using Double Deep QNetwork en_US
dc.identifier.doi http://dx.doi.org/10.12785/ijcds/XXXXXX
dc.identifier.doi 2210-142X
dc.volume 16 en_US
dc.issue 1 en_US
dc.pagestart 1 en_US
dc.pageend 11 en_US
dc.contributor.authorcountry India en_US
dc.contributor.authorcountry India en_US
dc.contributor.authorcountry India en_US
dc.contributor.authorcountry India en_US
dc.contributor.authoraffiliation Department of CSE, Centurion University of Technology and Management & Department of CSE, Raghu Engineering college en_US
dc.contributor.authoraffiliation Department of CSE, Centurion University of Technology and Management en_US
dc.contributor.authoraffiliation Department of CSSE, Andhra University college of Engineering en_US
dc.contributor.authoraffiliation Department of ECE, Raghu Engineering college en_US
dc.source.title International Journal of Computing and Digital Systems en_US
dc.abbreviatedsourcetitle IJCDS en_US


Files in this item

This item appears in the following Issue(s)

Show simple item record

All Journals


Advanced Search

Browse

Administrator Account