I am a...
I want to...
Sign In/Register for Account
External Research Office
Reinforcement Learning Framework for Utility-Based Scheduling in Resource-Constrained Systems, A (February 2005)
This paper presents a general methodology for scheduling jobs in soft real-time systems, where the utility of completing each job decreases over time. This scheduling problem is known to be NP-hard, requiring a heuristic solution to operate in real-time. We present a utility-based framework for making repeated scheduling decisions based on dynamically observed information about unscheduled jobs and system's resources. This framework generalizes the standard scheduling problem to a resource-constrained environment, where resource allocation (RA) decisions (how many CPUs to allocate to each job) have to be made concurrently with the scheduling decisions (when to execute each job). We then use the discrete-time Optimal Control theory to formulate the optimization problem of finding the scheduling/RA policy that maximizes the average utility per time step obtained from completed jobs. We propose a Reinforcement Learning (RL) architecture for solving the NP-hard Optimal Control problem in real-time, and our experimental results demonstrate the feasibility and benefits of the proposed approach.
Oracle Labs on OTN
Want to try out some of the cool technology being built at Oracle Labs?
Email to a friend
Integrated Cloud Applications and Platform Services
Oracle RSS Feed