Deep reinforcement learning algorithm for wellbore cleaning across drilling operation
Publikationen: Beitrag in Buch/Bericht/Konferenzband › Beitrag in Konferenzband
Standard
Fourth EAGE Digitalization Conference & Exhibition, Mar 2024, Volume 2024, p.1 - 5. Band 2024 2024.
Publikationen: Beitrag in Buch/Bericht/Konferenzband › Beitrag in Konferenzband
Harvard
APA
Vancouver
Author
Bibtex - Download
}
RIS (suitable for import to EndNote) - Download
TY - GEN
T1 - Deep reinforcement learning algorithm for wellbore cleaning across drilling operation
AU - Keshavarz, Sahar
AU - Elmgerbi, Asad
AU - Thonhauser, Gerhard
PY - 2024/3/25
Y1 - 2024/3/25
N2 - We propose a novel framework for real-time drilling operation planning updates using deep reinforcement learning algorithm, enabling drilling process reactions to be automated. The framework includes a decision tree algorithm to represent the environment dynamic changes based on the imposed actions parallel to a Gaussian process algorithm to quantify the safe operating window in real-time. Combining these two algorithms leads to mounting a Markov Decision Process (MDP) environment for a decision-making system.We demonstrate the effectiveness of our framework by implementing an off-policy deep reinforcement learning algorithm, using a deep Q-learning network to create experiences, and employing synchronous updates on the agent. Given the essence of reinforcement learning, the framework can be efficiently implemented for on-the-spot decision-making, allowing the driller to receive an effective sequence of actions considering company policies.Our algorithm achieves state-of-the-art performance on weight-to-slip hole conditioning operation, a wellbore cleaning operation after drilling a stand before connecting to the next pipe. The performance evaluation exhibits its efficiency in real-time operation overhaul, eliminating non-value-added activities. Our framework thus opens the door for automating the process based on the operating parameters obtained in real-time
AB - We propose a novel framework for real-time drilling operation planning updates using deep reinforcement learning algorithm, enabling drilling process reactions to be automated. The framework includes a decision tree algorithm to represent the environment dynamic changes based on the imposed actions parallel to a Gaussian process algorithm to quantify the safe operating window in real-time. Combining these two algorithms leads to mounting a Markov Decision Process (MDP) environment for a decision-making system.We demonstrate the effectiveness of our framework by implementing an off-policy deep reinforcement learning algorithm, using a deep Q-learning network to create experiences, and employing synchronous updates on the agent. Given the essence of reinforcement learning, the framework can be efficiently implemented for on-the-spot decision-making, allowing the driller to receive an effective sequence of actions considering company policies.Our algorithm achieves state-of-the-art performance on weight-to-slip hole conditioning operation, a wellbore cleaning operation after drilling a stand before connecting to the next pipe. The performance evaluation exhibits its efficiency in real-time operation overhaul, eliminating non-value-added activities. Our framework thus opens the door for automating the process based on the operating parameters obtained in real-time
UR - https://www.earthdoc.org/content/papers/10.3997/2214-4609.202439018
U2 - 10.3997/2214-4609.202439018
DO - 10.3997/2214-4609.202439018
M3 - Conference contribution
VL - 2024
BT - Fourth EAGE Digitalization Conference & Exhibition, Mar 2024, Volume 2024, p.1 - 5
ER -