Thumbnail Image

PDC cutter wear and temperature modeling

Bridges, Douglas
Since Polycrystalline Diamond Compact (PDC) drill bits have been developed, they have made a strong introduction into the drilling industry due to their tendency to produce a higher rate of penetration (ROP) than that of a rollercone. Even though a PDC bit is often times more efficient, the effects of bit wear still decrease PDC bit life. For both geothermal wells and oil/gas wells, bit wear has been a coherent setback during the drilling process since the beginning. Drilling a well efficiently is the key to being profitable and bit wear is a key factor in drilling efficiency. The research presented within, includes single cutter testing preformed at varying operating parameters to analyze effects on PDC wear rates and temperature development under the cutter wearflat. Previous temperature modeling is verified with single cutter data. From the single cutter data, significant differences are seen when wear surpasses the PDC layer. Based on these finding, a Differential Evolution Algorithm (DEA) optimization is used to calibrate a wear model based on PDC wear rates only. The wear model is applied to two separate bit runs to verify that it can predicts real-time wear based on operating parameters and rock properties. Using a detailed cutter geometry outlined in this study, wearflat temperatures and PDC temperatures are estimated on both bit runs. With the models outlined in this study, optimal operating ranges for weight on bit (WOB) and RPM can be achieved to limit PDC bit wear. The ultimate goal of this study is to combine this research with other individual for Oklahoma State University, University of Oklahoma, and Sandia National Laboratories to develop a real time drilling model. This research can be applied to any drilling application to increase efficiency by minimizing PDC drill bit wear.