Speaker
Description
Growing demands on accelerator performance, together with recent advances in computational sciences, provide a remarkable opportunity to rethink how CERN’s accelerator complex is operated. Moving beyond traditional approaches, greater automation and improved optimisation methods promise measurable benefits, including reduced costs and energy use, shorter setup times, consistent beam quality, and higher overall reliability. Delivering these gains requires, among others, new algorithms, ranging from classical optimisation to machine learning, and infrastructure that evolves to meet new requirements. Applications extend beyond beam quality optimisation to include equipment-related tasks such as automated setup and recovery, fault prediction and diagnostics, and predictive maintenance. First use cases already support optimisation tasks in daily operation, while further developments are underway. This contribution will review the current status and outline the path toward wider operational use, with emphasis on the Efficient Particle Accelerators (EPA) project as the framework that links these activities into a coherent roadmap from studies to deployment and toward future facilities.
I have read and accept the Privacy Policy Statement | Yes |
---|