The KAUST Supercomputing Core Lab invites you to join the Distributed Hyperparameter Optimization Workshop on IBEX, a hands-on training designed to help users efficiently scale machine learning experiments using Ray-Tune and PyTorch on IBEX’s high-performance computing environment.
This workshop focuses on running large-scale hyperparameter searches across multiple GPUs and compute nodes using Ray Tune integrated with SLURM, enabling systematic exploration of training configurations for deep learning models.