Processing large data sets on Shaheen

As the size of data increases, the simulation workflow is becoming a data motion problem rather than the execution of a compute bound PDE solver. In recent weeks, we have come across many new users who are struggling to move large scale data to-and-from Shaheen for pre/post processing. When the data size reaches TBs per time step, this mode of operation becomes very inefficent. Do you know that you can prepare python based scripts on your workstations and run the same script on Shaheen (after minor edits of the file paths) in batch mode and automatically generate pictures and movies? Most post processing tools like paraview, visit and tecplot offer this feature.