Workshops
CARC's Research Facilitation & Applications team offers a number of workshops designed to introduce users to CARC systems, as well as workshops on using specific software and programming languages in a high-performance computing environment.
The workshops below are approximately two hours in length and are offered several times a year on a rotating basis. Stay up to date on upcoming workshops by checking the schedule below and subscribing to the monthly CARC newsletter.
Due to the ongoing COVID-19 pandemic, all workshops are currently being hosted via Zoom.
Recordings of the workshops can be accessed at Video Learning or on CARC's YouTube channel.
If you are interested in attending one of our workshops but you do not already have a CARC account, please submit a help ticket before registering and we can create an account for you.
Upcoming workshops
Research computing essentials
→ Introduction to Research Computing on CARC Discovery Cluster
An overview of CARC's services and high-performance computing clusters, including how to log in, manage and transfer data, load and build software, and run and monitor jobs.
→ CARC OnDemand: Scientific Computing on CARC from a Web Browser
This workshop is offered as a pre-recorded video only. Open OnDemand allows users to access CARC resources from within a web browser. This presentation will discuss how to manage files, getting shell access, running jobs, and how to start interactive apps with a graphical interface.
→ Introduction to Linux
This workshop is offered as a pre-recorded video only. An introduction to Linux that covers the basic skills needed to be productive in a command-line environment on Unix-like systems such as CARC's high-performance computing clusters. Topics include an overview of the Linux operating system, how to run commands, navigate file systems, create and edit files, use pipes and filters, and develop shell scripts.
→ Installing and Using Software on CARC Systems
An overview of the software stacks available on CARC systems, using the Lmod software module system. Topics include how to find and load modules, manage your shell environment, build your own software, and create your own modules.
→ Running Jobs on CARC Systems
An overview of job submission and monitoring using the Slurm workload manager and job scheduler. Topics include cluster and node information, resource requests, job history and efficiency, job dependencies, and job arrays.
→ Using Virtual Machines on CARC's Artemis Cloud Platform
An introduction to our on-premises private cloud computing platform. Topics include logging in, creating and modifying a VM, an overview of VM templates, and an introduction to the available Firecracker microVMs. Attendees are encouraged to attend the workshop on the basics of cloud computing on AWS before attending this workshop, if possible.
→ Scientific Computing Series
This two-part series introduces the concepts and tools that are essential for high-performance computing through a combination of lectures and hands-on practices.
- Overview of an HPC Cluster and Essential Linnux Commands
Start with getting to know CARC’s HPC ecosystem and basic Linux shell commands. From there, we'll do a brief introduction to Python and writing our first program with it. This is followed by discussing the need for version control and setting up a Github repository.
- Parallel Processing with MPI in Python
Part 2 covers the fundamental concepts involved in developing a parallel program using Massage Passing Interface (MPI) in Python via the mpi4py package. Essential concepts are introduced using hands on tutorials.
Advanced topics in research computing
→ Software Containers with Singularity
An overview of software containers and using Singularity to create and run containers for high-performance computing tasks.
→ Running Deep Learning Applications on HPC Systems
This workshop is an advanced course on Python with specific emphasis on running deep learning applications on HPC systems. We cover how to submit deep learning jobs to the HPC cluster using Slurm job scripts and how to use Singularity containers to streamline the execution of your deep learning code.
→ Introduction to OpenMP GPU Offloading
A general overview of the OpenMP programming model. We will cover the basics of using OpenMP directives to offload compute work to NVIDIA GPUs. We will also go over managing data transfers, lifetimes, and reductions. This workshop is meant for USC CARC users who are already familiar with the basic ideas of GPU programming and want to learn about the core GPU offloading capabilities of OpenMP.
→ Introduction to Code Performance Profiling
Learn how to use application performance measurement tools to optimize code and improve application performance on HPC systems. We will use the Intel oneAPI HPC Toolkit to diagnose memory, CPU usage, and other application-level issues. This workshop will use a hands-on approach to analyze application peformance issues. We will use a sample hybrid MPI+OpenMP code using MPI Performance Snapshot, Intel Trace Analyzer and Collector, and Intel VTune Profiler.
→ MPI Programming with Python
This workshop will cover basic message passing interface (MPI) concepts and some MPI4py commands. We will go over some coding examples for the hands-on portion.
→ Building a HPC Cluster on the Cloud
In this workshop you will learn about the different components of a HPC cluster—login nodes, the scheduler, storage options—and how to create them using AWS ParallelCluster. Attendees spin up their own individual HPC cluster on AWS and measure bandwidth and latency of their cluster.
Programming
→ Introduction to Python
An introduction to the Python programming language. This workshop covers basic Python syntax, installing and importing packages, visualizing data, and writing scripts.
→ Optimizing Python for HPC
Intermediate-to-advanced topics for getting improved Python performance in an HPC cluster environment. This workshop covers debugging, profiling, and parallel programming.
→ Introduction to R
An introduction to the R programming environment and language for statistical computing and graphics. Topics include base R and packages, objects and functions, importing and exporting data, summarizing data, visualizing data, modeling data, control flow, iteration, and scripting.
→ R Programming for HPC
An intermediate-to-advanced workshop on HPC methods in R programming. Topics include profiling and benchmarking, vectorizing code, memory use, data I/O, and parallel programming. This workshop requires basic proficiency in R programming.
→ Introduction to Julia
An introduction to the Julia programming language for scientific and technical computing. Topics include base Julia and packages, data types and structures, functions, control flow, iteration, and scripting.
→ Julia Programming for HPC
An intermediate-to-advanced workshop on HPC methods in Julia programming. Topics include profiling and benchmarking, memory use, data I/O, and parallel programming. This workshop requires basic proficiency in Julia programming.
Research applications
→ Introduction to Cloud Computing on AWS
The goal of this workshop is to introduce the concept of cloud computing, focusing on the main elements of the HPC workflow. The practice materials are presented based on the AWS products, however similar concepts can be adopted to other major cloud service providers.
Basic concepts such as regions, zones, VPC, and subnets are explained. Different machine types, as well as storage options are briefly described and the possible use cases for each type are discussed.
→ Using Workflows for Job Automation
This workshop uses the workflow management software (WMS) makeflow to automate the job scheduling processes. We will define a sequence of tasks that will automatically manage job dependencies, resubmit failed jobs, and delete unneeded intermediate files. Other tools such as snakemake and Pegasus will be discussed.
→ Managing HPC Workflows with Pegasus WMS
Learn how to automate your data workflows to reduce time adjusting scripts, transferring data, and performing repetitive tasks. We will guide your through running workflows on our Discvory cluster and give you hands-on experience in using Pegasus WMS for a genomics application.
→ Building Neural Networks for Deep Learning Applications
This workshop is an advanced course. We will cover basics of artificial neural networks and deep learning pipelines. The hands on session of the workshop will teach you how to use Conda to install the popular deep learning PyTorch package and walk you through an example of how to write code in PyTorch to build deep neural networks and solve a real-world classification problem.
→ Data Analysis and Visualization Using Panda and Matplotlib
This workshop is an intermediate course on Python with specific emphasis on data analysis and visualization. It will cover the use of Panda and Matplotlib, popular libraries for data processing, manipulation, and visualization.
→ Computational Biology with CARC OnDemand
An introductory workshop with an overview of CARC's life sciences computing resources and tools. This workshop utilizes CARC's OnDemand platform to prepare and submit jobs. Topics focus on demonstrations of common use cases and simple troubleshooting without using the command line.
Semester-long series
→ Density Functional Theory Methods Using Quantum Espresso
This workshop series will benefit researchers who are interested in the application of theoretical methods and techniques for the study of the physics and chemistry of the solid state. These hands-on oriented workshops are targeted towards undergraduate, graduate, and post-doctoral students who wish to use Density Functional Theory (DFT) methods in their research. The aim is to teach the basics of ab initio atomistic materials simulation using the Quantum Espresso (QE) plane-wave pseudopotential software suite. The workshops will consist of lectures, demonstrations, and practical hands-on sessions using the Discovery HPC cluster.
→ CP2K: Atomistic Simulation
CP2K is a suite of modules for electronic structure and molecular dynamics simulations optimally suited for the simulation of complex condensed phase systems and materials. This workshop provides an introduction to the most relevant computational tools implemented within the CP2K program package to researchers and students in the field of molecular simulations. This workshop focuses on methodologies available in CP2K and encourages modular, flexible, and problem-oriented thinking while using them. The most standard methods as well as some of the more advanced features will be introduced by overviews of background theory and through examples of application, always related to the specific implementation in this code. Recurring topics in the workshop are the scaling of algorithms, the combination of different levels of theory and of sampling, and tools and strategies for the analysis of results.
Topics covered include:
- Kohn-Sham density functional methods using the Gaussian and Plane Wave (GPW) and the Gaussian Augmented Plane Wave (GAPW) methods
- Ab initio molecular dynamics and enhanced sampling methods
- Advanced electronic structure methods and electronic properties
USC Libraries Data and Visualization Workshops
USC Library for International and Public Affairs offers a variety of workshops, events, and other outreach programming focusing on data, visualization, tools, software, government information and collections.
See their workshop schedule, descriptions, and registration here.
Research Computing
3434 South Grand Avenue
3rd Floor (CAL Building)
Los Angeles, CA 90089
carc-support@usc.edu
Mailing List
Sign up to receive information about upcoming
system upgrades, events, and announcements.