New pages
From SHARCNETHelp
Jump to navigationJump to search- 11:24, 7 January 2026 Webinar 2026 Running Gaussian16 and NBO7 effectively on Nibi and Fir: Performance Issues (hist | edit) [559 bytes] Jemmyhu (talk | contribs) (Created page with "In this talk, we will review approaches for running Gaussian and NBO7 jobs on the new Alliance clusters, Fir and Nibi. To better understand performance on these systems, we co...")
- 14:52, 6 January 2026 Webinar 2026 Too Big to Train 2: PyTorch's Upgraded Interface for Fully Sharded Data Parallel (hist | edit) [507 bytes] Syam (talk | contribs) (Created page with "In our last talk on Fully Sharded Data Parallel (FSDP), we offered insight into training large models using FSDP and strategies for customizing model training with FSDP for pe...")
- 13:43, 5 January 2026 Webinar 2026 Floating-point Numbers Aren't Mathematical Real Numbers (hist | edit) [625 bytes] Syam (talk | contribs) (Created page with "When one writes computer program code, one might use floating-point values as if they were mathematical real number values. While floating-point numbers are inexact and have a...")
- 11:34, 20 November 2025 Webinar 2025 Illuminating the Black Box: Understanding AI Models with Integrated Gradients (hist | edit) [905 bytes] Syam (talk | contribs) (Created page with "Modern AI models, especially deep neural networks, have achieved remarkable success across vision, language, and decision-making tasks — but their inner workings often remai...")
- 14:28, 17 November 2025 Webinar 2025 Serial Farms: Package options and when to switch to farming (hist | edit) [835 bytes] Syam (talk | contribs) (Created page with "TBA")
- 09:55, 3 November 2025 Webinar 2025 High-Performance Data Science with Modern C++: Ranx (hist | edit) [707 bytes] Syam (talk | contribs) (Created page with "This is the second part in a series of talks about using modern C++ for high-performance data science. In the first talk of the series (https://youtu.be/YPQUIkSIFhw), we cover...")