<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>HPC Posts Archive | Puget Systems</title>
	<atom:link href="https://www.pugetsystems.com/all-hpc/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.pugetsystems.com/all-hpc/</link>
	<description>Workstations for creators.</description>
	<lastBuildDate>Tue, 17 Mar 2026 22:55:38 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>HOW TO: Install and Configure Puget Systems Docker App Packs</title>
		<link>https://www.pugetsystems.com/labs/hpc/how-to-install-and-configure-puget-systems-docker-app-packs/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/how-to-install-and-configure-puget-systems-docker-app-packs/#respond</comments>
		
		<dc:creator><![CDATA[Dustin Moore]]></dc:creator>
		<pubDate>Tue, 17 Mar 2026 22:55:26 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=44485</guid>

					<description><![CDATA[<p>This guide will walk you step-by-step through installing and configuring our tailored AI environments on your Puget Systems workstation or server.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/how-to-install-and-configure-puget-systems-docker-app-packs/">HOW TO: Install and Configure Puget Systems Docker App Packs</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/how-to-install-and-configure-puget-systems-docker-app-packs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>From Unboxing to Inference: Introducing the Puget Systems Docker App Packs</title>
		<link>https://www.pugetsystems.com/labs/hpc/introducing-the-puget-systems-docker-app-packs/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/introducing-the-puget-systems-docker-app-packs/#respond</comments>
		
		<dc:creator><![CDATA[Dustin Moore]]></dc:creator>
		<pubDate>Tue, 17 Mar 2026 22:55:21 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=44477</guid>

					<description><![CDATA[<p>We look at AI the way our customers do, which is why we built the Puget Systems Docker App Packs: to help you get up and running with AI inference fast!</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/introducing-the-puget-systems-docker-app-packs/">From Unboxing to Inference: Introducing the Puget Systems Docker App Packs</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/introducing-the-puget-systems-docker-app-packs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Standing Up AI Development Quickly for Supercomputing 2025</title>
		<link>https://www.pugetsystems.com/labs/hpc/standing-up-ai-development-quickly-for-supercomputing-2025/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/standing-up-ai-development-quickly-for-supercomputing-2025/#respond</comments>
		
		<dc:creator><![CDATA[Dustin Moore]]></dc:creator>
		<pubDate>Fri, 12 Dec 2025 22:02:39 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=42432</guid>

					<description><![CDATA[<p>How I used "Vibe Coding" and 25 years of experience to tame a liquid-cooled supercomputer in two weeks.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/standing-up-ai-development-quickly-for-supercomputing-2025/">Standing Up AI Development Quickly for Supercomputing 2025</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/standing-up-ai-development-quickly-for-supercomputing-2025/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Exploring Hybrid CPU/GPU LLM Inference</title>
		<link>https://www.pugetsystems.com/labs/hpc/exploring-hybrid-cpu-gpu-llm-inference/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/exploring-hybrid-cpu-gpu-llm-inference/#respond</comments>
		
		<dc:creator><![CDATA[Jon Allman]]></dc:creator>
		<pubDate>Thu, 20 Mar 2025 20:41:02 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=35198</guid>

					<description><![CDATA[<p>A brief look into using a hybrid GPU/VRAM + CPU/RAM approach to LLM inference with the KTransformers inference library.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/exploring-hybrid-cpu-gpu-llm-inference/">Exploring Hybrid CPU/GPU LLM Inference</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/exploring-hybrid-cpu-gpu-llm-inference/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>What&#8217;s the deal with NPUs?</title>
		<link>https://www.pugetsystems.com/labs/hpc/whats-the-deal-with-npus/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/whats-the-deal-with-npus/#respond</comments>
		
		<dc:creator><![CDATA[Jon Allman]]></dc:creator>
		<pubDate>Fri, 25 Oct 2024 19:55:03 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=31072</guid>

					<description><![CDATA[<p>An introduction to NPU hardware and its growing presence outside of mobile computing devices.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/whats-the-deal-with-npus/">What&#8217;s the deal with NPUs?</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/whats-the-deal-with-npus/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Local alternatives to Cloud AI services</title>
		<link>https://www.pugetsystems.com/labs/hpc/local-alternatives-to-cloud-ai-services/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/local-alternatives-to-cloud-ai-services/#respond</comments>
		
		<dc:creator><![CDATA[Jon Allman]]></dc:creator>
		<pubDate>Thu, 11 Apr 2024 20:07:33 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=26768</guid>

					<description><![CDATA[<p>Presenting local AI-powered software options for tasks such as image &#038; text generation, automatic speech recognition, and frame interpolation.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/local-alternatives-to-cloud-ai-services/">Local alternatives to Cloud AI services</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/local-alternatives-to-cloud-ai-services/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AMD Zen4 Threadripper PRO vs Intel Xeon-w9 For Science and Engineering</title>
		<link>https://www.pugetsystems.com/labs/hpc/amd-zen4-threadripper-pro-vs-intel-xeon-w9-for-science-and-engineering/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/amd-zen4-threadripper-pro-vs-intel-xeon-w9-for-science-and-engineering/#respond</comments>
		
		<dc:creator><![CDATA[Dr. Donald Kinghorn]]></dc:creator>
		<pubDate>Thu, 07 Mar 2024 19:56:57 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=25940</guid>

					<description><![CDATA[<p>The performance improvement with the new Zen4 TrPRO over the Zen3 TrPRO is very impressive!<br />
My first recommendation for a Scientific and Engineering workstation CPU would now be the AMD Zen4 architecture as either Zen4 Threadripper PRO or Zen4 EPYC for multi-socket systems.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/amd-zen4-threadripper-pro-vs-intel-xeon-w9-for-science-and-engineering/">AMD Zen4 Threadripper PRO vs Intel Xeon-w9 For Science and Engineering</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/amd-zen4-threadripper-pro-vs-intel-xeon-w9-for-science-and-engineering/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Benchmarking with TensorRT-LLM</title>
		<link>https://www.pugetsystems.com/labs/hpc/benchmarking-with-tensorrt-llm/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/benchmarking-with-tensorrt-llm/#respond</comments>
		
		<dc:creator><![CDATA[Jon Allman]]></dc:creator>
		<pubDate>Fri, 16 Feb 2024 18:06:51 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=23187</guid>

					<description><![CDATA[<p>Evaluating the speed of GeForce RTX 40-Series GPUs using NVIDIA's TensorRT-LLM tool for benchmarking GPU inference performance.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/benchmarking-with-tensorrt-llm/">Benchmarking with TensorRT-LLM</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/benchmarking-with-tensorrt-llm/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Experiences with Multi-GPU Stable Diffusion Training</title>
		<link>https://www.pugetsystems.com/labs/hpc/multi-gpu-sd-training/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/multi-gpu-sd-training/#respond</comments>
		
		<dc:creator><![CDATA[Jon Allman]]></dc:creator>
		<pubDate>Mon, 29 Jan 2024 23:22:19 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=22714</guid>

					<description><![CDATA[<p>Results and thoughts with regard to testing a variety of Stable Diffusion training methods using multiple GPUs.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/multi-gpu-sd-training/">Experiences with Multi-GPU Stable Diffusion Training</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/multi-gpu-sd-training/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>LLM Server Setup Part 2 &#8212; Container Tools</title>
		<link>https://www.pugetsystems.com/labs/hpc/llm-server-setup-part-2-container-tools/</link>
					<comments>https://www.pugetsystems.com/labs/hpc/llm-server-setup-part-2-container-tools/#respond</comments>
		
		<dc:creator><![CDATA[Dr. Donald Kinghorn]]></dc:creator>
		<pubDate>Mon, 20 Nov 2023 22:15:10 +0000</pubDate>
				<guid isPermaLink="false">https://www.pugetsystems.com/?post_type=hpc_post&#038;p=21032</guid>

					<description><![CDATA[<p>This post is Part 2 in a series on how to configure a system for LLM deployments and development usage.  Part 2 is about installing and configuring container tools, Docker and NVIDIA Enroot.</p>
<p>The post <a href="https://www.pugetsystems.com/labs/hpc/llm-server-setup-part-2-container-tools/">LLM Server Setup Part 2 &#8212; Container Tools</a> appeared first on <a href="https://www.pugetsystems.com">Puget Systems</a>.</p>
]]></description>
		
					<wfw:commentRss>https://www.pugetsystems.com/labs/hpc/llm-server-setup-part-2-container-tools/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
