<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Pytorch-Mps on ICE-ICE-BEAR-BLOG</title><link>https://ice-ice-bear.github.io/tags/pytorch-mps/</link><description>Recent content in Pytorch-Mps on ICE-ICE-BEAR-BLOG</description><generator>Hugo -- gohugo.io</generator><language>en</language><lastBuildDate>Wed, 08 Apr 2026 00:00:00 +0900</lastBuildDate><atom:link href="https://ice-ice-bear.github.io/tags/pytorch-mps/index.xml" rel="self" type="application/rss+xml"/><item><title>Running SAM 2.1 on Mac — Apple Silicon GPU Acceleration and Meta SAM 3 Comparison</title><link>https://ice-ice-bear.github.io/posts/2026-04-08-sam2-mac/</link><pubDate>Wed, 08 Apr 2026 00:00:00 +0900</pubDate><guid>https://ice-ice-bear.github.io/posts/2026-04-08-sam2-mac/</guid><description>&lt;img src="https://ice-ice-bear.github.io/" alt="Featured image of post Running SAM 2.1 on Mac — Apple Silicon GPU Acceleration and Meta SAM 3 Comparison" /&gt;&lt;h2 id="overview"&gt;Overview
&lt;/h2&gt;&lt;p&gt;Meta&amp;rsquo;s Segment Anything Model (SAM) changed the game for image segmentation. SAM 2.1 can be run locally on your own machine, while the latest SAM 3 is available through Meta&amp;rsquo;s online playground. In this post, I run SAM 2.1 on an Apple Silicon Mac with MPS GPU acceleration and compare it with the SAM 3 online demo.&lt;/p&gt;
&lt;h2 id="sam-21-local-vs-sam-3-online--architecture-comparison"&gt;SAM 2.1 Local vs SAM 3 Online — Architecture Comparison
&lt;/h2&gt;&lt;pre class="mermaid" style="visibility:hidden"&gt;flowchart LR
 subgraph Local["SAM 2.1 Local"]
 A["User Input &amp;lt;br/&amp;gt; Points/Boxes"] --&gt; B["SAM 2.1 Tiny &amp;lt;br/&amp;gt; 74.5 MB Model"]
 B --&gt; C["PyTorch MPS &amp;lt;br/&amp;gt; Apple Silicon GPU"]
 C --&gt; D["Gradio Web UI &amp;lt;br/&amp;gt; localhost:7860"]
 end

 subgraph Cloud["SAM 3 Online"]
 E["User Input &amp;lt;br/&amp;gt; Text/Click"] --&gt; F["SAM 3 &amp;lt;br/&amp;gt; Meta Server"]
 F --&gt; G["Cloud GPU &amp;lt;br/&amp;gt; Inference"]
 G --&gt; H["Web Browser &amp;lt;br/&amp;gt; aidemos.meta.com"]
 end

 style Local fill:#e8f5e9,stroke:#2e7d32
 style Cloud fill:#e3f2fd,stroke:#1565c0&lt;/pre&gt;&lt;h2 id="sam-21-on-apple-silicon-mac"&gt;SAM 2.1 on Apple Silicon Mac
&lt;/h2&gt;&lt;p&gt;The &lt;a class="link" href="https://github.com/ice-ice-bear/sam2-mac-test" target="_blank" rel="noopener"
 &gt;ice-ice-bear/sam2-mac-test&lt;/a&gt; repository provides a ready-to-run SAM 2.1 setup for Apple Silicon Macs.&lt;/p&gt;
&lt;h3 id="key-features"&gt;Key Features
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;MPS GPU Acceleration&lt;/strong&gt;: Uses PyTorch&amp;rsquo;s Metal Performance Shaders backend to run inference on M1/M2/M3/M4 GPUs&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Multi-point Segmentation&lt;/strong&gt;: Place include/exclude points for fine-grained segmentation with undo/clear support&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Segment Everything Mode&lt;/strong&gt;: Segment all objects in an image at once&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Gradio Web UI&lt;/strong&gt;: Browser-based interface accessible right away&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;SAM 2.1 Tiny Model&lt;/strong&gt;: Lightweight 74.5 MB model, auto-downloaded on first run&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="quick-start"&gt;Quick Start
&lt;/h3&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bash" data-lang="bash"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;git clone https://github.com/ice-ice-bear/sam2-mac-test.git
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nb"&gt;cd&lt;/span&gt; sam2-mac-test
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;uv sync
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;uv run python app.py
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Open &lt;code&gt;http://127.0.0.1:7860&lt;/code&gt; in your browser to access the Gradio UI.&lt;/p&gt;
&lt;h3 id="performance"&gt;Performance
&lt;/h3&gt;&lt;p&gt;Benchmarks on an M1 MacBook:&lt;/p&gt;
&lt;table&gt;
 &lt;thead&gt;
 &lt;tr&gt;
 &lt;th&gt;Task&lt;/th&gt;
 &lt;th&gt;Time&lt;/th&gt;
 &lt;/tr&gt;
 &lt;/thead&gt;
 &lt;tbody&gt;
 &lt;tr&gt;
 &lt;td&gt;Single point segmentation&lt;/td&gt;
 &lt;td&gt;~1.6s&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Multi-point update&lt;/td&gt;
 &lt;td&gt;~1.5s per update&lt;/td&gt;
 &lt;/tr&gt;
 &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;The Tiny model keeps memory usage low, and MPS acceleration provides significant speedup over CPU-only inference.&lt;/p&gt;
&lt;h3 id="tech-stack"&gt;Tech Stack
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;SAM 2.1&lt;/strong&gt;: Via the Ultralytics library&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;PyTorch MPS&lt;/strong&gt;: Apple Silicon GPU backend&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Gradio&lt;/strong&gt;: Web UI framework&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;uv&lt;/strong&gt;: Package manager&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="meta-sam-3-online-playground"&gt;Meta SAM 3 Online Playground
&lt;/h2&gt;&lt;p&gt;Meta offers the latest SAM 3 as an online demo at &lt;a class="link" href="https://aidemos.meta.com/segment-anything" target="_blank" rel="noopener"
 &gt;aidemos.meta.com/segment-anything&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="what-sets-sam-3-apart"&gt;What Sets SAM 3 Apart
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Text-prompt Segmentation&lt;/strong&gt;: Find objects using natural language — &amp;ldquo;find animal&amp;rdquo;, &amp;ldquo;find person&amp;rdquo;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;One-click Effects&lt;/strong&gt;: Apply blur, clone, desaturate, and more with a single click&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Motion Trails&lt;/strong&gt;: Add motion effects to segmented objects&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Contour Lines / Bounding Boxes&lt;/strong&gt;: Various visualization options&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Video Segmentation&lt;/strong&gt;: Track Anything feature for object tracking in video&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Community Templates&lt;/strong&gt;: Use effects created by other users&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="sam-21-local-vs-sam-3-online-comparison"&gt;SAM 2.1 Local vs SAM 3 Online Comparison
&lt;/h2&gt;&lt;table&gt;
 &lt;thead&gt;
 &lt;tr&gt;
 &lt;th&gt;Aspect&lt;/th&gt;
 &lt;th&gt;SAM 2.1 Local&lt;/th&gt;
 &lt;th&gt;SAM 3 Online&lt;/th&gt;
 &lt;/tr&gt;
 &lt;/thead&gt;
 &lt;tbody&gt;
 &lt;tr&gt;
 &lt;td&gt;Environment&lt;/td&gt;
 &lt;td&gt;Local Mac (Apple Silicon)&lt;/td&gt;
 &lt;td&gt;Meta cloud servers&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;GPU&lt;/td&gt;
 &lt;td&gt;MPS (M1/M2/M3/M4)&lt;/td&gt;
 &lt;td&gt;Cloud GPU&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Model Size&lt;/td&gt;
 &lt;td&gt;Tiny 74.5 MB&lt;/td&gt;
 &lt;td&gt;Full-size (undisclosed)&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Input Methods&lt;/td&gt;
 &lt;td&gt;Point click, box&lt;/td&gt;
 &lt;td&gt;Text, click, box&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Text Prompts&lt;/td&gt;
 &lt;td&gt;Not supported&lt;/td&gt;
 &lt;td&gt;Supported&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Post-processing Effects&lt;/td&gt;
 &lt;td&gt;None&lt;/td&gt;
 &lt;td&gt;Blur, clone, desaturate, etc.&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Video Support&lt;/td&gt;
 &lt;td&gt;Not supported&lt;/td&gt;
 &lt;td&gt;Supported&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Privacy&lt;/td&gt;
 &lt;td&gt;Data stays local&lt;/td&gt;
 &lt;td&gt;Uploaded to Meta servers&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Internet Required&lt;/td&gt;
 &lt;td&gt;Only for model download&lt;/td&gt;
 &lt;td&gt;Always&lt;/td&gt;
 &lt;/tr&gt;
 &lt;tr&gt;
 &lt;td&gt;Customization&lt;/td&gt;
 &lt;td&gt;Full code access&lt;/td&gt;
 &lt;td&gt;Limited&lt;/td&gt;
 &lt;/tr&gt;
 &lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 id="which-one-should-you-choose"&gt;Which One Should You Choose?
&lt;/h2&gt;&lt;p&gt;&lt;strong&gt;SAM 2.1 local&lt;/strong&gt; is the right choice when:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;You don&amp;rsquo;t want sensitive images leaving your machine&lt;/li&gt;
&lt;li&gt;You need to integrate segmentation into an automated pipeline&lt;/li&gt;
&lt;li&gt;You want to modify or extend the model&lt;/li&gt;
&lt;li&gt;You need to work offline&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;SAM 3 online demo&lt;/strong&gt; is the right choice when:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;You want to find objects using text prompts&lt;/li&gt;
&lt;li&gt;You need quick access to effects like blur and cloning&lt;/li&gt;
&lt;li&gt;You need video segmentation&lt;/li&gt;
&lt;li&gt;You want to try it out without any installation&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="wrapping-up"&gt;Wrapping Up
&lt;/h2&gt;&lt;p&gt;Running SAM 2.1 locally is a practical option for Apple Silicon Mac users. The 74.5 MB Tiny model delivers usable segmentation results, and MPS acceleration makes good use of the GPU. The SAM 3 online demo takes it further with text prompts and a rich set of effects. Depending on your use case, combining local and cloud approaches gives you the best of both worlds.&lt;/p&gt;
&lt;h3 id="links"&gt;Links
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class="link" href="https://github.com/ice-ice-bear/sam2-mac-test" target="_blank" rel="noopener"
 &gt;ice-ice-bear/sam2-mac-test (GitHub)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://aidemos.meta.com/segment-anything" target="_blank" rel="noopener"
 &gt;Meta AI Demos — Segment Anything&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://docs.ultralytics.com/models/sam-2/" target="_blank" rel="noopener"
 &gt;Ultralytics SAM 2 Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="link" href="https://pytorch.org/docs/stable/notes/mps.html" target="_blank" rel="noopener"
 &gt;PyTorch MPS Backend&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description></item></channel></rss>