{"id":1109357,"date":"2026-01-29T01:39:24","date_gmt":"2026-01-29T01:39:24","guid":{"rendered":"http:\/\/HNJPg65LGTqGXBaEp6HEsd"},"modified":"2026-01-29T01:39:24","modified_gmt":"2026-01-29T01:39:24","slug":"today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required","status":"publish","type":"post","link":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/","title":{"rendered":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required"},"content":{"rendered":"<article>\n<p>Got the impression that a bazillion dollar&#8217;s worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI tools outfit HuggingFace, claims that you can run the <a data-analytics-id=\"inline-link\" href=\"https:\/\/www.pcgamer.com\/hardware\/graphics-cards\/chinas-deepseek-chatbot-reportedly-gets-much-more-done-with-fewer-gpus-but-nvidia-still-thinks-its-excellent-news\/\" target=\"_blank\">hot new DeepSeek R1 LLM<\/a> on just $6,000 of PC hardware. The kicker? You don&#8217;t even need a high-end GPU.<\/p>\n<p><a data-analytics-id=\"inline-link\" href=\"https:\/\/x.com\/carrigmat\/status\/1884244369907278106\" target=\"_blank\">Carrigan&#8217;s suggested build<\/a> involves a dual-socket AMD EPYC motherboard and a couple of compatible AMD chips to go with it. Apparently, the spec of the CPUs isn&#8217;t actually that critical. Instead, it&#8217;s all about the memory.<\/p>\n<div class=\"see-more see-more--clipped\">\n<blockquote class=\"twitter-tweet hawk-ignore\" data-lang=\"en\">\n<p lang=\"en\" dir=\"ltr\">Complete hardware + software setup for running Deepseek-R1 locally. The actual model, no distillations, and Q8 quantization for full quality. Total cost, $6,000. All download and part links below:<a href=\"https:\/\/twitter.com\/cantworkitout\/status\/1884244369907278106\">January 28, 2025<\/a><\/p>\n<\/blockquote>\n<div class=\"see-more__filter\"><\/div>\n<\/div>\n<p>&#8220;We are going to need 768GB (to fit the model) across 24 RAM channels (to get the bandwidth to run it fast enough). That means 24 x 32 GB DDR5-RDIMM modules,&#8221; Carrigan explains.<\/p>\n<p>Links are helpfully provided and the RAM alone comes to about $3,400. Then you&#8217;ll need a case, PSU, a mere 1 TB SSD, some heatsinks and fans.<\/p>\n<p>Indeed, Carrigan says this setup gets you the full DeepSeek R1 experience with no compromises. &#8220;The actual model, no distillations, and Q8 quantization for full quality,&#8221; he explains.<\/p>\n<p>From there, simply &#8220;throw&#8221; on Linux, install llama.cpp, download 700 GB of weights, input a command line string Carrigan helpfully provides and Bob&#8217;s your large language model running locally, as they say.<\/p>\n<p>Notable in all this is a total absence of mention of expensive Nvidia GPUs. So what gives? Well, Carrigan provides a video of the LLM running locally on this setup plus a rough performance metric.<\/p>\n<figure class=\"van-image-figure inline-layout\" data-bordeaux-image-check >\n<div class='image-full-width-wrapper'>\n<div class='image-widthsetter' style=\"max-width:1920px;\">\n<p class=\"vanilla-image-block\" style=\"padding-top:56.25%;\"><img loading=\"lazy\" decoding=\"async\" id=\"c3RwNWN8XeDGfgrBXGaR4f\" name=\"nvidia-hopper-architecture-h100-die.jpg\" alt=\"Nvidia Hopper GPU die\" src=\"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg\" mos=\"\" align=\"middle\" fullscreen=\"\" width=\"1920\" height=\"1080\" attribution=\"\" endorsement=\"\" class=\"\"><\/p>\n<\/div>\n<\/div><figcaption itemprop=\"caption description\" class=\" inline-layout\"><span class=\"caption-text\">Nvidia&#8217;s H100: You won&#8217;t be needing one of these. <\/span><span class=\"credit\" itemprop=\"copyrightHolder\">(Image credit: Nvidia)<\/span><\/figcaption><\/figure>\n<p>&#8220;The generation speed on this build is 6 to 8 tokens per second, depending on the specific CPU and RAM speed you get, or slightly less if you have a long chat history. The clip above is near-realtime, sped up slightly to fit video length limits,&#8221; he says.<\/p>\n<p>The video shows the model generating text at a reasonable pace. But that, of course, is for just one user. Open this setup out to multiple users and the per-user performance would, we assume, quickly become unusable.<\/p>\n<p>In other words, that&#8217;s $6,000 of hardware to support, in effect, a single user. So, this likely isn&#8217;t an approach that&#8217;s practical for setting up an AI business serving hundreds, thousands or even millions of users. For that kind of application, GPUs may well be more cost effective, even with their painful unit price.<\/p>\n<p>Carrigan suggests a build relying on GPUs might run up to triple-digits pretty quickly, albeit with better performance.<\/p>\n<div class=\"see-more see-more--clipped\">\n<blockquote class=\"twitter-tweet hawk-ignore\" data-lang=\"en\">\n<p lang=\"en\" dir=\"ltr\">And if you got this far: Yes, there&#8217;s no GPU in this build! If you want to host on GPU for faster generation speed, you can! You&#8217;ll just lose a lot of quality from quantization, or if you want Q8 you&#8217;ll need >700GB of GPU memory, which will probably cost $100k+<a href=\"https:\/\/twitter.com\/cantworkitout\/status\/1884247727758008642\">January 28, 2025<\/a><\/p>\n<\/blockquote>\n<div class=\"see-more__filter\"><\/div>\n<\/div>\n<div class=\"fancy-box\">\n<div class=\"fancy_box-title\">Your next upgrade<\/div>\n<div class=\"fancy_box_body\">\n<figure class=\"van-image-figure \" >\n<div class='image-full-width-wrapper'>\n<div class='image-widthsetter' >\n<p class=\"vanilla-image-block\" style=\"padding-top:56.25%;\"><img decoding=\"async\" id=\"tidxyoUY3P2N5A2jEhgSNK\" name=\"nvidia-rtx-4070-12.jpg\" caption=\"\" alt=\"Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards\" src=\"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required-1.jpg\" mos=\"\" link=\"\" align=\"\" fullscreen=\"\" width=\"\" height=\"\" attribution=\"\" endorsement=\"\" class=\"pinterest-pin-exclude\"><\/p>\n<\/div>\n<\/div><figcaption itemprop=\"caption description\" class=\"\"><span class=\"credit\" itemprop=\"copyrightHolder\">(Image credit: Future)<\/span><\/figcaption><\/figure>\n<p class=\"fancy-box__body-text\"><a data-analytics-id=\"inline-link\" href=\"https:\/\/www.pcgamer.com\/best-cpu-for-gaming\/\" target=\"_blank\"><strong>Best CPU for gaming<\/strong><\/a>: The top chips from Intel and AMD.<br \/><a data-analytics-id=\"inline-link\" href=\"https:\/\/www.pcgamer.com\/best-gaming-motherboards\/\" target=\"_blank\"><strong>Best gaming motherboard<\/strong><\/a>: The right boards.<br \/><a data-analytics-id=\"inline-link\" href=\"https:\/\/www.pcgamer.com\/the-best-graphics-cards\/\" target=\"_blank\"><strong>Best graphics card<\/strong><\/a>: Your perfect pixel-pusher awaits.<br \/><a data-analytics-id=\"inline-link\" href=\"https:\/\/www.pcgamer.com\/best-ssd-for-gaming\/\" target=\"_blank\"><strong>Best SSD for gaming<\/strong><\/a>: Get into the game ahead of the rest.<\/p>\n<\/div>\n<\/div>\n<p>But it is intriguing to learn that you don&#8217;t actually need a bazillion dollar&#8217;s worth of GPUs to get a full-spec LLM running locally. Arguably, it also provides insight into the true scale of intelligence implied by the latest LLMs.<\/p>\n<p>As an end user experiencing what can seem like consciousness streaming out of these bots, the assumption is that it takes huge computation to generate an LLM&#8217;s output. But this setup is doing it on a couple of AMD CPUs.<\/p>\n<p>So, unless you think a couple of AMD CPUs is capable of consciousness, this hardware solution demonstrates the prosaic reality of even the very latest and most advanced LLMs. Maybe the AI apocalypse isn&#8217;t quite upon us after all.<\/p>\n<\/article>\n<p><a href=\"https:\/\/www.pcgamer.com\/hardware\/graphics-cards\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-usd6-000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Got the impression that a bazillion dollar&#8217;s worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI tools outfit HuggingFace, claims that you can run the hot new DeepSeek R1 LLM on just $6,000 of PC hardware. The kicker? You don&#8217;t even need a high-end GPU. Carrigan&#8217;s suggested build involves a dual-socket AMD EPYC motherboard and a couple of compatible AMD chips to go with it. Apparently, the spec of the CPUs isn&#8217;t actually that critical. Instead, it&#8217;s all about the memory. Complete hardware + software setup for running Deepseek-R1 locally. The actual model, no distillations, and Q8 quantization for full quality. Total cost, $6,000. All download and part links below:January 28, 2025 &#8220;We are going to&hellip;<\/p>\n<p class=\"excerpt-more\"><a class=\"blog-excerpt button\" href=\"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/\">Read More&#8230;<\/a><\/p>\n","protected":false},"author":1,"featured_media":1109358,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[336],"tags":[791,66],"class_list":["post-1109357","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-pc-gamer","tag-graphics-cards","tag-hardware"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required | Arcader News<\/title>\n<meta name=\"description\" content=\"Got the impression that a bazillion dollar&#039;s worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required | Arcader News\" \/>\n<meta property=\"og:description\" content=\"Got the impression that a bazillion dollar&#039;s worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI\" \/>\n<meta property=\"og:url\" content=\"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/\" \/>\n<meta property=\"og:site_name\" content=\"Arcade News\" \/>\n<meta property=\"article:published_time\" content=\"2026-01-29T01:39:24+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"480\" \/>\n\t<meta property=\"og:image:height\" content=\"270\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Arcade News\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Arcade News\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/\"},\"author\":{\"name\":\"Arcade News\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/#\\\/schema\\\/person\\\/8460f5e5076b52fb2369f2f7ce6f2839\"},\"headline\":\"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required\",\"datePublished\":\"2026-01-29T01:39:24+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/\"},\"wordCount\":690,\"commentCount\":0,\"image\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/arcader.org\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg\",\"keywords\":[\"Graphics Cards\",\"hardware\"],\"articleSection\":[\"PC Gamer\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/\",\"url\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/\",\"name\":\"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required | Arcader News\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/arcader.org\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg\",\"datePublished\":\"2026-01-29T01:39:24+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/#\\\/schema\\\/person\\\/8460f5e5076b52fb2369f2f7ce6f2839\"},\"description\":\"Got the impression that a bazillion dollar's worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#primaryimage\",\"url\":\"https:\\\/\\\/arcader.org\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg\",\"contentUrl\":\"https:\\\/\\\/arcader.org\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg\",\"width\":480,\"height\":270,\"caption\":\"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/arcader.org\\\/news\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/#website\",\"url\":\"https:\\\/\\\/arcader.org\\\/news\\\/\",\"name\":\"Arcade News\",\"description\":\"Free Arcade News from the Best Online Sources\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/arcader.org\\\/news\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/arcader.org\\\/news\\\/#\\\/schema\\\/person\\\/8460f5e5076b52fb2369f2f7ce6f2839\",\"name\":\"Arcade News\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/3fea48a614d86edd987bc7bb25f4707c69546d4b1f78ad4aa20b26316bad1f9d?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/3fea48a614d86edd987bc7bb25f4707c69546d4b1f78ad4aa20b26316bad1f9d?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/3fea48a614d86edd987bc7bb25f4707c69546d4b1f78ad4aa20b26316bad1f9d?s=96&d=mm&r=g\",\"caption\":\"Arcade News\"},\"sameAs\":[\"https:\\\/\\\/cricketgames.tv\"],\"url\":\"https:\\\/\\\/arcader.org\\\/news\\\/author\\\/arcade-news\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required | Arcader News","description":"Got the impression that a bazillion dollar's worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/","og_locale":"en_US","og_type":"article","og_title":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required | Arcader News","og_description":"Got the impression that a bazillion dollar's worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI","og_url":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/","og_site_name":"Arcade News","article_published_time":"2026-01-29T01:39:24+00:00","og_image":[{"width":480,"height":270,"url":"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg","type":"image\/jpeg"}],"author":"Arcade News","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Arcade News","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#article","isPartOf":{"@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/"},"author":{"name":"Arcade News","@id":"https:\/\/arcader.org\/news\/#\/schema\/person\/8460f5e5076b52fb2369f2f7ce6f2839"},"headline":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required","datePublished":"2026-01-29T01:39:24+00:00","mainEntityOfPage":{"@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/"},"wordCount":690,"commentCount":0,"image":{"@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#primaryimage"},"thumbnailUrl":"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg","keywords":["Graphics Cards","hardware"],"articleSection":["PC Gamer"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/","url":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/","name":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required | Arcader News","isPartOf":{"@id":"https:\/\/arcader.org\/news\/#website"},"primaryImageOfPage":{"@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#primaryimage"},"image":{"@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#primaryimage"},"thumbnailUrl":"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg","datePublished":"2026-01-29T01:39:24+00:00","author":{"@id":"https:\/\/arcader.org\/news\/#\/schema\/person\/8460f5e5076b52fb2369f2f7ce6f2839"},"description":"Got the impression that a bazillion dollar's worth of GPUs are required to run a cutting-edge chatbot? Think again. Matthew Carrigan, an engineer at AI","breadcrumb":{"@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#primaryimage","url":"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg","contentUrl":"https:\/\/arcader.org\/wp-content\/uploads\/2025\/01\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required.jpg","width":480,"height":270,"caption":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required"},{"@type":"BreadcrumbList","@id":"https:\/\/arcader.org\/news\/today-i-learned-i-can-run-my-very-own-deepseek-r1-chatbot-on-just-6000-of-pc-hardware-and-no-megabucks-nvidia-gpus-required\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/arcader.org\/news\/"},{"@type":"ListItem","position":2,"name":"Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required"}]},{"@type":"WebSite","@id":"https:\/\/arcader.org\/news\/#website","url":"https:\/\/arcader.org\/news\/","name":"Arcade News","description":"Free Arcade News from the Best Online Sources","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/arcader.org\/news\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/arcader.org\/news\/#\/schema\/person\/8460f5e5076b52fb2369f2f7ce6f2839","name":"Arcade News","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/3fea48a614d86edd987bc7bb25f4707c69546d4b1f78ad4aa20b26316bad1f9d?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/3fea48a614d86edd987bc7bb25f4707c69546d4b1f78ad4aa20b26316bad1f9d?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/3fea48a614d86edd987bc7bb25f4707c69546d4b1f78ad4aa20b26316bad1f9d?s=96&d=mm&r=g","caption":"Arcade News"},"sameAs":["https:\/\/cricketgames.tv"],"url":"https:\/\/arcader.org\/news\/author\/arcade-news\/"}]}},"_links":{"self":[{"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/posts\/1109357","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/comments?post=1109357"}],"version-history":[{"count":1,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/posts\/1109357\/revisions"}],"predecessor-version":[{"id":1478595,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/posts\/1109357\/revisions\/1478595"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/media\/1109358"}],"wp:attachment":[{"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/media?parent=1109357"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/categories?post=1109357"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/arcader.org\/news\/wp-json\/wp\/v2\/tags?post=1109357"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}