<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	
	>
<channel>
	<title>
	Comments on: A Full Hardware Guide to Deep Learning	</title>
	<atom:link href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/feed/" rel="self" type="application/rss+xml" />
	<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/</link>
	<description>Making deep learning accessible.</description>
	<lastBuildDate>Mon, 25 Oct 2021 01:22:16 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.0.11</generator>
	<item>
		<title>
		By: TK		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-97565</link>

		<dc:creator><![CDATA[TK]]></dc:creator>
		<pubDate>Mon, 25 Oct 2021 01:22:16 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-97565</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-92014&quot;&gt;zoey79&lt;/a&gt;.

I had a gaming laptop for deep learning. However I think desktop is still a better choice . Using Laptop for deep learning tend to overheat the laptop and battery appears to degrade much faster. 
Moreover, the largest gpu memory in a laptop is 8gb but note that not all 8gb can be allocated for deep learning, which may not be sufficient if you are trying a very deep network or dual network. Mobile gpu is also less efficient than desktop gpu. Computing speed (cpu and etc) can also slower than a gaming desktop.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-92014">zoey79</a>.</p>
<p>I had a gaming laptop for deep learning. However I think desktop is still a better choice . Using Laptop for deep learning tend to overheat the laptop and battery appears to degrade much faster.<br />
Moreover, the largest gpu memory in a laptop is 8gb but note that not all 8gb can be allocated for deep learning, which may not be sufficient if you are trying a very deep network or dual network. Mobile gpu is also less efficient than desktop gpu. Computing speed (cpu and etc) can also slower than a gaming desktop.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Tim Dettmers		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-97546</link>

		<dc:creator><![CDATA[Tim Dettmers]]></dc:creator>
		<pubDate>Sun, 24 Oct 2021 18:45:05 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-97546</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-95048&quot;&gt;Ehtesham&lt;/a&gt;.

Thanks for sharing this! This shows how difficult it can be to get the power requirements right]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-95048">Ehtesham</a>.</p>
<p>Thanks for sharing this! This shows how difficult it can be to get the power requirements right</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Tim Dettmers		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-97540</link>

		<dc:creator><![CDATA[Tim Dettmers]]></dc:creator>
		<pubDate>Sun, 24 Oct 2021 18:26:39 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-97540</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-93379&quot;&gt;Jay&lt;/a&gt;.

6GB is indeed a bit small – I would go for the 8 GB GPU]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-93379">Jay</a>.</p>
<p>6GB is indeed a bit small – I would go for the 8 GB GPU</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Tim Dettmers		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-97534</link>

		<dc:creator><![CDATA[Tim Dettmers]]></dc:creator>
		<pubDate>Sun, 24 Oct 2021 17:58:10 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-97534</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-92014&quot;&gt;zoey79&lt;/a&gt;.

Gaming laptops are excellent for deep learning. Make sure to get a beefy GPU!]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-92014">zoey79</a>.</p>
<p>Gaming laptops are excellent for deep learning. Make sure to get a beefy GPU!</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Ehtesham		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-95048</link>

		<dc:creator><![CDATA[Ehtesham]]></dc:creator>
		<pubDate>Sat, 11 Sep 2021 12:01:53 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-95048</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-52570&quot;&gt;Tim Dettmers&lt;/a&gt;.

Well, I tried something on Hp Proliant ML350 Gen8 for mining. (I wish i may put a picture here). 6 GPU setup. 3 mini risers 009S at one side of mobo and 3 on the others side of the board. Those six USB wires let them through the back side of chassis - and mount my 6 GPUs on the top of server. 

All 6gpus (GTX 1660Super) recognized by windows 10 system. Now the difficult part.. Proliant ML350 Gen8 has separate 400W power supplies (combined 800W). Technically, 1660 super requires and draw 90W. In this scenario, total consumption is 540W.. so i decided to put 750W additional power supply to meet my requirement.

I failed.. coz power supply goes down in 1 minute.. and windows gets crashed and system get rebooted... Same exercise i tried with HIVEOS but same result. 

Today I will try to put a 1500W power supply along with 800W (already installed supply) and shall see the results... Hope it works.. !]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-52570">Tim Dettmers</a>.</p>
<p>Well, I tried something on Hp Proliant ML350 Gen8 for mining. (I wish i may put a picture here). 6 GPU setup. 3 mini risers 009S at one side of mobo and 3 on the others side of the board. Those six USB wires let them through the back side of chassis &#8211; and mount my 6 GPUs on the top of server. </p>
<p>All 6gpus (GTX 1660Super) recognized by windows 10 system. Now the difficult part.. Proliant ML350 Gen8 has separate 400W power supplies (combined 800W). Technically, 1660 super requires and draw 90W. In this scenario, total consumption is 540W.. so i decided to put 750W additional power supply to meet my requirement.</p>
<p>I failed.. coz power supply goes down in 1 minute.. and windows gets crashed and system get rebooted&#8230; Same exercise i tried with HIVEOS but same result. </p>
<p>Today I will try to put a 1500W power supply along with 800W (already installed supply) and shall see the results&#8230; Hope it works.. !</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Jay		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-93379</link>

		<dc:creator><![CDATA[Jay]]></dc:creator>
		<pubDate>Fri, 30 Jul 2021 03:31:07 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-93379</guid>

					<description><![CDATA[Hey,

Thanks for that summary. You said that one should buy a GPU with at least 8GB RAM but that RTX GPU RAM was twice as effective as GTX RAM. That brings me to my question.

I have a choice between 2 laptops. Identical except one has an GeForce RTX 3060 6GB and costs $1400; while the other has a GeForce RTX 3070 8GB and costs $2000.

I know the RTX 3060 will be slower but is 6GB acceptable? You implied it will be the equivalent of a GeForce GTX 12GB RAM video card for RAM utilization. 

Please advise as I&#039;d really like to save the extra $600 in cost between the 2 laptops. 

Given that video card add-ins for desktops for 3000 series RTX cards seem to start at $1000 it seems to me I should bide my time with a good entry level laptop with an RTX GPU that has much fairer prices until the video card price gouging is done for. 

Thanks!]]></description>
			<content:encoded><![CDATA[<p>Hey,</p>
<p>Thanks for that summary. You said that one should buy a GPU with at least 8GB RAM but that RTX GPU RAM was twice as effective as GTX RAM. That brings me to my question.</p>
<p>I have a choice between 2 laptops. Identical except one has an GeForce RTX 3060 6GB and costs $1400; while the other has a GeForce RTX 3070 8GB and costs $2000.</p>
<p>I know the RTX 3060 will be slower but is 6GB acceptable? You implied it will be the equivalent of a GeForce GTX 12GB RAM video card for RAM utilization. </p>
<p>Please advise as I&#8217;d really like to save the extra $600 in cost between the 2 laptops. </p>
<p>Given that video card add-ins for desktops for 3000 series RTX cards seem to start at $1000 it seems to me I should bide my time with a good entry level laptop with an RTX GPU that has much fairer prices until the video card price gouging is done for. </p>
<p>Thanks!</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: zoey79		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-92014</link>

		<dc:creator><![CDATA[zoey79]]></dc:creator>
		<pubDate>Wed, 09 Jun 2021 16:16:43 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-92014</guid>

					<description><![CDATA[Wonderful article. However, I am about to buy a new laptop. So what do you feel about the idea of a gaming laptop for deep learning?]]></description>
			<content:encoded><![CDATA[<p>Wonderful article. However, I am about to buy a new laptop. So what do you feel about the idea of a gaming laptop for deep learning?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: marco		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-88888</link>

		<dc:creator><![CDATA[marco]]></dc:creator>
		<pubDate>Thu, 01 Apr 2021 14:23:54 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-88888</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-84883&quot;&gt;Dmytro&lt;/a&gt;.

1. Version of Python and tf should match (please verifiy spec on tf software requirements)

2. CPU binary precompiled version of Tf can include CPU specific optimizations that could be not compatibles.

Generally this happen when a different cpu architecture binary is launched on an unsupported CPU.

Probably tf 1.5 could be run on your CPU, but the new was not compiled for that or was not compatible with your Python version.

How to solve it:
if u can check the python incompatible issue, that could be solved quickly, installing appropriate version of Python (I suppose 3.8 bcs I have the same tf 2.2 on my machine and it use  3.8)
If the problem is at binary / cpu compatible level you should :
A: change CPU
or
B: compile tf from source ON YOUR CPU.

I yet compiled tf on my CPU several time and you can do it, don&#039;t worry]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-84883">Dmytro</a>.</p>
<p>1. Version of Python and tf should match (please verifiy spec on tf software requirements)</p>
<p>2. CPU binary precompiled version of Tf can include CPU specific optimizations that could be not compatibles.</p>
<p>Generally this happen when a different cpu architecture binary is launched on an unsupported CPU.</p>
<p>Probably tf 1.5 could be run on your CPU, but the new was not compiled for that or was not compatible with your Python version.</p>
<p>How to solve it:<br />
if u can check the python incompatible issue, that could be solved quickly, installing appropriate version of Python (I suppose 3.8 bcs I have the same tf 2.2 on my machine and it use  3.8)<br />
If the problem is at binary / cpu compatible level you should :<br />
A: change CPU<br />
or<br />
B: compile tf from source ON YOUR CPU.</p>
<p>I yet compiled tf on my CPU several time and you can do it, don&#8217;t worry</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: David		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-87773</link>

		<dc:creator><![CDATA[David]]></dc:creator>
		<pubDate>Sat, 06 Mar 2021 17:17:19 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-87773</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-84275&quot;&gt;Audi&lt;/a&gt;.

Hey Audi,

I&#039;m currently struggling with a similar problem. Either the Ryzen 9 3900x or the 5800x. Do you know which one is better for deep learning? Following the explanation given by tim I suppose that the 12 cores outperforms the 8 cores of the 5800x?]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-84275">Audi</a>.</p>
<p>Hey Audi,</p>
<p>I&#8217;m currently struggling with a similar problem. Either the Ryzen 9 3900x or the 5800x. Do you know which one is better for deep learning? Following the explanation given by tim I suppose that the 12 cores outperforms the 8 cores of the 5800x?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Chaitanya		</title>
		<link>https://timdettmers.com/2018/12/16/deep-learning-hardware-guide/comment-page-1/#comment-85549</link>

		<dc:creator><![CDATA[Chaitanya]]></dc:creator>
		<pubDate>Tue, 02 Feb 2021 07:15:13 +0000</pubDate>
		<guid isPermaLink="false">https://timdettmers.wordpress.com/?p=121#comment-85549</guid>

					<description><![CDATA[Thank you Tim for the post, it was very helpful to understand the importance of hardware components in deep learning. 

I have been researching about the hardware requirements to begin a Deep learning project on my work station from couple of months, finally read your article that has answered lot of my questions. I did realize  the GPU on my machine will not be sufficient so wanted to get your thoughts on its replacement or adding a second one.   

Please suggest if I can add any Nvidia 20xx series GPU to below configuration. 

 - Dual CPU - Xeon E5 2670 - V2 10 cores each,  64GB RAM
 - Existing GPU - Nvidia Geforce 1050
 - power unit - 800 watts 
 - two PCi e gen 3 X 16 slots (with 4 other gen slots in between, currently one is in use for 1050)]]></description>
			<content:encoded><![CDATA[<p>Thank you Tim for the post, it was very helpful to understand the importance of hardware components in deep learning. </p>
<p>I have been researching about the hardware requirements to begin a Deep learning project on my work station from couple of months, finally read your article that has answered lot of my questions. I did realize  the GPU on my machine will not be sufficient so wanted to get your thoughts on its replacement or adding a second one.   </p>
<p>Please suggest if I can add any Nvidia 20xx series GPU to below configuration. </p>
<p> &#8211; Dual CPU &#8211; Xeon E5 2670 &#8211; V2 10 cores each,  64GB RAM<br />
 &#8211; Existing GPU &#8211; Nvidia Geforce 1050<br />
 &#8211; power unit &#8211; 800 watts<br />
 &#8211; two PCi e gen 3 X 16 slots (with 4 other gen slots in between, currently one is in use for 1050)</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
