Difference between revisions of "Talk:Winter 2009 SYA810 Block Device Benchmark Scripts"

From CDOT Wiki
Jump to: navigation, search
(Take Note!)
 
Line 1: Line 1:
 
Here's something interesting I found while researching the topic, from the creator of the [http://www.textuality.com/bonnie/ bonnie] benchmark:  
 
Here's something interesting I found while researching the topic, from the creator of the [http://www.textuality.com/bonnie/ bonnie] benchmark:  
 
<blockquote>
 
<blockquote>
"...note that the Bonnie results when your memory’s bigger than your test data are generally bogus, since well-designed Unix-lineage systems... try hard to buffer everything to avoid doing I/O. The only way to defeat this and actually test I/O rates is to completely flood the available buffer space. This is the right thing to do, because in many production applications, memory is maxed out anyhow, so the actual I/O rate (what Bonnie measures) becomes an important performance-limiting factor."  
+
"...note that the Bonnie results when your memory’s bigger than your test data are generally bogus, since well-designed Unix-lineage systems... try hard to buffer everything to avoid doing I/O. The only way to defeat this and actually test I/O rates is to completely flood the available buffer space. This is the right thing to do, because in many production applications, memory is maxed out anyhow, so the actual I/O rate (what Bonnie measures) becomes an important performance-limiting factor." [http://www.tbray.org/ongoing/When/200x/2004/11/16/Bonnie64]
 
</blockquote>
 
</blockquote>
 +
...and again, from the website:
 +
<blockquote>
 +
"It is important to use a file size that is several times the size of the available memory (RAM) - otherwise, the operating system will cache large parts of the file, and Bonnie will end up doing very little I/O. <b>At least four times</b> the size of the available memory is desirable." (emphasis added)[http://www.textuality.com/bonnie/advice.html]
 +
</blockquote>
  
 
In other words, in order to get any meaningful results out of a hard drive performance test, your script has to produce <b>more hard drive I/O than the computer has RAM. </b> If I understand this correctly, this means that (for example) if your computer has 1GB of RAM, then the script has to read/write at least 1GB before it starts to actually stress the hard drive. Otherwise, the data gets stored in your ram and never touches the hard drive, you have effectively benchmarked the read/write speed of your ram. :) Note that this can come in the form of lots of little files, totaling more than 1GB, or one giant file bigger that 1GB. In fact, it probably best if you do both.  - --[[User:Evets|scarter4]] 14:42, 22 January 2009 (UTC)
 
In other words, in order to get any meaningful results out of a hard drive performance test, your script has to produce <b>more hard drive I/O than the computer has RAM. </b> If I understand this correctly, this means that (for example) if your computer has 1GB of RAM, then the script has to read/write at least 1GB before it starts to actually stress the hard drive. Otherwise, the data gets stored in your ram and never touches the hard drive, you have effectively benchmarked the read/write speed of your ram. :) Note that this can come in the form of lots of little files, totaling more than 1GB, or one giant file bigger that 1GB. In fact, it probably best if you do both.  - --[[User:Evets|scarter4]] 14:42, 22 January 2009 (UTC)

Revision as of 09:54, 22 January 2009

Here's something interesting I found while researching the topic, from the creator of the bonnie benchmark:

"...note that the Bonnie results when your memory’s bigger than your test data are generally bogus, since well-designed Unix-lineage systems... try hard to buffer everything to avoid doing I/O. The only way to defeat this and actually test I/O rates is to completely flood the available buffer space. This is the right thing to do, because in many production applications, memory is maxed out anyhow, so the actual I/O rate (what Bonnie measures) becomes an important performance-limiting factor." [1]

...and again, from the website:

"It is important to use a file size that is several times the size of the available memory (RAM) - otherwise, the operating system will cache large parts of the file, and Bonnie will end up doing very little I/O. At least four times the size of the available memory is desirable." (emphasis added)[2]

In other words, in order to get any meaningful results out of a hard drive performance test, your script has to produce more hard drive I/O than the computer has RAM. If I understand this correctly, this means that (for example) if your computer has 1GB of RAM, then the script has to read/write at least 1GB before it starts to actually stress the hard drive. Otherwise, the data gets stored in your ram and never touches the hard drive, you have effectively benchmarked the read/write speed of your ram. :) Note that this can come in the form of lots of little files, totaling more than 1GB, or one giant file bigger that 1GB. In fact, it probably best if you do both. - --scarter4 14:42, 22 January 2009 (UTC)