For years, the solution to IO bottlenecks has been pretty consistent: (1) add
spindles to decrease seek time and increase throughput, and (2) add as much
RAM as you can so your filesystems and applications can cache hot data and
avoid disk access entirely.
These brute-force attempts to gain performance are inherently flawed and
costly. The price of increasing the number of disks in an array adds up
quick, to say nothing of the investment in additional JBODs when you run out
of slots in your array. And although the cost of consumer-grade memory has
fallen, relying upon RAM for caching in enterprise environments can get
expensive, quickly. Worse, once you run out of DIMM slots for all that RAM,
you’re left with no way to increase the size of your cache aside from
purchasing more servers and building a clustered environment.
Cheaper IOPS: SSD vs Spinning Ru... (more)
Whether ‘tis nobler in the mind to suffer the slings and arrows of your IT
manager or to take up cache against a sea of data. With apologies to Bill
Shakespeare, the IT Dog is discussing two different cache write policies:
write through and write back. Let me take a run at telling you what they are,
why they are different and what you need to think about when you decide what
policy to use.
Cache Write Through
This is the easier to explain and understand of the two policies discussed
here. Cache write through is like having your cake and eating it too. Data is
written into cache ... (more)
I wear many hats at VeloBit, one of which is to manage the production and
development lab environment for our engineering team. This means that I face
the same basic challenge that all sysadmins face: growing capacity
requirements and a set budget to work with. Luckily, I have an ace up my
sleeve: VeloBit HyperCache, which lets me do more with less. This, combined
with KVM-based virtualization, allows us to keep running without having to
deal with server-sprawl in our labs.
How we use Virtualization at VeloBit
We make extensive use of virtualization for the IT infrastructure at V... (more)
Solid State Disk (SSD) technology is one of the hottest IT areas in 2012. The
promise to eliminate storage IO bottlenecks and increase application
performance, enable server virtualization, enable data center growth, and
reduce IT costs has prompted 83% of all IT departments to consider SSD. But
those planning an SSD deployment face a variety of deployment choices.
Moreover, choosing poorly can yield large expense, disruption to existing IT
environments, and a complex new set of storage management tasks.
Join Brian Garrett, Vice President at the Enterprise Strategy Gro... (more)
Dr Dog Says “You Need More RAM In Your Cache Diet”
Well, I am not really a doctor, but I play one on TV. You may remember (if
you are old enough) that line from the 1986 Vicks-44 TV commercial. Anyway,
while I may not be a real doctor, I can certainly tell you about why you need
more RAM in you cache diet.
“I Think My Cache Diet Is Great”
If you were happily running your IT system but needed some additional
performance, chances are you turned to some kind of SSD caching solution to
improve performance at a lower cost point than adding more servers or
storage. You went on what I a... (more)