24 SSD RAID - Over 20TB of SSD Storage!

Linus Tech Tips ·Linus Tech Tips ·2016-05-06 · 1,800 words · ~9 min read
Floatplane YouTube

Transcript

JSON SRT VTT 137
0:00 if you've been following me on social media you've probably spent a fair bit
0:03 of your time lately feeling bad for me about all of the ssds that I had to
0:08 mount in our new 24 Drive Solid State Storage server no all right well then
0:13 you've probably at least been hoping that I'll make a video about it at some
0:16 point and talk about the performance and that time is now this is the allnew
0:22 wanic the fastest beast machine in our
0:26 office
0:35 the Corsair HX 1200i power supply delivers 80 plus Platinum efficiency for
0:40 quiet performance and Corsair link digital Advanced monitoring and control
0:45 click now to learn more so our current storage server rusin
0:50 uses Seagate 3 tbte consumer drives in a
0:53 raid 6 array to achieve respectable readand WR performance and some fault
0:59 tolerance the array can actually lose up to two drives before suffering
1:02 catastrophic data loss assuming it's able to rebuild before more drives fail
1:07 or an unrecoverable error occurs this is all fine and good but the main problem
1:13 with it is that rusin was built for one
1:16 Editor to work on 4K video files at Max
1:19 Speed and we now have a whole room full of editors so while the rusin 10 GB
1:25 network interface and sequential data speeds aren't really bottleneck X its
1:30 mechanical drives are much more suitable for a single person workflow so I
1:36 reached out to our good buddies at Kingston with a crazy idea what if we
1:42 slipped free of the Surly Bonds of mechanical storage and danced the skies
1:47 on SSD silvered Wings to which they kind
1:50 of went um how much silver lonus I told
1:53 them I wanted 24 1 tbte class drives and
1:57 dog G it for some reason said yes I
2:01 think the most incredible thing about that story is how much the landscape has
2:05 changed in such a short amount of time two years ago I could have been the Pope
2:10 in Rome and any SSD maker would have laughed at me for wanting 20 terabytes
2:16 of redundant SSD storage in a single
2:19 server but in 2015 Kingston's just like
2:22 yeah we've got the Enterprise grade KC 310 it's got an 8 channel fisen S10
2:28 controller 960 gigabyt of capacity ECC
2:32 flash protection for data Integrity power loss protection trim support
2:36 although we'll be relying on idle garbage collection in rate anyway and
2:39 it's under 60 cents per gig I mean Holy balls I'm actually wearing the right
2:43 shirt for that so let's talk upgrade process then the first thing I needed
2:48 was way better raid cards yes cards not
2:51 a single card there are 24 Port controllers in fact the old server has
2:56 one but since each individual SS D is
3:00 capable of 500 plus megabytes per second read and write speeds if you hook 24 of
3:06 them up to a single card with a theoretical total speed in the
3:10 neighborhood of 12 gbt per second you're going to run into some pretty serious
3:14 bottlenecks all over the place so after removing the placeholder mechanical
3:18 drives from the system laboriously mounting 24 ssds on sleds and connecting
3:24 the sff 887 connectors Each of which handles four drives to their back plane
3:29 in my Norco RPC 4224 chassis man I love
3:32 these things on Kingston's recommendation I picked up three LSI
3:37 9271 8i 8 Port raid cards each in a PCI
3:42 Express 3.x slot this is where the x99
3:47 platform really shows its value because you're going to need enough PCI Express
3:51 Lanes to handle all that storage bandwidth something that consumer grade
3:56 platforms simply cannot provide now
3:59 something a lot lot of people commented on when I posted a picture of these
4:02 cards on Instagram was that these cards run really hot and I had them installed
4:07 right next to each other don't worry I'm using a 90mm fan mounted directly on top
4:12 of them for auxiliary Cooling and I'll be bolting that in before I install this
4:16 server in our fancy rack cabinet at the new office so with all the drives
4:20 installed the next step was getting firmware updates and drivers taking care
4:24 of for my controllers and configuring arrays naturally the first thing I did
4:29 was throw the whole thing in raid zero for laws to see how fast it would go
4:34 there's a bit of a special process for this in this case though you need to
4:38 create a raid zero array of eight drives
4:41 on each of the controller cards then use
4:44 software raid to put them all together so in my case that required the use of
4:49 dis Management in Windows to set each raid zero as a dynamic drive then stripe
4:55 the whole thing together so it's kind of like raid 0000 or something like that
5:00 the results were well if Shia were here
5:04 I guess she'd say that don't impress me much read speeds were great even for
5:09 512k transactions I'm looking at over 5
5:13 12 gigabytes per second I mean remember
5:16 this is for video editing so very little of what we deal with is going to be
5:19 smaller than half a Meg with 4K transfers that's more than two full
5:24 orders of magnitude faster than my old 10 hard drive solution but those
5:30 right speeds aren't enough to saturate
5:33 the planned 2x1 GB teamed network connection This Server is packing if
5:38 multiple users are writing large files to the array either way raid zero wasn't
5:43 my final configuration since I wanted some fault tolerance so I figured if I'm
5:47 going to troubleshoot this thing I might as well do it when it's set up properly
5:51 so I threw my eight Drive arrays in raid five that allows me to lose up to one
5:56 drive per array and then I also have a spare drive on hand in the unlikely
6:00 event of a failure which is lots for a server that'll be backed up nightly on
6:04 the network then I striped those raid fives together in software for what is
6:08 effectively raid 50 a quick Benchmark before the arrays
6:12 were finished initializing revealed worse numbers than raid zero although
6:17 that's pretty much a given since any parody raid puts much more load on the
6:21 controller card especially for rights than a striping raid but I really hadn't
6:25 expected them to be this bad so I waited for the arrays to finish initializing
6:30 and they got worse so it was about that time that I
6:35 realized maybe the right cash setting on solid state makes a bigger difference
6:39 than on mechanical so even though I don't have battery backups for my cards
6:43 or a ups for my server yet I enabled
6:47 right back cash and there we go there is
6:53 the drawback of an unexpected power loss causing potential data loss with right
6:57 back caching enabled but we're just going to have to get those batteries and
7:02 UPS's going because with that setting on we are able to saturate the bananas out
7:07 of any connection we can make on the network to This Server when she's
7:12 handling large streaming reads and wrs this array can do an excess of 5
7:17 gigabytes per second when she's handling extremely small transactions she can
7:22 still do just under a 100 times the performance of Ruskin and when she's
7:27 able to queue up those small transactions from many clients hitting
7:30 her at the same time she can do well over 500 megabytes per second I just
7:35 need to drop another $600 on battery units for the raid cards and wait for
7:40 the network cards for my clients to show up so that I can show you guys how the
7:45 network is going to handle all of this
7:49 man this server grade stuff is expensive and very timec consuming but it floats
7:53 my geeky boat to see numbers like this where a PCI express-based Predator SS D
8:00 is the bottleneck in a local file
8:03 transfer speaking of stuff that floats my geeky boat I fix it you probably know
8:07 I fix it from their tear Downs of electronic devices and their fantastic
8:13 repair guides on their site that can save you tens 50s even hundreds of
8:18 dollars on repair costs I've used them a number of times on an iMac on a phone
8:23 and I'm sure there's something else but I'm not thinking of it at the moment
8:26 what you probably aren't aware of is that I fix it Sals per professional
8:29 grade tools as well so they've got their uh their iFixit 54-- bit driver kit
8:34 they've got all these little prying tools they've got antistatic straps
8:38 they've got their magnetic organizer that I actually I might have Andy
8:43 yeah I was using this the other day that lets you write little labels draw little
8:47 diagrams and keep all your screws somewhere safe when you're working on a
8:50 project they've got all kinds of fantastic stuff whether you're trying to
8:53 take apart a Nintendo DS with a tri-wing bit whether you're trying to take apart
8:57 McDonald's toys with a triangle bit or you need to take apart something that
9:01 uses security Tores all that stuff they've got it and what's cool is when
9:06 you go on their guides they actually list all of the tools that you need for
9:10 a particular guide the one to probably start with though is the kind of
9:14 all-in-one protect toolkit pack I use mine all the time it's 65 bucks and if
9:20 you use ifixit.com Linus and then code Linus 05 at the
9:25 checkout you save $10 off that or any purchase of $50 or more so that's
9:30 ifixit.com Linus check it out great tools great guides great
9:34 stuff so that's pretty much it guys thanks for watching like the video if
9:38 you liked it dislike it if you thought it sucked leave a comment preferably at the link below to our Forum if you want
9:42 to discuss it also linked below you can buy a cool t-shirt like this one you can
9:46 give us a monthly contribution if you think what we're doing is important you
9:50 can change your Amazon bookmark to one with our affiliate codes so next time
9:53 you buy 24 ssds we'll get a kickback from that um and that's pretty much it
9:57 don't forget to subscribe and follow and all that good stuff thanks again for
10:00 watching