Comments

videoITguy wrote on 3/19/2013, 8:37 PM
Several things about RAID - generally motherboard offered RAID will be a huge trade-off on your system bus if you have a fairly complex configuration across many board slots. You can make it work for some applications where you do not have competition on the system for speedy lane access across slots. That is the cheapest form of a possible RAID.

Outboard RAID boxes can be great if they have the complexity in their own right to handle RAID chores. THIS config is NOT Cheap.

A good middle ground is a lightweight RAID hardware card - start at about $280/card and again needs to be in the right slot on the Motherboard to work. SATA drives are preferred at 6G (this takes special cards, and special cabling). If you go with mechanical hard-drives instead of SSD, you can get more drive space inexpensively. BUT, not all hard-drives are suited for RAID work, and the few that are sell for about 2x the competing simple drive price.
TheRhino wrote on 3/19/2013, 8:46 PM
I'm a big fan of RAID because my workflow has me rendering (2) VEG projects in the background while I burn (2) Blu-ray ISOs and edit (1) VEG in the foreground. I also work with a lot of uncompressed HD & output to 220 Mbps DNxHD. A lot of folks will dump on RAID on this forum but their projects do not require it.

My main editing workstation has a 12-drive 3Ware 9650SE hardware RAID (PCIe) with (4) 2TB hdds in RAID0, (6) 2TB hdds in RAID10, and (2) hdds in RAID1. All drives are in hotswap trays in a CM Stacker case which has (11) 5.25" bays.

The RAID 0 has all source video which is also copied to hotswap drives as a backup. The RAID10 has all edited footage that will be sent to clients. The RAID1 has all VEG files, BR & DVD ISOs, & other customer-related files which is also backed-up on an offsite server.

This system has been running daily for (3+) years. We have had a drive fail in each of the RAIDs including the RAID0. For the RAID10 & RAID1 it was a simple matter of replacing the bad drive with a spare. The day of the failure we continued to edit and allowed the replacement drive to rebuild overnight. When the RAID0 failed, we inserted the hotswap drive with the backup media, continued to edit, and rebuilt the RAID0 with a new drive at the end of the day. We plugged-in the (4) hotswap drives that had a copy of the source video and transferred the files to the new RAID0 overnight. All was good the very next day.

We get 350+ MBps read/writes which means editing uncompressed HD is buttery smooth...

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

john_dennis wrote on 3/19/2013, 9:07 PM
No.
musicvid10 wrote on 3/19/2013, 9:50 PM
Why use RAID?
Even a modest SATA should be able handle 120 MBps continuous throughput.
Former user wrote on 3/19/2013, 10:01 PM
We use Raid 0 where I work on our Final Cut Pro systems because we have to a do a lot of 10bit uncompressed High Def files. I end up doing some uncompressed work at home and wish I had a raid, but budget doesn't allow me to have a lot of storage and a raid. so I have learned patience.

The raid we use at work is on a fiber line. But still Final Cut says it can't keep up sometimes. I would love to run Vegas on it and see what it does, but I can't since these are Macs and work computers.

I also have an internal raid system on a Digital Rapids StreamZ encoding computer. We capture uncompressed HD here to encode for theater projection. It has 4 scsi drives at Raid 0.

Dave T2
videoITguy wrote on 3/19/2013, 10:27 PM
Raid thruput on a fast USB3.0 conneced (SATA innards) can approach the 145Mbps speed- which is very very fast. Trouble is if you are going to spin off one stream of an uncompressed 10bit file, to your NLE you need to be at 185Mbps to be smooth - which a single mechanical harddrive cannot spin. You might get that off of an SSD in 2 lane slot running at system bus when connected that way....

In order to be smooth with a couple of streams, you really need Raid-0 running at 220Mbps and topnotch hardware for more than that should be running at 320Mbps to be running really well.
farss wrote on 3/19/2013, 10:33 PM
I've used MOBO RAID 0 for 10bit 4:2:2.
Latest machine, no RAID.
I now have two NAS boxes, one still running RAID 5, the other RAID 6.

If I had the time I'd love to build a big ZFS system.

Bob.
musicvid10 wrote on 3/19/2013, 11:03 PM
If we're going to continue to discuss video demand vs. drive throughput (a red herring at best), let's please stop confusing Mbps (video throughput) with MBps (drive throughput).

8 Mb = 1 MB, so in my example above (a modest mechanical SATA), the conservative sustainable real-world read/write drive throughput (of 120 MBps) is almost 3 times the suggested video demand (of 350 Mbps), or realistically 274% headroom.

It's never been shown to my satisfaction that RAID 0 offers any advantage in a normal editing environment, because the drive is never the data bottleneck. Anyone doing full-toot 4K is already going to have the system to handle it, up front.

With the mathematical failure rate of RAID 0 being 195% over that of a single drive, why use RAID?
Chienworks wrote on 3/19/2013, 11:44 PM
For smaller systems i'll use RAID 1 for redundancy. I've never encountered a situation in the last 15 years where RAID 0 was worth the risk or even resulted in a noticeable speed benefit.

At work all our DB and file servers are RAID 6 for redundancy, speed, security, and live hot swap capability. But, as pointed out earlier, it's pricey! Our 16 bay outboard RAID boxes run about $4500 just for the eSata host box, and the WD 2TB black drives about $125 to $175 each, depending on what sales we find. Load it up with 6 drives and it's pushing $5400. We've got one with 15 drives that cost about $7000.*

When i get around to building my huge data storage box at home someday, it'll be RAID 1 using LVM to span huge volumes across multiple drives. It'll be able to take 8 3TB drives and present them to the host as a single 12TB volume. No hot-swap, but my home environment isn't so critical that i can't wait for an opportunity to shut it down for 2 minutes to replace a drive. We don't have that luxury with the work DBs.

*Although, that $7000 was dwarfed by the host box we got to go with it, with 24 8-core Xeon CPUs and 128GB RAM, which came in at about $41,000. Too bad i can't install Vegas on it!
farss wrote on 3/20/2013, 12:45 AM
"With the mathematical failure rate of RAID 0 being 195% over that of a single drive, why use RAID?"

Over the years my experience has been that drives are very reliable. I've had more power supplies and MOBOs fail than HDDs The one HDD I had fail was a case of infant failure. The first IDE drives I bought are still working.

I've used RAID 0 either from a Promise RAID controller or from a MOBO RAID controller because that's what BMD recommend. I probably could have gotten by with a single drive but why risk it.

Of course I'm only a small fish, there's one user here cutting multicam from something like 6 RED cameras and I recall he needed lot of drives in RAID 0 to get it to fly.

As for 4K, that's going to be a real challenge because Sony have upped the game to 14 bits per channel packed into 16its. 4K will mean 8 times the data of 1080p, maybe more.

Bob.
Byron K wrote on 3/20/2013, 1:04 AM
The raid controller is the weakest link on a RAID system.

If you loose your RAID controller/card you're stuck. I don't do RAID just for this reason. I just lost my P4HT motherboard which was providing RAID 0+1 array and now cannot recover anything from that machine. If it was just a standard drive or just a RAID1 I could at least remove the drives from the machine and recover.

Loosing my ASUS P4 motherboard was the last time I'll RAID my workstations.

I've also lost an adaptec RAID cards and a Netcell RAID card both of which have corrupted arrays that had to be rebuilt. The only card that I've used and recommend, that provided excellent stable RAID arrays is 3WARE. I've built a few server systems with these and none of them has failed. I've since move on to SSD where I load my video files to read from and render to SATA. (;

I always back-up original footage to external SATA before editing. (;
ushere wrote on 3/20/2013, 6:08 AM
+1 jd.

last used raid about 8 years ago. however, just working with regular hdv / avchd / mxf regular hd's are plenty fast enough. enough fx would bog down my cpu before the drives.

every day's edit backup - though (touch wood) i haven't had a drive fail in years (seagate)....
JohnnyRoy wrote on 3/20/2013, 8:42 PM
I use a 9TB CineRAID CR-H458 RAID 5 (4 x Western Digital Red WD30EFRX 3TB IntelliPower SATA 6.0Gb/s ) connected via eSATA to store all of my captured video waiting for editing and all of my completed projects. I had a RAID 0 for editing on my previous build but decided not to have one in my new build but I miss the speed and will probably go back to a 2TB (2x 1TB) RAID 0 for editing.

~jr
riredale wrote on 3/20/2013, 9:27 PM
No raid here, either. I work with just HDV, which any drive can handle without breaking a sweat, and my renders are CPU-bound anyway.

I, too, have found hard drives to be very reliable, though I do copy the most recent veg file at the end of the day to another drive, so in theory I could lose the work drive and rebuild the entire project from scratch by just re-importing all the raw files.