Hi Fellow Forum Members:
I joined the forum back in September of 2016 and I have been reading posts and "recommended" documentation ever since. It has given me time to reflect on my objectives and how I should go about getting there. I have gone back and forth between a Supermicro 24 bay chassis solution and something a little more conventional. Finally, after a lot of deliberation, I decided to build a new FreeNAS solution that is more compact and easier to manage within a home environment
Objectives:
Hopefully, all of my homework and reading and sometimes re-reading have paid off and it won't be too painful to correct any oversights I might have made.
Proposed build:
Case: Fractal Design 804
Power: Seasonic Platinum 760W, calculated from the forum guidelines.
Cooling: There are 3x R2 cooling fans included with the case I have chosen; I will likely need additional fans to optimize cooling, but I may wait and see what temps I get with the stock fans. I don't like buying equipment for what-if scenarios, but I am open to suggestions.
Motherboard: Supermicro X11SSL-CF-O
I selected the X11 because it will allow me to have up to 64GB of ECC unbuffered memory and should adequately support my stated goals.
CPU: E3-1240, v5 (likely overkill for the given tasks, but since I started out with the idea of an E5, I figured I was making a good compromise).
RAM: Crucial 32GB DDR4-2400 ECC UDIMM (CT7982583)
Taken directly from the Crucial website; this is the highest density and I plan to go either with 32GB (2x) or 64GB (4x) for this project. The above part number is a 32GB paired set.
Boot: (2x, mirrored), SanDisk SSD PLUS 120GB Solid State Drive (SDSSDA-120G-G26) [Newest Version]
As I am not a fan of USB fobs, I picked 2 reasonably priced (~$100 for 2) and rated SSDs, to host the OS.
Storage: 10x WD Red (WD60EFRX)
I normally would go with an enterprise class drive; the QNAP for example has 4TB RED PRO drives in it; however, I do like the lower wattage of the plain REDS, albeit with lesser warranty coverage. It's a toss up, but I figured, given the case I am choosing, if I could lessen the heat load without really negatively impacting the overall transfer speed, why not do it? Presently, I have a GbE network, with the ability to go to 10 GbE if warranted.
I also thought about going with SAS based drives with the built in controller, but then I would have needed to add an expander to get past 8 drives and that just seemed cost-ineffective when I already have 8 SATA connections from the built-in SAS controller (via break-out cables) and 6 additional SATA connectors with the chosen motherboard. Even with 10 data drives, and 2 SSD's for the OS, I have 2 spare SATA ports if needed.
Storage Configuration Options
For a ZFS VDev/zpool solution, at 10 drives I know I am pressing the "width" barrier. With that in mind, I have outlined a few potential options. Calculated capacities provided by the "ZFS Drive Size and Cost Comparison" spreadsheet created by @diedrichg
Option 1: 1 zpool, 1 VDev, 10 drives in a RAIDZ2 configuration with a projected capacity of ~37.79TB
Well, that about does it for now. I welcome all constructive criticism and advice, and I look forward to being part of this community of FreeNAS owners.
I joined the forum back in September of 2016 and I have been reading posts and "recommended" documentation ever since. It has given me time to reflect on my objectives and how I should go about getting there. I have gone back and forth between a Supermicro 24 bay chassis solution and something a little more conventional. Finally, after a lot of deliberation, I decided to build a new FreeNAS solution that is more compact and easier to manage within a home environment
Objectives:
- Replace an older, hardware RAID-6 based storage array (nearing EOL) that is presently used to backup my primary QNAP storage system.
- Move the QNAP appliance into the secondary storage role, kept in sync using RSYNC.
- Increase storage capacity with this new FreeNAS solution to allow for data growth.
- SMB accessible, shared file system.
- Maximum of 5 concurrent users.
- Mixed file types: media (audio/video) libraries, images, various documents, and backups of personal devices.
Hopefully, all of my homework and reading and sometimes re-reading have paid off and it won't be too painful to correct any oversights I might have made.
Proposed build:
Case: Fractal Design 804
Power: Seasonic Platinum 760W, calculated from the forum guidelines.
Cooling: There are 3x R2 cooling fans included with the case I have chosen; I will likely need additional fans to optimize cooling, but I may wait and see what temps I get with the stock fans. I don't like buying equipment for what-if scenarios, but I am open to suggestions.
Motherboard: Supermicro X11SSL-CF-O
I selected the X11 because it will allow me to have up to 64GB of ECC unbuffered memory and should adequately support my stated goals.
CPU: E3-1240, v5 (likely overkill for the given tasks, but since I started out with the idea of an E5, I figured I was making a good compromise).
RAM: Crucial 32GB DDR4-2400 ECC UDIMM (CT7982583)
Taken directly from the Crucial website; this is the highest density and I plan to go either with 32GB (2x) or 64GB (4x) for this project. The above part number is a 32GB paired set.
Boot: (2x, mirrored), SanDisk SSD PLUS 120GB Solid State Drive (SDSSDA-120G-G26) [Newest Version]
As I am not a fan of USB fobs, I picked 2 reasonably priced (~$100 for 2) and rated SSDs, to host the OS.
Storage: 10x WD Red (WD60EFRX)
I normally would go with an enterprise class drive; the QNAP for example has 4TB RED PRO drives in it; however, I do like the lower wattage of the plain REDS, albeit with lesser warranty coverage. It's a toss up, but I figured, given the case I am choosing, if I could lessen the heat load without really negatively impacting the overall transfer speed, why not do it? Presently, I have a GbE network, with the ability to go to 10 GbE if warranted.
I also thought about going with SAS based drives with the built in controller, but then I would have needed to add an expander to get past 8 drives and that just seemed cost-ineffective when I already have 8 SATA connections from the built-in SAS controller (via break-out cables) and 6 additional SATA connectors with the chosen motherboard. Even with 10 data drives, and 2 SSD's for the OS, I have 2 spare SATA ports if needed.
Storage Configuration Options
For a ZFS VDev/zpool solution, at 10 drives I know I am pressing the "width" barrier. With that in mind, I have outlined a few potential options. Calculated capacities provided by the "ZFS Drive Size and Cost Comparison" spreadsheet created by @diedrichg
Option 1: 1 zpool, 1 VDev, 10 drives in a RAIDZ2 configuration with a projected capacity of ~37.79TB
- Better capacity at the risk of redundancy and recovery times.
- Better redundancy at the cost of available space.
- A compromise, better redundancy with a minor reduction in available space.
Well, that about does it for now. I welcome all constructive criticism and advice, and I look forward to being part of this community of FreeNAS owners.