SSD selection - 8 mirror vdev, WD Red versus Seagate IronWolf

Joined
Dec 29, 2014
Messages
1,135
I am seriously debating replacing the 1GB SATA HDD (aka spinning rust) in my TrueNAS-12.0-U8.1 unit with SATA SSD's. The main workload is as light ESXi NFS datastore. The two drive types I am considering are these:

Western Digital 1TB WD Red SA500 NAS 3D NAND Internal SSD - SATA III 6 Gb/s, 2.5"/7mm, Up to 560 MB/s - WDS100T1R0A ($110 on Amazon)
Seagate IronWolf 125 SSD 1TB NAS Internal Solid State Drive - 2.5 Inch SATA 6Gb/s speeds of up to 560MB/s with Rescue Service (ZA1000NM1A002) ($170 on Amazon)

This is mainly a lab system, but I enjoy experimenting with ways to make it faster that I do apply at work. The current pool is 8 mirrored vdevs of Seagate Constellation 2 (ST91000640NS) with a spare drive and an Intel Optane SLOG. Storage network is 40G with a Chelsio T580 NIC. I don't mind doing the more expensive Seagate drives if there is a good reason to do it. If there isn't a good reason then the WD option would be around $1200 cheaper.
 

HoneyBadger

actually does care
Administrator
Moderator
iXsystems
Joined
Feb 6, 2014
Messages
5,112
The only material difference between the two is their rated TBW - the WD coming in at 600TBW [1], and the Seagate at 1400TBW [2] - both will ping off the bandwidth limit of the SATA3 bus, have similar claimed IOPS numbers, and a 5 year warranty. Given the cost difference between the two, my personal choice would be to just get the Reds now and replace them if/when needed down the line.

[1] https://documents.westerndigital.co...brief-western-digital-wd-red-ssd-04-00048.pdf

[2] https://www.seagate.com/content/dam...fs/ironwolf-125-ssd-DS2052-1-2007US-en_US.pdf
 
Joined
Dec 29, 2014
Messages
1,135
The only material difference between the two is their rated TBW - the WD coming in at 600TBW [1], and the Seagate at 1400TBW [2]
Is there someplace I can pull the rate of change from my system? I doubt it is very high since I normally only have 5ish VM's running. This is a snapshot of several of the drives in that pool. I don't know if I could get that data in text instead of from the dashboard.
1683053703982.png
 
Joined
Dec 29, 2014
Messages
1,135
Dooh! I should have known 'iostat' would give me that info.
Code:
root@freenas2:/mnt/MIRROR-I/CIFS-I/elliot # zpool iostat MIRROR-I
              capacity     operations     bandwidth
pool        alloc   free   read  write   read  write
----------  -----  -----  -----  -----  -----  -----
MIRROR-I    3.66T  3.59T      8    102  6.56M  2.95M

Code:
root@freenas2:/mnt/MIRROR-I/CIFS-I/elliot # zpool iostat -v MIRROR-I
                                          capacity     operations     bandwidth
pool                                    alloc   free   read  write   read  write
--------------------------------------  -----  -----  -----  -----  -----  -----
MIRROR-I                                3.66T  3.59T      8    102  6.56M  2.95M
  mirror                                 446G   482G      1     12   798K   377K
    gptid/e39027e8-fe1e-11eb-88a4-5c838f806d36      -      -      0      6   399K   188K
    gptid/78fc8b5e-fe1f-11eb-88a4-5c838f806d36      -      -      0      6   399K   188K
  mirror                                 444G   484G      1     12   794K   375K
    gptid/37008e8c-fe1f-11eb-88a4-5c838f806d36      -      -      0      6   397K   188K
    gptid/842c0150-fe1f-11eb-88a4-5c838f806d36      -      -      0      6   397K   188K
  mirror                                 439G   489G      1     11   786K   348K
    gptid/2a574619-fe1f-11eb-88a4-5c838f806d36      -      -      0      5   393K   174K
    gptid/8b834d22-fe1f-11eb-88a4-5c838f806d36      -      -      0      5   393K   174K
  mirror                                 446G   482G      1      8   799K   276K
    gptid/ae46df27-fe1f-11eb-88a4-5c838f806d36      -      -      0      4   400K   138K
    gptid/baea87d8-fe1f-11eb-88a4-5c838f806d36      -      -      0      4   399K   138K
  mirror                                 436G   492G      1     10   780K   300K
    gptid/30495198-fe1f-11eb-88a4-5c838f806d36      -      -      0      5   390K   150K
    gptid/598a5fea-fe1f-11eb-88a4-5c838f806d36      -      -      0      5   390K   150K
  mirror                                 455G   473G      1      8   815K   239K
    gptid/608c2dfd-fe1f-11eb-88a4-5c838f806d36      -      -      0      4   408K   120K
    gptid/9e266479-fe1f-11eb-88a4-5c838f806d36      -      -      0      4   407K   120K
  mirror                                 544G   384G      1      8   975K   240K
    gptid/f4a90804-fe1e-11eb-88a4-5c838f806d36      -      -      0      4   488K   120K
    gptid/10949fd4-fe1f-11eb-88a4-5c838f806d36      -      -      0      4   488K   120K
  mirror                                 541G   387G      1      9   969K   297K
    gptid/e3d10d05-fe1e-11eb-88a4-5c838f806d36      -      -      0      4   485K   149K
    gptid/00d7c50e-fe1f-11eb-88a4-5c838f806d36      -      -      0      4   485K   149K
logs                                        -      -      -      -      -      -
  gptid/b9443779-fe1f-11eb-88a4-5c838f806d36  2.30M   260G      0     19      0   567K
--------------------------------------  -----  -----  -----  -----  -----  -----
 
Top