Anyone using LSI SAS 9400-16i

orddie

Contributor
Joined
Jun 11, 2016
Messages
104
Hiya,

Currently I have an LSI 2800-8i card running 8 of my 12 SSD drives. I have an Intel optane 900P drive for LOG to help speed things up.

I personally do not like the combo using some PCI adapter card ports and than some motherboard SATA ports. Would rather it be all on card.

i was looking at the LSI 900-16i to to put everything on one card with 12GB pip to move the data (as everything currently is 6GB)

https://www.broadcom.com/products/storage/host-bus-adapters/sas-nvme-9400-16i

Thoughts?
Anyone else using the 9400-16?
 

MikeyG

Patron
Joined
Dec 8, 2017
Messages
442
I have a 9305 with all SSDs on it. So far so good - only at 4 drives so far though.

Curious what the difference is. Are the 9400s faster? Seems to be geared specifically for NVMe drives, but i'm not quite understanding how.
 

orddie

Contributor
Joined
Jun 11, 2016
Messages
104
I have a 9305 with all SSDs on it. So far so good - only at 4 drives so far though.

Curious what the difference is. Are the 9400s faster? Seems to be geared specifically for NVMe drives, but i'm not quite understanding how.
both cards are 12GB/s on an 8x PCI lane. so not much of a difference. From what i was reading, i would NOT be able to do 16 NVME (though why would I due to saturation) - only 4
 

MikeyG

Patron
Joined
Dec 8, 2017
Messages
442
Interesting. Aside from bandwidth, what is the NVMe device limit? I imagine there are companies like Icy Dock coming out with NVMe bays.

Specs mention 24 devices:
Devices Supported SAS/SATA: 1024; NVMe: 24
 

orddie

Contributor
Joined
Jun 11, 2016
Messages
104
Interesting. Aside from bandwidth, what is the NVMe device limit? I imagine there are companies like Icy Dock coming out with NVMe bays.

Specs mention 24 devices:
Devices Supported SAS/SATA: 1024; NVMe: 24

I'm very interested in looking at the spec sheet of a single HBA that can handle 4x NVME's

The SATA drives I have are rated for 500MB/s. I'm no math major, all 12 of these slower drives will maxout the card (for all 12 drives produce 48Gbit)
 

Chris Moore

Hall of Famer
Joined
May 2, 2015
Messages
10,080

orddie

Contributor
Joined
Jun 11, 2016
Messages
104
What are you doing with that? Why the need for speed?

The SAN has both servers and desktops that users use which are IO intensive. Exchange 2016 and SQL servers for example. I'm trying to ensure we do not have a disk speed issue by throwing hardware at it.

I felt using a 12GB/s card, moving all the disks that that card, and paring with optane would be a good start. at the very least, having all drives in the array on like hardware, i feel, is a better solution.
 

jcl123

Dabbler
Joined
Jul 6, 2019
Messages
23
I have one of these cards, it is a "tri-mode" adapter, as it can do SATA, SAS, and NVMe, all at the same time.
I am using it with an Intel Expander so that I have 48Gig with just one of it's four ports, this is supporting a bunch of SATA spinning disks.

I do not plan to use it with NVMe drives, but you can. It requires a special cable and you can attach quite a few depending on if you want 2-lanes per drive or 4-lanes per drive. I guess it all depends on what drives you are using and what you are doing with it. If your drives are NVMe but not the fastest one's, then in theory you could do quite a few and it would be better than eating up actual PCIe slots. Conversely, if you don't care about transfer rate and just want a crazy amount of IOPs that would work too. They do make a more expensive version that is a PCIe x16 card.

For me, I just bought it for "future proofing", it supports all the latest features and OS's now, so it should last me quite a few years.

-JCL
 

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
LSI 9400-16i CAM Errors

Purchased this card 7 months ago to use on a "Supermicro 6048R-E1CR36N" which is a 36 Bay SuperStorage server. The upgrade process to Firmware P15 in Tri-Mode went smooth although I'm nut running any SAS nor NVMe drives. Installed 30x8TB SATA Drives with TrueNAS 12 core. Unfortunately the system occasionally is giving CAM errors see below. After some troubleshooting found it is related to HDD connected to the 12 Bay Backplane. Swapped cables between the 16i ports, Replaced Power Supplies, Replaced Cables, then finally purchased brand new "Backplane 12 Bays BPN-SAS3-826EL". Unfortunately, none of the above solved this issue to the extend I gave up on the 12 Bay & started using only the 24 Bay. In between upgraded Firmware to P16. Few weeks ago upgraded to P17 & tested the 12 Bay it seems like these CAM errors disappeared. Accordingly I decided to flash only SAS/SATA mode with P17 and for the past 2 weeks everything seems OK. I don't want to jump into conclusions here since I'm using only 2x4TB SATA drives on the 12 Bay because I gave up on it long time. Planning to install 4x8TB on the 12 Bay this weekend & see how it goes.
The reason I can't say for sure what is the real reason is that, I purchased this server preowned & it was equipped with "Adaptec ASR-71605 Gen3 1GB 6GBs Raid" hence I didn't bother trying it since it is SAS2-6Gb "Although I'm running only SATA 6Gb which I believe is more than enough" but I had to walk that path, so I purchased this card since it is a 16i which I thought this will give higher throughput since each backplane will have it is own 8 channels & maybe I will upgrade to NVMe at a certain point and the fun started
Appreciate if some one can comment if he encountered similar issues

Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): WRITE(10). CDB: 2a 00 28 7b 36 70 00 01 00 00
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): CAM status: CCB request completed with an error
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): Retrying command, 3 more tries remain
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): WRITE(10). CDB: 2a 00 28 7b 45 20 00 00 58 00
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): CAM status: CCB request completed with an error
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): Retrying command, 3 more tries remain
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): WRITE(10). CDB: 2a 00 28 7b 46 78 00 01 00 00
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): CAM status: CCB request completed with an error
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): Retrying command, 3 more tries remain
Sep 13 13:54:35 truenas (da17:mpr0:0:68:0): WRITE(10). CDB: 2a 00 28 7b 84 f0 00 00 b0 00
Sep 13 13:54:35 truenas (da17:mpr0:0:68:0): CAM status: SCSI Status Error
Sep 13 13:54:35 truenas (da17:mpr0:0:68:0): SCSI status: Check Condition
Sep 13 13:54:35 truenas (da17:mpr0:0:68:0): SCSI sense: UNIT ATTENTION asc:29,0 (Power on, reset, or bus device reset occurred)
Sep 13 13:54:35 truenas (da17:mpr0:0:68:0): Retrying command (per sense data)
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): WRITE(10). CDB: 2a 00 28 7b 5f d8 00 00 58 00
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): CAM status: SCSI Status Error
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): SCSI status: Check Condition
Sep 13 13:54:35 truenas (da16:mpr0:0:67:0): SCSI sense: UNIT ATTENTION asc:29,0 (Power on, reset, or bus device reset occurred
 

Herr_Merlin

Patron
Joined
Oct 25, 2019
Messages
200
We are running this cards in one system. They domeork fast and good
 

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
We are running this cards in one system. They domeork fast and good
Thanks for the feedback. Appreciate if you can share Firmware version, Which Mode you are running "Tri or SAS/SAT", TrueNAS version, Server Brand/ Model/Backplanes Models & HD connected to the card SATA/SAS/NVMe. Although I bought a manufacturer sealed brand new card from eBay maybe it is defective.

Thanks again
 

firesyde424

Contributor
Joined
Mar 5, 2019
Messages
155
We are currently running a 9400-16e without issue. It's currently connected to a 102 x 12GB SAS SSD and a 60 x 12GB SAS SSD. We have not had any issues whatsoever with it. I assume the 16i is the same card, just the internal variant.
 

Jacoub

Dabbler
Joined
Sep 4, 2020
Messages
14
We are currently running a 9400-16e without issue. It's currently connected to a 102 x 12GB SAS SSD and a 60 x 12GB SAS SSD. We have not had any issues whatsoever with it. I assume the 16i is the same card, just the internal variant.
The 9400-16e doesn't support NVMe Mode see "BC00-0459EN (broadcom.com)" hence the argument is still valid since the 9400-16i started working fine after updating to P17 & disabled Tri-Mode by installing SAS/SATA firmware. But again thanks for the information
 
Top