Hi there!
It´s been a while. I`ve been running a Hyperconverged Cluster based on 2012r2 and Starwind for the last 5 Years.
(And its still up!)
First of all: Thanks a lot for the great software!
Now I am building a new lab and I am looking for some design tipps
My old Lab lacks of RAM, and IOPS for the VM workload I`m putting onto it.
The idea is to test S2D based on Server 2019 as well as Starwind and compare the performance / efficiency.
The Hardware I`ve got together for the new lab looks like the following:
- 1x ES-16-XG, 10G Switch (only one, yes)
- 1x 24-Port D-Link Gigabit Switch, used as Access-Switch for my clients
- 2x Dell R720 with each having the following hardware in it:
--- 2x Intel Xeon E5-2650 each has 8 physical Cores, 16 logical
--- 256 GB of RAM
--- 1x Intel X710 Dual Port 10GBit-NIC
--- LSI SAS HBA 9207-8i (also got the original RAID-Cards from those Machines in the Drawer, I think those are H710 mini)
--- 2x m-Sata SSD in a RAID1 for the OS
--- 2x Samsung SSD 983 DCT 960GB NVMe sitting on PCI with u2 to pci adapter cards
--- 4x DC S4600 - 240GB S-ATA SSDs
--- 8x Seagate Constellation.2 1TB, SAS 6Gb/s Disks
In my current cluster I am using tiered Storage Spaces for my iSCSI-Target Volumes because I like the idea of having a big pool with all available Capacity from the SSDs and the HDDs and having it manage that automatically. (Based on the heatmap: frequently used stuff on ssd not so frequently used stuff on hdd)
From what I saw this is possible with S2D on Server 2019 too, in addition there is a caching feature that would enable me to use the nvme as cache for ssd and hdd.
- Is there something similar for Starwind in the meantime?
- What setup would you recommend taking into account the kind of Storage/Hardware I have if I build a new cluster with Starwind? (Like, build a Raid to store the iSCSI target volumes, or create another tiered Storage Space)
- Would you recommend to invest into some RDMA capable NICs, like the ConnectX-3 with the hardware I have? (Is the possible performance gain by that worth the money? If its only about bandwidth, i have a bunch of old Mellanox IB Cards with 20Gbps Bandwidth)
I looking forward to read from you, let me know what you think
Have a nice Day
The Latest Gartner® Magic Quadrant™Hyperconverged Infrastructure Software