We identified the issue of server MUMMY, as the raid controller firmware being corrupted, preventing the RAID controller from powering on and preventing the server to boot and mount the disks.
We tried to force reflash the raid controller firmware, but it seems that the ROM of the raid controller has failed and this is the cause of the firmware corruption as well.
A replacement raid controller is already shipped and underway. Unfortunately we did not have the specific raid controller in stock as a spare part.
We expect the replacement card to be at the datacenter tomorrow (Friday 25th March) before noon, and as soon as we receive it, we will proceed with replacement and restoring the server back to operational status.
We apologize in advance for the inconvenience caused to all clients affected and we ask for your patience in this matter. We will proceed with compensation for the downtime as soon as the server is back online tomorrow.