I registered this domain back in December 2010, specifically Christmas Day.
I’ve always been a “VPS enthusiast” as the technology is the perfect solution for the shortcomings of packed-like-sardines shared hosting and at the time the “tremendous” cost of a dedicated server. At the time, dedicated servers were $150-$200/mo minimum and out of the reach of a lot of people’s budgets at the time.
Back in 2012, some of the review/deal sites started to get competitive and individuals were attacking websites. I think my website was stress tested but my site was never knocked offline like some of the other sites. After another incident of data loss by a sponsored provider, I decided it was time to focus on my employment and I started phasing out doing VPS reviews while continuing to renew this domain knowing at some point I would come back to doing VPS reviews.
Right now with the low hardware costs matching the low operating costs, with some providers selling dedicated servers for $10 – $25 per month and “the sweet spot” being $50 per month, it seems like it’s a buyer’s market right now where you can either find yourself hosted on a VPS node or hosted on your own hardware.
I find myself setting up more dedicated servers but I see the need for virtual private servers such as me hosting my cPanel DNS Only servers on OVH SSD VPS and Ubiquity Hosting’s cloud platform. I think both platforms are equal with OVH appealing to those familiar with OVH services and Ubiquity Hosting appealing to US based customers familiar with similar providers. I think Ubiquity’s pricing is attractive and I haven’t had much issues with it so far. The attraction to the service was the $25 credit, which I had to add $10 to my account to verify the service and get it started.
I think over the next few weeks is when you’ll see some added reviews on here from known and lesser known providers. I still can’t believe some VPS customers are still downloading these benchmarking scripts and abusing a VPS node to desperately seek the largest numbers on a VPS node thinking that the performance observed today will be an indication of quality later on. What if the node is replaced? What if the node is already overloaded by benchmarkers trying to get the best numbers?
I never understood benchmarking. It’s like looking at historical weather data for a vacation destination. Sure, it’s the average weather observed but it’s not going to be the weather in most cases. For example, my area you would think that October 4th would be “beach weather” but it’s unseasonably cool right now feeling like November and if you looked at historical weather data, but not current data, you would run around in shorts, tank top and flip flops to be uncomfortable.
With some smaller providers, this data that is outdated could be improved days, weeks or months later by deploying a new hardware node or making an adjustment. However that old information is out there for the Google Machine to suck up, store and will always be brought back up days, weeks and months later by customers or a customer inquiring about hardware performance. I would rather have my website be a source of factual information that is friendly to both provider and customer versus seeming like an “attack” constantly on companies and providers who have enough and so much to deal with already.