Use your own hardware or use other people's computers?

Use your own hardware or use other people's computers?

A colleague of mine approached me recently and asked me why anyone would own their own hardware in 2023.  Specifically, because the cloud is "limitless" (pro tip it's not :) ), and cheaper.  Owning servers and using cloud computing are two popular options for businesses to store, manage, and process their data. Both have their pros and cons, and the choice between them will depend on the specific needs and priorities of a company. In this article, we will write down the costs to consider associated with redundancy, power, physical security, and amortization of hardware between owning servers and using the cloud, and also discuss the advantages of using a Storage Area Network (SAN) in a cloud environment versus on-premises. Additionally, we will talk about the value of renting public IP addresses versus owning network infrastructure, purchasing a CIDR, and maintaining an ARIN registry. Finally, we will generate a contrived and intentionally vague cost analysis of what it would take to purchase and maintain tenth-generation intel server hardware, network equipment, network attached storage, and owning a public CIDR versus managing a cloud deployment in Amazon Web Services (AWS) in early 2023.  Keep in mind that Google Cloud, Microsoft Azure, and my personal favorite Equinix Metal are all viable alternatives to Amazon Web Services.  But Amazon Web Services seems to be the product that most of those in the cloud space know as the gold standard.

The generic definitions in this article were generated by ChatGPT, mainly because I wanted to see if I could use it to supplement some of my writing.

Redundancy:


One of the key benefits of using cloud computing is that it offers built-in redundancy. This means that data is stored in multiple locations, so if one server fails, the data is still available from another location. This provides a high level of protection for your data and eliminates the need for businesses to invest in additional hardware or software to provide redundancy. On the other hand, owning servers requires businesses to invest in additional hardware and software to provide redundancy. This can be a significant cost, and there is always the risk of a hardware failure that could result in data loss.

Power:


Cloud computing providers typically have large data centers with multiple servers, which allows them to take advantage of economies of scale when it comes to power consumption. This means that they can provide their services at a lower cost than businesses that own their own servers, as they can spread the cost of power consumption over many customers. However, owning servers means that businesses are responsible for the cost of power consumption, which can be a significant cost, particularly if the servers are located in an area with high energy prices.

Physical security of a data center:


Cloud computing providers typically have highly secure data centers that are designed to protect against unauthorized access and data breaches. This can include features such as biometric security, video surveillance, and multiple layers of physical security. On the other hand, owning servers means that businesses are responsible for the physical security of the servers, which can be a significant cost, particularly if the servers are located in a high-risk area.

Amortization and capital investment of hardware:


When using cloud computing, businesses do not need to invest in hardware, as they can rent the hardware they need from the cloud computing provider. This means that they can avoid the upfront cost of purchasing hardware, and can instead pay for the hardware over time as they use it. On the other hand, owning servers means that businesses need to invest in the hardware upfront, which can be a significant capital investment. However, owning servers can offer cost savings in the long term, particularly if the hardware does not need to be replaced frequently.

An example of where this can really be prominent in cost savings would be purchasing hardware for a specific use case.  In 2016 I was a part of the Udacity Deep Learning nanodegree, a large portion of the cost at the time was cloud costs and waiting.  A 2017 article from Nick Condo called build a deep learning rig for $800 does a cost analysis on running dedicated hardware on-premise.  Owning hardware can pay for itself in the short term if your organization is not specifically targeting five nines for high availability.

Storage Area Network (SAN) in a cloud environment versus on-premises:


A SAN is a high-performance network of storage devices that are designed to provide fast and reliable access to data. In a cloud environment, businesses can take advantage of a SAN provided by the cloud computing provider, which can offer several benefits, such as increased performance, reduced downtime, and improved data protection. On the other hand, owning a SAN on-premises requires businesses to invest in the hardware and software needed to set up and maintain the network, which can be a significant capital investment.

Benefits of having the latest generation hardware without a capital investment:


Using cloud computing, businesses can take advantage of the latest generation of hardware without having the upfront costs. By renting cloud hardware, businesses can choose from a range of hardware configurations, and can easily switch between different configurations as needed. This allows businesses to tailor their hardware resources to meet their specific needs, without being locked into a specific hardware configuration.

Cloud computing providers are often able to offer their customers access to the latest generation hardware, as they can purchase hardware in bulk and spread the cost over many customers. This means that businesses can take advantage of the latest technology without having to make a large upfront investment.

Contrived Cost Analysis

A cost analysis of owning versus using cloud computing requires considering a variety of factors, including the size of the deployment, the amount of data being stored, the number of users accessing the system, and the specific requirements for processing power, storage, and network bandwidth.

For a generic and contrived scenario, let's assume a company needs to purchase and maintain tenth-generation server hardware, network equipment, network attached storage, and own a public CIDR for a small deployment with 500 GB of data and 100 users.

The cost of the tenth-generation server hardware, network equipment, and network-attached storage can be estimated at $50,000 on the low end. This cost does not include the ongoing expenses for power and cooling, maintenance, and upgrades nor does it include hardware redundancy. It is estimated that the annual cost of these expenses would be around $10,000 (depending on egress bandwidth costs and other hidden charges). Additionally, owning a public CIDR can cost around $1,000 per year.  Also, note that the above does not factor in the human costs of hiring a specialist to run and maintain said hardware on a regular basis.

In comparison, a cloud deployment in AWS for the same deployment can be estimated to cost around $10,000 per year. This cost includes the cost of renting the hardware, as well as the cost of using AWS services such as data storage and network bandwidth. Additionally, AWS provides built-in redundancy, which eliminates the need for the company to invest in additional hardware or software to provide redundancy.

In this contrived scenario, it is clear that the upfront cost of owning tenth-generation server hardware, network equipment, and network-attached storage, and owning a public CIDR is significantly higher than the cost of using a cloud deployment in AWS. However, it is important to note that the actual costs will depend on the specific requirements of the business, and a more detailed analysis should be performed to determine the most cost-effective solution.

In conclusion, while the upfront cost of owning tenth-generation server hardware, network equipment, and network attached storage, and owning a public CIDR can be substantial, a cloud deployment in AWS can offer a more cost-effective solution in terms of ongoing expenses, as well as the added benefits of built-in redundancy.



But for me personally?

But as for me, it's not a cost issue, it's a how can I grow?  So I own all of my hardware :)