Ask a Server SSD Expert
Planning the right solution requires an understanding of your project's storage goals. Let Kingston's experts guide you.
For IT departments, evaluating storage products is an important process that can determine the entire shape of a company’s digital infrastructure. A poorly architected storage solution can substantially impair a department’s performance and lead to major outages or, in worst case scenarios, permanent data loss. However, an intelligent decision, made with the right factors in mind, can provide an organisation with a scalable shared storage solution that is able to meet the performance and reliability service-level objectives of the proposed system design.
Especially on a larger scale, maintaining IT architecture can be like keeping an old car operational. It’s expensive and resource intensive if you don’t have the time or money to source a better alternative. IT sysadmins working with antiquated and inefficient hardware can struggle to catch up and support data transformation initiatives.
When designing a greenfield solution, it’s important to first understand the high-level architecture and system design of the proposed solution and understand the possible resource bottlenecks throughout the entire stack. This will enable application and storage architects to choose and design the suitable storage solution. We highlight some key questions that storage architects should be asking to help make an informed decision:
When onboarding new applications, it’s important to understand the kind of data being stored to make an informed decision about whether to use block, file or object storage.
Block storage is the most common use case for DAS and SAN environments. In the case of DAS, an entire RAID volume or physical drive is presented to the OS as a raw, unformatted volume. In the case of SAN environments, the entire LUN (compromised of several physical drives) presented from the storage array is presented to the OS through a high-speed network and appears as a raw unformatted volume. The underlying layers of the raw volume consist of smaller extents or sectors that the operating system handles and then the underlying storage subsystem is able to map those logical blocks to specific physical blocks on the specific drive(s). Block-level storage is fast, reliable and most ideal for continuously changing data like relational databases, online transaction processing (OLTP) databases, email servers or virtual desktop infrastructure, where high transaction throughput and low latency are requirements.
Object storage stores data (and the associated metadata) in containers with unique identifiers, with no folders or subdirectories like those associated with file storage. It uses the concept of key-value stores, where each key points to a specific “value” or piece of data and is retrieved via APIs.
It is mainly used to handle large amounts of unstructured data, like emails, backup images, video surveillance footage or, in IoT, data management for machine learning and data analytics. Object storage is good for handling very large amounts of data and can scale as quickly as the application requires but is slow at data retrieval, making it inefficient for databases or high-performance computing. Examples of object storage are Amazon S3, Google Cloud object storage or Azure Blob storage.
File storage stores data in files, organised in folders and subdirectories, and is shared over a network using SMB (Windows) or NFS (Linux). It’s good for centralising storage files like videos, images or documents but has limited scalability as the amount of data continues to grow. It is not the most suitable application to handle very large amounts of unstructured data or continuously changing data like OLTP databases.
Successful enterprises therefore concern themselves with building High-Performance Computing systems (HPCs). They leverage local databases and data services to perform transactional computation and then enable native integration with cloud object stores to store large amounts of unstructured data. This enables the throughput and IOPS-intensive transactions to occur in local data centers’ fast block and file storage, coupled with slower cloud object storage for storing a large amount of unstructured data.
Large-scale data processing requires a data storage solution based on the type of data that your enterprise needs to analyse. For example, to process and analyse unstructured on-prem or cloud-based data, companies need a file data platform for a hybrid storage infrastructure, one which can provide real-time analytics and insights.
A central pillar of evaluating storage products is testing and validating them. The benefits of testing are many. Improved application performance, storage cost optimisation and risk mitigation are all outcomes that can be tested for with the right tools. That said, small or underfunded IT departments can find it hard to do so, as DIY or shareware tools often prevent the rigorous variety of testing necessary to replicate a company’s real-world production environment.
Testing can be used to answer any or all these questions:
If you are choosing a scalable enterprise data storage solution, it is vital to pay attention to how your chosen storage works with data and apps.
A great product can sadly be undermined by a lack of support team to help an enterprise manage any issues it has during use. Conversely, a good product can be raised to new heights by the exceptional efforts of its technical support staff. It may be worth accounting for your existing professional relationship with your enterprise’s incumbent storage vendor when making a decision regarding a potential change in your storage solution. Additionally, any service level agreements (SLAs) such as meeting KPIs like latency, throughput or IOPS during specific workloads should impact your choice. If your intended vendor has a strong reputation in the industry (for example, they generally exceed the industry-standard benchmarks), you can take them at their word when they advertise features such as high IOPS and throughput at acceptable latency for each platform.
Another element to keep in mind is the cost of any storage products that you’re considering. Not only the cost of acquisition, but also the cost of maintenance and total cost of ownership (TCO) should be accounted for.
#KingstonIsWithYou
Planning the right solution requires an understanding of your project's storage goals. Let Kingston's experts guide you.