In an era of data explosion, Seagate stressed that in order to gain valuable analytics and insights through machine learning, companies must have an infrastructure capable of collecting and storing as much data as possible.
Seagate is one of the top 5 data storage trends in 2020 ▲The increasing importance of hierarchical security in both data at rest and in use ▲More extensive use of object storage by enterprises ▲Universalization of composable systems ▲Data in large-capacity storage deployments Layering according to the frequency of use ▲ Increased data usability through Formative AI was cited.
As the hyper-scale software ecosystem continues to grow, small-scale application development is possible in companies or locations that do not have the necessary network infrastructure. As a result, more cloud-based applications around the world are running in interconnected locations (PoPs) or co-location facilities. Encryption of data at rest has already become a prerequisite in most industries, and businesses will likely need to use encrypted disks as early as a day to prepare for security threats that can be issued at any time on external and internal channels.
With the explosion of useful data, companies are rapidly moving from traditional file systems to object storage. Object storage is becoming the standard for large-scale data by providing advantages that conventional file storage does not have, such as normative metadata, excellent scalability, and freedom from hierarchical data structures, the company explained. In addition, unlike the file storage types used in the past, economies of scale can be utilized, so many applications are already moving from the file storage type to object storage.
Kubernetes, an open-source system that automates the deployment, scaling, and management of containerized applications, is showing an increasing usage trend, showing that data storage trends are heading towards composable systems. Open source is believed to be the future of application development because it mobilizes a large community to effectively solve the problems faced by various industries and allows domain-specific solutions that are utilized in an open architecture. Therefore, hardware development is changing to suit these software and business environments.
When looking at SSDs and HDDs as data storage in different tiers, it is possible to find the most efficient balance of cost and performance through tiering, the company explained. He added that as new technologies such as storage-class memory are developed, the importance of an architecture that creates maximum value from all tiers of storage becomes more prominent. For economical reasons, the data identification capability of data center software is also gradually improving, while the storage layering of storing frequently used data on high-cost, high-performance media and inexpensive large-capacity media is progressing.
Not only the total amount of data produced, but also the amount of useful data is increasing explosively, and the development of technology using artificial intelligence or machine learning has made it possible to obtain additional information from stored data. As the useful life of data extends, companies must develop the ability to store a lot of data and uncover important information.
This trend is accelerating thanks to Formatic AI, which can change according to the situation, and it relies on a flexible architecture capable of intelligently responding to changes, which is in line with the trend of tiering in data storage. It is difficult to predict which direction machine learning will evolve in the future, so it will become more important to store more data and adopt an efficient data storage method, the company predicted.
0 Comments