scalability

Results 151 - 175 of 608Sort Results By: Published Date | Title | Company Name
Published By: NetApp     Published Date: Sep 24, 2013
"Storage system architectures are shifting from large scale-up approaches to scale-out of clustered storage approaches. The need to increase the levels of storage and application availability, performance, and scalability while eliminating infrastructure or application downtime has necessitated such an architectural shift. This paper looks at the adoption and benefits of clustered storage among firms of different sizes and geographic locations. Access this paper now to discover how clustered storage offerings meet firms’ key requirements for clustered storage solutions and the benefits including: Non-disruptive operations Secure multi-tenancy Scalability and availability And more"
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution
    
NetApp
Published By: NetApp     Published Date: Sep 24, 2013
"Storage system architectures are moving away from monolithic scale-up approaches and adopting scale-out storage – providing a powerful and flexible way to respond to the inevitable data growth and data management challenges in today’s environments. With extensive data growth demands, there needs to be an increase in the levels of storage and application availability, performance, and scalability. Access this technical report that provides an overview of NetApp clustered Data ONTAP 8.2 and shows how it incorporates industry-leading unified architecture, non-disruptive operations, proven storage efficiency, and seamless scalability."
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution, non-disruptive operations
    
NetApp
Published By: NetApp     Published Date: Sep 24, 2013
"As IT continues to implement advanced capabilities, as well as traditional services such as server virtualization, storage systems become more complex. The complexity only increases because of the rapid growth of data that needs to be managed. View this resource to learn the results of ESG Lab hands-on evaluation of NetApp storage systems with a focus on those enterprise-class capabilities required to manage increasingly large and complex storage environments."
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution, non-disruptive operations
    
NetApp
Published By: NetApp     Published Date: Sep 30, 2013
"Today’s data centers are being asked to do more at less expense and with little or no disruption to company operations. They’re also expected to run 24x7, handle numerous new application deployments and manage explosive data growth. Data storage limitations can make it difficult to meet these stringent demands. Faced with these challenges, CIOs are discovering that the “rip and replace” disruptive migration method of improving storage capacity and IO performance no longer works. Access this white paper to discover a new version of NetApps storage operating environment. Find out how this software update eliminates many of the problems associated with typical monolithic or legacy storage systems."
Tags : 
storage infrastructure, clustered storage, technology, scalability, application, storage solution
    
NetApp
Published By: NetApp     Published Date: Dec 09, 2014
Although the cost of flash storage solutions continues to fall, on a per-gigabyte capacity basis, it is still significantly more expensive to acquire than traditional hard drives. However, when the cost per gigabyte is examined in terms of TCO, and the customer looks past the pure acquisition cost and accounts for “soft factors” such as prolonging the life of a data center, lower operating costs (for example, power and cooling), increased flexibility and scalability, or the service levels that a flash solution enables, flash solution costs become increasingly competitive with spinning media.
Tags : 
flash storage, netapp, tco, flash storage solutions, soft factors, flash solution
    
NetApp
Published By: Dell EMC     Published Date: Aug 03, 2015
The infographic provide leading analyst insights on the all-flash array market and how innovation accelerators are driving the agile data center through high data growth and the need for increased scalability and performance.
Tags : 
innovation, industry, data center, data growth, scalability
    
Dell EMC
Published By: Rackspace     Published Date: Nov 11, 2019
The Customer Spiraledge is a health and ecommerce company in the online retail, activity tracking and farm management spaces. The Obstacles They Faced The online retailer needed to improve scalability, performance and agility to reduce the risk of unpredictable traffic causing outages or bad customer experiences. What Spiraledge Achieved with Rackspace and GCP After completing a 13TB migration, Spiraledge’s new Google Cloud Platform is more responsive to traffic spikes and has increased key business results from R&D platform innovations.
Tags : 
    
Rackspace
Published By: Oracle HCM Cloud     Published Date: Aug 02, 2016
In the midst of industry consolidation, shrinking margins, and fierce competition for talent, health care payers are facing increasing pressure to deliver more cost efficient, high-quality patient care. Learn how to succeed in this dynamic healthcare market by integrating financial and HR systems to tackle immediate challenges and create scalability for the future.
Tags : 
healthcare payers, healthcare hcm, healthcare erp and hcm in the cloud
    
Oracle HCM Cloud
Published By: Spectrum Enterprise     Published Date: Oct 05, 2017
What does high growth mean to your business? Ask your business peers that question and there will be critical elements and key priorities in common: the need for speed and efficiency, a future-proof technology strategy, and high-performance network connectivity, just to name a few. Of course, reliability, scalability, and security will also come up as indispensable aspects of any high-growth solution. This guide gives you an overview of the steps you need to build a foundation for sustainable growth -- the kinds of investments, drivers, and differentiators that are involved.
Tags : 
    
Spectrum Enterprise
Published By: Applause Israel     Published Date: Sep 25, 2018
The goal of usability testing, simply put, is to make sure that a user can complete the tasks they are expected to complete. Usability testing doesn’t test whether or not the functions of the application, website or connected device work correctly, but rather that a user intuitively understands how to perform these tasks — and how easy or difficult it was to do so. With usability testing, “close enough” won’t cut it. A product may have a superior architecture, a great set of features, good performance, scalability and a number of other positive attributes. However, all of this effort is wasted if the user experience is inadequate. An application, website or connected device that is not user-friendly is just as bad as a buggy version and can lead to diminished revenue, product abandonment or a total failure. An application with poor usability can also negatively affect a brand
Tags : 
    
Applause Israel
Published By: IBM APAC     Published Date: May 14, 2019
The perceived x86 benefits of lower acquisition cost and standardizing on are often made at the expense of performance, reliability, scalability and manageability. Moreover, many are driven by the impression that x86-based systems will solve all their computing challenges—when often that is not the case. This eBook looks at companies that chose to invest in IBM® Power Systems™ rather than continuing to run on or migrate to x86-based systems — and why.
Tags : 
    
IBM APAC
Published By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day Two” challenges of accelerating large-scale PostgreSQL deployments. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. After a PostgreSQL deployment is live, there are a variety of day-two scenarios that require planning and strategizing. The third section of this eBook provides a detailed analysis of all aspects accelerating large-scale PostgreSQL deployments: ? Backups and Availability: strategies, point-in-time recovery, availability and scalability ? Upgrades and DevOps: PostgreSQL upgrade process, application upgrades and CI/CD
Tags : 
    
Stratoscale
Published By: SES     Published Date: Oct 12, 2016
SES Plus comprises a series of differentiated solutions designed to offer enterprises more. Flexibility and more scalability to meet their connectivity needs.
Tags : 
connectivity, enterprise, ses plus, collaboration, business connectivity, roaming
    
SES
Published By: Pure Storage     Published Date: Oct 09, 2018
Man AHL is a pioneer in the field of systematic quantitative investing. Its entire business is based on creating and executing computer models to make investment decisions. The firm has adopted the Pure FlashBlade™ solution from Pure Storage to deliver the massive storage throughput and scalability required to meet its most demanding simulation applications.
Tags : 
    
Pure Storage
Published By: IBM     Published Date: Feb 22, 2016
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources.
Tags : 
ibm, data, performance, scalability, information integration, big data
    
IBM
Published By: IBM     Published Date: Jan 27, 2017
Every day, torrents of data inundate IT organizations and overwhelm the business managers who must sift through it all to glean insights that help them grow revenues and optimize profits. Yet, after investing hundreds of millions of dollars into new enterprise resource planning (ERP), customer relationship management (CRM), master data management systems (MDM), business intelligence (BI) data warehousing systems or big data environments, many companies are still plagued with disconnected, “dysfunctional” data—a massive, expensive sprawl of disparate silos and unconnected, redundant systems that fail to deliver the desired single view of the business.
Tags : 
    
IBM
Published By: IBM     Published Date: Jan 27, 2017
Companies today increasingly look for ways to house multiple disparate forms forms of data under the same roof, maintaining original integrity and attributes. Enter the Hadoop-based data lake. While a traditional on-premise data lake might address the immediate needs for scalability and flexibility, research suggests that it may fall short in supporting key aspects of the user experience. This Knowledge Brief investigates the impact of a data lake maintained in a cloud or hybrid infrastructure.
Tags : 
    
IBM
Published By: IBM     Published Date: Apr 18, 2017
Apache Hadoop technology is transforming the economics and dynamics of big data initiatives by supporting new processes and architectures that can help cut costs, increase revenue and create competitive advantage. An effective big data integration solution delivers simplicity, speed, scalability, functionality and governance to produce consumable data. To cut through this misinformation and develop an adoption plan for your Hadoop big data project, you must follow a best practices approach that takes into account emerging technologies, scalability requirements, and current resources and skill levels.
Tags : 
data integration, data security, data optimization, data virtualization, database security, data migration, data assets, data delivery
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
In order to exploit the diversity of data available and modernize their data architecture, many organizations explore a Hadoop-based data environment for its flexibility and scalability in managing big data. Download this white paper for an investigation into the impact of Hadoop on the data, people, and performance of today's companies.
Tags : 
hadoop, flexibility, scalability, data architecture
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
Companies today increasingly look for ways to house multiple disparate forms of data under the same roof, maintaining original integrity and attributes. Enter the Hadoop-based data lake. While a traditional on-premise data lake might address the immediate needs for scalability and flexibility, research suggests that it may fall short in supporting key aspects of the user experience. This Knowledge Brief investigate the impact of a data lake maintained in a cloud or hybrid infrastucture.
Tags : 
data lake, user experience, knowledge brief, cloud infrastructure
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
Known by its iconic yellow elephant, Apache Hadoop is purpose-built to help companies manage and extract insight from complex and diverse data environments. The scalability and flexibility of Hadoop might be appealing to the typical CIO but Aberdeen's research shows a variety of enticing business-friendly benefits.
Tags : 
data management, yellow elephant, business benefits, information management
    
IBM
Published By: IBM     Published Date: Sep 28, 2017
Here are the 6 reasons to change your database: Lower total cost of ownership Increased scalability and availability Flexibility for hybrid environments A platform for rapid reporting and analytics Support for new and emerging applications Greater simplicity Download now to learn more!
Tags : 
scalability, hybrid environment, emerging applications, rapid reporting
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Oracle     Published Date: Jan 28, 2019
Oracle Engineered Systems are architected to work as a unified whole, so organizations can hit the ground running after deployment. Organizations choose how they want to consume the infrastructure: on-premises, in a public cloud, or in a public cloud located inside the customer’s data center and behind their firewall using Oracle’s “Cloud at Customer” offering. Oracle Exadata and Zero Data Loss Recovery Appliance (Recovery Appliance) offer an attractive alternative to do-it-yourself deployments. Together, they provide an architecture designed for scalability, simplified management, improved cost of ownership, reduced downtime, zero-data loss, and an increased ability to keep software updated with security and patching. Download this whitepaper to discover ten capabilities to consider for protecting your Oracle Database Environments.
Tags : 
    
Oracle
Published By: IBM APAC     Published Date: Aug 25, 2017
Two-thirds of organizations that blend traditional and cloud infrastructures are already gaining advantage from their hybrid environments. However, leaders among them use hybrid cloud to power their digital transformation, going beyond cost reduction and productivity gains.
Tags : 
cost reduction, infrastructure, business process, workflow, scalability, resiliency
    
IBM APAC
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search Resource Library