data volume

Results 76 - 100 of 217Sort Results By: Published Date | Title | Company Name
Published By: IBM APAC     Published Date: Jul 19, 2019
With businesses developing larger data volumes to improve their competitiveness, their IT infrastructures are struggling to store and manage all of the data. To keep pace with this increase in data, organizations require a modern enterprise storage infrastructure that can scale to meet the demands of large data sets, while reducing the cost and complexity of infrastructure management. This white paper examines IBM’s FlashSystem 9100 solution and the benefits it can offer businesses.
Tags : 
    
IBM APAC
Published By: Cohesity     Published Date: May 09, 2018
The growing importance—and complexity—of data protection means old approaches no longer will get the job done in an era of exploding data volumes and ever-changing business requirements. It’s time to reimagine and reengineer your IT infrastructure for a more efficient, affordable and manageable data protection framework.
Tags : 
    
Cohesity
Published By: Resource     Published Date: Dec 04, 2018
What’s a common characteristic of the best talent? They all have jobs. In today’s marketplace, to get the best talent you have to convince them your opportunity is better than what they currently have. The good news: it can be done - and you can win consistently with a deliberate outbound process. The bad news: it requires an intentional approach which is challenging without the right tools and data. Your success depends on your ability to build a repeatable process to identify and recruit a steady volume of high quality applicants. The way to accelerate and scale your outbound process is to benchmark and refine it regularly using data. Below we’ll walk through the steps to building a data-driven recruiting process, based on.. you guessed it.. data.
Tags : 
    
Resource
Published By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : 
cost effective, data storage, data collection, security, compliance, platform, big data, it resources
    
Amazon Web Services
Published By: Teradata     Published Date: May 02, 2017
A Great Use of the Cloud: Recent trends in information management see companies shifting their focus to, or entertaining a notion for the first time of a cloud-based solution. In the past, the only clear choice for most organizations has been on-premises data—oftentimes using an appliance-based platform. However, the costs of scale are gnawing away at the notion that this remains the best approach for all or some of a company’s analytical needs. This paper, written by McKnight Consulting analysts William McKnight and Jake Dolezal, describes two organizations with mature enterprise data warehouse capabilities, that have pivoted components of their architecture to accommodate the cloud.
Tags : 
data projects, data volume, business data, cloud security, data storage, data management, cloud privacy, encryption, security integration
    
Teradata
Published By: Nice Systems     Published Date: Feb 26, 2019
NICE has made a significant investment into AI and ML techniques that are embedded into its core workforce management solution, NICE WFM. Recent advancements include learning models that find hidden patterns in the historical data used to generate forecasts for volume and work time. NICE WFM also has an AI tool that determines, from a series of more than 40 models, which single model will produce the best results for each work type being forecasted. NICE has also included machine learning in its scheduling processes which are discussed at length in the white paper.
Tags : 
    
Nice Systems
Published By: IBM     Published Date: Jul 05, 2016
Today's data-driven organization is faced with magnified urgency around data volume, user needs and compressed decision time frames. In order to address these challenges, while maintaining an effective analytical culture, many organizations are exploring cloud-based environments coupled with powerful business intelligence (BI) and analytical technology to accelerate decisions and enhance performance.
Tags : 
ibm, datamart on demand, analytics, cloud, hybrid cloud, business insight
    
IBM
Published By: IBM     Published Date: Jul 06, 2016
Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : 
ibm, big data, trusted data, data management, data solutions
    
IBM
Published By: IBM     Published Date: Oct 13, 2016
Who's afraid of the big (data) bad wolf? Survive the big data storm by getting ahead of integration and governance functional requirements Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : 
ibm, big data, trusted data, data management, data solutions
    
IBM
Published By: IBM     Published Date: Jan 27, 2017
A big data integration platform that is flexible and scalable is needed to keep up with today’s ever-increasing big data volume.
Tags : 
    
IBM
Published By: IBM     Published Date: Apr 14, 2017
A big data integration platform that is flexible and scalable is needed to keep up with today’s ever-increasing big data volume. Download this infographic to find out how to build a strong foundation with big data integration.
Tags : 
big data, big data integration, scalable data
    
IBM
Published By: IBM     Published Date: Apr 14, 2017
Any organization wishing to process big data from newly identified data sources, needs to first determine the characteristics of the data and then define the requirements that need to be met to be able to ingest, profile, clean,transform and integrate this data to ready it for analysis. Having done that, it may well be the case that existing tools may not cater for the data variety, data volume and data velocity that these new data sources bring. If this occurs then clearly new technology will need to be considered to meet the needs of the business going forward.
Tags : 
data integration, big data, data sources, business needs, technological advancements, scaling data
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. In many cases, these systems are no longer up to the task—so it’s time to make a decision. Do you use more staff to keep up with the fixes, patches, add-ons and continual tuning required to make your existing systems meet performance goals, or move to a new database solution so you can assign your staff to new, innovative projects that move your business forward?
Tags : 
database, growth, big data, it infrastructure, information management
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. This e-book presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : 
database, streamlining, it infrastructure, database systems
    
Group M_IBM Q1'18
Published By: Group M_IBM Q2'19     Published Date: Apr 03, 2019
Data is the lifeblood of business. And in the era of digital business, the organizations that utilize data most effectively are also the most successful. Whether structured, unstructured or semi-structured, rapidly increasing data quantities must be brought into organizations, stored and put to work to enable business strategies. Data integration tools play a critical role in extracting data from a variety of sources and making it available for enterprise applications, business intelligence (BI), machine learning (ML) and other purposes. Many organization seek to enhance the value of data for line-of-business managers by enabling self-service access. This is increasingly important as large volumes of unstructured data from Internet-of-Things (IOT) devices are presenting organizations with opportunities for game-changing insights from big data analytics. A new survey of 369 IT professionals, from managers to directors and VPs of IT, by BizTechInsights on behalf of IBM reveals the challe
Tags : 
    
Group M_IBM Q2'19
Published By: Intel     Published Date: Apr 16, 2019
The data center is coming under immense pressure. The boom in connected devices means increasing volumes of data – and all that needs processing. One way for CSPs to accelerate customer workloads is by using FPGAs, which are easier to use than ever before. Download Intel's latest eGuide, ‘FPGA-as-a-Service: A Guide for CSPs' to discover: • How to add FPGAs to the data center • The structure of the Intel® Acceleration Stack for FPGAs • Adding off-the-shelf accelerator functions • How FPGAs can accelerate many cloud services, such as database as a service and analytics as a service
Tags : 
    
Intel
Published By: HP     Published Date: Feb 02, 2015
Ever-increasing data volumes driven by the constant growth in both structured and unstructured data coupled with the ever decreasing costs of storage capacity on a per GB basis are continuing to put a strain on corporate backup abilities. While other backup and data optimization technologies offer some relief, deduplicating backup appliances have become the go to solution. They provide a quick, largely non-disruptive plug-and-play solution that alleviates backup pain, reduces storage consumption by up to 20x and have become a proven frontrunner in the ongoing battle to improve the backup experience.
Tags : 
    
HP
Published By: Delphix     Published Date: May 28, 2015
"Security-conscious organizations face a gap between current requirements and capabilities as they relate to data masking. Data volumes are growing exponentially and the risk of data leaks continues to make news, yet many organizations rely on inefficient, legacy approaches to protecting sensitive data. In contrast, top performing companies are turning to virtual databases and service-based masking solutions to ensure that data management functions can keep up with software development.
Tags : 
    
Delphix
Published By: HP     Published Date: Feb 11, 2015
Ever-increasing data volumes driven by the constant growth in both structured and unstructured data coupled with the ever decreasing costs of storage capacity on a per GB basis are continuing to put a strain on corporate backup abilities. While other backup and data optimization technologies offer some relief, deduplicating backup appliances have become the go to solution. They provide a quick, largely non-disruptive plug-and-play solution that alleviates backup pain, reduces storage consumption by up to 20x and have become a proven frontrunner in the ongoing battle to improve the backup experience.
Tags : 
    
HP
Published By: MarkLogic     Published Date: Mar 13, 2015
Big Data has been in the spotlight recently, as businesses seek to leverage their untapped information resources and win big on the promise of big data. However, the problem with big data initiatives are that organizations try using existing information management practices and legacy relational database technologies, which often collapse under the sheer weight of the data. In this paper, MarkLogic explains how a new approach is needed to handle the volume, velocity, and variety of big data because the current relational model that has been the status quo is not working. Learn about the NoSQL paradigm shift, and why NoSQL is gaining significant market traction because it solves the fundamental challenges of big data, achieving better performance, scalability, and flexibility. Learn how MarkLogic’s customers are reimagining their data to: - Make the world more secure - Provide access to valuable information - Create new revenue streams - Gain insights to increase market share - Reduce b
Tags : 
enterprise, nosql, relational, databases, data storage, management system, application, scalable
    
MarkLogic
Published By: K2     Published Date: Apr 27, 2015
This paper describes how a business-app platform from K2® can help you provide simpler, more cost-effective ways to empower people by connecting them with data that is dispersed across different systems.
Tags : 
erp and legacy systems, data volume, business processes, cloud crm system, system complexity, business apps
    
K2
Published By: Waterline Data & Research Partners     Published Date: Nov 07, 2016
Business users want the power of analytics—but analytics can only be as good as the data. The biggest challenge nontechnical users are encountering is the same one that has been a steep challenge for data scientists: slow, difficult, and tedious data preparation. The increasing volume, variety, and velocity of data is putting pressure on organizations to rethink traditional methods of preparing data for reporting, analysis, and sharing. Download this white paper to find out how you can improve your data preparation for business analytics.
Tags : 
    
Waterline Data & Research Partners
Published By: Mimecast     Published Date: Oct 11, 2018
Information management is getting harder. Organizations face increasing data volumes, more stringent legal and regulatory record-keeping requirements, stricter privacy rules, increasing threat of breaches and decreasing employee productivity. Companies are also finding that their old-fashioned, legacy archive strategies are increasingly ineffective. This is driving many organizations to rethink their approach, developing more modern Information Governance strategies.
Tags : 
    
Mimecast
Published By: Mimecast     Published Date: Jul 15, 2019
The Mimecast Supervision solution enables compliance personnel to systematically review and discover targeted data among the volume of communications organizations face today. Integrated with the industry leading Mimecast Cloud Archive, users can facilitate an auditable, managed supervision review process, flexible to meet the needs of the business while utilizing a scaleable, immutable SEC 17A-4 validated and tamper proof archive with guaranteed 7 second SLA search capabilities. To reduce the number of false positives in sampling data, targeted detection rules can focus on specific senders/recipients and to accelerate the process. In addition, queues can be configured with an upper limit upon which to be populated with email. This helps limit the amount of email a reviewer must go through while still identifying risk. Today’s supervision demands require reviewers to be highly productive.
Tags : 
    
Mimecast
Start   Previous    1 2 3 4 5 6 7 8 9    Next    End
Search Resource Library