data types

Results 1 - 25 of 140Sort Results By: Published Date | Title | Company Name
Published By: Acronis EMEA     Published Date: May 18, 2015
This book is a guide to modern data protection for IT professionals that covers the simplest file backup scenarios to comprehensive protection strategies for businesses with complex IT environments.
Tags : 
backup, data protection, data security, protection strategies, recovery, cloud computing, virtualization, backup types, recovery tools, data backup
    
Acronis EMEA
Published By: Adobe     Published Date: Sep 23, 2019
Data is a company’s most valuable asset. Just look at Forbes’ World’s Most Value Brands list. No longer is a company’s worth evaluated by its tangible assets — data has changed all of that. Every business today relies on data. The ability to filter through volumes of data to capture true insights is critical to gaining a competitive advantage. Companies aspiring to deliver the best possible customer experiences must be able to unify different types of information, including behavioral, transactional , and operational data
Tags : 
    
Adobe
Published By: AlienVault     Published Date: Oct 21, 2014
While vulnerability assessments are an essential part of understanding your risk profile, it's simply not realistic to expect to eliminate all vulnerabilities from your environment. So, when your scan produces a long list of vulnerabilities, how do you prioritize which ones to remediate first? By data criticality? CVSS score? Asset value? Patch availability? Without understanding the context of the vulnerable systems on your network, you may waste time checking things off the list without really improving security. Join AlienVault for this session to learn: • The pros & cons of different types of vulnerability scans - passive, active, authenticated, unauthenticated • Vulnerability scores and how to interpret them • Best practices for prioritizing vulnerability remediation • How threat intelligence can help you pinpoint the vulnerabilities that matter most
Tags : 
vulnerability, management, risk, prioritize, profile, environment, data, asset value, network, authenticated, unauthenticated, remediation, best practices, intelligence, scores, attacks, policy violations, compromise, ex filtration, exploit
    
AlienVault
Published By: AlienVault     Published Date: Mar 30, 2016
Given that Point of Sale (POS) systems are used to transmit debit and credit card information in retail transactions, it's no wonder they are a desirable target for attackers. In this white paper, you'll learn about some of the common types of POS malware, how they work and best practices for protecting cardholder data. Topics covered in this white paper include: • Common types of POS malware and how they work • How attackers exfiltrate data from POS systems once they gain access • POS security techniques to protect payment card data Download your copy today to learn how to effectively detect and respond to POS malware threats.
Tags : 
    
AlienVault
Published By: Altiscale     Published Date: May 28, 2015
Big Data technologies are maturing and quickly moving into the next phase - one that expands in data use-cases as Hadoop moves into more influential roles throughout IT infrastructures. In this just-released report, Gartner is recognizing four Big Data vendors as "cool." Gartner says these vendors can meaningfully and synergistically combine multiple types of functionality.
Tags : 
big data, vendors, functionality, cool vendors, consistent performance, it management, data management; add - gartner, hadoop
    
Altiscale
Published By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : 
cost effective, data storage, data collection, security, compliance, platform, big data, it resources
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Today’s businesses generate staggering amounts of data, and learning to get the most value from that data is paramount to success. Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on-demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Amazon Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. Organizations choose Amazon Redshift for its affordability, flexibility, and powerful feature set: • Enterprise-class relational database query and management system • Supports client connections with many types of applications, including business intelligence (BI), reporting, data, and analytics tools • Execute analytic queries in order to retrieve, compare, and evaluate large amounts of data in multiple-stage operations
Tags : 
    
Amazon Web Services
Published By: APC     Published Date: Apr 08, 2010
Many of the mysteries of equipment failure, downtime, software and data corruption, are the result of a problematic supply of power. There is also a common problem with describing power problems in a standard way. This white paper will describe the most common types of power disturbances, what can cause them, what they can do to your critical equipment, and how to safeguard your equipment, using the IEEE standards for describing power quality problems.
Tags : 
apc, power, cooling, it wiring, heat removal, green computing, ieee, equipment failure
    
APC
Published By: Arcserve     Published Date: May 29, 2015
Find out what you need to know about data protection appliances in this video brief by the Enterprise Strategy Group’s Sr. Data Protection Analyst, Jason Buffington. In this short video he’ll analyze: -Various types of data protection appliances -Market trends regarding appliance adoption -What data protection specialists expect from appliances -How Arcserve has evolved to solve the issues of today’s complex organizations Watch this video brief today!
Tags : 
data protection appliances, appliance adoption, arcserve, security
    
Arcserve
Published By: ASG Technologies     Published Date: Apr 25, 2017
Compliance and governance trends ensure you’re responsibly using your data and that it’s trustable and traceable. Increasingly, your enterprise must know how and by whom its data is handled and what happens to it. Can your enterprise meet these new demands without slowing down the benefits you enjoy from new types and sources of data?
Tags : 
    
ASG Technologies
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Published By: AWS     Published Date: Jun 20, 2018
Data and analytics have become an indispensable part of gaining and keeping a competitive edge. But many legacy data warehouses introduce a new challenge for organizations trying to manage large data sets: only a fraction of their data is ever made available for analysis. We call this the “dark data” problem: companies know there is value in the data they collected, but their existing data warehouse is too complex, too slow, and just too expensive to use. A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated q
Tags : 
    
AWS
Published By: AWS     Published Date: Aug 20, 2018
A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated querying: ability to run a query across heterogeneous sources of data • Data consumption: support numerous types of analysis - ad-hoc exploration, predefined reporting/dashboards, predictive and advanced analytics
Tags : 
    
AWS
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: AWS     Published Date: Jul 10, 2019
Being able to monitor and respond to patient inquiries quickly and effectively is critical to creating a positive clinical experience and delivering successful products. But compiling and monitoring this data to address customer concerns in a timely way is a challenge when you have disparate sources and systems, global teams, and multiple patients. Read how a top 10 global pharmaceutical company worked with Slalom and AWS to design and implement a unified and globally distributed event and inquiry data reporting system. By combining three types of requests into one solution, the company has improved the customer experience and increased call center and data input operational efficiency by 50%. Learn how to Increase access to relevant data to help inform future or ongoing clinical trials Adapt your existing system development processes to an agile approach Engage with Slalom and AWS throughout the lifecycle of a healthcare engagement
Tags : 
    
AWS
Published By: AWS - ROI DNA     Published Date: Jun 12, 2018
Traditional databases and data warehouses are evolving to capture new data types and spread their capabilities in a hybrid cloud architecture, allowing business users to get the same results regardless of where the data resides. The details of the underlying infrastructure become invisible. Self-managing data lakes automate the provisioning, reliability, performance and cost, enabling data access and experimentation.
Tags : 
    
AWS - ROI DNA
Published By: Blue Coat Systems     Published Date: Jul 18, 2013
IT Leaders See security as barrier to enabling employees. However with new Business assurance technology you are able to give Continuity, Agility, and Governance. With Blue Coat you can deliver business continuity by protecting against threats and data loss, extend protection and policy to users in any location on any device ,safely deploy and consume all types of applications, align IT infrastructure with business priorities to assure and accelerate user experience across the extended enterprise and make risk management tradeoffs and enforce compliance.
Tags : 
technology, bluecoat, infrastructure
    
Blue Coat Systems
Published By: BrightEdge     Published Date: Nov 13, 2014
BrightEdge tapped into its massive Data Cube repository to provide a comprehensive view into the channels that drive traffic and the types of content that perform best. BrightEdge created this report to help brands understand the actual performance of site content by channel and by industry.
Tags : 
brightedge, content marketing, data analytics, marketing data, marketing platform, market intelligence, business intelligence
    
BrightEdge
Published By: Brother     Published Date: Mar 08, 2018
Documents are an integral component to the successful operation of an organization. Whether in hardcopy or digital form, they enable the communication, transaction, and recording of business-critical information. To ensure documents are used effectively, organizations are encouraged to continually evaluate and improve surrounding workflows. This may involve automating elements of document creation, securing the transfer and storage of information, and/or simplifying the retrieval of records and the data contained within. These types of enhancements can save time, money, and frustration. This white paper will discuss top trends and requirements in the optimization of document-related business processes as well as general technology infrastructures for document management. It will also address how some office technology vendors have reacted to these trends to guide their design and development of products, solutions, and services.
Tags : 
documents, workflows, business process, document management
    
Brother
Published By: Canon     Published Date: Oct 09, 2019
THE NOTIFIABLE DATA BREACHES (NDB) SCHEME came into effect on Feb 22nd 2018 making it obligatory for every organisation covered by the Australian Privacy Act to notify the Australian government of certain security breaches. The 2019 Canon Security Report is a guide to understanding which organisations are affected by this policy, the types of security breaches that require notification and what your organisation can do to help mitigate the risk of such breaches happening in the first place. Download this handy guide and protect your business from the business costs and legal ramifications of security breaches.
Tags : 
    
Canon
Published By: Carbonite     Published Date: Oct 05, 2017
Data protection often seems like a clash between competing interests: the need to protect data, against the need to protect access to data. The challenge lies in deploying the right protection across the different systems and types of data, since they each require different forms of protection. IT pros need confidence that the protection they deploy can: •Ensure long-term survivability of historical data •Deliver data securely to different waypoints. •Extend protection as environments change.
Tags : 
data protection, data security, carbonite, data recovery, data resiliency, data backup
    
Carbonite
Published By: Carbonite     Published Date: Jan 04, 2018
Data protection often seems like a clash between competing interests: the need to protect data, against the need to protect access to data. The challenge lies in deploying the right protection across the different systems and types of data, since they each require different forms of protection.
Tags : 
    
Carbonite
Published By: Carbonite     Published Date: Oct 10, 2018
Data protection is a balancing act between the need to protect data and the need to protect access to data. The trick lies with deploying the right protection across the different systems and types of data, since they each require different forms of protection.
Tags : 
    
Carbonite
Published By: Cisco     Published Date: Dec 21, 2016
The data center infrastructure is central to the overall IT architecture. It is where most business-critical applications are hosted and various types of services are provided to the business. Proper planning of the data center infrastructure design is critical, and performance, resiliency, and scalability need to be carefully considered. Another important aspect of the data center design is the flexibility to quickly deploy and support new services. Designing a flexible architecture that can support new applications in a short time frame can result in a significant competitive advantage. The basic data center network design is based on a proven layered approach that has been tested and improved over the past several years in some of the largest data center implementations in the world. The layered approach is the foundation of a data center design that seeks to improve scalability, performance, flexibility, resiliency, and maintenance.
Tags : 
    
Cisco
Start   Previous   1 2 3 4 5 6    Next    End
Search Resource Library