workload data

Results 1 - 25 of 264Sort Results By: Published Date | Title | Company Name
Published By: Intel     Published Date: Nov 14, 2019
Infrastructure considerations for IT leaders By 2020, deep learning will have reached a fundamentally different stage of maturity. Deployment and adoption will no longer be confined to experimentation, becoming a core part of day-to-day business operations across most fields of research and industries. Why? Because advancements in the speed and accuracy of the hardware and software that underpin deep learning workloads have made it both viable and cost-effective. Much of this added value will be generated by deep learning inference – that is, using a model to infer something about data it has never seen before. Models can be deployed in the cloud or data center, but more and more we will see them on end devices like cameras and phones. Intel predicts that there will be a shift in the ratio between cycles of inference and training from 1:1 in the early days of deep learning, to well over 5:1 by 2020¹. Intel calls this the shift to ‘inference at scale’ and, with inference also taking up
Tags : 
    
Intel
Published By: Intel     Published Date: Nov 14, 2019
Evaluating an existing HPC platform for efficient AI-driven workloads The growth of artificial intelligence (AI) capabilities, data volumes and computing power in recent years mean that AI is now a serious consideration for most organizations. When combined with high-performance computing (HPC) capabilities, its potential grows even stronger. However, converging the two approaches requires careful thought and planning. New AI initiatives must align with organizational strategy. Then AI workloads must integrate with your existing HPC infrastructure to achieve the best results in the most efficient and cost- effective way. This paper outlines the key considerations for organizations looking to bring AI into their HPC environment, and steps they can take to ensure the success of their first forays into HPC/AI convergence.
Tags : 
    
Intel
Published By: Intel     Published Date: Nov 14, 2019
You can migrate live VMs between Intel processor-based servers but migration in a mixed CPU environment requires downtime and administrative hassle A study commissioned by Intel Corp. One of the greatest advantages of adopting a adopting a public, private, or hybrid cloud environment is being able to easily migrate the virtual machines that run your critical business applications—within the data center, across data centers, and between clouds. Routine hardware maintenance, data center expansion, server hardware upgrades, VM consolidation, and other events all require your IT staff to migrate VMs. For years, one powerful tool in your arsenal has been VMware vSphere® vMotion®, which can live migrate VMs from one host to another with zero downtime, provided the servers share the same underlying architecture. The EVC (Enhanced vMotion Compatibility) feature of vMotion makes it possible to live migrate virtual machines even between different generations of CPUs within a given architecture.
Tags : 
    
Intel
Published By: Infinidat EMEA     Published Date: Oct 10, 2019
Big Data- und Analytik-Workloads bringen für Unternehmen neue Herausforderungen mit sich. Die erfassten Daten stammen aus Quellen, die vor zehn Jahren noch gar nicht existierten. Es werden Daten von Mobiltelefonen, maschinengenerierte Daten und Daten aus Webseiten-Interaktionen erfasst und analysiert. In Zeiten knapper IT-Budgets wird die Lage zusätzlich dadurch verschärft, dass die Big Data-Volumen immer größer werden und zu enormen Speicherproblemen führen. Das vorliegende White Paper informiert über die Probleme, die Big Data-Anwendungen für Storage-Systeme mit sich bringen, sowie darüber, wie die Auswahl der richtigen Storage-Infrastruktur Big Data- und Analytik-Anwendungen optimieren kann, ohne das Budget zu sprengen.
Tags : 
    
Infinidat EMEA
Published By: Infinidat EMEA     Published Date: Oct 10, 2019
I Big Data e gli analytics workloads sono la nuova frontiera per le aziende. I dati vengono raccolti da fonti che non esistevano 10 anni fa. Tutti i dati dei telefoni cellulari, i dati generati dalle macchine e i dati relativi all’interazione con i siti vengono raccolti e analizzati. Inoltre, con i budget IT sempre più sotto pressione, l’impatto ambientale dei Big Data non fa che aumentare e pone grandi sfi de per i sistemi storage. Questo documento fornisce informazioni sulle problematiche che le applicazioni dei Big Data pongono sullo storage e su come scegliere le più corrette infrastrutture per ottimizzare e consolidare le applicazioni dei Big Data e degli analytics, senza prosciugare le fi nanze.
Tags : 
    
Infinidat EMEA
Published By: Infinidat EMEA     Published Date: Oct 10, 2019
Lo storage enterprise InfiniBox® fornisce prestazioni migliori rispetto a quelle che si possono ottenere con la tecnologia all flash, un’alta disponibilità su scala multi-petabyte adatta a workload per applicazioni miste. Snapshot a impatto zero e replica active/active migliorano decisamente la business agility, mentre l’encryption data-at-rest certificata FIPS elimina la necessità di cancellare in modo sicuro gli array smantellati. Con InfiniBox, le aziende enterprise IT e i fornitori di servizi cloud possono andare oltre i propri obiettivi di servizio e al contempo abbattere i costi e la complessità delle loro operazioni su scala petabyte.
Tags : 
    
Infinidat EMEA
Published By: AWS     Published Date: Oct 07, 2019
Imperva, an APN Security Competency Partner, can help protect your application workloads on AWS with the Imperva SaaS Web Application Security platform. The Imperva high-capacity network of globally distributed security services protects websites against all types of DDoS threats, including networklevel Layer 3 and Layer 4 volumetric attacks—such as synchronized (SYN) floods and User Datagram Protocol (UDP) floods—and Layer 7 application-level attacks (including the OWASP Top 10 threats) that attempt to compromise application resources. Harnessing real data about current threats from a global customer base, both the Web Application Firewall (WAF) and DDoS protection, incorporate an advanced client classification system that blocks malicious traffic without interfering with legitimate users. Enterprises can easily create custom security rules in the GUI to enforce their specific security policy. In addition, this versatile solution supports hybrid environments, allowing you to manage th
Tags : 
    
AWS
Published By: Gigaom     Published Date: Sep 16, 2019
We’ve heard it before. A data warehouse is a place for formally-structured, highly-curated data, accommodating recurring business analyses, whereas data lakes are places for “raw” data, serving analytic workloads, experimental in nature. Since both conventional and experimental analysis is important in this data-driven era, we’re left with separate repositories, siloed data, and bifurcated skill sets. Or are we? In fact, less structured data can go into your warehouse, and since today’s data warehouses can leverage the same distributed file systems and cloud storage layers that host data lakes, the warehouse/lake distinction’s very premise is rapidly diminishing. In reality, business drivers and business outcomes demand that we abandon the false dichotomy and unify our data, our governance, our analysis, and our technology teams. Want to get this right? Then join us for a free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust and special guest, Dav
Tags : 
    
Gigaom
Published By: Group M_IBM Q3'19     Published Date: Sep 04, 2019
In the last few years we have seen a rapid evolution of data. The need to embrace the growing volume, velocity and variety of data from new technologies such as Artificial Intelligence (AI) and Internet of Things (IoT) has been accelerated. The ability to explore, store, and manage your data and therefore drive new levels of analytics and decision-making can make the difference between being an industry leader and being left behind by the competition. The solution you choose must be able to: • Harness exponential data growth as well as semistructured and unstructured data • Aggregate disparate data across your organization, whether on-premises or in the cloud • Support the analytics needs of your data scientists, line of business owners and developers • Minimize difficulties in developing and deploying even the most advanced analytics workloads • Provide the flexibility and elasticity of a cloud option but be housed in your data center for optimal security and compliance
Tags : 
    
Group M_IBM Q3'19
Published By: Group M_IBM Q3'19     Published Date: Aug 12, 2019
Welcome to Secure Hybrid Cloud For Dummies, IBM Limited Edition. The hybrid cloud is becoming the way enterprises are transforming their organizations to meet changing customer requirements. Businesses are discovering that in order to support the needs of customers, there is an imperative to leverage the highly secure IBM Z platform to support missioncritical workloads, such as transaction management applications. The Z platform has been transformed over the years. The combination of z/OS, LinuxONE, open APIs, and the inclusion of Kubernetes has made IBM Z a critical partner in the hybrid cloud world. Businesses can transform their IBM Z environments into a secure, private cloud. In addition, through IBM’s public cloud, businesses may take advantage of IBM Z’s security services to protect their data and applications.
Tags : 
    
Group M_IBM Q3'19
Published By: Cohesity     Published Date: Aug 09, 2019
As organizations continue to look for ways to increase business agility, a need for a modern database architecture that can rapidly respond to the needs of business is more apparent than ever. While an RDBMS still serves as a lifeline for many organizations, the adoption of technologies such as NoSQL and Hadoop are enabling organizations to best address database performance and scalability requirements while also satisfying the goals of embracing hybrid cloud and becoming more data-driven. And with organizations relying so heavily on these new technologies to yield rapid insights that positively impact the business, the need to evaluate how those new technologies are managed and protected is essential. Hadoop and NoSQL workloads are now pervasive in production environments and require “production-class” data protection, yet few data protection solutions offer such capabilities today.
Tags : 
    
Cohesity
Published By: Cohesity     Published Date: Aug 09, 2019
In a context of mass data fragmentation on-premises and in the cloud, organizations now struggle with the compounded complexities brought about by modern workloads such as containers, NoSQL/NewSQL databases, and SaaS applications. These new workloads are turning traditional backup and recovery approaches on their head—in particular, in Microsoft Office 365 deployments for which new backup, recovery, and data management schemas must be deployed.
Tags : 
    
Cohesity
Published By: Cohesity     Published Date: Aug 09, 2019
IT organizations everywhere are undergoing significant transformation to keep pace with the needs of their businesses. They’re tasked with consolidating data centers and migrating both workloads and data to the cloud. The transition has been easier for some than others. As hybrid architectures increasingly become the norm, how are enterprises gaining complete visibility, simplifying management, and making use of all of their data—both on-premises and in the cloud? Five enterprises explain how they’ve replaced multiple products that created legacy data silos with Cohesity – a single, hyperconverged softwaredefined platform with native Microsoft Azure integration for simplified secondary data and applications. For them, Cohesity and Azure together boost IT agility while lowering costs, solving critical secondary data challenges from long-term retention, storage tiering, test/dev, disaster recovery and cloud-native backup in a proven hybrid cloud architecture.
Tags : 
    
Cohesity
Published By: Cohesity     Published Date: Aug 09, 2019
Data for secondary workloads – backup, test/dev, disaster recovery, and archiving to name a few – have become siloed the same way application data has, leading to multiple point solutions to manage an increasing amount of data. This white paper looks at the evolution of these challenges and offers practical advice on ways to store, manage and move secondary data in hybrid cloud architectures while extracting the hidden value it can provide.
Tags : 
    
Cohesity
Published By: Dell EMC     Published Date: Aug 01, 2019
In the Principled Technologies datacenter, we tested the All-Flash Dell EMC SC5020 storage array and the HPE Nimble Storage AF5000 array to see how well they performed while handling two workloads at once. The Dell EMC array handled transactional database workloads and data mart imports better than the HPE solution without sacrificing performance. Download this whitepaper from Dell and Intel® to learn more.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Aug 01, 2019
Software might run the world, but software still runs on hardware. It’s a misperception that hardware has little value anymore. Every application, every workload, every data set runs on physical servers. Read “Hardware Does Matter: Global Server Brands are Perceived as Superior for Driving Digital Business,” a Frost & Sullivan report of 500 IT decision makers, on the value of global server brands vs. commodity servers. Look beyond commodity status to discover: • Key server purchase criteria • How top brands directly compare • How to choose based on workload Server brands very significantly, and a commodity brand may not provide the outcomes you need, especially for new and next-generation applications. Download this analyst report from Dell EMC and Intel® to learn more.
Tags : 
    
Dell EMC
Published By: AWS     Published Date: Jul 29, 2019
What you'll learn in this webinar: Optimize your operations by taking advantage of the modern, scalable cloud infrastructure available on Amazon Web Services (AWS). Migrate your Oracle applications and databases to AWS and get all the benefits of the cloud. Migrating mission-critical Oracle databases and applications to the cloud is complex and you may feel locked into your platform. Amazon Aurora provides commercial-grade database performance and availability at a fraction of the cost. Apps Associates—an AWS Partner Network (APN) Partner and Oracle expert—can migrate enterprise workloads to the cloud, freeing customers to focus on higher-value initiatives. Watch this webinar to learn how to: Run your entire Oracle database and application environment on the cloud Take advantage of lower IT costs on the cloud and reduce your Total Cost of Ownership (TCO) Leverage Amazon Aurora to help satisfy your company’s cloud-first mandate, improve security, and reduce risk
Tags : 
    
AWS
Published By: AWS     Published Date: Jul 24, 2019
What you'll learn in this webinar: Optimize your operations by taking advantage of the modern, scalable cloud infrastructure available on Amazon Web Services (AWS). Migrate your Oracle applications and databases to AWS and get all the benefits of the cloud. Migrating mission-critical Oracle databases and applications to the cloud is complex and you may feel locked into your platform. Amazon Aurora provides commercial-grade database performance and availability at a fraction of the cost. Apps Associates—an AWS Partner Network (APN) Partner and Oracle expert—can migrate enterprise workloads to the cloud, freeing customers to focus on higher-value initiatives. Watch this webinar to learn how to: Run your entire Oracle database and application environment on the cloud Take advantage of lower IT costs on the cloud and reduce your Total Cost of Ownership (TCO) Leverage Amazon Aurora to help satisfy your company’s cloud-first mandate, improve security, and reduce risk
Tags : 
    
AWS
Published By: IBM APAC     Published Date: Jul 19, 2019
It’s important to understand the impact of AI workloads on data management and storage infrastructure. If you’re selecting infrastructure for AI workloads involving ML and deep learning, you must understand the unique requirements of these emerging workloads, especially if you’re looking to use them to accelerate innovation and agility. This Gartner report highlights three main impacts that AI workloads have on data management and storage.
Tags : 
    
IBM APAC
Published By: NetApp APAC     Published Date: Jul 04, 2019
This IDC study provides an evaluation of 10 vendors that sell all-flash arrays (AFAs) for dense mixed enterprise workload consolidation that includes at least some mission-critical applications. "All-flash arrays are dominating primary storage spend in the enterprise, driving over 80% of that revenue in 2017," said Eric Burgener, research director, Storage. "Today's leading AFAs offer all the performance, capacity scalability, enterprise-class functionality, and datacenter integration capabilities needed to support dense mixed enterprise workload consolidation. More and more IT shops are recognizing this and committing to 'all flash for primary storage' strategies."
Tags : 
    
NetApp APAC
Published By: Pure Storage     Published Date: Jul 03, 2019
For thousands of organizations, Splunk® has become mission-critical. But it’s still a very demanding workload. Pure Storage solutions dramatically improve Splunk Enterprise deployments by accelerating data ingest, indexing, search, and reporting capabilities – giving businesses the speed and intelligence to make faster, more informed decisions.
Tags : 
    
Pure Storage
Published By: Rackspace     Published Date: May 28, 2019
Today, it isn’t a matter of if you’re taking SAP to the cloud – but a matter of when, and how you’re going to make it happen. Because by moving your SAP workloads to the cloud, you are putting them together with other data streams, advanced analytics and machine learning, to create a powerful combination to better engage customers, empower employees, optimize operations and transform products. This e-book introduces Rackspace as the managed cloud service provider to partner, for moving SAP workloads to Azure. Being certified in all the leading SAP technologies, including hosting services, HANA Operations and HANA Enterprise Cloud (HEC) – and having been awarded Microsoft Hosting Partner of the Year five times – Rackspace has got the whole SAP on Azure solution covered, from planning to deployment and ongoing management. Check out the case studies of global companies, like Rockwell Automation, The Mosaic Company, Malaysia Airlines and Coats & Clark, to discover how they have benefi
Tags : 
    
Rackspace
Published By: Alert Logic     Published Date: May 23, 2019
Organizations continue to adopt cloud computing at a rapid pace to benefit from increased efficiency, better scalability, and faster deployments. As more workloads are shifting to the cloud, cybersecurity professionals remain concerned about security of data, systems, and services in the cloud. To cope with new security challenges, security teams are forced to reassess their security posture and strategies as traditional security tools are often not suited for the challenges of dynamic, virtual and distributed cloud environments. This technology challenge is only exacerbated by the dramatic shortage of skilled cybersecurity professionals.
Tags : 
    
Alert Logic
Published By: Infinidat EMEA     Published Date: May 14, 2019
Big Data and analytics workloads represent a new frontier for organizations. Data is being collected from sources that did not exist 10 years ago. Mobile phone data, machine-generated data, and website interaction data are all being collected and analyzed. In addition, as IT budgets are already under pressure, Big Data footprints are getting larger and posing a huge storage challenge. This paper provides information on the issues that Big Data applications pose for storage systems and how choosing the correct storage infrastructure can streamline and consolidate Big Data and analytics applications without breaking the bank.
Tags : 
    
Infinidat EMEA
Published By: Infinidat EMEA     Published Date: May 14, 2019
Infinidat has developed a storage platform that provides unique simplicity, efficiency, reliability, and extensibility that enhances the business value of large-scale OpenStack environments. The InfiniBox® platform is a pre-integrated solution that scales to multiple petabytes of effective capacity in a single 42U rack. The platform’s innovative combination of DRAM, flash, and capacity-optimized disk, delivers tuning-free, high performance for consolidated mixed workloads, including object/Swift, file/Manila, and block/Cinder. These factors combine to cut direct and indirect costs associated with large-scale OpenStack infrastructures, even versus “build-it-yourself” solutions. InfiniBox delivers seven nines (99.99999%) of availability without resorting to expensive replicas or slow erasure codes for data protection. Operations teams appreciate our delivery model designed to easily drop into workflows at all levels of the stack, including native Cinder integration, Ansible automation pl
Tags : 
    
Infinidat EMEA
Start   Previous   1 2 3 4 5 6 7 8 9 10 11    Next    End
Search Resource Library