Webinar Replays

  • Sep 8 2016

    Migrating Your Big Data Infrastructure to Cloud

    GigaOM analyst William McKnight will be joined by experts from Qubole and WANdisco, who will explain the benefits of moving to the cloud and review the tools available for cloud migration and hybrid cloud deployments. Learn what's required to avoid the downtime and business disruption that often accompany cloud migration projects. Limited Time Offer - View Qubole and WANdisco's Special Quick Start Package: http://bit.ly/2cGupC6
  • Jul 21 2016

    Deploying Mission Critical Applications on Hadoop, On-premises and in the Cloud

    Global enterprises have quietly funneled enormous amounts of data into Hadoop over the last several years. Hadoop has transformed the way organizations deal with big data. By making vast quantities of rich unstructured and semi-structured data quickly and cheaply accessible, Hadoop has opened up a host of analytic capabilities that were never possible before, to drive business value. The challenges have revolved around operationalizing Hadoop to enterprise standards, and leveraging cloud-based Hadoop as a service (HaaS) options offering a vast array of analytics applications and processing capacity that would be impossible to deploy and maintain in-house. This webcast will explain how solutions from IBM and WANdisco address these challenges by supporting: - Continuous availability with guaranteed data consistency across Hadoop clusters any distance apart, both on-premises and in the cloud. - Migration to cloud without downtime and hybrid cloud for burst-out processing and offsite disaster recovery. - Flexibility to eliminate Hadoop distribution vendor lock-in and support migration to cloud without downtime or disruption. - IBM's BigInsights in the cloud, and BigSQL, which allows you to run standard ANSI compliant SQL against your Hadoop data.
  • Jun 23 2016

    Build An Effective, Fast And Secure Data Engine With Hortonworks & WANdisco

    Data is coming from everywhere. The first challenge is just being able to get hold of it, curate and convey it in a secure and transparent manner. Hortonworks Data Flow is the tool that collects data at the edge, processes and secures data in motion and brings data into your data-at-rest platform (HDP). Once you have your data collected in a valuable data lake, you need resilience, control over its location, and safety against failure. That’s where Wandisco Fusion & Hortonworks HDP come in. With Wandisco Fusion on HDP, an enterprise can now build an effective, fast and secure data engine out of multiple Hadoop clusters, getting the most business value out of its HDP deployment with a reliable and high-performing Big Data service. Join Hortonworks & WANdisco on this webinar to learn how to make this into reality.
  • Jun 9 2016

    Bringing Hadoop into the Banking Mainstream

    Global banks have the most rigorous availability, performance and data security standards. Join 451 Research and WANdisco as we explore the cutting-edge techniques leading financial services firms are using to fully operationalize Hadoop to meet these standards and leap ahead of their competition. Register for this webinar and get the free white paper entitled "Bringing Hadoop into the Banking Mainstream"
  • Apr 21 2016

    Making Hybrid Cloud a Reality

    Solutions for seamlessly moving data between on-premise and cloud environments are virtually non-existent. This webinar explains how to achieve a true hybrid cloud deployment that supports on-demand burst-out processing in which data moves in and out of the cloud as it changes, and enable the cloud to be used for offsite disaster recovery without downtime or data loss.
  • Feb 11 2016

    ETL and Big Data: Building Simpler Data Pipelines

    In the traditional world of EDW, ETL pipelines are a troublesome bottleneck when preparing data for use in the data warehouse. ETL pipelines are notoriously expensive and brittle, so as companies move to Hadoop they look forward to getting rid of the ETL infrastructure. But is it that simple? Some companies are finding that in order to move data between clusters for backup or aggregation purposes, they are building systems that look an awful lot like ETL.
  • Oct 22 2015

    No More DR Sites

    Disaster recovery sites are typically underutilized with idle hardware and software that are only used in an emergency. Why let your valuable resources remain idle? In this webinar, you’ll learn how you can take full advantage of the resources in what would ordinarily be your DR site by using active-active replication to provide full utilization as well as complete failover with lower RPO and RTO.
  • Oct 8 2015

    EMEA/APAC - Hadoop Migration and Upgrade without Downtime or Data Loss

    Migrating your Hadoop cluster between versions or distributions is difficult. It is a critical process that if done incorrectly can lead to the loss of data, system downtime, and disruption of business activities. In this webinar, learn how you can mitigate the risk in a migration through the development of a comprehensive migration strategy and leveraging tools like those from WANdisco to simplify and automate your migration.
  • Oct 6 2015

    Americas - Hadoop Migration and Upgrade without Downtime or Data Loss

    Migrating your Hadoop cluster between versions or distributions is difficult. It is a critical process that if done incorrectly can lead to the loss of data, system downtime, and disruption of business activities. In this webinar, learn how you can mitigate the risk in a migration through the development of a comprehensive migration strategy and leveraging tools like those from WANdisco to simplify and automate your migration.
  • Sep 24 2015

    Running Globally Distributed Hadoop Clusters

    Join us for this webinar to learn how active-active replication available only through WANdisco Fusion allows you to run a single Hadoop namespace across multiple clusters located around the world. In this webinar, you’ll see: * How slower, lower bandwidth WAN connections can impact distributed Hadoop clusters and how to mitigate that impact * How to use 100% of your data center resources for robust disaster recovery without leaving hardware idle * How to keep your Hadoop clusters running during necessary maintenance and upgrades
  • Sep 2 2015

    Enterprise Class Replication for Hadoop and Why You Need It

    To make your Hadoop deployment enterprise-class, robust data replication is required to support business-critical functions that depend on Hadoop. Firms can no longer rely on the status quo of traditional, slow data backups into underutilized hardware. Enterprise class active-active replication delivers several benefits, including: - Maximum infrastructure resource utilization - Better performance across geographies - Faster disaster recovery. In this webinar, learn how WANdisco Fusion enables true enterprise class replication for Hadoop.
  • Aug 10 2015

    Big Data Solutions with Aptus Data Labs and WANdisco

    Ravindra Swamy, Chief Technologist and Ankur Gupta, VP –Client Services will present Aptus Data Labs’ data to decision (D2D) framework, Big Data / Hadoop Services & best practices. Paul Scott-Murphy, Vice President of Field Technical Services for WANdisco will introduce WANdisco Fusion, the only active-active solution for total data protection and availability across Hadoop distributions and storage and the advantages of implementing this in your business.
  • Jul 23 2015

    Global Hadoop: Storage and Compute Challenges

    Enterprise Hadoop applications require continuous operation in the face of complete data center failure. To address this, businesses have taken a multi-data center, global approach to Hadoop. In addition, organizations have begun to utilize multi-directional data sharing between clusters to get more out of their existing infrastructure. In this webinar, we'll examine storage and compute challenges in operating Hadoop over a WAN, and lay out the blueprint for an ideal solution.
  • Jun 18 2015

    Managing hybrid on-premise/cloud Hadoop deployments

    A growing number of Hadoop adopters are making use of both on-premise and cloud clusters. Production workloads run on-premise to provide the best performance and security, while cloud clusters are reserved for testing, development, and burst-out processing power. In this webinar we'll review best practices for managing a hybrid environment including security, data transfer, and performance.
  • May 14 2015

    Big Data Storage: Options & Recommendations

    Hadoop clusters are often built around commodity storage, but architects now have a wide selection of Big Data storage choices. Hadoop clusters can use a mix of solid-state and spinning disk storage, while Hadoop compatibility layers and connectors can use enterprise storage systems or share storage between Hadoop and legacy applications. In this webinar, 451 Research Director Matt Aslett will review the storage options available to Hadoop architects and provide recommendations for each use case. WANdisco's Randy DeFauw will then present an active-active replication option that makes data available across multiple storage systems.
  • Mar 12 2015

    WANdisco Fusion: Spanning Hadoop distributions, versions, and storage

    Many Hadoop deployments comprise a mix of versions and distributions of Hadoop running in different clusters, and often include specialized storage and file systems. WANdisco Fusion extends the concept of a Hadoop Compatible File System (HCFS) to span these diverse environments, using active-active replication to unify portions of the Hadoop namespace between clusters. In this webinar, Jagane Sundar, WANdisco's CTO and formerly a Hadoop architect at Yahoo, will discuss this new product and its practical applications.
  • Feb 12 2015

    Securing Sensitive Data in Hadoop: Challenges and New Approaches

    Security and compliance concerns are often some of the biggest barriers to wider rollout of Hadoop-based analytics projects. Almost 10 years after the introduction of Hadoop, data security is finally receiving a lot of attention from the Hadoop community, yet businesses are still reluctant to rely on it to support critical applications. In this webinar, Wikibon analyst Jeff Kelly will review the key security and compliance concerns of Big Data professionals, and discuss the current range of solutions and where they may fall short. Afterwards, WANdisco’s Randy DeFauw will present a novel approach to Big Data security.
  • Jan 21 2015

    ETL and Big Data: Building Simpler Data Pipelines

    In the traditional world of EDW, ETL pipelines are a troublesome bottleneck when preparing data for use in the data warehouse. ETL pipelines are notoriously expensive and brittle, so as companies move to Hadoop they look forward to getting rid of the ETL infrastructure. But is it that simple? Some companies are finding that in order to move data between clusters for backup or aggregation purposes, they are building systems that look an awful lot like ETL. In this webinar 451 Research will review the current state of the industry: how companies are changing their ETL systems as they move to Hadoop, what challenges they’re facing, and how to minimize complex data transfer processes in Hadoop data pipelines.
  • Dec 9 2014

    Bringing Hadoop into the Banking Mainstream

    Financial services companies are deadly serious about money and ambitious in using new technology to support their operations. Financial firms know that time and bits of information can be critical to success, placing rigorous performance and availability standards on Hadoop. Data from around the world must be ingested rapidly with zero chance for interruption, and often must remain in its country of origin for analysis with only aggregate results passed upstream. How do you manage these demands with traditional Hadoop technologies? Join WANdisco, and guest speakers Leslie Owens and Jost Hoppermann of Forrester Research, to explore the cutting-edge techniques – backed by rock-solid reliability – that let financial services firms leap ahead of the competition.
  • Nov 6 2014

    HBase Driver Handbook: Use Cases and Challenges

    HBase is the premier NoSQL Database for critical business applications. This webinar will provide real world examples of large-scale HBase use cases, address the major stumbling blocks that businesses face in HBase deployment, and show how WANdisco's Non-Stop technology addresses these issues.
  • Oct 23 2014

    Get More Value out of Multiple Hadoop Data Centers - Hosted by O'Reilly Media

    For businesses that depend on Hadoop for critical operations, secondary clusters are commonly used for backup, sitting idle while the primary cluster handles the workload. In this webcast, we'll examine how to get the most out of your multi-data center Hadoop investment. Topics include: • Understanding how secondary data centers are typically set up and what role they play • Maximizing hardware utilization by running jobs in multiple data centers • How to efficiently split tasks between clusters
  • Oct 21 2014

    Supporting Financial Services with a More Flexible Approach to Big Data

    In this webinar we'll look at three examples of using 'Big Data' to get a more comprehensive view of customer behavior and activity in the banking and insurance industries. Then we'll pull out the common threads from these examples, and see how a flexible next-generation Hadoop architecture lets you get a step up on improving your business performance. Join us to learn: - How to leverage data from across an entire global enterprise - How to analyze a wide variety of structured and unstructured data to get quick, meaningful answers to critical questions - What industry leaders have put in place
  • Oct 15 2014

    Apache Hadoop: Is one cluster enough? - Hosted by GigaOm

    In this Gigaom Research webinar, the panel will discuss how the multi-cluster approach can be implemented in real systems, and whether and how it can be made to work. The panel will also talk about best practices for implementing the approach in organizations. • Does Apache YARN make all tasks equal or does dedicating clusters to specific workloads make more sense? • Is the data lake concept best for all, or is partitioning data between clusters right for some customers? • Can Hadoop inter-cluster replication of data work? • How do public and private cloud architectures impact the multi-cluster question? • Can multiple clusters be a vector of parallelism and elasticity?
  • Jan 28 2014

    Getting Hadoop through an IT Audit

    Join GigaOM Research and sponsor WANdisco for an analyst webinar on hardening Hadoop across global data centers and ensuring continuous availability, even during maintenance windows. In this webinar we'll cover: • What’s driving Hadoop adoption in the enterprise? (Use cases and examples) • How are companies hardening Hadoop to meet IT audits? • Trends in Hadoop infrastructure over the next 2-3 years