Webinar Replays

SEPTEMBER

  • Sep 8 2016

    Migrating Your Big Data Infrastructure to Cloud

    GigaOM analyst William McKnight will be joined by experts from Qubole and WANdisco, who will explain the benefits of moving to the cloud and review the tools available for cloud migration and hybrid cloud deployments. Learn what's required to avoid the downtime and business disruption that often accompany cloud migration projects. Limited Time Offer - View Qubole and WANdisco's Special Quick Start Package: http://bit.ly/2cGupC6

JULY

  • Jul 21 2016

    Deploying Mission Critical Applications on Hadoop, On-premises and in the Cloud

    Global enterprises have quietly funneled enormous amounts of data into Hadoop over the last several years. Hadoop has transformed the way organizations deal with big data. By making vast quantities of rich unstructured and semi-structured data quickly and cheaply accessible, Hadoop has opened up a host of analytic capabilities that were never possible before, to drive business value. The challenges have revolved around operationalizing Hadoop to enterprise standards, and leveraging cloud-based Hadoop as a service (HaaS) options offering a vast array of analytics applications and processing capacity that would be impossible to deploy and maintain in-house. This webcast will explain how solutions from IBM and WANdisco address these challenges by supporting: - Continuous availability with guaranteed data consistency across Hadoop clusters any distance apart, both on-premises and in the cloud. - Migration to cloud without downtime and hybrid cloud for burst-out processing and offsite disaster recovery. - Flexibility to eliminate Hadoop distribution vendor lock-in and support migration to cloud without downtime or disruption. - IBM's BigInsights in the cloud, and BigSQL, which allows you to run standard ANSI compliant SQL against your Hadoop data.

JUNE

  • Jun 23 2016

    Build An Effective, Fast And Secure Data Engine With Hortonworks & WANdisco

    Data is coming from everywhere. The first challenge is just being able to get hold of it, curate and convey it in a secure and transparent manner. Hortonworks Data Flow is the tool that collects data at the edge, processes and secures data in motion and brings data into your data-at-rest platform (HDP). Once you have your data collected in a valuable data lake, you need resilience, control over its location, and safety against failure. That’s where Wandisco Fusion & Hortonworks HDP come in. With Wandisco Fusion on HDP, an enterprise can now build an effective, fast and secure data engine out of multiple Hadoop clusters, getting the most business value out of its HDP deployment with a reliable and high-performing Big Data service. Join Hortonworks & WANdisco on this webinar to learn how to make this into reality.
  • Jun 9 2016

    Bringing Hadoop into the Banking Mainstream

    Global banks have the most rigorous availability, performance and data security standards. Join 451 Research and WANdisco as we explore the cutting-edge techniques leading financial services firms are using to fully operationalize Hadoop to meet these standards and leap ahead of their competition. Register for this webinar and get the free white paper entitled "Bringing Hadoop into the Banking Mainstream"