Deploying a Hadoop Cluster

Udacity
Online

Kostenlos

Wichtige informationen

  • Kurs
  • Online
  • Wann:
    Freie Auswahl
Beschreibung

Learn how to tackle big data problems with your own Hadoop clusters! In this course, you’ll deploy Hadoop clusters in the cloud and use them to gain insights from large datasets.

Wichtige informationen
Veranstaltungsort(e)

Wo und wann

Beginn Lage
Freie Auswahl
Online

Was lernen Sie in diesem Kurs?

Hadoop
Amazon EC2
Ambari
MapReduce
Hadoop cluster

Themenkreis

Intermediate

Approx. 3 weeks

Assumes 6hrs/wk (work at your own pace)

Join thousands of students Course Summary

Learn how to tackle big data problems with your own Hadoop clusters! In this course, you’ll deploy Hadoop clusters in the cloud and use them to gain insights from large datasets.

Why Take This Course?

Using massive datasets to guide decisions is becoming more and more important for modern businesses. Hadoop and MapReduce are fundamental tools for working with big data. By knowing how to deploy your own Hadoop clusters, you’ll be able to start exploring big data on your own.

Prerequisites and Requirements

This course is intended for students with some experience with Hadoop and MapReduce, Python, and bash commands.

You’ll have to be able to work with HDFS and write MapReduce programs. You can learn about these in our Intro to Hadoop and MapReduce course.

The MapReduce programs in the course are written in Python. It is possible to use Java and other languages, but we suggest using Python, on the level of our Intro to Computer Science course.

You’ll also be using remote cloud machines, so you’ll need to know these bash commands:

  • ssh
  • scp
  • cat
  • head/tail

You’ll also need to be able to work in an editor such as vim or nano. You can learn about these in our Linux Command Line Basics course.

See the Technology Requirements for using Udacity.

Syllabus Deploying a Hadoop cluster on Amazon EC2

Deploy a small Hadoop cluster on Amazon EC2 instances.

Deploy a Hadoop cluster with Ambari

Use Apache Ambari to automatically deploy a larger, more powerful Hadoop cluster.

On-demand Hadoop clusters

Use Amazon’s ElasticMapReduce to deploy a Hadoop cluster on-demand.

Project: Analyzing a big dataset with Hadoop and MapReduce

Use Hadoop and MapReduce to analyze a 150 GB dataset of Wikipedia page views.