Hadoop Ecosystem Essentials
Posted 2 years ago by Packt
Learn the skills needed to succeed as a data analyst
For data analysts, Hadoop is an extremely powerful tool to help process large amounts of data and is used by successful companies such as Google and Spotify.
On this four-week course, you’ll learn how to use Hadoop to its full potential to make it easier for you to store, analyse, and scale big data.
Through step-by-step guides and exercises, you’ll gain the knowledge and practical skills to take into your role in data analytics.
Understand how to manage your Hadoop cluster
You’ll understand how to manage clusters with Yet Another Resource Negotiator (YARN), Mesos, Zookeeper, Oozie, Zeppelin, and Hue.
With this knowledge, you’ll be able to ensure high performance, workload management, security, and more.
Learn how to analyse streams of data
Next, you’ll uncover the techniques to handle and stream data in real-time using Kafka, Flume, Spark Streaming, Flink, and Storm.
This understanding will help you to react and respond quickly to any issues that may arise.
Hone your data handling skills
Finally, you’ll learn how to design real-world systems using the Hadoop ecosystem to ensure you can use your skills in practice.
By the end of the course, you’ll have the knowledge to handle large amounts of data using Hadoop.
This course is designed for anyone who wants to hone their data handling skills using Hadoop.
You’ll be shown how to use a variety of open source utilities within the Hadoop environment. We assume you’ve already installed the Hadoop environment. If you haven’t, check out Introduction to Big Data Analytics with Hadoop.
This course is designed for anyone who wants to hone their data handling skills using Hadoop.
- Practice using different query engines in the Hadoop ecosystem.
- Demonstrate using different resource negotiators to manage a Hadoop cluster.
- Describe streaming.
- Practice analysing streams of data.
- Design a system to meet real-world business requirements.