Hortonworks Certified Developer (HDPCD Exam) – Training Program

Big Data Hadoop is one of the most game changing technology ever introduced and being able to write jobs for analytics is an amazing achievement. There was a time when opening were more and people knowing Hadoop were less, but the scenario have changed in last couple of years. Freshers, Working Professionals are completing the Hadoop Training, but as they doesn't master it, it becomes difficult for them to get in the stream of Hadoop.

Getting Certified is one of the way to stand out and you too can become a certified Hadoop Developer under the banner of Hortonworks with our 25 hours comprehensive Hortonworks Certified Developer (HDPCD Exam) – Training Program. You omit your competition the moment you clear your certification, as the one of the official distributor of Hadoop certifies you being a Developer.

Who should take our Certification Training Program.

  • Any candidate with knowledge of Pig,Hive,Sqoop and Flume can take this training.

  • Any Big Data Working Professionals(with knowledge of Pig, Hive, Sqoop and Flume), appraisals are on corner, being certified gives you the boost. Even when switching your jobs, it boosts your resume in addition to your experience.

  • If you are new to the field of Big Data and want to get certified, you can first complete our Big Data Hadoop Developer Training and then opt for this certification program.

I’m Interested in
This Program


Weekdays 2nd November 2019
Location Thane

Why join Asterix for Hortonworks Certified Developer Training Program?


We have best of the trainers with a belief that best friend of an developer is a Keyboard, hence we have 100% Practical Approach policy.


We keep our batch sizes small for qualified attention and proper practical time. Smaller batches help both trainer as well as trainees to work well on the concepts.


Our Portfolio Projects makes your resume outstanding. We make projects based on Real Time problems, so every batch is doing some great projects and the experience help them deploy the concepts they know.


Our Placement unit make sure that you are quite comfortable in presenting to the companies we are tied up with. We give 100% Job Assistance.

Topics Included

  • Data Ingestion- Sqoop and Flume

  • Data Transformation- Pig

  • Data Analysis- Hive

Detailed Syllabus

Data Ingestion

Import data from a table in a relational database into HDFS  SQOOP-IMPORT, Import the results of a query from a relational database into HDFS  FREE-FORM QUERY IMPORTS, Import a table from a relational database into a new or existing Hive table  IMPORTING DATA INTO HIVE, Insert or update data from HDFS into a table in a relational database  SQOOP-EXPORT, Given a Flume configuration file, start a Flume agent FLUME AGENT, Given a configured sink and source, configure a Flume memory channel  with a specified capacity MEMORY CHANNEL.

Data Transformation

Write and execute a Pig script, Load data into a Pig relation without a schema, Load data into a Pig relation with a schema, Load data from a Hive table into a Pig relation, Use Pig to transform data into a specified format, Transform data to match a given Hive schema, Group the data of one or more Pig relations, Use Pig to remove records with null values from a relation, Store the data from a Pig relation into a folder in HDFS, Store the data from a Pig relation into a Hive table, Sort the output of a Pig relation ,Remove the duplicate tuples of a Pig relation, Specify the number of reduce tasks for a Pig MapReduce job, Join two datasets using Pig, Perform a replicated join using Pig, Run a Pig job using Tez, Within a Pig script, register a JAR file of User Defined Functions, Within a Pig script, define an alias for a User Defined Function, Within a Pig script, invoke a User Defined Function.

Data Analysis

Write and execute a Hive query Define a Hive-managed table, Define a Hive external table, Define a partitioned Hive table, Define a bucketed Hive table, Define a Hive table from a select query, Define a Hive table that uses the ORC File format, Create a new ORC File table from the data in an existing non-ORC File  Hive table, Specify the storage format of a Hive table, Specify the delimiter of a Hive table, Load data into a Hive table from a local directory, Load data into a Hive table from an HDFS directory, Load data into a Hive table as the result of a query, Load a compressed data file into a Hive table, Update a row in a Hive table, Delete a row from a Hive table, Insert a new row into a Hive table, Join two Hive tables, Run a Hive query using Tez, Run a Hive query using vectorization, Output the execution plan for a Hive query, Use a subquery within a Hive query .

Detailed FAQs

Batches are available throughout the day, you can attend any batch as per your convenience, but you must notify your coordinator before changing the batch.

Yes, you will get the training and project completion certificate by Asterix Solution.

Yes, apart from training Asterix Solution also runs a development unit, hence we give some part of project to our students.

Yes, you can take break for a valid situation and later join back in any other batch.

You can schedule a revision time with your trainer, or sit in any other batch with same topic free of cost.

Yes, we allow the fees payment to be in 2-3 installment through cash, cheque or Internet banking.

Student Reviews

Companies Where Our Students Placed