9 of 10Image
Click the Job Designer icon (4th from the left in Hue's toolbar), then the Create Mapreduce Design button, and up comes a screen that lets you run a jar-based MapReduce job. Give the job a name and a description, specify the jar file, add properties (parameters) and property values, then click Save.
You'll be automatically redirected to the Job Designs screen where you can click the Submit button for the job design you just built. You can also click the Create Streaming Design button to design a job whsoe mapper and reducer code is written in a language other than Java.
Click the Hue Shell Toolbar button to get to browser-based command line shells for Pig and HBase. Below the toolbar, you'll find one clickable link for each. The Pig shell (called "Grunt") is shown here. At this prompt you can enter Pig Latin commands interactively or run .pig scripts.
Click the HBase Shell link beneath Hue's toolbar and you'll come to HBase's command line interface. Read more about the commands you can use in this interface here.
That's pretty much all there is to working with the CDH 4 VM! Of course, if you prefer the command line for everything, then you can go back to an X terminal window and do all your work from there.
No matter which way you work, you can now start to learn Hadoop, Pig, Hive and more. And you can build your own clusters later on, if you're into that sort of thing :-)