Sie sind auf Seite 1von 17

ipaddress=> public DNS=> instance id=>

101.63.244.183 ec2-54-242-216-129.compute-1.amazonaws.com i-695b0116

http://docs.amazonwebservices.com/AWSEC2/latest/UserGuide/EC2_GetStarted.html#EC 2_LaunchInstance_Linux https://console.aws.amazon.com/ec2/home?region=us-east-1

http://apache.techartifact.com/mirror/hadoop/common/stable/hadoop-1.0.4.tar.gz http://snap.stanford.edu/class/cs246-2011/hw_files/hadoop_install.pdf http://arifn.web.id/blog/2010/07/29/running-hadoop-single-cluster.html http://arifn.web.id/blog/2010/01/23/hadoop-in-netbeans.html http://wiki.apache.org/hadoop/GettingStartedWithHadoop

tar xvfz hadoop-1.0.4.tar.gz which java=> to obtain the path. hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/JsonObjectMapperParser.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/JsonObjectMapperWriter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LogRecordType.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedDiscreteCDF.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedJob.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedLocation.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedNetworkTopology.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedSingleRelativeRanking .java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedTask.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/LoggedTaskAttempt.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/MachineNode.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/MapAttempt20LineHistoryEven tEmitter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/MapAttemptFinishedEvent.jav a hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/MapTaskAttemptInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Node.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Outputter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Pair.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ParsedConfigFile.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ParsedHost.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ParsedLine.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/PossiblyDecompressedInputSt ream.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Pre21JobHistoryConstants.ja va hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/RackNode.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/RandomSeedGenerator.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ReduceAttempt20LineHistoryE

ventEmitter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ReduceAttemptFinishedEvent. java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ReduceTaskAttemptInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/RewindableInputStream.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/SingleEventEmitter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Task20LineHistoryEventEmitt er.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttempt20LineEventEmitt er.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptFinishedEvent.ja va hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptStartedEvent.jav a hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptUnsuccessfulComp letionEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskFailedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskFinishedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskStartedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskUpdatedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TopologyBuilder.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TraceBuilder.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TreePath.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Version20LogInterfaceUtils. java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ZombieCluster.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ZombieJob.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ZombieJobProducer.java hadoop-1.0.4/src/webapps/datanode/browseBlock.jsp hadoop-1.0.4/src/webapps/datanode/browseDirectory.jsp hadoop-1.0.4/src/webapps/datanode/tail.jsp hadoop-1.0.4/src/webapps/hdfs/dfshealth.jsp hadoop-1.0.4/src/webapps/hdfs/dfsnodelist.jsp hadoop-1.0.4/src/webapps/hdfs/index.html hadoop-1.0.4/src/webapps/hdfs/nn_browsedfscontent.jsp hadoop-1.0.4/src/webapps/history/analysejobhistory.jsp hadoop-1.0.4/src/webapps/history/job_authorization_error.jsp hadoop-1.0.4/src/webapps/history/jobconf_history.jsp hadoop-1.0.4/src/webapps/history/jobdetailshistory.jsp hadoop-1.0.4/src/webapps/history/jobhistoryhome.jsp hadoop-1.0.4/src/webapps/history/jobtaskshistory.jsp hadoop-1.0.4/src/webapps/history/legacyjobhistory.jsp hadoop-1.0.4/src/webapps/history/loadhistory.jsp hadoop-1.0.4/src/webapps/history/taskdetailshistory.jsp hadoop-1.0.4/src/webapps/history/taskstatshistory.jsp hadoop-1.0.4/src/webapps/job/gethistory.jsp hadoop-1.0.4/src/webapps/job/index.html hadoop-1.0.4/src/webapps/job/job_authorization_error.jsp hadoop-1.0.4/src/webapps/job/jobblacklistedtrackers.jsp hadoop-1.0.4/src/webapps/job/jobconf.jsp hadoop-1.0.4/src/webapps/job/jobdetails.jsp hadoop-1.0.4/src/webapps/job/jobfailures.jsp hadoop-1.0.4/src/webapps/job/jobhistory.jsp hadoop-1.0.4/src/webapps/job/jobqueue_details.jsp hadoop-1.0.4/src/webapps/job/jobtasks.jsp hadoop-1.0.4/src/webapps/job/jobtracker.jsp hadoop-1.0.4/src/webapps/job/machines.jsp hadoop-1.0.4/src/webapps/job/taskdetails.jsp

hadoop-1.0.4/src/webapps/job/taskstats.jsp hadoop-1.0.4/src/webapps/static/hadoop-logo.jpg hadoop-1.0.4/src/webapps/static/hadoop.css hadoop-1.0.4/src/webapps/static/jobconf.xsl hadoop-1.0.4/src/webapps/static/jobtracker.js hadoop-1.0.4/src/webapps/static/sorttable.js hadoop-1.0.4/src/webapps/task/index.html hadoop-1.0.4/src/webapps/task/tasktracker.jsp hadoop-1.0.4/webapps/datanode/WEB-INF/web.xml hadoop-1.0.4/webapps/hdfs/WEB-INF/web.xml hadoop-1.0.4/webapps/hdfs/index.html hadoop-1.0.4/webapps/history/WEB-INF/web.xml hadoop-1.0.4/webapps/job/WEB-INF/web.xml hadoop-1.0.4/webapps/job/analysejobhistory.jsp hadoop-1.0.4/webapps/job/gethistory.jsp hadoop-1.0.4/webapps/job/index.html hadoop-1.0.4/webapps/job/job_authorization_error.jsp hadoop-1.0.4/webapps/job/jobblacklistedtrackers.jsp hadoop-1.0.4/webapps/job/jobconf.jsp hadoop-1.0.4/webapps/job/jobconf_history.jsp hadoop-1.0.4/webapps/job/jobdetails.jsp hadoop-1.0.4/webapps/job/jobdetailshistory.jsp hadoop-1.0.4/webapps/job/jobfailures.jsp hadoop-1.0.4/webapps/job/jobhistory.jsp hadoop-1.0.4/webapps/job/jobhistoryhome.jsp hadoop-1.0.4/webapps/job/jobqueue_details.jsp hadoop-1.0.4/webapps/job/jobtasks.jsp hadoop-1.0.4/webapps/job/jobtaskshistory.jsp hadoop-1.0.4/webapps/job/jobtracker.jsp hadoop-1.0.4/webapps/job/legacyjobhistory.jsp hadoop-1.0.4/webapps/job/loadhistory.jsp hadoop-1.0.4/webapps/job/machines.jsp hadoop-1.0.4/webapps/job/taskdetails.jsp hadoop-1.0.4/webapps/job/taskdetailshistory.jsp hadoop-1.0.4/webapps/job/taskstats.jsp hadoop-1.0.4/webapps/job/taskstatshistory.jsp hadoop-1.0.4/webapps/static/hadoop-logo.jpg hadoop-1.0.4/webapps/static/hadoop.css hadoop-1.0.4/webapps/static/jobconf.xsl hadoop-1.0.4/webapps/static/jobtracker.js hadoop-1.0.4/webapps/static/sorttable.js hadoop-1.0.4/webapps/task/WEB-INF/web.xml hadoop-1.0.4/webapps/task/index.html hadoop-1.0.4/src/contrib/ec2/bin/image/ hadoop-1.0.4/bin/hadoop hadoop-1.0.4/bin/hadoop-config.sh hadoop-1.0.4/bin/hadoop-daemon.sh hadoop-1.0.4/bin/hadoop-daemons.sh hadoop-1.0.4/bin/rcc hadoop-1.0.4/bin/slaves.sh hadoop-1.0.4/bin/start-all.sh hadoop-1.0.4/bin/start-balancer.sh hadoop-1.0.4/bin/start-dfs.sh hadoop-1.0.4/bin/start-jobhistoryserver.sh hadoop-1.0.4/bin/start-mapred.sh hadoop-1.0.4/bin/stop-all.sh hadoop-1.0.4/bin/stop-balancer.sh hadoop-1.0.4/bin/stop-dfs.sh hadoop-1.0.4/bin/stop-jobhistoryserver.sh hadoop-1.0.4/bin/stop-mapred.sh

hadoop-1.0.4/bin/task-controller hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-config.sh hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-daemon.sh hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-daemons.sh hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-slaves.sh hadoop-1.0.4/contrib/hdfsproxy/bin/start-hdfsproxy.sh hadoop-1.0.4/contrib/hdfsproxy/bin/stop-hdfsproxy.sh hadoop-1.0.4/contrib/hod/bin/VERSION hadoop-1.0.4/contrib/hod/bin/checknodes hadoop-1.0.4/contrib/hod/bin/hod hadoop-1.0.4/contrib/hod/bin/hodcleanup hadoop-1.0.4/contrib/hod/bin/hodring hadoop-1.0.4/contrib/hod/bin/ringmaster hadoop-1.0.4/contrib/hod/bin/verify-account hadoop-1.0.4/contrib/vaidya/bin/vaidya.sh hadoop-1.0.4/libexec/hadoop-config.sh hadoop-1.0.4/libexec/jsvc.amd64 hadoop-1.0.4/sbin/hadoop-create-user.sh hadoop-1.0.4/sbin/hadoop-setup-applications.sh hadoop-1.0.4/sbin/hadoop-setup-conf.sh hadoop-1.0.4/sbin/hadoop-setup-hdfs.sh hadoop-1.0.4/sbin/hadoop-setup-single-node.sh hadoop-1.0.4/sbin/hadoop-validate-setup.sh hadoop-1.0.4/sbin/update-hadoop-env.sh hadoop-1.0.4/src/contrib/ec2/bin/cmd-hadoop-cluster hadoop-1.0.4/src/contrib/ec2/bin/create-hadoop-image hadoop-1.0.4/src/contrib/ec2/bin/delete-hadoop-cluster hadoop-1.0.4/src/contrib/ec2/bin/hadoop-ec2 hadoop-1.0.4/src/contrib/ec2/bin/hadoop-ec2-env.sh hadoop-1.0.4/src/contrib/ec2/bin/hadoop-ec2-init-remote.sh hadoop-1.0.4/src/contrib/ec2/bin/image/create-hadoop-image-remote hadoop-1.0.4/src/contrib/ec2/bin/image/ec2-run-user-data hadoop-1.0.4/src/contrib/ec2/bin/launch-hadoop-cluster hadoop-1.0.4/src/contrib/ec2/bin/launch-hadoop-master hadoop-1.0.4/src/contrib/ec2/bin/launch-hadoop-slaves hadoop-1.0.4/src/contrib/ec2/bin/list-hadoop-clusters hadoop-1.0.4/src/contrib/ec2/bin/terminate-hadoop-cluster [ec2-user@ip-10-212-79-90 ~]$ ls -lt total 61328 -rw-rw-r-- 1 ec2-user ec2-user 62793050 Oct 3 05:17 hadoop-1.0.4.tar.gz drwxr-xr-x 14 ec2-user ec2-user 4096 Oct 3 05:17 hadoop-1.0.4 [ec2-user@ip-10-212-79-90 ~]$ df -h Filesystem Size Used Avail Use% Mounted on /dev/xvda1 7.9G 1.2G 6.7G 15% / tmpfs 298M 0 298M 0% /dev/shm [ec2-user@ip-10-212-79-90 ~]$ which java /usr/bin/java [ec2-user@ip-10-212-79-90 ~]$ cd /usr/bin/java -bash: cd: /usr/bin/java: Not a directory [ec2-user@ip-10-212-79-90 ~]$ ls hadoop-1.0.4 hadoop-1.0.4.tar.gz [ec2-user@ip-10-212-79-90 ~]$ ls -a . .bash_logout .bashrc hadoop-1.0.4.tar.gz .. .bash_profile hadoop-1.0.4 .ssh [ec2-user@ip-10-212-79-90 ~]$ cd hadoop-1.0.4 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy sbin build.xml hadoop-client-1.0.4.jar ivy.xml share c++ hadoop-core-1.0.4.jar lib src

CHANGES.txt hadoop-examples-1.0.4.jar libexec webapps conf hadoop-minicluster-1.0.4.jar LICENSE.txt contrib hadoop-test-1.0.4.jar NOTICE.txt docs hadoop-tools-1.0.4.jar README.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd conf [ec2-user@ip-10-212-79-90 conf]$ ls capacity-scheduler.xml hadoop-policy.xml slaves configuration.xsl hdfs-site.xml ssl-client.xml.example core-site.xml log4j.properties ssl-server.xml.example fair-scheduler.xml mapred-queue-acls.xml taskcontroller.cfg hadoop-env.sh mapred-site.xml hadoop-metrics2.properties masters [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh # remote nodes. # The java implementation to use. Required. # export JAVA_HOME=/usr/lib/j2sdk1.5-sun hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Pre21JobHistoryConstants.ja va hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/RackNode.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/RandomSeedGenerator.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ReduceAttempt20LineHistoryE ventEmitter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ReduceAttemptFinishedEvent. java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ReduceTaskAttemptInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/RewindableInputStream.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/SingleEventEmitter.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Task20LineHistoryEventEmitt er.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttempt20LineEventEmitt er.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptFinishedEvent.ja va hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptStartedEvent.jav a hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskAttemptUnsuccessfulComp letionEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskFailedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskFinishedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskInfo.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskStartedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TaskUpdatedEvent.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TopologyBuilder.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TraceBuilder.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/TreePath.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/Version20LogInterfaceUtils. java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ZombieCluster.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ZombieJob.java hadoop-1.0.4/src/tools/org/apache/hadoop/tools/rumen/ZombieJobProducer.java hadoop-1.0.4/src/webapps/datanode/browseBlock.jsp hadoop-1.0.4/src/webapps/datanode/browseDirectory.jsp hadoop-1.0.4/src/webapps/datanode/tail.jsp hadoop-1.0.4/src/webapps/hdfs/dfshealth.jsp hadoop-1.0.4/src/webapps/hdfs/dfsnodelist.jsp hadoop-1.0.4/src/webapps/hdfs/index.html hadoop-1.0.4/src/webapps/hdfs/nn_browsedfscontent.jsp hadoop-1.0.4/src/webapps/history/analysejobhistory.jsp

hadoop-1.0.4/src/webapps/history/job_authorization_error.jsp hadoop-1.0.4/src/webapps/history/jobconf_history.jsp hadoop-1.0.4/src/webapps/history/jobdetailshistory.jsp hadoop-1.0.4/src/webapps/history/jobhistoryhome.jsp hadoop-1.0.4/src/webapps/history/jobtaskshistory.jsp hadoop-1.0.4/src/webapps/history/legacyjobhistory.jsp hadoop-1.0.4/src/webapps/history/loadhistory.jsp hadoop-1.0.4/src/webapps/history/taskdetailshistory.jsp hadoop-1.0.4/src/webapps/history/taskstatshistory.jsp hadoop-1.0.4/src/webapps/job/gethistory.jsp hadoop-1.0.4/src/webapps/job/index.html hadoop-1.0.4/src/webapps/job/job_authorization_error.jsp hadoop-1.0.4/src/webapps/job/jobblacklistedtrackers.jsp hadoop-1.0.4/src/webapps/job/jobconf.jsp hadoop-1.0.4/src/webapps/job/jobdetails.jsp hadoop-1.0.4/src/webapps/job/jobfailures.jsp hadoop-1.0.4/src/webapps/job/jobhistory.jsp hadoop-1.0.4/src/webapps/job/jobqueue_details.jsp hadoop-1.0.4/src/webapps/job/jobtasks.jsp hadoop-1.0.4/src/webapps/job/jobtracker.jsp hadoop-1.0.4/src/webapps/job/machines.jsp hadoop-1.0.4/src/webapps/job/taskdetails.jsp hadoop-1.0.4/src/webapps/job/taskstats.jsp hadoop-1.0.4/src/webapps/static/hadoop-logo.jpg hadoop-1.0.4/src/webapps/static/hadoop.css hadoop-1.0.4/src/webapps/static/jobconf.xsl hadoop-1.0.4/src/webapps/static/jobtracker.js hadoop-1.0.4/src/webapps/static/sorttable.js hadoop-1.0.4/src/webapps/task/index.html hadoop-1.0.4/src/webapps/task/tasktracker.jsp hadoop-1.0.4/webapps/datanode/WEB-INF/web.xml hadoop-1.0.4/webapps/hdfs/WEB-INF/web.xml hadoop-1.0.4/webapps/hdfs/index.html hadoop-1.0.4/webapps/history/WEB-INF/web.xml hadoop-1.0.4/webapps/job/WEB-INF/web.xml hadoop-1.0.4/webapps/job/analysejobhistory.jsp hadoop-1.0.4/webapps/job/gethistory.jsp hadoop-1.0.4/webapps/job/index.html hadoop-1.0.4/webapps/job/job_authorization_error.jsp hadoop-1.0.4/webapps/job/jobblacklistedtrackers.jsp hadoop-1.0.4/webapps/job/jobconf.jsp hadoop-1.0.4/webapps/job/jobconf_history.jsp hadoop-1.0.4/webapps/job/jobdetails.jsp hadoop-1.0.4/webapps/job/jobdetailshistory.jsp hadoop-1.0.4/webapps/job/jobfailures.jsp hadoop-1.0.4/webapps/job/jobhistory.jsp hadoop-1.0.4/webapps/job/jobhistoryhome.jsp hadoop-1.0.4/webapps/job/jobqueue_details.jsp hadoop-1.0.4/webapps/job/jobtasks.jsp hadoop-1.0.4/webapps/job/jobtaskshistory.jsp hadoop-1.0.4/webapps/job/jobtracker.jsp hadoop-1.0.4/webapps/job/legacyjobhistory.jsp hadoop-1.0.4/webapps/job/loadhistory.jsp hadoop-1.0.4/webapps/job/machines.jsp hadoop-1.0.4/webapps/job/taskdetails.jsp hadoop-1.0.4/webapps/job/taskdetailshistory.jsp hadoop-1.0.4/webapps/job/taskstats.jsp hadoop-1.0.4/webapps/job/taskstatshistory.jsp hadoop-1.0.4/webapps/static/hadoop-logo.jpg hadoop-1.0.4/webapps/static/hadoop.css

hadoop-1.0.4/webapps/static/jobconf.xsl hadoop-1.0.4/webapps/static/jobtracker.js hadoop-1.0.4/webapps/static/sorttable.js hadoop-1.0.4/webapps/task/WEB-INF/web.xml hadoop-1.0.4/webapps/task/index.html hadoop-1.0.4/src/contrib/ec2/bin/image/ hadoop-1.0.4/bin/hadoop hadoop-1.0.4/bin/hadoop-config.sh hadoop-1.0.4/bin/hadoop-daemon.sh hadoop-1.0.4/bin/hadoop-daemons.sh hadoop-1.0.4/bin/rcc hadoop-1.0.4/bin/slaves.sh hadoop-1.0.4/bin/start-all.sh hadoop-1.0.4/bin/start-balancer.sh hadoop-1.0.4/bin/start-dfs.sh hadoop-1.0.4/bin/start-jobhistoryserver.sh hadoop-1.0.4/bin/start-mapred.sh hadoop-1.0.4/bin/stop-all.sh hadoop-1.0.4/bin/stop-balancer.sh hadoop-1.0.4/bin/stop-dfs.sh hadoop-1.0.4/bin/stop-jobhistoryserver.sh hadoop-1.0.4/bin/stop-mapred.sh hadoop-1.0.4/bin/task-controller hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-config.sh hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-daemon.sh hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-daemons.sh hadoop-1.0.4/contrib/hdfsproxy/bin/hdfsproxy-slaves.sh hadoop-1.0.4/contrib/hdfsproxy/bin/start-hdfsproxy.sh hadoop-1.0.4/contrib/hdfsproxy/bin/stop-hdfsproxy.sh hadoop-1.0.4/contrib/hod/bin/VERSION hadoop-1.0.4/contrib/hod/bin/checknodes hadoop-1.0.4/contrib/hod/bin/hod hadoop-1.0.4/contrib/hod/bin/hodcleanup hadoop-1.0.4/contrib/hod/bin/hodring hadoop-1.0.4/contrib/hod/bin/ringmaster hadoop-1.0.4/contrib/hod/bin/verify-account hadoop-1.0.4/contrib/vaidya/bin/vaidya.sh hadoop-1.0.4/libexec/hadoop-config.sh hadoop-1.0.4/libexec/jsvc.amd64 hadoop-1.0.4/sbin/hadoop-create-user.sh hadoop-1.0.4/sbin/hadoop-setup-applications.sh hadoop-1.0.4/sbin/hadoop-setup-conf.sh hadoop-1.0.4/sbin/hadoop-setup-hdfs.sh hadoop-1.0.4/sbin/hadoop-setup-single-node.sh hadoop-1.0.4/sbin/hadoop-validate-setup.sh hadoop-1.0.4/sbin/update-hadoop-env.sh hadoop-1.0.4/src/contrib/ec2/bin/cmd-hadoop-cluster hadoop-1.0.4/src/contrib/ec2/bin/create-hadoop-image hadoop-1.0.4/src/contrib/ec2/bin/delete-hadoop-cluster hadoop-1.0.4/src/contrib/ec2/bin/hadoop-ec2 hadoop-1.0.4/src/contrib/ec2/bin/hadoop-ec2-env.sh hadoop-1.0.4/src/contrib/ec2/bin/hadoop-ec2-init-remote.sh hadoop-1.0.4/src/contrib/ec2/bin/image/create-hadoop-image-remote hadoop-1.0.4/src/contrib/ec2/bin/image/ec2-run-user-data hadoop-1.0.4/src/contrib/ec2/bin/launch-hadoop-cluster hadoop-1.0.4/src/contrib/ec2/bin/launch-hadoop-master hadoop-1.0.4/src/contrib/ec2/bin/launch-hadoop-slaves hadoop-1.0.4/src/contrib/ec2/bin/list-hadoop-clusters hadoop-1.0.4/src/contrib/ec2/bin/terminate-hadoop-cluster

[ec2-user@ip-10-212-79-90 ~]$ ls -lt total 61328 -rw-rw-r-- 1 ec2-user ec2-user 62793050 Oct 3 05:17 hadoop-1.0.4.tar.gz drwxr-xr-x 14 ec2-user ec2-user 4096 Oct 3 05:17 hadoop-1.0.4 [ec2-user@ip-10-212-79-90 ~]$ df -h Filesystem Size Used Avail Use% Mounted on /dev/xvda1 7.9G 1.2G 6.7G 15% / tmpfs 298M 0 298M 0% /dev/shm [ec2-user@ip-10-212-79-90 ~]$ which java /usr/bin/java [ec2-user@ip-10-212-79-90 ~]$ cd /usr/bin/java -bash: cd: /usr/bin/java: Not a directory [ec2-user@ip-10-212-79-90 ~]$ ls hadoop-1.0.4 hadoop-1.0.4.tar.gz [ec2-user@ip-10-212-79-90 ~]$ ls -a . .bash_logout .bashrc hadoop-1.0.4.tar.gz .. .bash_profile hadoop-1.0.4 .ssh [ec2-user@ip-10-212-79-90 ~]$ cd hadoop-1.0.4 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy sbin build.xml hadoop-client-1.0.4.jar ivy.xml share c++ hadoop-core-1.0.4.jar lib src CHANGES.txt hadoop-examples-1.0.4.jar libexec webapps conf hadoop-minicluster-1.0.4.jar LICENSE.txt contrib hadoop-test-1.0.4.jar NOTICE.txt docs hadoop-tools-1.0.4.jar README.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd conf [ec2-user@ip-10-212-79-90 conf]$ ls capacity-scheduler.xml hadoop-policy.xml slaves configuration.xsl hdfs-site.xml ssl-client.xml.example core-site.xml log4j.properties ssl-server.xml.example fair-scheduler.xml mapred-queue-acls.xml taskcontroller.cfg hadoop-env.sh mapred-site.xml hadoop-metrics2.properties masters [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh [ec2-user@ip-10-212-79-90 conf]$ cd .. [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd .. [ec2-user@ip-10-212-79-90 ~]$ cd .. [ec2-user@ip-10-212-79-90 home]$ ls ec2-user [ec2-user@ip-10-212-79-90 home]$ ls -a . .. ec2-user [ec2-user@ip-10-212-79-90 home]$ cd ec2-user [ec2-user@ip-10-212-79-90 ~]$ ls -lt total 61328 -rw-rw-r-- 1 ec2-user ec2-user 62793050 Oct 3 05:17 hadoop-1.0.4.tar.gz drwxr-xr-x 14 ec2-user ec2-user 4096 Oct 3 05:17 hadoop-1.0.4 [ec2-user@ip-10-212-79-90 ~]$ ls -lta total 61356 drwx------ 4 ec2-user ec2-user 4096 Nov 27 23:46 . -rw------- 1 ec2-user ec2-user 691 Nov 27 23:46 .viminfo drwx------ 2 ec2-user ec2-user 4096 Nov 27 22:25 .ssh drwxr-xr-x 3 root root 4096 Oct 8 17:08 .. -rw-rw-r-- 1 ec2-user ec2-user 62793050 Oct 3 05:17 hadoop-1.0.4.tar.gz drwxr-xr-x 14 ec2-user ec2-user 4096 Oct 3 05:17 hadoop-1.0.4 -rw-r--r-- 1 ec2-user ec2-user 18 May 22 2012 .bash_logout -rw-r--r-- 1 ec2-user ec2-user 176 May 22 2012 .bash_profile -rw-r--r-- 1 ec2-user ec2-user 124 May 22 2012 .bashrc [ec2-user@ip-10-212-79-90 ~]$ hostname -f ip-10-212-79-90.ec2.internal

[ec2-user@ip-10-212-79-90 ~]$ wget http://download.oracle.com/otn-pub/java/jdk/7u7-b10/jre-7u7-linux-i586.tar. gz wget --no-cookies --header "Cookie: gpw_e24=http%3A%2F%2Fwww.oracle.com%2Ftechne twork%2Fjava%2Fjavase%2Fdownloads%2Fjdk-7u7-download-1501626.html;" http://downl oad.oracle.com/otn-pub/java/jdk/7u7-b10/jre-7u7-linux-i586.tar.gz jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e08601451c tar zxvf jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e0 8601451c ./user/java ./user/java/jre1.7.0_07/bin/java /home/ec2-user/user/java /home/ec2-user/user/java/jre1.7.0_07/bin/java /home/ec2-user/data/hdfstmp [ec2-user@ip-10-212-79-90 ~]$ find -name java ./user/java ./user/java/jre1.7.0_07/bin/java ./hadoop-1.0.4/src/benchmarks/gridmix2/src/java ./hadoop-1.0.4/src/contrib/streaming/src/java ./hadoop-1.0.4/src/contrib/vaidya/src/java ./hadoop-1.0.4/src/contrib/failmon/src/java ./hadoop-1.0.4/src/contrib/gridmix/src/java ./hadoop-1.0.4/src/contrib/eclipse-plugin/src/java ./hadoop-1.0.4/src/contrib/hdfsproxy/src/java ./hadoop-1.0.4/src/contrib/index/src/java ./hadoop-1.0.4/src/contrib/thriftfs/src/java ./hadoop-1.0.4/src/contrib/fairscheduler/src/java ./hadoop-1.0.4/src/contrib/capacity-scheduler/src/java ./hadoop-1.0.4/src/contrib/data_join/src/java ./hadoop-1.0.4/src/test/system/java [ec2-user@ip-10-212-79-90 ~]$ ^C [ec2-user@ip-10-212-79-90 ~]$ ^C [ec2-user@ip-10-212-79-90 ~]$ ls hadoop-1.0.4 hadoop-1.0.4.tar.gz jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e08601451c user [ec2-user@ip-10-212-79-90 ~]$ cd hadoop-1.0.4 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy sbin build.xml hadoop-client-1.0.4.jar ivy.xml share c++ hadoop-core-1.0.4.jar lib src CHANGES.txt hadoop-examples-1.0.4.jar libexec webapps conf hadoop-minicluster-1.0.4.jar LICENSE.txt contrib hadoop-test-1.0.4.jar NOTICE.txt docs hadoop-tools-1.0.4.jar README.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd cong -bash: cd: cong: No such file or directory [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd conf [ec2-user@ip-10-212-79-90 conf]$ ls capacity-scheduler.xml hadoop-policy.xml slaves configuration.xsl hdfs-site.xml ssl-client.xml.example

core-site.xml log4j.properties ssl-server.xml.example fair-scheduler.xml mapred-queue-acls.xml taskcontroller.cfg hadoop-env.sh mapred-site.xml hadoop-metrics2.properties masters [ec2-user@ip-10-212-79-90 conf]$ vi core-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh [ec2-user@ip-10-212-79-90 conf]$ sudo find /-name java find: `/-name': No such file or directory find: `java': No such file or directory [ec2-user@ip-10-212-79-90 conf]$ sudo find -name javva [ec2-user@ip-10-212-79-90 conf]$ sudo find -name java [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh [ec2-user@ip-10-212-79-90 conf]$ cd /home [ec2-user@ip-10-212-79-90 home]$ ls ec2-user [ec2-user@ip-10-212-79-90 home]$ cd ec2-user [ec2-user@ip-10-212-79-90 ~]$ ls hadoop-1.0.4 hadoop-1.0.4.tar.gz jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e08601451c user [ec2-user@ip-10-212-79-90 ~]$ ssh-keygen -t rsa -P "" Generating public/private rsa key pair. Enter file in which to save the key (/home/ec2-user/.ssh/id_rsa): /home/ec2-user /.ssh/id_rsa Your identification has been saved in /home/ec2-user/.ssh/id_rsa. Your public key has been saved in /home/ec2-user/.ssh/id_rsa.pub. The key fingerprint is: 49:ae:7c:a5:9f:41:09:e5:1a:aa:3f:ce:b1:0b:7f:41 ec2-user@ip-10-212-79-90 The key's randomart image is: +--[ RSA 2048]----+ | . | | o | | + . | | +E= . | | ..S + | | o ..+ | | o + o.. | | =.+.. o | | .B+ o | +-----------------+ [ec2-user@ip-10-212-79-90 ~]$ cat .ssh/id_rsa.pub ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAwHXIYh/D2QmFyIzy9aUoIA41vXFq9AN5sE9V4R/ihtYP f2aKErgJF5rk+9XzqRupxoU+M3eLsdcPzQxSPD6qBhELRGOTilxSCyATv57x5qg7RXltnzEqq6n8NeVi 78r2AyEqjg94RN5xSDIEzsp8bBEua6SxoicfnHVfbW2Fwqdg6g3ZmLTQJDzd46UX6rQD9AR/fsc8+11j Nzc4fdX855nX6qWB2ttlZvrSCgqO9p6bHTqAeJFSVazaaGAXJZ9TVOLzqUT2mNMszDAivETtUwJ9O7ZX v1Dpy9GgYG63tcglW00TEF+JhpEtk8Qxve5z5luq2QUJf+jChW3M3fbOaQ== ec2-user@ip-10-21279-90 [ec2-user@ip-10-212-79-90 ~]$ ls hadoop-1.0.4 hadoop-1.0.4.tar.gz jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e08601451c user [ec2-user@ip-10-212-79-90 ~]$ cd hadoop-1.0.4 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy sbin build.xml hadoop-client-1.0.4.jar ivy.xml share c++ hadoop-core-1.0.4.jar lib src

CHANGES.txt hadoop-examples-1.0.4.jar libexec webapps conf hadoop-minicluster-1.0.4.jar LICENSE.txt contrib hadoop-test-1.0.4.jar NOTICE.txt docs hadoop-tools-1.0.4.jar README.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd conf [ec2-user@ip-10-212-79-90 conf]$ ls capacity-scheduler.xml hadoop-policy.xml slaves configuration.xsl hdfs-site.xml ssl-client.xml.example core-site.xml log4j.properties ssl-server.xml.example fair-scheduler.xml mapred-queue-acls.xml taskcontroller.cfg hadoop-env.sh mapred-site.xml hadoop-metrics2.properties masters [ec2-user@ip-10-212-79-90 conf]$ vi core-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi core-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi mapred-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi mapred-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi hdfs-site.xml [ec2-user@ip-10-212-79-90 conf]$ ./bin/hadoop namenode -format -bash: ./bin/hadoop: No such file or directory [ec2-user@ip-10-212-79-90 conf]$ cd .. [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ../bin/hadoop namenode -format -bash: ../bin/hadoop: No such file or directory [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy sbin build.xml hadoop-client-1.0.4.jar ivy.xml share c++ hadoop-core-1.0.4.jar lib src CHANGES.txt hadoop-examples-1.0.4.jar libexec webapps conf hadoop-minicluster-1.0.4.jar LICENSE.txt contrib hadoop-test-1.0.4.jar NOTICE.txt docs hadoop-tools-1.0.4.jar README.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ bin -bash: bin: command not found [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd bin [ec2-user@ip-10-212-79-90 bin]$ ./bin/hadoop namenode -format -bash: ./bin/hadoop: No such file or directory [ec2-user@ip-10-212-79-90 bin]$ cd bin -bash: cd: bin: No such file or directory [ec2-user@ip-10-212-79-90 bin]$ ls hadoop start-all.sh stop-balancer.sh hadoop-config.sh start-balancer.sh stop-dfs.sh hadoop-daemon.sh start-dfs.sh stop-jobhistoryserver.sh hadoop-daemons.sh start-jobhistoryserver.sh stop-mapred.sh rcc start-mapred.sh task-controller slaves.sh stop-all.sh [ec2-user@ip-10-212-79-90 bin]$ cd .. [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/hadoop namenode -format 12/11/28 02:04:17 INFO namenode.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = ip-10-212-79-90/10.212.79.90 STARTUP_MSG: args = [-format] STARTUP_MSG: version = 1.0.4 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/b ranch-1.0 -r 1393290; compiled by 'hortonfo' on Wed Oct 3 05:13:58 UTC 2012 ************************************************************/ 12/11/28 02:04:18 INFO util.GSet: VM type = 32-bit 12/11/28 02:04:18 INFO util.GSet: 2% max memory = 19.33375 MB 12/11/28 02:04:18 INFO util.GSet: capacity = 2^22 = 4194304 entries 12/11/28 02:04:18 INFO util.GSet: recommended=4194304, actual=4194304 12/11/28 02:04:18 INFO namenode.FSNamesystem: fsOwner=ec2-user

12/11/28 02:04:18 INFO namenode.FSNamesystem: supergroup=supergroup 12/11/28 02:04:18 INFO namenode.FSNamesystem: isPermissionEnabled=true 12/11/28 02:04:18 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100 12/11/28 02:04:18 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessK eyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s) 12/11/28 02:04:18 INFO namenode.NameNode: Caching file names occuring more than 10 times 12/11/28 02:04:18 INFO common.Storage: Image file of size 114 saved in 0 seconds . 12/11/28 02:04:18 INFO common.Storage: Storage directory /home/ec2-user/data/hdf stmp/dfs/name has been successfully formatted. 12/11/28 02:04:18 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at ip-10-212-79-90/10.212.79.90 ************************************************************/ [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ .bin/start-all.sh -bash: .bin/start-all.sh: No such file or directory [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy sbin build.xml hadoop-client-1.0.4.jar ivy.xml share c++ hadoop-core-1.0.4.jar lib src CHANGES.txt hadoop-examples-1.0.4.jar libexec webapps conf hadoop-minicluster-1.0.4.jar LICENSE.txt contrib hadoop-test-1.0.4.jar NOTICE.txt docs hadoop-tools-1.0.4.jar README.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh starting namenode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hadoop -ec2-user-namenode-ip-10-212-79-90.out The authenticity of host 'localhost (127.0.0.1)' can't be established. RSA key fingerprint is af:88:d5:51:d2:33:e3:81:af:54:99:ce:be:0b:2a:12. Are you sure you want to continue connecting (yes/no)? yes localhost: Warning: Permanently added 'localhost' (RSA) to the list of known hos ts. localhost: Permission denied (publickey). localhost: Permission denied (publickey). starting jobtracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hado op-ec2-user-jobtracker-ip-10-212-79-90.out localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ sudo ./bin/start-all.sh starting namenode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hadoop -root-namenode-ip-10-212-79-90.out The authenticity of host 'localhost (127.0.0.1)' can't be established. RSA key fingerprint is af:88:d5:51:d2:33:e3:81:af:54:99:ce:be:0b:2a:12. Are you sure you want to continue connecting (yes/no)? yes localhost: Warning: Permanently added 'localhost' (RSA) to the list of known hos ts. localhost: Permission denied (publickey). localhost: Permission denied (publickey). starting jobtracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hado op-root-jobtracker-ip-10-212-79-90.out localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh namenode running as process 1553. Stop it first. localhost: Permission denied (publickey). localhost: Permission denied (publickey). jobtracker running as process 1721. Stop it first. localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ jps -bash: jps: command not found [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ find -name jps

[ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd .. [ec2-user@ip-10-212-79-90 ~]$ find -name jps [ec2-user@ip-10-212-79-90 ~]$ kill 1721 localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ jps -bash: jps: command not found [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ find -name jps [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd .. [ec2-user@ip-10-212-79-90 ~]$ find -name jps [ec2-user@ip-10-212-79-90 ~]$ kill 1721 [ec2-user@ip-10-212-79-90 ~]$ ./bin/start-all.sh -bash: ./bin/start-all.sh: No such file or directory [ec2-user@ip-10-212-79-90 ~]$ ls data hadoop-1.0.4 hadoop-1.0.4.tar.gz jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e08601451c user [ec2-user@ip-10-212-79-90 ~]$ cd data [ec2-user@ip-10-212-79-90 data]$ ls hdfstmp [ec2-user@ip-10-212-79-90 data]$ cd hdfstmp [ec2-user@ip-10-212-79-90 hdfstmp]$ ls dfs [ec2-user@ip-10-212-79-90 hdfstmp]$ cd dfs [ec2-user@ip-10-212-79-90 dfs]$ ls name [ec2-user@ip-10-212-79-90 dfs]$ cd name [ec2-user@ip-10-212-79-90 name]$ ls current image previous.checkpoint [ec2-user@ip-10-212-79-90 name]$ cd current [ec2-user@ip-10-212-79-90 current]$ ls edits fsimage fstime VERSION [ec2-user@ip-10-212-79-90 current]$ cd .. [ec2-user@ip-10-212-79-90 name]$ cd .. [ec2-user@ip-10-212-79-90 dfs]$ cd .. [ec2-user@ip-10-212-79-90 hdfstmp]$ cd .. [ec2-user@ip-10-212-79-90 data]$ cd .. [ec2-user@ip-10-212-79-90 ~]$ ls data hadoop-1.0.4 hadoop-1.0.4.tar.gz jre-7u7-linux-i586.tar.gz?AuthParam=1354065481_58675abe509dcaf7d855b3e08601451c user [ec2-user@ip-10-212-79-90 ~]$ cd hadoop-1.0.4 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy README.txt build.xml hadoop-client-1.0.4.jar ivy.xml sbin c++ hadoop-core-1.0.4.jar lib share CHANGES.txt hadoop-examples-1.0.4.jar libexec src conf hadoop-minicluster-1.0.4.jar LICENSE.txt webapps contrib hadoop-test-1.0.4.jar logs docs hadoop-tools-1.0.4.jar NOTICE.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd conf [ec2-user@ip-10-212-79-90 conf]$ ls capacity-scheduler.xml hadoop-policy.xml slaves configuration.xsl hdfs-site.xml ssl-client.xml.example core-site.xml log4j.properties ssl-server.xml.example fair-scheduler.xml mapred-queue-acls.xml taskcontroller.cfg

hadoop-env.sh mapred-site.xml hadoop-metrics2.properties masters [ec2-user@ip-10-212-79-90 conf]$ vi core-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi hadoop-env.sh [ec2-user@ip-10-212-79-90 conf]$ vi hdfs-site.xml [ec2-user@ip-10-212-79-90 conf]$ vi mapred-site.xml [ec2-user@ip-10-212-79-90 conf]$ cd .. [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy README.txt build.xml hadoop-client-1.0.4.jar ivy.xml sbin c++ hadoop-core-1.0.4.jar lib share CHANGES.txt hadoop-examples-1.0.4.jar libexec src conf hadoop-minicluster-1.0.4.jar LICENSE.txt webapps contrib hadoop-test-1.0.4.jar logs docs hadoop-tools-1.0.4.jar NOTICE.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh namenode running as process 1553. Stop it first. localhost: Permission denied (publickey). localhost: Permission denied (publickey). starting jobtracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hado op-ec2-user-jobtracker-ip-10-212-79-90.out localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ kill 1553 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh starting namenode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hadoop -ec2-user-namenode-ip-10-212-79-90.out localhost: Permission denied (publickey). localhost: Permission denied (publickey). jobtracker running as process 2504. Stop it first. localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ kill 2504 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh namenode running as process 2647. Stop it first. localhost: Permission denied (publickey). localhost: Permission denied (publickey). starting jobtracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hado op-ec2-user-jobtracker-ip-10-212-79-90.out localhost: Permission denied (publickey). [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authori zed_keys [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh namenode running as process 2647. Stop it first. localhost: starting datanode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../ logs/hadoop-ec2-user-datanode-ip-10-212-79-90.out localhost: starting secondarynamenode, logging to /home/ec2-user/hadoop-1.0.4/li bexec/../logs/hadoop-ec2-user-secondarynamenode-ip-10-212-79-90.out jobtracker running as process 3032. Stop it first. localhost: starting tasktracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/ ../logs/hadoop-ec2-user-tasktracker-ip-10-212-79-90.out [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ kill 2647 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ kill 3032 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/start-all.sh starting namenode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hadoop -ec2-user-namenode-ip-10-212-79-90.out localhost: datanode running as process 3250. Stop it first. localhost: secondarynamenode running as process 3359. Stop it first. jobtracker running as process 3032. Stop it first. localhost: tasktracker running as process 3506. Stop it first. [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ cd bin [ec2-user@ip-10-212-79-90 bin]$ ls

hadoop start-all.sh stop-balancer.sh hadoop-config.sh start-balancer.sh stop-dfs.sh hadoop-daemon.sh start-dfs.sh stop-jobhistoryserver.sh hadoop-daemons.sh start-jobhistoryserver.sh stop-mapred.sh rcc start-mapred.sh task-controller slaves.sh stop-all.sh [ec2-user@ip-10-212-79-90 bin]$ sh stop-all.sh no jobtracker to stop localhost: no tasktracker to stop stopping namenode localhost: stopping datanode localhost: stopping secondarynamenode [ec2-user@ip-10-212-79-90 bin]$ sh start-all.sh starting namenode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hadoop -ec2-user-namenode-ip-10-212-79-90.out localhost: starting datanode, logging to /home/ec2-user/hadoop-1.0.4/libexec/../ logs/hadoop-ec2-user-datanode-ip-10-212-79-90.out localhost: starting secondarynamenode, logging to /home/ec2-user/hadoop-1.0.4/li bexec/../logs/hadoop-ec2-user-secondarynamenode-ip-10-212-79-90.out starting jobtracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/../logs/hado op-ec2-user-jobtracker-ip-10-212-79-90.out localhost: starting tasktracker, logging to /home/ec2-user/hadoop-1.0.4/libexec/ ../logs/hadoop-ec2-user-tasktracker-ip-10-212-79-90.out [ec2-user@ip-10-212-79-90 bin]$ cd .. [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ls bin hadoop-ant-1.0.4.jar ivy README.txt build.xml hadoop-client-1.0.4.jar ivy.xml sbin c++ hadoop-core-1.0.4.jar lib share CHANGES.txt hadoop-examples-1.0.4.jar libexec src conf hadoop-minicluster-1.0.4.jar LICENSE.txt webapps contrib hadoop-test-1.0.4.jar logs docs hadoop-tools-1.0.4.jar NOTICE.txt [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/hadoop hadoop-examples-1.0.4.jar pi 10 10 Error: Could not find or load main class hadoop-examples-1.0.4.jar [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ ./bin/hadoop jar hadoop-examples-1.0.4. jar pi 10 10 Number of Maps = 10 Samples per Map = 10 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Wrote input for Map #5 Wrote input for Map #6 Wrote input for Map #7 Wrote input for Map #8 Wrote input for Map #9 Starting Job 12/11/28 02:22:17 INFO mapred.FileInputFormat: Total input paths to process : 10 12/11/28 02:22:18 INFO mapred.JobClient: Running job: job_201211280220_0001 12/11/28 02:22:19 INFO mapred.JobClient: map 0% reduce 0% 12/11/28 02:22:34 INFO mapred.JobClient: map 20% reduce 0% 12/11/28 02:22:43 INFO mapred.JobClient: map 40% reduce 0% 12/11/28 02:22:52 INFO mapred.JobClient: map 40% reduce 13% 12/11/28 02:22:55 INFO mapred.JobClient: map 60% reduce 13% 12/11/28 02:23:01 INFO mapred.JobClient: map 60% reduce 16% 12/11/28 02:23:05 INFO mapred.JobClient: map 60% reduce 20% 12/11/28 02:23:35 INFO mapred.JobClient: map 80% reduce 20%

12/11/28 02:23:41 INFO mapred.JobClient: map 100% reduce 20% 12/11/28 02:23:47 INFO mapred.JobClient: map 100% reduce 26% 12/11/28 02:23:53 INFO mapred.JobClient: map 100% reduce 100% 12/11/28 02:23:58 INFO mapred.JobClient: Job complete: job_201211280220_0001 12/11/28 02:23:58 INFO mapred.JobClient: Counters: 30 12/11/28 02:23:58 INFO mapred.JobClient: Job Counters 12/11/28 02:23:58 INFO mapred.JobClient: Launched reduce tasks=1 12/11/28 02:23:58 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=142465 12/11/28 02:23:58 INFO mapred.JobClient: Total time spent by all reduces wai ting after reserving slots (ms)=0 12/11/28 02:23:58 INFO mapred.JobClient: Total time spent by all maps waitin g after reserving slots (ms)=0 12/11/28 02:23:58 INFO mapred.JobClient: Launched map tasks=10 12/11/28 02:23:58 INFO mapred.JobClient: Data-local map tasks=10 12/11/28 02:23:58 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=78769 12/11/28 02:23:58 INFO mapred.JobClient: File Input Format Counters 12/11/28 02:23:58 INFO mapred.JobClient: Bytes Read=1180 12/11/28 02:23:58 INFO mapred.JobClient: File Output Format Counters 12/11/28 02:23:58 INFO mapred.JobClient: Bytes Written=97 12/11/28 02:23:58 INFO mapred.JobClient: FileSystemCounters 12/11/28 02:23:58 INFO mapred.JobClient: FILE_BYTES_READ=226 12/11/28 02:23:58 INFO mapred.JobClient: HDFS_BYTES_READ=2440 12/11/28 02:23:58 INFO mapred.JobClient: FILE_BYTES_WRITTEN=239780 12/11/28 02:23:58 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=215 12/11/28 02:23:58 INFO mapred.JobClient: Map-Reduce Framework 12/11/28 02:23:58 INFO mapred.JobClient: Map output materialized bytes=280 12/11/28 02:23:58 INFO mapred.JobClient: Map input records=10 12/11/28 02:23:58 INFO mapred.JobClient: Reduce shuffle bytes=280 12/11/28 02:23:58 INFO mapred.JobClient: Spilled Records=40 12/11/28 02:23:58 INFO mapred.JobClient: Map output bytes=180 12/11/28 02:23:58 INFO mapred.JobClient: Total committed heap usage (bytes)= 1426944000 12/11/28 02:23:58 INFO mapred.JobClient: CPU time spent (ms)=16250 12/11/28 02:23:58 INFO mapred.JobClient: Map input bytes=240 12/11/28 02:23:58 INFO mapred.JobClient: SPLIT_RAW_BYTES=1260 12/11/28 02:23:58 INFO mapred.JobClient: Combine input records=0 12/11/28 02:23:58 INFO mapred.JobClient: Reduce input records=20 12/11/28 02:23:58 INFO mapred.JobClient: Reduce input groups=20 12/11/28 02:23:58 INFO mapred.JobClient: Combine output records=0 12/11/28 02:23:58 INFO mapred.JobClient: Physical memory (bytes) snapshot=14 03031552 12/11/28 02:23:58 INFO mapred.JobClient: Reduce output records=0 12/11/28 02:23:58 INFO mapred.JobClient: Virtual memory (bytes) snapshot=380 0526848 12/11/28 02:23:58 INFO mapred.JobClient: Map output records=20 Job Finished in 102.706 seconds Estimated value of Pi is 3.20000000000000000000 [ec2-user@ip-10-212-79-90 hadoop-1.0.4]$ http://download.oracle.com/otn-pub/java/jdk/7u9-b05/jdk-7u9-linux-x64.tar.gz wget --no-cookies --header "Cookie: gpw_e24=http%3A%2F%2Fwww.oracle.com%2Ftechne twork%2Fjava%2Fjavase%2Fdownloads%2Fjdk-7u7-download-1501626.html;" http://downl oad.oracle.com/otn-pub/java/jdk/7u9-b05/jdk-7u9-linux-x64.tar.gz % tar zxvf jdk-7u9-linux-x64.tar.gz?AuthParam=1354363662_4ee3c67f6b850babd5266e1 e5f1d25b4 jdk1.7.0_09

./user/java/jdk1.7.0_09/bin/jps

Das könnte Ihnen auch gefallen