当前位置:   article > 正文

Windows系统安装hadoop并启动_windows 启动droopy

windows 启动droopy

首先最基本的要安装JDK并配置好了环境变量
然后去hadoop官网下载对应版本的压缩包
解压到本地
在本地hadoop目录下找到:
core-site.xml

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>    
</configuration>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

hdfs-site.xml

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>    
        <name>dfs.namenode.name.dir</name>    
        <value>file:/hadoop/data/dfs/namenode</value>    
    </property>    
    <property>    
        <name>dfs.datanode.data.dir</name>    
        <value>file:/hadoop/data/dfs/datanode</value>  
    </property>
</configuration>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14

mapred-site.xml

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

yarn-site.xml

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
        <value>org.apache.hadoop.mapred.ShuffleHandler</value>
    </property>
</configuration>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10

然后下载对应版本的hadoop的winutil:
https://github.com/yzydtc/winutils
下载后将下载的文件替换到hadoop目录下的bin目录里
然后打开cmd进入hadoop目录下的bin目录:执行下面命令:
hadoop namenode -format

DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
21/07/29 18:17:56 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = TCNP3541/*******
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.7.4
STARTUP_MSG:   classpath = D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\share\hadoop\common\lib\activation-1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-asn1-api-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-util-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\asm-3.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\avro-1.7.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-1.7.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-cli-1.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-codec-1.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-compress-1.4.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-configuration-1.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-digester-1.8.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-httpclient-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-io-2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-lang-2.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-net-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-client-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-framework-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-recipes-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\gson-2.2.4.jar;D:\hadoop-2.7.4\share\ha
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

然后进入hadoop目录下的sbin目录里执行下面命令:
start-all.cmd
然后会弹出四个窗口
在这里插入图片描述

会将namenode、datanode、resourcemanager、nodemanager
namenode:

DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
21/07/29 18:22:22 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = TCNP3541/*******
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.7.4
STARTUP_MSG:   classpath = D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\share\hadoop\common\lib\activation-1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-asn1-api-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-util-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\asm-3.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\avro-1.7.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-1.7.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-cli-1.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-codec-1.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-compress-1.4.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-configuration-1.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-digester-1.8.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-httpclient-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-io-2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-lang-2.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-net-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-client-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-framework-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-recipes-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\gson-2.2.4.jar;D:\hadoop-2.7.4\share\hadoop\co
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

datanode

DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
21/07/29 18:22:22 INFO datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = TCNP3541/********
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.7.4
STARTUP_MSG:   classpath = D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\share\hadoop\common\lib\activation-1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-asn1-api-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-util-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\asm-3.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\avro-1.7.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-1.7.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-cli-1.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-codec-1.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-compress-1.4.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-configuration-1.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-digester-1.8.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-httpclient-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-io-2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-lang-2.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-net-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-client-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-framework-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-recipes-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\gson-2.2.4.jar;D:\hadoop-2.7.4\share\hadoop\co
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

nodemanager:

21/07/29 18:22:22 INFO nodemanager.NodeManager: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NodeManager
STARTUP_MSG:   host = TCNP3541/**********
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.7.4
STARTUP_MSG:   classpath = D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\share\hadoop\common\lib\activation-1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-asn1-api-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-util-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\asm-3.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\avro-1.7.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-1.7.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-cli-1.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-codec-1.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-compress-1.4.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-configuration-1.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-digester-1.8.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-httpclient-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-io-2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-lang-2.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-net-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-client-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-framework-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-recipes-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\gson-2.2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\guava-11.0.2.jar;D:\hadoop-2.7.4\sh
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7

resourcemanager:

21/07/29 18:22:22 INFO resourcemanager.ResourceManager: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting ResourceManager
STARTUP_MSG:   host = TCNP3541/*******
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 2.7.4
STARTUP_MSG:   classpath = D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\etc\hadoop;D:\hadoop-2.7.4\share\hadoop\common\lib\activation-1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-i18n-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-asn1-api-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\api-util-1.0.0-M20.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\asm-3.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\avro-1.7.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-1.7.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-cli-1.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-codec-1.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-compress-1.4.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-configuration-1.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-digester-1.8.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-httpclient-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-io-2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-lang-2.6.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\commons-net-3.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-client-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-framework-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\curator-recipes-2.7.1.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\gson-2.2.4.jar;D:\hadoop-2.7.4\share\hadoop\common\lib\guava-11.0.2.jar;D:\had
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7

这就代表已经开启成功
进入http://localhost:8088/cluster
可以看到yarn

问题点:

  • start-all.cmd这个命令要进入sbin执行而不是bin
  • 去github下载对应版本的winutil
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/你好赵伟/article/detail/634479
推荐阅读
相关标签
  

闽ICP备14008679号