Spark/Hadoop Cluster: Difference between revisions
(Created page with "To allow the spark user to ssh to itself and also the workers, you need ssh passwordless to be enabled. This can be done by logging into the spark user and doing: <pre>ssh-keygen -t rsa -P ""</pre> Once the key has been generated, it will be in /home/spark/.ssh/id_rsa (by default). Copy it to the authorized hosts file (to allow spark to ssh to itself): <pre>cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys</pre> Or, for each worker, do something like: <pre>ssh-copy-i...") |
No edit summary |
||
Line 11: | Line 11: | ||
<pre>ssh-copy-id -i ~/.ssh/mykey spark@localhost</pre> | <pre>ssh-copy-id -i ~/.ssh/mykey spark@localhost</pre> | ||
<pre>ssh-copy-id -i ~/.ssh/mykey spark@spark2.lab.bpopp.net</pre> |
Revision as of 06:28, 29 January 2024
To allow the spark user to ssh to itself and also the workers, you need ssh passwordless to be enabled. This can be done by logging into the spark user and doing:
ssh-keygen -t rsa -P ""
Once the key has been generated, it will be in /home/spark/.ssh/id_rsa (by default). Copy it to the authorized hosts file (to allow spark to ssh to itself):
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Or, for each worker, do something like:
ssh-copy-id -i ~/.ssh/mykey spark@localhost
ssh-copy-id -i ~/.ssh/mykey spark@spark2.lab.bpopp.net