Getting erro while connecting hadoop with minio

Hi,
I setup minio and iceberg together first using git:

After that I tried to setup hadoop and integrate it with minio using following steps:
“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”
sudo apt update
sudo apt install openjdk-8-jdk
java -version
sudo apt install ssh

ssh-keygen -t rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 640 ~/.ssh/authorized_keys
ssh (external IP)

wget dlcdn.apache.org/hadoop/common/hadoop-3.4.0/hadoop-3.4.0.tar.gz
tar -xvzf hadoop-3.4.0.tar.gz
mv hadoop-3.4.0 hadoop

nano ~/.bashrc

cd hadoop/
mkdir -p ~/hadoopdata/hdfs/namenode
mkdir -p ~/hadoopdata/hdfs/datanode

After that i setup the core-site.xml and installed the mentioned packages Modern Data Lake with MinIO : Part 2

Then I started hadoop
hadoop/bin/hdfs namenode -format
hadoop/sbin/start-all.sh
“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”“”
When I did JPS, it’s showing that Nodemanager is running
After that I ran the following command to read from minio:
hadoop fs -ls s3a://mydemobucket/

But it’s throwing the following error:
java.lang.NoClassDefFoundError: software/amazon/awssdk/transfer/s3/progress/TransferListener

Note - mydemobucket is already present in minio

Hi @gakshat1107,

Something similar was discussed here: java - EMR serverless NoClassDefFoundError: software/amazon/awssdk/transfer/s3/progress/TransferListener - Stack Overflow

I suggest you check the dependency tree on this.

Thanks, Bogdan

I hit something similar when I was working with hadoop locally when trying to create tables on S3. It looks like you may not have the AWS depedencies as Bogdan alluded to. This is how I fixed it for myself.

  1. Edit this file vi /Users/username/hadoop-3.3.5/etc/hadoop/hadoop-env.sh
  2. Add this line export HADOOP_OPTIONAL_TOOLS="hadoop-aws"

This brought in the aws classes I needed into my env.

Hope this helps.

Hi, it’s resolved now, forgot to set the Hadoop class path