Access HDFS from Matlab
1 view (last 30 days)
Show older comments
Ludwig Drees
on 10 Oct 2014
Hi
we have installed Hadoop on two Linux (Ubuntu) machines (2 Datanode / 1 Namenode). Now, we want to access the data from a third computer where our Matlab R2014b is installed on a Windows operating system.
We have two questions:
1. How should we specify the Environment variables (HADOOP_PREFIX) on our Windows machine? 2. Do we need to install Hadoop on our Windows machine?
Thanks for your support.
2 Comments
Siddharth Sundar
on 13 Oct 2014
The error suggests that datastore hasn't been able to read the folder that contains the customers files. My suggestion for the first step is to check the permissions in HDFS. HDFS is a filesystem that is part of Hadoop, which has posix-like permissions. This folder will be owned by user 'hadoop' and it is possible that permissions are set such that other users cannot access it.
What username are you running MATLAB as? If it is not 'hadoop', then do the following (in a Linux terminal window):
/home/hadoop/hadoop-1.2.1/bin/hadoop fs -ls -l /user/hadoop
If this fails, or if it returns something like:
drwx------ - hadoop supergroup ... /user/hadoop/airline
Then you needs to correct the permissions in your filesystem.
Does this work for you?
Accepted Answer
Javensius Sembiring
on 21 Oct 2014
Hi Kalsi,
Thanks for your feedback. Ludwig and I are currently working in this project. The problem was that the configuration in core-site.xml which contain namenode address (fs.default.name) still refers to local IP address. On the other hand, Matlab requires a correct IP address which directly links to the location of hdfs system. So, by changing the IP fs.default.name to public IP address, the Matlab is now able to connect to hdfs storage system.
The Matlab-hadoop configuration we are developing now consist of three computers which connected each other using private network. All these three computers use Ubuntu OS in which hadoop is installed. One of these computers has two network cards, one for local connection and the other for public connection.
The public network card is used by the other computer client to access to this Hadoop cluster. The problem is when we change the df.default.name (namenode address) to public IP address, the hadoop can not start the other two data nodes since the other two data nodes refers to namenode local IP address. I know that this is not Matlab related problem, but do you know how to configure it correctly ?
Thanks in advance,
2 Comments
More Answers (2)
Aaditya Kalsi
on 15 Oct 2014
You do need to install Hadoop on your Windows machine and provide that installation path to MATLAB on the same machine through the HADOOP_PREFIX environment variable.
To specify the environment variable on your Windows machine try:
setenv('HADOOP_PREFIX', 'C:\path\to\hadoop_installation')
ds = datastore('hdfs://host/path/to/file.txt', ...)
3 Comments
Aaditya Kalsi
on 16 Oct 2014
Could you provide the configuration details? Is the host and port correct and is the path known to exist?
It might also help to ensure that the server name and port are exactly the same as the fs.default.name in your Hadoop configuration file.
If youre not in the same network, you may have to fully qualify the hostname.
Hope this helps.
See Also
Categories
Find more on HDF5 in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!