How can I acces data from an hdfs in parquet format
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
André Horn
le 17 Sep 2018
Commenté : Hatem Helal
le 10 Avr 2019
We have a large dataset stored in parquet files on an hadoop file system and would like to use a matlab datastore to analyse them. Unfortunately I couldn't find any reports, that anybody has done this yet.
Does mathworks provide a native way to access parquet data? Perhaps one can use the fileDatastore or a matlab custom datastore? Is there a template for that?
0 commentaires
Réponse acceptée
Hitesh Kumar Dasika
le 20 Déc 2018
Mathworks has added support for Parquet files. it is available in the following link.
5 commentaires
Knut Voigtlaender
le 16 Jan 2019
Indeed, I'm able to access parquet files hosted on a remote hadoop linux cluster from a local Windows PC Matlab.
For me it worked considering following steps:
1. I got a local Hadoop Windows installation according to
https://github.com/MuhammadBilalYar/Hadoop-On-Window/wiki/Step-by-step-Hadoop-2.8.0-installation-on-Window-10
2. log4j.properties must be copied from \hadoop-X.X.X\ect\hadoop\ to \matlab-parquet-master\Software\MATLAB\lib\jar\
3. The HADOOP_HOME environment variable should then point to the local hadoop home directory instead to Winutils.exe
4. The check for unix-style filename was removed in matlab-parquet-master\Software\MATLAB\+bigdata\+parquet\Reader.m
5. The OS-check must be removed from \matlab-parquet-master\Software\MATLAB\functions\parquetDatastore.m
or like proposed above it is possible to use directly \matlab-parquet-master\Software\MATLAB\+bigdata\+parquet\ParquetDatastore.m
After this steps I could initialize a Datastore via a remote hadoop url like
ds=bigdata.parquet.ParquetDatastore('hdfs://server:port/dir','IncludeSubfolders',true)
Hatem Helal
le 10 Avr 2019
R2019a adds support for working with parquet files, see this answer and let us know if you have any further feedback.
Plus de réponses (2)
Hatem Helal
le 10 Avr 2019
MATLAB R2019a adds support for reading and writing Apache Parquet files (doc). Here are the relevant release notes:
1. Import and export column-oriented data from Parquet files in MATLAB. Parquet is a columnar storage format that supports efficient compression and encoding schemes. To work with the Parquet file format, use these functions.
- parquetread — Read columnar data from a Parquet file.
- parquetwrite — Write columnar data to a Parquet file.
- parquetinfo — Get information about a Parquet file.
2. The write function now supports writing tall arrays to Parquet files. To write a tall array, set the FileType parameter to 'parquet', for example:
write('C:\myData',tX,'FileType','parquet')
0 commentaires
Hitesh Kumar Dasika
le 24 Sep 2018
Currently, there is no support to Apache Arrow and Parquet files in MATLAB.
3 commentaires
Hitesh Kumar Dasika
le 24 Sep 2018
Thank you for your feedback. We have raised this concern with our developers and they are actively looking at including this feature in our future releases. Unfortunately, there is no workaround in this case for now. Sorry for the trouble.
Hatem Helal
le 10 Avr 2019
R2019a adds support for working with parquet files, see this answer and let us know if you have any further feedback.
Voir également
Catégories
En savoir plus sur Data Import and Analysis dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!