Exception while trying to access data

Description

Update
/usr/lib/python3.6/site-packages/MD2K_Cerebral_Cortex-2.2.2-py3.6.egg/cerebralcortex/core/data_manager/raw/stream_handler.py - write_filesystem_day_file - 776 - Error in writing
data to FileSystem. STREAM ID: 153b4dd6-6e25-31e6-9a26-547da88a430dOwner ID: 136f8891-af6f-49c1-a69a-b4acd7116a3cFiles: /CC/mperf/136f8891-af6f-49c1-a69a-b4acd7116a3c/153b4dd6-6e
25-31e6-9a26-547da88a430d/20171123.gz - Exception: a bytes-like object is required, not 'list'
Is happening in filesystem. HDFS seems fine

CC version 2.2.2 as of Apr 9 12:13 PM

hdfsOpenFile(/cerebralcortex/data/622bf725-2471-4392-8f82-fcc9115a3745/1342f3e3-969a-38a8-bc0a-c7aede3c78a9/20171211.gz): FileSystem#create((Lorg/apache/hadoop/fs/Path;ZISJ)Lorg/apache/hadoop/fs/FSDataOutputStream\;) error:
java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:471)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1271)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1216)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:477)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:474)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:474)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:415)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1067)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1048)
Error in generating gz file.
hdfsGetPathInfo(/cerebralcortex/data/622bf725-2471-4392-8f82-fcc9115a3745/1342f3e3-969a-38a8-bc0a-c7aede3c78a9/20171211.gz): getFileInfo error:
java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:471)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1715)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1526)
at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1523)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1523)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1627)
HDFS: GetPathInfo failed
Traceback (most recent call last):
File "/cerebralcortex/code/eggs/MD2K_Cerebral_Cortex-2.0.0-py3.6.egg/cerebralcortex/core/data_manager/raw/stream_handler.py", line 301, in compress_store_pickle
with hdfs.open(gz_filename, "wb") as gzwrite:
File "pyarrow/io-hdfs.pxi", line 397, in pyarrow.lib.HadoopFileSystem.open (/arrow/python/build/temp.linux-x86_64-3.6/lib.cxx:65876)
File "pyarrow/error.pxi", line 79, in pyarrow.lib.check_status (/arrow/python/build/temp.linux-x86_64-3.6/lib.cxx:8345)
pyarrow.lib.ArrowIOError: Unable to open file /cerebralcortex/data/622bf725-2471-4392-8f82-fcc9115a3745/1342f3e3-969a-38a8-bc0a-c7aede3c78a9/20171211.gz

Environment

None

Assignee

Nasir Ali

Reporter

Anandatirtha N

Labels

None

Priority

Highest
Configure