How to read text-non-text-classification data from kaggle folder?

FileNotFoundError                         Traceback (most recent call last)
<ipython-input-13-381be183fe9d> in <module>
      4 images_train = read_all(trainPath , key_prefix='bgr_') # change the path
      5 for language in languages:
----> 6     images_train.update(read_all(trainPath+language, key_prefix=language+"_" ))
      7 print(len(images_train))
      8 

<ipython-input-4-7594fb431dc1> in read_all(folder_path, key_prefix)
      5     print("Reading:")
      6     images = {}
----> 7     files = os.listdir(folder_path)
      8     for i, file_name in tqdm_notebook(enumerate(files), total=len(files)):
      9         file_path = os.path.join(folder_path, file_name)

FileNotFoundError: [Errno 2] No such file or directory: '../input/padhai-text-non-text-classification-level-2/kaggle_level_2/ta'

help me

Hi @vimalkumarmdb,
Please check all the files location first, you can do so by using:
!ls -l

Thanks @Ishvinder , now I’m level 3, there is more difficulties , can you help me , how to read data from in lavel3?

Are you facing the same issues in level 3 as well?

, I want to unzip the Zip file inside the folder but I’m unable to do this, can you help me how can i read this level data into data frame?

@Ishvinder, not like previous, it is new problem -

I want to unzip the Zip file inside the folder but I’m unable to do this, can you help me how can i read this level data into data frame?

To unzip a file, you can use the following snippet:

!unzip file.zip

Infact, you need not to unzip files in kaggle i guess.
You can directly access all the subdirectories.
Try hovering to the expand like button for the directory on the right side data window.

I’m very frustrated in this assignment, this is the 1st time I can’t read the dataset, if you have code of this lavel kindly share with me.

You should be able to unzip the file to your current path by

unzip filename.zip

Usually, the zip archives can be automatically accessed, I am not sure why we have to extract. We will check this.