Handling large csv

We have a 13.1 gb csv file with 553 columns hosted on s3 and it seems Dremio is having issues loading it. When selecting extract field names while formatting it we get this error
"Unable to skip a line. End of input reached. Check if line delimiter is present in the file"

13.1GB is pretty a large file to download from S3, and you are more likely to encounter network issues. I would recommend splitting the file into multiple CSV files into a directory (you can instruct Dremio to convert a whole folder to a dataset).