Yes I did that too but I faced the same problem. I have been doing this since around 2 days and finally I got the root of the issue.
I cleaned up everything and make sure no other job running in background as well, cleaned up thrash too.
If I upload the csv with 10 records it is not causing any issue but if you have csv with more than 10 records and if I am uploading from Ambari hdfs files view, the file get corrupted. When i did the same file upload using command line it is all fine no issues.
Basically I uploaded csv to /home/sparkFiles first using winscp and then moved the file using hdfs dfs -put that way it worked fine.