Your verification ID is: guDlT7MCuIOFFHSbB3jPFN5QLaQ Big Computing: Reading a large number of files into R

Sunday, June 8, 2014

Reading a large number of files into R

I know this is a fairly basic topic, but it is one that caused me problems lately. Normally I only have to read in one data file at a time or I read in a few tables separately.

If I am reading in a single file would do the following

>read.table("file")

 or if it is online

>read.table("url")

If it is a csv file

read.csv("file")

Now the problem arose because I needed to read in 400 files from a directory, but the files were not numerically indexed. So to solve this problem I used the functions list.files and paste.

>names<-list.files("~/directory/")
>complete_names<-paste( "~directory", names, sep="")
>for (i in sep_along(names){
         monitors<-rbind(monitors, read.csv(complete_names[i]))

It was slow, but got the job done.





1 comment: