Wednesday 15 June 2011

R - OSX Mountain Lion and CPU Limit -


I am collecting data through SQL queries, though R. I have a loop to pull a small part of a large table, and drop the segment, if repeated for an hour or until the whole table stays in the flat files in my RSQL directory.

However, R has extended a quipetemem limit: 24 errors every time.

I am playing Mountain Lion.

I have tried

Good -199 RCMD Batch Mesric R

And continue to kill the process at os odd intervals. I do not believe that the script stays on any particular operation, it takes some time to solve through the loop.

The Loop looks like this ..

 For  (in i: 64) {foobyte & lt; - for NULL (in 0: 7) {max id = 1000000 rows = 1e5 = maximum_ID * (rows * j) - (i * 7 * rows) = * max_id * (rows * (j-1)) - (1 * 7 * lines) fibit & lt; - queryDB (paste from "Select * Foobar where id & lt; = ', to,' and id & gt; ','; ') Foobyte & lt; - rbind (foobit, foobyte)} filename < Paste ("/ my / data / dir / foobyte", j, ".csv", sep = "") write.table (foobyte, filename)}   

before 30 I will try to firing R from the shell script that calls the unlimited in that terminal session only, and how it works.

Trying to be false ... it appears that I do not have access, even if Even before using the

  ulimit -a -H   

.

  ulimit -t 12000 # Set Cputime limit has been resolved from 600 seconds to 12,000 seconds via the Debian Virtual Machine. If anybody has a Mountain Leonic solution, please let us know.   

No comments:

Post a Comment