Friday 15 February 2013

performance - Should I get a habit of removing unused variables in R? -


Currently I'm working with relatively large data files, and my computer is not a super computer. Temporarily setting several subsets and not removing them from the workspace. Obviously they are making a disorganization of many variables but, is there any effect of many unused variables on the performance of R? (I.e., does the computer's memory fill up at some point?)
When should I start the habit of removing unused variables when the code should be written? is it worth it?

  x   

I do not want to add another line to my code Instead, I like to ruin my workspace (if no performance improvements)

Yes , The unused objects will affect your performance, because all its parts in the REM memos Store Uon that will clearly impact on small objects at the results, and you will only really have to delete the data frame) with the majors (millions of rows, etc., but nothing hurt when a Ancuted workspace.

The only exposure is removing something that you need later even when using a repo, it is suggested that accidentally breaking the stuff is something that you can save from it. Want to

One way to achieve these issues is to use the local when you do a computation that is spread around the temporary objects, then you can call it a Local , which you will then have to effectively dispose of those objects. No more to clean more i , j , x , temp.var , and whatnot

  for local ({x (i: seq_along (obj)) temp & lt; - some_unvectorised function (obj [[i]], x) (j: 1: temp) Temp2 & lt; - some_other_unvectorised_function (temp, j) #x, i, j, temp, temp2 only for local (...)})     Exist

No comments:

Post a Comment