Monday 15 February 2010

data.table - Recursive function in R to find unique rows of a list of data tables -


I am working on a function that takes a list of identical column names with a list of data tables and a The table also gives the data that is shown by using continuous rbind, by adding unique rows from each data frame, shown below.

This function will be applied to "very" large data (10 out of the millions of rows) This is why I had to divide it into several small data tables and recurs to recur in a list Had to use Based on the length of the list of data tables (weird or even), in each step, I get unique data on that list index and the data table is obtained on the table x-1, and then sequentially 2 The RBI and Assign Index x - 1, and more in the list index x list.

I should have some clarity, because although I can give it the last unique data when printing it (i.e., print (Lilélment [[1]], when I come back (Lists [ [1]] I get zero. If I'm missing someone, I should find out ... or suggest that maybe there is a more efficient way to do this.

Its Besides, instead of adding each data to the list, do I Can I add as a "reference" to the archive? I think doing something like a list (dataset 1, datatable2 ...) will actually copy them?

  ## Code returns UNique2 <- Function (Alst) {if (length (alist) == 1) {z  2) {alist [[length (alist) - 1]]   
  print (z) ### This problem is, if I change to return (Z) , I get tap (?)   

to read

  back (z)   

return tap < / P>

Thanks in advance.

Please correct me if I think what you are doing is wrong, but it seems that you have a large data table and I have to separate I'm trying to run some functions on it to combine everything above and then and to run a unique on it, is worth the data way of doing this by , Such as

  fn = function (d) {# to ​​do whatever subset and result Return the figure back to # In this case, do not do anything, not = 10 = the number of numbers of your DT [, FN (SD)) = (SAC_LAN (NOO (DT)) - 1)% /% ( Nrow (dt) / N)] [, seq_len: = NULL] dt = dt [! Duplicate (DT)]    

No comments:

Post a Comment