Tuesday, 15 July 2014

Python-list is too large and can't be deleted -


I have an iterator / generator that creates an 'event' in an event with a name, a timestamp and a value is. I want to store them in NumPy arrays, this is done in the _LoadTriples () :

  def _LoadTriples (abortEvt, count = [1]): this = _YieldTriples () While True: if abortEvt.is_set (): it.close () break0 = time.time () self.allEvents.append (np.fromiter (this, dtype = [(['sigNameIdx', 'i'], ( 'Time', 'f'), ('value', 'f8')], count = count [-1]) distance = time.time (time) - t if the fear <0.2: count.append (count [-1] * 2) Eleph der & gt; 0.4 and count [-1]! = 1: count.append (count [1] / 2) Other: count.append (count [-1])   

_YieldTriples is the generator, the abortEvt event that tells me that when users are back I can add NumPy arrays to Triples (name, timestamp, value) here. self.allEvents is an empty list here. Is a list because I am likely to break the recurrence and I can numpy.fromiter . So about every 0.3 seconds I can stop running.

All this works well. But , in one instance it happens that Pygmy uses 300 MB of memory for the list! When I stop running, my list requires a maximum of 10 MB, depending on when I closed it, but self.allEvents.append (np.fromiter (...)) After some calls, used 300MB and I do not know why.

Besides this, this memory is not free till I keep the entire program in check, even if I delete self.allEvents directly after that call. . There should be some context in it that prevents me from releasing it. Is there a way to look at objects in which object is the reference to the list?

More to mention one thing: The function is called as a new threading.Thread , but the main column waits for it ...

EDIT: I have not mentioned that no extra memory has been allocated in the list after using 300 MB. It appears that this memory is reserved after some exceptions in the list.

You should try: def _LoadTriples (abortEvt, count = None): If not counting: count = [1] ...

Problems with dynamic default arguments can be very fast

No comments:

Post a Comment