Sunday 15 June 2014

Python 2.7 - Write and read a list from file -


This is a slightly awkward request, but I am searching for a way to write a list and then read it some other

I have no way of remaking lists so that they are formed correctly / formatted as given below.

The data in my lists is like the following:

  Here's a group of test data here. Test data here. Here it is another group.    

If you do not need to be human-readable / editable, the easiest The solution is to use only pickle / .

To write:

To read:

  < P> code> open (the_filename, 'rb') as f: my_list = pickle.load (f)   

if you do Need is human-readable, we need more information.

If my_list is not guaranteed to be a list of wire with an embedded newline, then with o one: o with f_write (s + '\ n' ) As F: open (the_filename, 'w') with one row per one: (Open (the_filename, 'r') as F: my_list = [line.rstrip ('\ n' ) In the line f]


If they are the Unicode string instead of byte strings, then you want to 'you want them to be encode (or worse, If they are byte strings, But not necessarily in the same encoding as your system default.)

If they have new lines or non-printable characters, etc., you can use the escape or cite Python In order to avoid various types of stdlib, it has been created.

Use unicode-escape to solve both of the above problems together:

  open (the_filename, 'w') for f: in my_list: f.write ((s + u '\ N '). Open with encod (' Unicode-escape ')) as the open (the_filename, R') f: my_list = [line.decode ('unicode-escape'). You can also do the 3.x-style solution in 2.x with either modules or modules for rstrip (u '\ n') in the line f]   

Use: *

  import with io.oopen (the_filename, 'w', encoding = 'unicode-' f 'as f: f.writelines (for row in my_list) Line + u '\ n')) as the open (the_filename, 'r') f: my_list = [line.rstrip (u '\ n') for line F]   

* TOOWTDI, then which one is the obvious way? It depends ?? | For the short version: If you have to work with Python versions before 2.6, then use codecs ; If not, then use io .

No comments:

Post a Comment