为什么 CSV 文件在每个数据行之间都包含一个空行,而在 Python 中使用 Dictwitter 输出呢

我正在使用 DictWriter 将字典中的数据输出到 csv 文件。为什么 CSV 文件在每个数据行之间有一个空行?这不是什么大问题,但是我的数据集很大,而且不适合一个 csv 文件,因为它有太多的行,因为“双倍行距”使文件中的行数增加了一倍。

我写字典的代码是:

headers=['id', 'year', 'activity', 'lineitem', 'datum']
output = csv.DictWriter(open('file3.csv','w'), delimiter=',', fieldnames=headers)
output.writerow(dict((fn,fn) for fn in headers))
for row in rows:
output.writerow(row)
111115 次浏览

I just tested your snippet, and their is no double spacing line here. The end-of-line are \r\n, so what i would check in your case is:

  1. your editor is reading correctly DOS file
  2. no \n exist in values of your rows dict.

(Note that even by putting a value with \n, DictWriter automaticly quote the value.)

By default, the classes in the csv module use Windows-style line terminators (\r\n) rather than Unix-style (\n). Could this be what’s causing the apparent double line breaks?

If so, in python 2 you can override it in the DictWriter constructor:

output = csv.DictWriter(open('file3.csv','w'), delimiter=',', lineterminator='\n', fieldnames=headers)

From csv writer documentation:

If csvfile is a file object, it should be opened with newline=''

In other words, when opening the file you pass newline='' as a parameter.
You can also use a with statement to close the file when you're done writing to it.
Tested example below:

from __future__ import with_statement # not necessary in newer versions
import csv
headers=['id', 'year', 'activity', 'lineitem', 'datum']
with open('file3.csv','w', newline='') as fou:
output = csv.DictWriter(fou,delimiter=',',fieldnames=headers)
output.writerow(dict((fn,fn) for fn in headers))
output.writerows(rows)

Changing the 'w' (write) in this line:

output = csv.DictWriter(open('file3.csv','w'), delimiter=',', fieldnames=headers)

To 'wb' (write binary) fixed this problem for me:

output = csv.DictWriter(open('file3.csv','wb'), delimiter=',', fieldnames=headers)

Python v2.75: Open()

Credit to @dandrejvv for the solution in the comment on the original post above.