使用 Node.js 将对象写入文件

我在 stackoverflow/google 上搜索了很多关于这个的信息,但似乎没有找到答案。

我正在搜集一个给定 URL 页面的社交媒体链接,该函数返回一个包含 URL 列表的对象。

当我尝试将这些数据写入另一个文件时,它会以 [object Object]的形式输出到该文件,而不是预期的: [‘ https://twitter.com/#!/101Cookbooks’, “ http://www.facebook.com/101cookbooks”] 当我把结果 console.log()的时候。

这是我在 Node 中读写一个文件的可悲尝试,尝试读取每一行(url)并通过函数调用 request(line, gotHTML)输入:

fs.readFileSync('./urls.txt').toString().split('\n').forEach(function (line){
console.log(line);
var obj = request(line, gotHTML);
console.log(obj);
fs.writeFileSync('./data.json', obj , 'utf-8');
});

供参考—— gotHTML函数:

function gotHTML(err, resp, html){
var social_ids = [];


if(err){
return console.log(err);
} else if (resp.statusCode === 200){
var parsedHTML = $.load(html);


parsedHTML('a').map(function(i, link){
var href = $(link).attr('href');
for(var i=0; i<socialurls.length; i++){
if(socialurls[i].test(href) && social_ids.indexOf(href) < 0 ) {
social_ids.push(href);
};
};
})
};


return social_ids;
};
150172 次浏览

obj is an array in your example.

fs.writeFileSync(filename, data, [options]) requires either String or Buffer in the data parameter. see docs.

Try to write the array in a string format:

// writes 'https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks'
fs.writeFileSync('./data.json', obj.join(',') , 'utf-8');

Or:

// writes ['https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks']
var util = require('util');
fs.writeFileSync('./data.json', util.inspect(obj) , 'utf-8');

edit: The reason you see the array in your example is because node's implementation of console.log doesn't just call toString, it calls util.format see console.js source

If you're geting [object object] then use JSON.stringify

fs.writeFile('./data.json', JSON.stringify(obj) , 'utf-8');

It worked for me.

Building on what deb2fast said I would also pass in a couple of extra parameters to JSON.stringify() to get it to pretty format:

fs.writeFileSync('./data.json', JSON.stringify(obj, null, 2) , 'utf-8');

The second param is an optional replacer function which you don't need in this case so null works.

The third param is the number of spaces to use for indentation. 2 and 4 seem to be popular choices.

In my experience JSON.stringify is slightly faster than util.inspect. I had to save the result object of a DB2 query as a json file, The query returned an object of 92k rows, the conversion took very long to complete with util.inspect, so I did the following test by writing the same 1000 record object to a file with both methods.

  1. JSON.Stringify

    fs.writeFile('./data.json', JSON.stringify(obj, null, 2));
    

Time: 3:57 (3 min 57 sec)

Result's format:

[
{
"PROB": "00001",
"BO": "AXZ",
"CNTRY": "649"
},
...
]
  1. util.inspect

    var util = require('util');
    fs.writeFile('./data.json', util.inspect(obj, false, 2, false));
    

Time: 4:12 (4 min 12 sec)

Result's format:

[ { PROB: '00001',
BO: 'AXZ',
CNTRY: '649' },
...
]

Could you try doing JSON.stringify(obj);

Like this:

var stringify = JSON.stringify(obj);
fs.writeFileSync('./data.json', stringify, 'utf-8');

Just incase anyone else stumbles across this, I use the fs-extra library in node and write javascript objects to a file like this:

const fse = require('fs-extra');
fse.outputJsonSync('path/to/output/file.json', objectToWriteToFile);

Further to @Jim Schubert's and @deb2fast's answers:

To be able to write out large objects of order which are than ~100 MB, you'll need to use for...of as shown below and match to your requirements.

const fsPromises = require('fs').promises;


const sampleData = {firstName:"John", lastName:"Doe", age:50, eyeColor:"blue"};


const writeToFile = async () => {
for (const dataObject of Object.keys(sampleData)) {
console.log(sampleData[dataObject]);
await fsPromises.appendFile( "out.json" , dataObject +": "+ JSON.stringify(sampleData[dataObject]));
}
}


writeToFile();

Refer https://stackoverflow.com/a/67699911/3152654 for full reference for node.js limits