在半大型数据集下,Knockout.js 速度非常慢

我刚刚开始使用 Knockout.js (一直想尝试一下,但现在我终于找到了一个借口!)然而,当把一个表绑定到一个相对较小的数据集(大约400行左右)时,我遇到了一些非常糟糕的性能问题。

在我的模型中,我有以下代码:

this.projects = ko.observableArray( [] ); //Bind to empty array at startup


this.loadData = function (data) //Called when AJAX method returns
{
for(var i = 0; i < data.length; i++)
{
this.projects.push(new ResultRow(data[i])); //<-- Bottleneck!
}
};

问题是上面的 for循环大约需要30秒左右,大约有400行。但是,如果我将代码更改为:

this.loadData = function (data)
{
var testArray = []; //<-- Plain ol' Javascript array
for(var i = 0; i < data.length; i++)
{
testArray.push(new ResultRow(data[i]));
}
};

然后 for循环在眨眼间完成。换句话说,Knokout 的 observableArray对象的 push方法非常慢。

以下是我的模板:

<tbody data-bind="foreach: projects">
<tr>
<td data-bind="text: code"></td>
<td><a data-bind="projlink: key, text: projname"></td>
<td data-bind="text: request"></td>
<td data-bind="text: stage"></td>
<td data-bind="text: type"></td>
<td data-bind="text: launch"></td>
<td><a data-bind="mailto: ownerEmail, text: owner"></a></td>
</tr>
</tbody>

我的问题:

  1. 这是将我的数据(来自 AJAX 方法)绑定到一个可观察的集合的正确方法吗?
  2. 我希望每次调用 push时,它都会进行一些大量的重新计算,比如重新构建绑定的 DOM 对象。有没有什么办法可以延缓这次撤销,或者可以一次性把我所有的东西都放进去?

如果需要,我可以添加更多的代码,但我很确定这是相关的。在大多数情况下,我只是按照敲除教程从网站。

更新:

根据下面的建议,我更新了我的代码:

this.loadData = function (data)
{
var mappedData = $.map(data, function (item) { return new ResultRow(item) });
this.projects(mappedData);
};

但是,对于400行,this.projects()仍然需要大约10秒。我承认我不确定这将是多快的 没有敲除(只是通过 DOM 添加行) ,但我有一种感觉,它会比10秒快得多。

更新2:

根据下面的其他建议,我给了 JQuery tmpl一个尝试(KnokOut 本机支持) ,这个模板引擎将在3秒钟内绘制大约400行。这似乎是最好的方法,但缺乏一种在滚动时动态加载更多数据的解决方案。

47027 次浏览

KnockoutJS has some great tutorials, particularly the one about loading and saving data

In their case, they pull data using getJSON() which is extremely fast. From their example:

function TaskListViewModel() {
// ... leave the existing code unchanged ...


// Load initial state from server, convert it to Task instances, then populate self.tasks
$.getJSON("/tasks", function(allData) {
var mappedTasks = $.map(allData, function(item) { return new Task(item) });
self.tasks(mappedTasks);
});
}

Give KoGrid a look. It intelligently manages your row rendering so that it's more performant.

If you you're trying to bind 400 rows to a table using a foreach binding, you're going to have trouble pushing that much through KO into the DOM.

KO does some very interesting things using the foreach binding, most of which are very good operations, but they do start to break down on perf as the size of your array grows.

I've been down the long dark road of trying to bind large data-sets to tables/grids, and you end up needing to break apart/page the data locally.

KoGrid does this all. Its been built to only render the rows that the viewer can see on the page, and then virtualize the other rows until they are needed. I think you'll find its perf on 400 items to be much better than you're experiencing.

As suggested in the comments.

Knockout has it's own native template engine associated with the (foreach, with) bindings. It also supports other template engines, namely jquery.tmpl. Read here for more details. I haven't done any benchmarking with different engines so don't know if it will help. Reading your previous comment, in IE7 you may struggle to get the performance that you are after.

As an aside, KO supports any js templating engine, if someone has written the adapter for it that is. You may want to try others out there as jquery tmpl is due to be replaced by JsRender.

Use pagination with KO in addition to using $.map.

I had the same problem with a large datasets of 1400 records until I used paging with knockout. Using $.map to load the records did make a huge difference but the DOM render time was still hideous. Then I tried using pagination and that made my dataset lighting fast as-well-as more user friendly. A page size of 50 made the dataset much less overwhelming and reduced the number of DOM elements dramatically.

Its very easy to do with KO:

http://jsfiddle.net/rniemeyer/5Xr2X/

A possible work-around, in combination with using jQuery.tmpl, is to push items on at a time to the observable array in an asynchronous manner, using setTimeout;

var self = this,
remaining = data.length;


add(); // Start adding items


function add() {
self.projects.push(data[data.length - remaining]);


remaining -= 1;


if (remaining > 0) {
setTimeout(add, 10); // Schedule adding any remaining items
}
}

This way, when you only add a single item at a time, the browser / knockout.js can take its time to manipulate the DOM accordingly, without the browser being completely blocked for several seconds, so that the user may scroll the list simultaneously.

Please see: Knockout.js Performance Gotcha #2 - Manipulating observableArrays

A better pattern is to get a reference to our underlying array, push to it, then call .valueHasMutated(). Now, our subscribers will only receive one notification indicating that the array has changed.

A solution to avoid locking up the browser when rendering a very large array is to 'throttle' the array such that only a few elements get added at a time, with a sleep in between. Here's a function which will do just that:

function throttledArray(getData) {
var showingDataO = ko.observableArray(),
showingData = [],
sourceData = [];
ko.computed(function () {
var data = getData();
if ( Math.abs(sourceData.length - data.length) / sourceData.length > 0.5 ) {
showingData = [];
sourceData = data;
(function load() {
if ( data == sourceData && showingData.length != data.length ) {
showingData = showingData.concat( data.slice(showingData.length, showingData.length + 20) );
showingDataO(showingData);
setTimeout(load, 500);
}
})();
} else {
showingDataO(showingData = sourceData = data);
}
});
return showingDataO;
}

Depending on your use case, this could result in massive UX improvement, as the user might only see the first batch of rows before having to scroll.

Taking advantage of push() accepting variable arguments gave the best performance in my case. 1300 rows were loading for 5973ms (~ 6 sec.). With this optimization the load time was down to 914ms (< 1 sec.)
That's 84.7 % improvement!

More info at Pushing items to an observableArray

this.projects = ko.observableArray( [] ); //Bind to empty array at startup


this.loadData = function (data) //Called when AJAX method returns
{
var arrMappedData = ko.utils.arrayMap(data, function (item) {
return new ResultRow(item);
});
//take advantage of push accepting variable arguments
this.projects.push.apply(this.projects, arrMappedData);
};

I've been experimenting with performance, and have two contributions that I hope might be useful.

My experiments focus on the DOM manipulation time. So before going into this, it is definitely worth following the points above about pushing into a JS array before creating an observable array, etc.

But if DOM manipulation time is still getting in your way, then this might help:


1: A pattern to wrap a loading spinner around the slow render, then hide it using afterRender

http://jsfiddle.net/HBYyL/1/

This isn't really a fix for the performance problem, but shows that a delay is probably inevitable if you loop over thousands of items and it uses a pattern where you can ensure you have a loading spinner appear before the long KO operation, then hide it afterwards. So it improves the UX, at least.

Ensure you can load a spinner:

// Show the spinner immediately...
$("#spinner").show();


// ... by using a timeout around the operation that causes the slow render.
window.setTimeout(function() {
ko.applyBindings(vm)
}, 1)

Hide the spinner:

<div data-bind="template: {afterRender: hide}">

which triggers:

hide = function() {
$("#spinner").hide()
}

2: Using the html binding as a hack

I remembered an old technique back from when I was working on a set top box with Opera, building UI using DOM manipulation. It was appalling slow, so the solution was to store large chunks of HTML as strings, and load the strings by setting the innerHTML property.

Something similar can be achieved by using the html binding and a computed that derives the HTML for the table as a big chunk of text, then applies it in one go. This does fix the performance problem, but the massive downside is that it severely limits what you can do with binding inside each table row.

Here's a fiddle that shows this approach, together with a function that can be called from inside the table rows to delete an item in a vaguely-KO-like way. Obviously this isn't as good as proper KO, but if you really need blazing(ish) performance, this is a possible workaround.

http://jsfiddle.net/9ZF3g/5/

I also noticed that Knockout js template engine works slower in IE, I replaced it with underscore.js, works way faster.

I been dealing with such huge volumes of data coming in for me valueHasMutated worked like a charm .

View Model :

this.projects([]); //make observableArray empty --(1)


var mutatedArray = this.projects(); -- (2)


this.loadData = function (data) //Called when AJAX method returns
{
ko.utils.arrayForEach(data,function(item){
mutatedArray.push(new ResultRow(item)); -- (3) // push to the array(normal array)
});
};
this.projects.valueHasMutated(); -- (4)

After calling (4) array data will be loaded into required observableArray which is this.projects automatically .

if you got time have a look at this and just in-case any trouble let me know

Trick here : By doing like this , if in case of any dependencies (computed,subscribes etc) can be avoided at push level and we can make them execute at one go after calling (4).

If using IE, try closing the dev tools.

Having the developer tools open in IE significantly slows this operation down. I'm adding ~1000 elements to an array. When having the dev tools open, this takes around 10 seconds and IE freezes over while it is happening. When i close the dev tools, the operation is instant and i see no slow down in IE.