very small message size - one aspect is very efficient variable sized integer representation.
Very fast decoding - it is a binary protocol.
protobuf generates super efficient C++ for encoding and decoding the messages -- hint: if you encode all var-integers or static sized items into it it will encode and decode at deterministic speed.
It offers a VERY rich data model -- efficiently encoding very complex data structures.
JSON is just text and it needs to be parsed. hint: encoding a "billion" int into it would take quite a lot of characters: Billion = 12 char's (long scale), in binary it fits in a uint32_t Now what about trying to encode a double ? that would be FAR FAR worse.
In general, JSON has slightly larger wire size and slightly worse DeSer, but wins in ubiquity and the ability to interpret it easily without the source IDL. The last point is something that Apache Avro is trying to solve, and it beats both in terms of performance.
The Burning Monks benchmarks show the performance of serializing a simple POCO whilst the comprehensive Northwind benchmarks show the combined results of serializing a row in every table of Microsoft's Northwind dataset.
Basically protocol buffers (protobuf-net) is around 7x quicker than the fastest Base class library Serializer in .NET (XML DataContractSerializer). Its also smaller than the competition as it is also 2.2x smaller than Microsofts most compact serialization format (JsonDataContractSerializer).
ServiceStack's Text serializers are the closest to matching the performance of the binary protobuf-net where its Json Serializer is only 2.58x slower than protobuf-net.