Php Unserialize Json
- Online PHP and JSON Unserializer. A common problem: you have a serialized PHP or JSON string, maybe even base64 encoded, but what you really want is an easy-to-read unserialized version. Unserialize is the solution. Simply paste in your serialized string, click 'Unserialize', and we'll display your unserialized text in an easy-to-read format.
- Preferred method to store PHP arrays (jsonencode vs serialize) Ask Question. Unserialize JSON decode in 84314 seconds PHP unserialized in 7.075 seconds Unserialize: unserialize was roughly 43.20% faster than jsondecode. Preferred method to store PHP arrays (jsonencode vs formatted array) in memcached.
- Serialize and Unserialize online tool. Output: Input format: Auto detect Experimental Serialized JSON XML HTTP Query INI YAML Output format: Unserialized printr Unserialized vardump Unserialized varexport Serialized JSON XML HTTP Query YAML Submit. This website is made by a developer for developers. Nothing more, nothing less.
- The serialized string. If the variable being unserialized is an object, after successfully reconstructing the object PHP will automatically attempt to call the wakeup member function (if it exists). Note: unserializecallbackfunc directive. It's possible to set a callback-function which will be called, if an undefined class should be instantiated during unserializing.
- JSON is an alternative that is not specific to PHP, and does not care about custom classes either. If you can access the PHP source of were the serialization happens, then change this to produce JSON instead. PHP offers jsonencode for that, and JavaScript has JSON.parse to decode it.
- If you look at the documentation for jsondecode, you'll see that you can pass a second parameter to make it decode these JS objects to associative arrays, so you don't need to use PHP's ugly serialize method. Storing your data in JSON is also more portable if you ever decide to use in somewhere that's not in PHP, as most popular programming languages have a JSON parsing library somewhere.
Return Values. Returns the value encoded in json in appropriate PHP type. Values true, false and null are returned as TRUE, FALSE and NULL respectively.NULL is returned if the json cannot be decoded or if the encoded data is deeper than the recursion limit.
I need to store a multi-dimensional associative array of data in a flat file for caching purposes. I might occasionally come across the need to convert it to JSON for use in my web app but the vast majority of the time I will be using the array directly in PHP.
Would it be more efficient to store the array as JSON or as a PHP serialized array in this text file? I've looked around and it seems that in the newest versions of PHP (5.3), json_decode
is actually faster than unserialize
.
I'm currently leaning towards storing the array as JSON as I feel its easier to read by a human if necessary, it can be used in both PHP and JavaScript with very little effort, and from what I've read, it might even be faster to decode (not sure about encoding, though).
Does anyone know of any pitfalls? Anyone have good benchmarks to show the performance benefits of either method?
Jeffrey Bosboom19 Answers
Depends on your priorities.
If performance is your absolute driving characteristic, then by all means use the fastest one. Just make sure you have a full understanding of the differences before you make a choice
- Unlike
serialize()
you need to add extra parameter to keep UTF-8 characters untouched:json_encode($array, JSON_UNESCAPED_UNICODE)
(otherwise it converts UTF-8 characters to Unicode escape sequences). - JSON will have no memory of what the object's original class was (they are always restored as instances of stdClass).
- You can't leverage
__sleep()
and__wakeup()
with JSON - By default, only public properties are serialized with JSON. (in
PHP>=5.4
you can implement JsonSerializable to change this behavior). - JSON is more portable
And there's probably a few other differences I can't think of at the moment.
A simple speed test to compare the two
T.ToduaJucheck exe windows 7. JSON is simpler and faster than PHP's serialization format and should be used unless:
- You're storing deeply nested arrays:
json_decode()
: 'This function will return false if the JSON encoded data is deeper than 127 elements.' - You're storing objects that need to be unserialized as the correct class
- You're interacting with old PHP versions that don't support json_decode
I've written a blogpost about this subject: 'Cache a large array: JSON, serialize or var_export?'. In this post it is shown that serialize is the best choice for small to large sized arrays. For very large arrays (> 70MB) JSON is the better choice.
You might also be interested in https://github.com/phadej/igbinary - which provides a different serialization 'engine' for PHP.
My random/arbitrary 'performance' figures, using PHP 5.3.5 on a 64bit platform show :
JSON :
- JSON encoded in 2.180496931076 seconds
- JSON decoded in 9.8368630409241 seconds
- serialized 'String' size : 13993
Native PHP :
- PHP serialized in 2.9125759601593 seconds
- PHP unserialized in 6.4348418712616 seconds
- serialized 'String' size : 20769
Igbinary :
- WIN igbinary serialized in 1.6099879741669 seconds
- WIN igbinrary unserialized in 4.7737920284271 seconds
- WIN serialized 'String' Size : 4467
So, it's quicker to igbinary_serialize() and igbinary_unserialize() and uses less disk space.
I used the fillArray(0, 3) code as above, but made the array keys longer strings.
igbinary can store the same data types as PHP's native serialize can (So no problem with objects etc) and you can tell PHP5.3 to use it for session handling if you so wish.
See also http://ilia.ws/files/zendcon_2010_hidden_features.pdf - specifically slides 14/15/16
David GoodwinDavid GoodwinY just tested serialized and json encode and decode, plus the size it will take the string stored.
We can conclude that JSON encodes faster and results a smaller string, but unserialize is faster to decode the string.
If you are caching information that you will ultimately want to 'include' at a later point in time, you may want to try using var_export. That way you only take the hit in the 'serialize' and not in the 'unserialize'.
Jordan S. JonesJordan S. JonesI augmented the test to include unserialization performance. Here are the numbers I got.
So json seems to be faster for encoding but slow in decoding. So it could depend upon your application and what you expect to do the most.
Really nice topic and after reading the few answers, I want to share my experiments on the subject.
I got a use case where some 'huge' table needs to be queried almost every time I talk to the database (don't ask why, just a fact). The database caching system isn't appropriate as it'll not cache the different requests, so I though about php caching systems.
I tried apcu
but it didn't fit the needs, memory isn't enough reliable in this case. Next step was to cache into a file with serialization.
Table has 14355 entries with 18 columns, those are my tests and stats on reading the serialized cache:
JSON:
As you all said, the major inconvenience with json_encode
/json_decode
is that it transforms everything to an StdClass
instance (or Object). If you need to loop it, transforming it to an array is what you'll probably do, and yes it's increasing the transformation time
average time: 780.2 ms; memory use: 41.5MB; cache file size: 3.8MB
Msgpack
@hutch mentions msgpack. Pretty website. Let's give it a try shall we?
average time: 497 ms; memory use: 32MB; cache file size: 2.8MB
That's better, but requires a new extension; compiling sometimes afraid people..
IgBinary
@GingerDog mentions igbinary. Note that I've set the igbinary.compact_strings=Off
because I care more about reading performances than file size.
average time: 411.4 ms; memory use: 36.75MB; cache file size: 3.3MB
Better than msg pack. Still, this one requires compiling too.
serialize
/unserialize
average time: 477.2 ms; memory use: 36.25MB; cache file size: 5.9MB
Better performances than JSON, the bigger the array is, slower json_decode
is, but you already new that.
Those external extensions are narrowing down the file size and seems great on paper. Numbers don't lie*. What's the point of compiling an extension if you get almost the same results that you'd have with a standard PHP function?
However, in Windows 8, Windows 10, Windows Server 2012, and Windows Server 2016 will need to install it. This can be done very easily through PowerShell by typing the following command. Note: SNMP Service since the Windows Server 2012 R2 version.Of course, if you are going to install SNMP on multiple endpoints (computers, servers, VMs, etc.), then it would be preferable to do so remotely using PowerShell and/or Group Policy. Check if SNMP is installedBefore we get started, it’s a good idea to first check if the SNMP service is already installed and running on Windows 10 or Windows Server 2016. Install snmp windows 10.
We can also deduce that depending on your needs, you will choose something different than someone else:
- IgBinary is really nice and performs better than MsgPack
- Msgpack is better at compressing your datas (note that I didn't tried the igbinary compact.string option).
- Don't want to compile? Use standards.
That's it, another serialization methods comparison to help you choose the one!
*Tested with PHPUnit 3.7.31, php 5.5.10 - only decoding with a standard hardrive and old dual core CPU - average numbers on 10 same use case tests, your stats might be different
Seems like serialize is the one I'm going to use for 2 reasons:
Someone pointed out that unserialize is faster than json_decode and a 'read' case sounds more probable than a 'write' case.
I've had trouble with json_encode when having strings with invalid UTF-8 characters. When that happens the string ends up being empty causing loss of information.
I've tested this very thoroughly on a fairly complex, mildly nested multi-hash with all kinds of data in it (string, NULL, integers), and serialize/unserialize ended up much faster than json_encode/json_decode.
The only advantage json have in my tests was it's smaller 'packed' size.
These are done under PHP 5.3.3, let me know if you want more details.
Here are tests results then the code to produce them. I can't provide the test data since it'd reveal information that I can't let go out in the wild.
I made a small benchmark as well. My results were the same. But I need the decode performance. Where I noticed, like a few people above said as well, unserialize
is faster than json_decode
. unserialize
takes roughly 60-70% of the json_decode
time. So the conclusion is fairly simple:When you need performance in encoding, use json_encode
, when you need performance when decoding, use unserialize
. Because you can not merge the two functions you have to make a choise where you need more performance.
My benchmark in pseudo:
- Define array $arr with a few random keys and values
- for x < 100; x++; serialize and json_encode a array_rand of $arr
- for y < 1000; y++; json_decode the json encoded string - calc time
- for y < 1000; y++; unserialize the serialized string - calc time
- echo the result which was faster
On avarage: unserialize won 96 times over 4 times the json_decode. With an avarage of roughly 1.5ms over 2.5ms.
JelmerJelmerBefore you make your final decision, be aware that the JSON format is not safe for associative arrays - json_decode()
will return them as objects instead:
Output is:
too much phptoo much phpCheck out the results here (sorry for the hack putting the PHP code in the JS code box):
RESULTS: serialize()
and unserialize()
are both significantly faster in PHP 5.4 on arrays of varying size.
I made a test script on real world data for comparing json_encode vs serialize and json_decode vs unserialize. The test was run on the caching system of an in production e-commerce site. It simply takes the data already in the cache, and tests the times to encode / decode (or serialize / unserialize) all the data and I put it in an easy to see table.
I ran this on PHP 5.4 shared hosting server.
The results were very conclusive that for these large to small data sets serialize and unserialize were the clear winners. In particular for my use case, the json_decode and unserialize are the most important for the caching system. Unserialize was almost an ubiquitous winner here. It was typically 2 to 4 times (sometimes 6 or 7 times) as fast as json_decode.
It is interesting to note the difference in results from @peter-bailey.
Here is the PHP code used to generate the results:
Pink CodeFirst, I changed the script to do some more benchmarking (and also do 1000 runs instead of just 1):
I used this build of PHP 7:
PHP 7.0.14 (cli) (built: Jan 18 2017 19:13:23) ( NTS ) Copyright (c) 1997-2016 The PHP Group Zend Engine v3.0.0, Copyright (c) 1998-2016 Zend Technologies with Zend OPcache v7.0.14, Copyright (c) 1999-2016, by Zend Technologies
And my results were:
serialize() (wins: 999) was roughly 10.98% faster than json_encode() unserialize() (wins: 987) was roughly 33.26% faster than json_decode() unserialize() (wins: 987) was roughly 48.35% faster than array json_decode()
So clearly, serialize/unserialize is the fastest method, while json_encode/decode is the most portable.
If you consider a scenario where you read/write serialized data 10x or more often than you need to send to or receive from a non-PHP system, you are STILL better off to use serialize/unserialize and have it json_encode or json_decode prior to serialization in terms of time.
Divjust an fyi -- if you want to serialize your data to something easy to read and understand like JSON but with more compression and higher performance, you should check out messagepack.
HutchHutchJSON is better if you want to backup Data and restore it on a different machine or via FTP.
For example with serialize if you store data on a Windows server, download it via FTP and restore it on a Linux one it could not work any more due to the charachter re-encoding, because serialize stores the length of the strings and in the Unicode > UTF-8 transcoding some 1 byte charachter could became 2 bytes long making the algorithm crash.
THX - for this benchmark code:
My results on array I use for configuration are as fallows:JSON encoded in 0.0031511783599854 seconds
PHP serialized in 0.0037961006164551 secondsjson_encode()
was roughly 20.47% faster than serialize()
JSON encoded in 0.0070841312408447 seconds
PHP serialized in 0.0035839080810547 secondsunserialize()
was roughly 97.66% faster than json_encode()
So - test it on your own data.
Php Unserialize Json String
If to summ up what people say here, json_decode/encode seems faster than serialize/unserialize BUTIf you do var_dump the type of the serialized object is changed.If for some reason you want to keep the type, go with serialize!
(try for example stdClass vs array)
serialize/unserialize:
json encode/decode
As you can see the json_encode/decode converts all to stdClass, which is not that good, object info lost.. So decide based on needs, especially if it is not only arrays..
I would suggest you to use Super Cache, which is a file cache mechanism which won't use json_encode
or serialize
. It is simple to use and really fast compared to other PHP Cache mechanism.
Ex:
Php Unserialize To Json Online
protected by NullPoiиteяJun 23 '13 at 10:19
Thank you for your interest in this question. Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?