Optimize creation of empty arrays in json_decode
Use the shared empty array from ZVAL_EMPTY_ARRAY
For code that created an 10 arrays of 100000 empty arrays
(has the same result with `$assoc=true` and `{}`)
- This is the worst-case comparison, but I'd expect 0-length arrays to be fairly
common in regular data for json_decode
- The parser implementation was using function pointers so that third party
extension developers could reuse the json parser for their own
data structures, etc. (I think).
This PR is meant to let those third party extensions continue working
without changes.
Before this patch: In 0.126 seconds: added 97.99 MiB
After this patch: In 0.096 seconds: added 41.99 MiB
```php
<?php
$json = '[' . str_repeat('[],', 100000) . "null]";
$start_memory = memory_get_usage();
$start_time = microtime(true);
$result = [];
for ($i = 0; $i < 10; $i++) {
$result[] = json_decode($json);
}
$end_memory = memory_get_usage();
$end_time = microtime(true);
// Before this patch: In 0.126 seconds: added 97.99 MiB
// After this patch: In 0.096 seconds: added 41.99 MiB
printf("In %.3f seconds: added %.2f MiB\n", $end_time - $start_time, ($end_memory - $start_memory)/
1000000);
// For objects
$json = '[' . str_repeat('{},', 100000) . "null]";
$start_memory = memory_get_usage();
$start_time = microtime(true);
for ($i = 0; $i < 10; $i++) {
$result[] = json_decode($json, true);
}
$end_memory = memory_get_usage();
$end_time = microtime(true);
// Before this patch: In 0.126 seconds: added 97.99 MiB
// After this patch: In 0.096 seconds: added 41.99 MiB
printf("In %.3f seconds: added %.2f MiB (objects decoded as arrays) \n", $end_time - $start_time, ($end_memory - $start_memory)/
1000000);
```
Closes GH-4861.