menu

Questions & Answers

Python's zlib decompresses data, but Pako (JavaScript zlib) fails

I'm trying to inflate some zlib compressed data (Ren'Py Archive 3 archive file structure for those wondering) with JavaScript, but I can't seem to reproduce the Python behavior in Node.js.

This Python script works:

import zlib

# Data written to a file from a different Python script, for demo purposes
# This would be a value in memory in JS
data = open("py", "rb")

# Works
print(
    zlib.decompress(data.read(), 0)
)

While this Node.js script:

const fs = require('fs');
const pako = require('pako');

const data = fs.readFileSync('py', 'binary');

// Doesn't work
console.log(
    pako.inflateRaw(data)
);

Throws this error:

C:\Users\gunne\Documents\Programming\node.js\rpa-extractor\node_modules\pako\lib\inflate.js:384
  if (inflator.err) throw inflator.msg || msg[inflator.err];
                    ^
invalid stored block lengths
(Use `node --trace-uncaught ...` to show where the exception was thrown)

As per the Python zlib.decompress documentation, a wbits parameter (the second parameter) of 0 "automatically [determines] the window size from the zlib header," something that the Pako implementation seemingly doesn't do.

Am I doing something incorrectly? How would I achieve the same output as in Python using Node.js?

Edit:

Here's the data (replace the readFileSync line with this):

const data = Buffer.from(
    // Output of Buffer.from(data).toString('hex');
    '789c6596596c545518c74bf9440a2d2d2d65ab802c022e33a52d2d2d8bcc9d61f3b05791cb52cae9f4f6cea577f699da2295cd04f97c1422122a8a24407880861009242ac6aa0891a8c428464d243184300f3e184c88e299f39de18cf8342fbffbfff6ff995d85bdf14133cd910505056d76b83a64f144aacde2296fd4b65be28508a7ff5e68b05fe27effba827df1c1dc1ca6c89a598400c28cc1c3fceccde3ad12798c9bc50249767457275d1e236808c29fa1f70df67184741ee766a9d2a96fb543d1a48a3714e1babf37c0beffd2945c11715931db7552c1500d71c31096941c31d8995ad21baef30a2ba418e146bf48fd5cf750899470b33c8744234e27f7589176424720dc74d1609e932b245aaa4bb07aac244165084b4fbf6bb09b5d147224374728289c0cf284455839c2de8b1f18ecdacf8455e8ccec5a424621a42be719acfc3c2195dc2cca359588d108a2e17e96ca389218a3890622c6221c9e7f2ec00eeedf2c89713a4c834a783cc2ce13537dacbbc096489516994dc413084d3f5ef4b31bc14e494cd0443d111311ba4f6d0db082f55b2431296fb829de46d0930813168ae19e9ce093d0e4bc92554153105ade9e6bb0c59f50c953b959f670fe49de134d38044e43f8f6adeb01f6d97c0af8143747a9804e2465253aac841509aa5e4f47283ad067b03fbca43a839b250a0e2678b093a899087dbf8bc1ad1f42d4d3ba4a359067107ab74df3b3998b68859fa5adb3d34e7556ab8d73e29e4378107cc760331c52f2e842e7d8847811cacedcf2b1d0aeb512a9a6fcb348b2b5d371dd5671606e0fc1b3104af0b0c1265551e36ab8393ad75d82ffb3f4b508638f9e30981ba5f0755c9e6d16ef49271c4ff68b2c199fdd12af47f869d97183d5849748b681b63f5752d272ada0bab939a2aa0151d5d942926de466456eb1e5a1c4a2e9dca53421ac1c74cc60b7ee05243b578f27e8469356ca237f089e8770a05f086faf22e1f9dc1cf7bf1c5489f4c502843d28be58aa1afcbc1ed51c2216227c650df8d8fd7d64103e3d82d96a0406c2d607c1005bec528e7ebdb31d9c3ad4120f20dc6d3d6ab075dd1469912e5a15128d591162178be52d150b94a9247689ceaa8e88a508f72fdc120b541794c4329d5593caea0581c4cff8d88e95cb24c23452a790e5085ddedd7eb67c804964051d4936a718b7ad0ed7516eba1261cacdf70c766c3225b44a27a4e6b41ae11bb7db60573e8d49628dbe8c64f8e165ac45d857290ef7aa4d3acd7926a2727a514c6468b38f9d9ade2e91977433ed88a3a2ad4318dc2eeef05286745ee60fdf13e9ed612b922672bd70dae2468315d61992341ff577d5d00d08bc5ff8fb6af5ee6cd4153611b10961fbed433e567b758d24363faaa46ebb05c1b74728352ba52ddab66dc7b65db5abad08de0fc53e0ca85ddd4a181954f6ba08e30856f9ed002bbad626b136fd8ed5ccf2682e88d028bc9b1dfa8e6cac9d9bc355d4445aad958550f19198e218974276e8eed7aaeedb087fb997fdecca025ae5906eac78bc3c5d512767850e4238e335589fbd4a92dbf4c0b34baf9ad1292eb25254f9c5af14d2a5b428a462c208a3ee7eee676523c80c23bac460c26a7752ea798922b041e77dacf15481e462f94f7ab6134a2f8e70f9c81b0136e437daa0848e59978b994428eedfef675bee442493d275d6b746782a99ee548d4d234cec6b08307e76a324bb88cc794a28da6525887c45bca1a5c24be6a90befd64eacde7ff182259407f6206cdc74d060ed196af476bd6f8d44bc2afe9964eef9d80f173a24b123ef5654c45e84d7c757f8d8b4af9b25f25a1e12256427c23f99948f5dba43f6b52bff1da4b408dc8d10bb5215609b466f90e01e9ef6fe0b4e4d2339'
    , 'hex'
).toString('binary');

Edit #2:

As per help of Mark Adler, the solution for me was to downgrade to pako version 1.0.11, as that seems to handle it via pako.inflate.

Answers(2) :

Use inflate instead of inflateRaw.

Comments:
2023-01-17 23:10:03
I tried this, but I get an unknown compression method error.
2023-01-17 23:10:03
Post in your question a hex dump of data from const data = fs.readFileSync('py', 'binary');.
2023-01-17 23:10:03
Alright, it should be on the question now.
2023-01-17 23:10:03
That's a perfectly fine zlib stream. I tried feeding that data to pako.inflate() in a node.js playground (codesandbox.io), and it worked fine. No errors. Decompressed the data correctly.
2023-01-17 23:10:03
Weird. It seems to work on pako version 1.0.11, but not the latest one. Thank you!

pako since v2.x has dropped support for binary Strings and Arrays. You can now only provide compressed data as Uin8Array instead. (source)

Examples of this can be seen here and here.