About this captureCOLLECTED BY Organization: Alexa Crawls Starting in 1996, Alexa Internet has been donating their crawl data to the Internet Archive. Flowing in every day, these data are added to the Wayback Machine after an embargo period. Collection: Alexa Crawls DE Crawl data donated by Alexa Internet. This data is currently not publicly accessible TIMESTAMPSGo to the first, previous, next, last section, table of contents.
- I understand that a new release of zlib has been made, whichcorrects a bug that in rare circumstances results in incorrectcompression. The same bug probably exists in e2compr, so try tointegrate this change into e2compr.
- Make the algorithms into kernel modules. This would reduce physicalmemory usage, and may have other advantages, e.g. adding a new algorithmwithout rebooting.
- Build e2decompress with Checker, ElectricFence or something, and try toget it (and hence the kernel, which shares the compression algorithmsource code) to handle erroneous data better (i.e. without crashing).This isn’t all that important because we store a checksum on thecompressed data, which should usually protect us from trying todecompress corrupted data.
Go to the first, previous, next, last section, table of contents.