About this captureCOLLECTED BY Organization: Alexa Crawls Starting in 1996, Alexa Internet has been donating their crawl data to the Internet Archive. Flowing in every day, these data are added to the Wayback Machine after an embargo period. Collection: Alexa Crawls DE Crawl data donated by Alexa Internet. This data is currently not publicly accessible TIMESTAMPSGo to the first, previous, next, last section, table of contents.
Well, that’s all. The big big advantage of this scheme is thatthere is almost nothing to change in the file system. We just have tohack a little bit the read and the write routine, in order to compressor decompress the data if it turns out that the block we are workingwith belongs to a compressed cluster. This is just a matter of a fewlines. [pjm: I think Antoine’s understating things here a little.]
See `linux/fs/ext2/Readme.e2compr’ of the kernel source tree, for amore detailed description and a map of the source code.
Go to the first, previous, next, last section, table of contents.