[ home / board list / faq / random / create / bans / search / manage / irc ] [ ]

/eternalarchive/ - The Eternal Archive Project

Fighting the TPP

Catalog

Name
Email
Subject
Comment *
File
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Options
dicesidesmodifier
Password (For file and post deletion.)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, swf, pdf
Max filesize is 8 MB.
Max image dimensions are 10000 x 10000.
You may upload 5 per post.


Submit a list of your archived shit regulary in the meta thread

File: 1448262256746.png (4.44 MB, 2072x8927, 2072:8927, SqueezeTest.png)

6c6d7f No.263

Ok anons, we've got a board with willing anons and a whole load of archived shit.

Now we begin to share through i2p.

In order to reduce load on i2p network and make it more feasible to torrent, I suggest we select a compression software to use across all our torrents.

It would suck balls if we had to fumble between multiple compression softwares in order to torrent properly so let's just select ONE, popularize it, and stick with it.

To give you guys a glimpse of what extreme compression software is capable of, some fags managed to compress 395GB into 667MB.

Source: http://www.xtremesystems.org/forums/archive/index.php?t-242212.html

Ok, here's some resources to get us started comparing some compressors:

http://www.maximumcompression.com/data/summary_mf.php

http://www.squeezechart.com/index.html - Use the excel spreadsheet

http://compressionratings.com/ratings.html

What I've found interesting so far:

MCM 0.83 - Scored highest free (as in freedom) compressor in squeeze chart tests

www.libbsc.com with www.github.com/vaibhav-y/bsc-gui - Very fast compression with NVIDIA GPU acceleration, nice compression rates, has easy to use GUI.

What we're looking for:

* Must be free (as in freedom)

* Very good compression rate

* GUI

* Good decoding times (generally decoding times are always shorter than encoding)

* Result must be bitwise identical

* Shell integration? (optional, just ricing)

To me it seems libbsc fits all these roles (except shell integration), plus it has NVIDIA GPU acceleration (no other compressor does), sorry AMD fags, maybe give in like a good goy next time?

No PAQ8 or WinRK fags, that shit is slower than practical

Data compression thread GO!

e01543 No.269

Just use xz


6c6d7f No.274

>>269

XZ is mediocre at compressing anything that isn't text, just like other general compressors.

We're talking serious compression which will be applied to files which are MASSIVE ~1TB, these need to be compressed with proper compression software or else i2p will suffer.

Soon bittorrent activity will increase greatly on i2p, and every GB counts.

ALSO

Precomp + FreeARC on ultra (based on data from squeezetest) looks great!


93b8b5 No.276

File: 1448266506615.png (54.49 KB, 651x396, 217:132, 10gb.png)

http://mattmahoney.net/dc/zpaq.html

It's faster than rar and zip, but 80% of zip files. It's FOSS, and even though it lacks GUI, it has encryption, rollback and incrementation.


93b8b5 No.278

>>276

http://mattmahoney.net/dc/10gb.html

See here for more details.

http://mattmahoney.net/dc/text.html

http://mattmahoney.net/dc/silesia.html

As long as we don't paq8, durilca or cmix, zpaq is the right choice.

About compression:

http://mattmahoney.net/dc/dce.html


6c6d7f No.279

>>276

>>278

I'm basing these comparisons off squeezechart

Precomp + FreeARC:

+ Very fast decomp and fast compress speeds

+ Has GUI

- Performed slightly worse at image compression

ZPAQ:

- No GUI

+ Slightly better image compression than Precomp+FreeARC

- Same decomp speed as comp.

MCM is a straight upgrade from ZPAQ, plus it has active support though this is a quick reply so I haven't checked if it has any quirks

Good work anons, we're getting somewhere.


6c6d7f No.280

Ok, did a bit more searching, seems like MCM doesn't support encryption, rollback or incrementation.

Doesn't have any GUI implementations either.

Looks like it's back to ZPAQ and Precomp + FreeARC


108619 No.282

>>263

>some fags managed to compress 395GB into 667MB.

No, they managed to compress 3.95 GB into 667 MB. Still impressive, but more plausible.


6c6d7f No.283

>>282

Ah I hate it when they use commas instead of decimal points.


6ca5c8 No.291

>>279

Why not take the source code and build a GUI to go with it? It will be great!

Just ask /tech/!


e01543 No.293

>>291

First thing we'll need is a logo

t. /tech


6ca5c8 No.294

>>293

https://github.com/thometal/zpaqgui

It's in java, so… /tech/ should do better than this


6ca5c8 No.295

>>294

http://encode.ru/threads/2015-ZPAQ-GUI

The author supports this tho…


360012 No.299

>What we're looking for:

>GUI

Why? I think it's worth getting a gui-less program if it's better, using a command or script isn't that hard.

I think we should be searching for

>FOSS

>Good compression rate

>Result bitwise equal

>Good decoding time

In this order. Fuck GUIs.

>>293

>logo wars

>again

Pls no.

t. zirconium


c76b19 No.300

>>299

Commands/scripts arent that hard, but having a good gui means it's easier and saves everyone time and extends the userbase. Time efficiency is vital to these operations.


e01543 No.301

>>300

Incorrect. Scripts are faster than your fingers can ever dream to be. They are far more efficient. How fast will it take you to download and sort an entire youtube channel into Playlist folders? Some anon in the IRC built a script to do that as fast as his Internet will let him.

Scripts/commands are faster and better than GUIs. GUIs are only good for lazy normalfags. People who are incapable of even getting onto i2p in the first place.


c76b19 No.303

>>301

Yeah no shit they're faster in the end for the initial time investment, but what about if you have no spin up time. The more people we have the more effective this is, and that means at least having a GUI for people to do it without that spin up time getting familiar with shit.


360012 No.316

>>303

I understand what you mean, but I don't think it should be our top priority. Like >>301 said, scripts are fast and easy, if you are even trying to get on this, running this shouldn't be hard at all.

We should try to find something good and FOSS. If we need more people we just need to make a script or some GUI on python.


2eaae0 No.325

lrzip is the best compression software I've found so far. It gets better as file size increases, and the more ram you throw at it. It gets the best results using the -U flag.

It uses various compression schemes (including zpaq) alongside each other in order to reduce the file size.

Here is the readme. It goes more in depth on how it works.

http://ck.kolivas.org/apps/lrzip/README

Here are the benchmarks.

http://ck.kolivas.org/apps/lrzip/README.benchmarks

Note that it will not archive directories, so you will need to tar them up first. Or you can use lrztar (You cannot use the -U flag with this).


6c6d7f No.330

>>299

>>300

>>303

>>316

>extends the userbase

I think this pretty much sums up why we should support something with a GUI.

We may be able to use scripts to good effect but easy-to-use is pretty much essential for it to obtain any widespread usage.

Also normalfags can't even into batch processing with CLI.

>Scripts are faster than your fingers can ever dream to be

Not really, GUI is just a front end to the exact same program a script would use.


a367da No.373

>>330

Normalfags can't i2p. We can always take the raw materials that normalfags send us and have dedicated people doing the compression and uploading to make sure it is done right. The people doing the compression would know how to use the command line. Problem solved. Idiot-proofing can come during the distribution process (later). Right now we need FOSS, fast, good compression v. quality loss, not idiot-proofing. We don't have time for them, we need to decide on a method fast and make it standard.


fd4484 No.406

>>325

Comparing this to ZPAQ, how good is it? Does it compress more densely? Or does it compress fast while being inefficient?


754ed1 No.408

nanozip and zpaq are the only options that ain't cmix, durilca or paq8…

mcm and zcm can be considered… but i have my doubts.


fc701f No.419

Question:

If I am for example uploading an entire series, should I compress the whole thing into one file, or compress each file separately so they can be downloaded individually?


fc701f No.423

>>325

I tried to use this and it didn't compress at all


000000 No.469

>>373

And anybody outside of a major metropolitan area, regardless of skill, since uplink demand is so high.


3a7d36 No.470

>>263

>some fags managed to compress 395GB into 667MB

That's 3.95 GB, not 395 GB.


2eaae0 No.514

>>406

Compresses almost as densely, but much, much faster. If you really want to use zpaq though, it has that built in with the -z flag.

I should also mention that is has archive recovery available (lziprecover), which works even better if you have multiple damaged copies it can compare.


676c29 No.549

Whatever we decide on, can everyone agree on this?

- Must be fully free, not partially free

- Must work on Linux, Windows, (and hopefully BSD.) I know Windows is gross, but if it doesn't work on Windows a lot of people won't contribute.

- Must work from the shell (GUI is fine, but it has to be scriptable)

- Must generate checksums for data integrity (this is important)


676c29 No.550

>>514

If someone has 32GB of memory and compresses it using all their memory to increase the compression ratio will someone with only 1-2GB of memory be able to decompress it?


2eaae0 No.559

>>550

Yes, I have compressed files on my main machine (32GB) and successfully decompressed them with my laptop (3GB).


0f4236 No.585

>>549

This.

But also

- Must be open-source


e71b91 No.593

>>585

I think when we say free here we're all meaning free as in freedom.


0f4236 No.594

>>593

I though that the

>- Must be fully free, not partially free

might confuse some anons, with some proprietary programs being free* and all.


778935 No.652

Using one all purpose program to compress everything is a shit idea. It will make migration to another much harder if the necessity arises. And more importantly, the compression rates of images and audio in OP's pic are subpar to actual, proper codecs.

What we need is to decide on the codecs we will use. If packing multiple files into one is needed the good old zip will do just fine, the overhead is negligible. Same goes for encryption - just wrap the whole thing into a encrypted container.

The volume of data also maters, not just compression rates, an uncompressed text of a book is about the same size as a jpeg scan of its cover, which is as big as two seconds of high quality audio.


7f1eec No.663

>>419

I think both.

One collection compress and one with each compressed so theres options.


1e4640 No.668

>>325

For those wanting to use lrzip on pig-disgusting Windows but feeling a little overwhelmed by it, feel free to use this little guide I made. It isn't perfect, but it should help guide you:

>https://ghostbin.com/paste/rspq4

It's a bit bare but it's only intended to get you started. The rest will be up to you.


4ee354 No.672

>>668

Mind if you write a guide for ZPAQ as well?


2f4a39 No.689

Gui would be lovely, but a bit of typing never hurt anyone.

>>668

Thank you for your tut. You helped this windoze pleb. I'm trying out -Uzp 1 -L 9 on a 1 gig folder to see how well it works. Got a standard you use that you could share?


13ccbc No.718

From what I'm reading there is a limit to how small you can compress a file (obviously) and that most video torrents are already compressed for distribution.

Does that mean I can further compress my library of movies/media for greater storage potential or are these torrented media files pretty much already as small as they're gonna get?


2eaae0 No.721

>>718

They're as compressed as they're going to get. Unless you get raw, video, audio, and images are already compressed from their encodings, specifically meant for that type of media. Some archivers like lrzip and zpaq recognize this and ignore compression on them by default.


e1f354 No.723

>>668

There is rarlrzip *spelling is likely to be off* that is supposed to hopefully use folders as a input.


1e4640 No.730

>>672

You can download a windows dll from the zpaq website. Doesn't get much easier than that….

>>689

I don't have a standard, really, but watch out using lrzip on files >= ~10x your physical memory. I am finding with ~16GB physical memory anything over 100GB is reallllly slow. Though I could be an isolated incident.

If you mean do I have a "default mode" for compression/decompression operations…why, yes, I do!

>https://ghostbin.com/paste/oosrs

Frankly, I am a little confused by what you mean….


1e4640 No.748

>>730

I optimized my compressing function somewhat, if anyone is interested:

>https://ghostbin.com/paste/uo32a


5f181b No.792

>>263

>www.libbsc.com with www.github.com/vaibhav-y/bsc-gui - Very fast compression with NVIDIA GPU acceleration, nice compression rates, has easy to use GUI.

Only works with single files. The GUI is great except it DOESN'T FUCKING WORK! I downloaded Visual Studio Community 2015 just to build it (why the fuck doesn't it have binaries included?) and it builds but the compress button does nothing. It might work with earlier versions of VS but fuck it, no support for directories is a deal breaker.


821968 No.829

>>263

Actually OP that's 3.95 GB. Check the amount of bytes he posted alongside the "3,95GB".

That's still a nice result.




[Return][Go to top][Catalog][Post a Reply]
Delete Post [ ]
[]
[ home / board list / faq / random / create / bans / search / manage / irc ] [ ]