Initial captcha and mirror verification : DarkMatterMarket | Torhoo darknet markets
When I'm solving the initial captcha, I sometimes see that there are 3 "blocks" of images that I have to arrange, and other times there are 5. It seems that when there are 3 blocks, the Dark Matter site that I am able to access after logging in after the captcha has an option at the top that allows me to verify my mirror. When there are 5 blocks for the initial captcha, the Dark Matter site that I enter after logging in does not have that option. Is the latter a phishing site? Today I tried probably 10 different links on the from the dread forum and all of them lead me to the 5-block captcha, and once I get to the main site I don't have the option to verify my mirror.
One possibility is to create an encrypted 7Z archive, with a simple script, stored in a convenient place, that way no need to navigate to the specific directory each time (plus it should be much more compact).
notes:
* A slightly more elaborate script could automatically set the archive's name with the current time, I don't know how to do that on Linux.
* A long password made of simple words is better than a shorter password made of gibberish, see: xkcd.com/936/
* Other options could improve the compression ratio, for instance -mx=7 -md=128m to set the compression strength to "7" and the dictionary size to 128MB.
* -mhe enables header encryption (the filenames can not be displayed until the correct password is inputted)
* -mtc=on and -mta=on enable the storage of creation and access timestamps, respectively; by default only the modification timestamps are stored.
* -ms=off disables the "solid" compression mode. By default 7-Zip compresses in "solid" mode which can considerably improve the compression ratio if there are many files with similar contents, but makes the archive very vulnerable to corruption, as a single corrupted byte makes the whole contents located beyond that byte unreadable, whereas in "non solid" mode only the file containing that corrupted byte will be unreadable. So if the difference in compressed size is less than ~10% (with or without "-ms=off"), or if you generally favor robustness over compacity when it comes to backups, I would advise to disable "solid" compression. For "solid" compression to be optimal, the dictionary size should be set to a value that is slightly higher than the total size of the uncompressed files (for instance, if the total size is 180MB, set the dictionary size to the available value immediately higher which is -md=192m), but if set too high the compression can fail because the available RAM is insufficient.
What are the other sensitive files?
According to the reference other files include these data: