[ home / board list / faq / random / create / bans / search / manage / irc ] [ ]

/hydrus/ - Hydrus Network

Bug reports, feature requests, and other discussion for the hydrus network.

Catalog

Name
Email
Subject
Comment *
File
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Options
Password (For file and post deletion.)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, swf, pdf
Max filesize is 8 MB.
Max image dimensions are 10000 x 10000.
You may upload 5 per post.


New user? Start here ---> http://hydrusnetwork.github.io/hydrus/

Currently prioritising: simple IPFS plugin


YouTube embed. Click thumbnail to play.

 No.2103

windows

zip: https://github.com/hydrusnetwork/hydrus/releases/download/v194/Hydrus.Network.194.-.Windows.-.Extract.only.zip

exe: https://github.com/hydrusnetwork/hydrus/releases/download/v194/Hydrus.Network.194.-.Windows.-.Installer.exe

os x

app: https://github.com/hydrusnetwork/hydrus/releases/download/v194/Hydrus.Network.194.-.OS.X.-.App.dmg

tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v194/Hydrus.Network.194.-.OS.X.-.Extract.only.tar.gz

linux

tar.gz: https://github.com/hydrusnetwork/hydrus/releases/download/v194/Hydrus.Network.194.-.Linux.-.Executable.tar.gz

source

tar.gz: https://github.com/hydrusnetwork/hydrus/archive/v194.tar.gz

I had a good week. I mostly worked on IPFS, and successfully got pin and unpin working, and otherwise did a bit of cleaning up and added a new way to add tags on import.

ipfs pin/unpin

Now that I know this all works, I will write up a help page for IPFS, including what it even is and where to get it, for next week, so if you don't understand any of this, please hang on a little bit.

If you have an IPFS service added, right-clicking on thumbnails will give you the option to remote services->pin to ipfs. The workflow works just like hydrus repositories, so your 'to be pinned' files will gain a little icon and the pending menu will count them. Committing from the pending menu will upload them to your IPFS daemon when you are ready. Files that are pinned to IPFS will retain a small correct-colour IPFS icon. Unpinning works similarly.

Furthermore, right-clicking on pinned files will let you copy their IPFS multihash, so you can share them wherever you like. Here's some I did this week:

QmP6BNvWfkNf74bY3q1ohtDZ9gAmss4LAjuFhqpDPQNm1S

QmXH3NYyqyAnNgf9gWppufJp3kCHwJcJeqk2w3d9zQQF9u

QmaQNZLHQg3tXT2R3gCSKPDJ4Bs9FjmWnHbmNQbVS157ge

You can try to download and import those through the client, or you can check through the browser-compatible gateway the IPFS people run at:

https://ipfs.io/ipfs/[multihash]

I haven't set my dev machine's IPFS daemon to run all the time, but I presume the file will live on despite that, somewhat, precisely because of how IPFS works. I have read all three files through that gateway, so I think they get temporarily mirrored or something. Let me know if you can get them, and let me know how your own IPFS-hydrus stuff goes!

parse tags from neighbouring .txt files

For a while now, when you export files from the thumbnail right-click menu via share->export->files, you have been able to export the files' tags into neighbouring [exported_filename].txt files, as newline-separated lists. I have now added a checkbox to the import files 'path tagging' dialog (the one with all the regex stuff), that will try to do the reverse, loading any existing neighbouring .txt files and parsing them for newline-separated tags.

If you have been looking for a way to automate tag import from an unusual source with a non-python script, have a play with this!

full list

- ipfs pins and unpins can now be queued up like file repository pending and petitioned, through the regular thumbnail right-click menu, which also reports some/all ipfs pinned selection status

- this ipfs action queue is similarly summarised and commited at the normal service 'pending' menu

- ipfs's 'pinned', 'to pin', and 'to unpin' statuses are displayed on thumbnails with ipfs-specific icons

- you can copy the focussed file's ipfs multihash or all the selected files' ipfs multihashes from the thumbnail menu's share->copy->ipfs multihash

- added a .txt tag parser to the 'path tagging' import dialog–it will parse the same sort of txt files the export dialog produces

- the client's new 'requests' network code is harmonised, generally improved, and now produces hydrus-compatible exceptions

- updated help re the local server and boorus now defaulting to off

- db can now remember service-specific filenames (e.g. ipfs multihashes)

- cleaned up some overly complicated and confused thumbnail menu code

- the pending menu now specifies what it is about to do more plainly

next week

I want to support IPFS directory sharing, so you'll be able to post 'Here's my 400-file blah collection: [single_multihash]' to a thread, and also be able to efficiently download such links using hydrus. It needs a bit more research on my end and thinking about what workflows would be best and then experimenting with the IPFS API to figure out what is actually doable. That won't happen in one week, because as well as the help, I also want to add 'system:pinned to ipfs' search support and a better review services panel for IPFS.

I might also write some class outlines for the big client.db extraction.

 No.2106

So I finally started playing with ipfs and huh, it's cooler than I thought. Can't wait for sharing the 400 files under one multihash through hydrus!

bug report though: for local files, my hydrus server no longer appears under

right click > remote services

So I cannot upload pictures to a hydrus server now.


 No.2107

>>2106

also ipfs isn't under system:file service so I cannot easily list what files I have pinned!


 No.2108

Might I suggest a feature where hydrus calculates the ipfs multihash of every file in the database? Or alternatively, uses ipfs multihash in the databse itself rather than whatever hash you use now?

I'm the guy who had a bunch of files disappear, but the hashes are still in hydrus' database file, it's just the files that are missing. If they were ipfs hashes, you could implement a "try to recover missing files from ipfs".


 No.2109

>>2103

Hey, hate to bother you with my request again, especially with how busy you seem to have been lately, but have you had any time to look into adding a source field?

I understand there are a lot more important things that should be done first, but it'd be nice to know it hasn't been forgotten.


 No.2110

can I ask for a option to toggle the ipfs multihash to be copied from the program with the generic gateway url prepended to it?

makes it easier on my side

also, a system tag to sort items that are pinned to a ipfs service, I assume system:file service is one that could be used

https://gateway.ipfs.io/ipfs/QmchWtr91Ht88MPDWvXUrUEX5jMsdxoK9T9oEjxTgM1pcT


 No.2111

File: 1456414460748.png (100.7 KB, 1366x768, 683:384, Hydrus.png)

Hooray for false positives -_-;

Just curious, how hydrus doesn't have an internal update checker?


 No.2112

I was a bit skeptical about ipfs but it looks kinda cool. Makes it easy to share stuff, certainly.

Also, how far back in the roadmap are the subscription system improvements that were discussed?


 No.2113

Great work as usual! Looking forward to directory sharing.

For anyone wondering about IPFS, I found this article very useful for conceptually understanding how IPFS works.

https://medium.com/@ConsenSys/an-introduction-to-ipfs-9bba4860abd0


 No.2114

>>2111

>Just curious, how hydrus doesn't have an internal update checker?

I believe it was because dev wanted the user not to be inclined to update if they didn't want to. the only reason people would really want to update outside feature scope would be due to a change in the networking, which isn't often

currently we're on network version 17, I believe last time we changed it was when we changed hashes/how they're transferred


 No.2115

>- added a .txt tag parser to the 'path tagging' import dialog–it will parse the same sort of txt files the export dialog produces

I could suck your digital dick right now.


 No.2116

Hi dev,

so, I've downloaded the PTR. Once it finished, the final db file was 4gb in size, and now everything is ridiculously unstable, sometimes it takes up to 3 minutes to initialize, and if try to look around it usually freezes for a while or outright crashes. Is there a way to purge the repository from my db file?


 No.2120

File: 1456540228297.png (885.02 KB, 1100x551, 1100:551, e6747ae36716c89829961ddb32….png)

>>2106

Can confirm that regular file repos don't appear under remote services, only when attempting to upload though removing files seems to work fine.

>>2116

Are you synced with hydrus_dev's actual tag repo or did you import the tag archive into local tags, if the former you can go to services->manage services->remote->tag repositories and remove the service there.


 No.2121

>>2111

Use an external package manager. No need to burden the dev with issues that already have solutions:

https://chocolatey.org/packages/hydrus-network


 No.2125

>update hydrus

>now refuses to open

client.exe pops up in taskmanager and then closes after 2-3 seconds

Should I not have used the installer version to update?


 No.2126

File: 1456555684466.png (33.75 KB, 757x448, 757:448, hydrus.png)

>>2125

Tried reinstalling, told me a previous uninstall wasn't complete (or something to that effect) and I had to retstart

>restart

>reinstall

works fine on initial loadup

>notice antivirus has blocked some hydrus files (pic related)

Set those to allow, restore those files, turn antivirus off just so it can't interfere further

>still fails to load

I do have a crash log though


 No.2128

File: 1456569205548.png (311.86 KB, 409x322, 409:322, 1402600448994.png)

>>2108

Chiming in

IPFS allows for hashing only without adding the file content to the datastore with the command

ipfs add -n *files*

I'm assuming there's a way to do the same in the API.

>If they were ipfs hashes, you could implement a "try to recover missing files from ipfs".

This self healing nature is something that excites me about IPFS and its integration with hydrus.

I would also love to see something like that so that files could be gathered using IPFS even if you never had them.

For instance if you add a tag repository. search for all images containing *tag* and don't have a remote file repo that has them, if you had the IPFS hash it wouldn't matter, you'd just pull it from whoever has it, hydrus user or not, no need for a specific file repo to pull the file from.

The thumbnails might be a problem though, I guess since you have the full image though you could just generate it.

>>2110

This isn't really related to what you said but these addons were very helpful for me and dealing with bare hashes

From /tech/:

This script changes all bare ipfs hashes into clickable links to the ipfs gateway (on page loads)

https://greasyfork.org/en/scripts/14837-ipfs-hash-linker

These redirect all gateway links to your local daemon when it's on, it works well with the previous script.

https://addons.mozilla.org/en-US/firefox/addon/ipfs-gateway-redirect/

https://chrome.google.com/webstore/detail/ipfs-gateway-redirect/gifgeigleclkondjnmijdajabbhmoepo

>>2113

This one is also good imo

https://blog.neocities.org/its-time-for-the-permanent-web.html

>>2121

I really hope MS develops OneGet further and pushes developers to use it. Related thread >>1511


 No.2129

>>2110

>>2128

I should mention I'd like a custom prefix option as well though, I didn't mean to imply one is not needed.

Allowing the user to set a custom one should cover all bases, that way you could copy links to your local gateway, a private one, the public one, one on non-standard ports, whatever. Having the gateway is easy for linking it out to people externally who don't use IPFS.


 No.2132

>>2120

Thanks, tried to do that before, actually, but I just thought the program had crashed or something. But then I opened the task manager and there it was, consuming 1,3gb of ram, half of the processor and the hard drive at max speed. It's been 7 hours now and just went through 1/4 of the total journal.

I guess I'll just have to leave the computer running for a day, since I believe the database journal gets scrapped if it doesn't finish.


 No.2133

Just updated and I noticed the installer didn't ask about the path like usual. Did I miss an option box or something, or was this intended?

It detected the right path and all, and I absolutely recognize that I'm gonna be in the minority here, but I'd like it if a tick box or something was added to let me specify where to install.

The reason being that I've toyed with the idea of having 2 separate hydruses installed. One for images in general, and one for doujinshi.

It's not a big deal at all, and I realize I can just download the zip file instead, but I also figure this wouldn't be a lot of work to add in.


 No.2135

File: 1456611190256.jpg (1.03 MB, 1818x2424, 3:4, 86605ec0e80acd30e2e84e2bd2….jpg)

>>2106

>>2120

Thank you for this report. Sorry for the inconvenience, it is fixed for next week!

>>2107

>>2110

>>2110

I hope to add ipfs services to 'system:file service' this week.

>>2108

I had thought this technically difficult, since IPFS hashes are not trivial to figure out, but if I can quickly ask the IPFS daemon for hashes for files as >>2128 suggests, then I might be able to figure out some sort of sync. Unfortunately for your situation, however, you need the file content (i.e. the file itself) to figure out its hash. There's no way to calculate an IPFS hash if you only have the hydrus hash.

>>2109

I'll write something simple for this week. Something like a right-click menu option for 'show all known urls for this file' or something. That information is remembered if you download, so I'll figure out some ugly access for you, and you can let me know how you would like it beautified.

>>2110

Adding a prefix is a great idea, thank you. I'll add something to manage services.

>>2111

Someone recently emailed me with a similar report with the same anti-virus software, maybe that was you. I don't know what's going on there, but I think those are old files anyway, as the latest version doesn't include them. If that warning keeps coming up, you can do a clean install (delete everything except the 'db' folder, which contains all your files and settings, and reinstall), and you shouldn't see that again.

There's no auto-update because I never got around to writing one! I'm not opposed, particularly, but it just isn't high on my list.

>>2112

I think that will be a big job, so it will be on the next 'what to work on next' poll. If that's what you care about most when that comes around, please vote!

>>2113

That's a great article, thanks. I am writing the ipfs help now, so I'll make sure to include it.


 No.2136

File: 1456613856694.jpg (68 KB, 602x960, 301:480, e1dcc095fde004c748767f3ea8….jpg)

>>2116

>>2120

>>2132

Unfortunately, you have hit the problem I want to fix with the client.db breakup. Having everything wrapped into one 4GB blob is too clunky for a bunch of reasons. Running regular vacuums tends to reduce a lot of the lag, but that in itself is a heavy operation that needs a bunch of disk space and can take twenty minutes.

Deleting a service is how >>2120 says, but for now you'll just have to wait. Let me know if it crashes or something, and I can walk you through a manual process.

When everything is broken up, it'll be much quicker, because services will be on separate files in a subfolder, so the client will just have to delete the folder, and it'll be gone.

>>2125

>>2126

When you say crash log, do you mean the crash.log at hydrus's install_dir/crash.log? If so, please post its contents, or email it to me if there is private info in it.

>>2128

Thank you for this information. Doing hydrus-ipfs syncs and putting file repo/repo updates on ipfs is something I won't be working on in this initial phase, but it is something I would like to do, and it is nice to know that IPFS can generate hashes without me having to spam pin/unpin.

Although hydrus uses sha256 right now, and the key tables and network stuff all work off that, I think I'll ultimately move towards arbitrary hashtype, so repos can talk only in IPFS if the admin wants, or sha512 or whatever, and the local client can cross-reference that stuff as cleverly and invisibly as it can.

>>2133

I updated my dev machine recently, which included updating all my dev software, including InnoSetup, the installer I use. I noticed the installer felt different when I tested it, but I wasn't sure what it was and assumed it was just a Windows 10 graphical thing. I suppose the new version of Inno doesn't show that page if it detects an install already exists.

I had a look through the Inno docs and found a variable to force that page to display. Thank you for mentioning it. Let me know if it doesn't work for you.


 No.2143

>>2126

I use webroot too, try copying over everything except the db folder from the .zip. The issue seems to be that the restore file option doesn't really restore the files, or at least not to where they were in the first place. For whatever reason, this happens every maybe 6 or 7 versions for me.

>>2136

> I suppose the new version of Inno doesn't show that page if it detects an install already exists.

I'm using hydrus on an external drive, and I've migrated to the extract install because the installer hasn't been able to install to that directory. You probably can't do anything about how Inno works to that extent, but it might be something to let them know about.

Also, I noticed that my performance seems to have increased dramatically, so great work!


 No.2144

>>2136

>hydrus install_dir/crash.log:

Traceback (most recent call last):

File "<string>", line 23, in <module>

File "c:\python27\Lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module

File "include\ClientController.py", line 3, in <module>

import ClientDaemons

File "c:\python27\Lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module

File "include\ClientDaemons.py", line 10, in <module>

import HydrusEncryption

File "c:\python27\Lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module

File "include\HydrusEncryption.py", line 2, in <module>

import Crypto.Cipher.PKCS1_OAEP

File "c:\python27\Lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module

File "site-packages\Crypto\Cipher\PKCS1_OAEP.py", line 57, in <module>

File "c:\python27\Lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 389, in load_module

File "site-packages\Crypto\Signature\PKCS1_PSS.py", line 74, in <module>

ImportError: No module named strxor

>>2143

Thanks for the tip, I'll give it a shot, although the files do show up in the main hydrus directory. If all else fails I'll just try a clean install with webroot off and see how it goes.


 No.2145

>>2136

I am the guy that was purging the PTR. It worked fine, took almost a whole day though. The journal reached to 2,7gb from the 4,3 that was the original file and then it left me with a 450mb client.db, which after a vacuum went to 300. Now I can use it at least.

After this, I did a backup (review services>system wide operation>export to tag archive) and the archive.db is only 3mb big, whereas the client.db is 300. Why is that? Is this really a backup of all the tags, hashes and mappings? I just wanted to bring it to your attention in case it's not the intended behavior.

Thanks for your continued effort :)


 No.2146

File: 1456689139058.jpg (117.19 KB, 540x736, 135:184, 98b8efd91b7f29037eba215d39….jpg)

>>2144

If you don't have Crypto.Util.strxor.pyd (7,680 bytes) in your main installer, I expect your anti-virus has removed some other files. Doing a completely fresh install is probably the ticket.

>>2145

Tag archive creation only stores the hashes and tags it needs, whereas your client.db is remembering all the hashes and tags it ever saw in the public tag repo (although all those mappings were deleted when you deleted the service, because they are service specific, the hashes and tags master reference tables are shared). I presume 2.7m hashes and 600k tags probably adds up to 290MB. It depends on how many files your db has, but perhaps you should only 'need' a 5-10MB db or so.

There isn't a way inside the program to delete those 'orphan' entries yet. I think you have several options:

- Wait until I revisit orphan clearing, at which point you can run it and your db will shrink.

- Create a fresh client and move your files and local mappings to it. This could be difficult if you have a lot of ratings or local siblings/parents, which can't currently be exported/imported. I can help you out if you want to do this.

- Just live with a slightly larger db (I advise this if you plan on ever trying to sync with my ptr again, as it'll be quicker to reprocess with all those hashes and tags).

- If you are enthusiastic, I can also try writing you some custom SQL that you can execute on your client's db directly that will clear out most of the orphans. I can't promise I can get it 100% correct, but if you like doing this sort of thing, I'm happy to give it a go.

Post last edited at

 No.2162

>>2103

>>2115

I must be blind because I can't find this option anymore and I swear I found the regex import options before.


 No.2165

File: 1456771489470.jpg (202.6 KB, 759x1024, 759:1024, 851c87ccb7e681ca84c65d52af….jpg)

>>2162

Drag-and-drop some files onto the main client window to bring up the import files dialog. This has two green-texted buttons, import now and add tags based on filename. Hit the 'add tags' one, which gives you the more complicated dialog that lets you add filename-regex and other complicated tags. The new checkbox is on that dialog, called 'try to load tags from neighbouring .txt files'.

It tries to parse the .txt files' tags live, btw, so they should be listed in the 'tags' column as soon as you hit the checkbox. Let me know if the control doesn't layout or work correctly for you.


 No.2169

>>2165

ah I see now. I'm the guy who made the booru parser so this is a good find. I guess the auto-importer via import folders doesn't support file name regex or txt files (yet)?


 No.2171

File: 1456819676146.jpg (221.05 KB, 1221x797, 1221:797, boorutagparser-server.jpg)


 No.2200

File: 1457208392633.jpg (577.61 KB, 1559x2053, 1559:2053, 058302a72c601b3a045ae3afbb….jpg)

>>2169

I will try to add .txt parsing to import folders in the next week or two. Regex is a more complicated thing to store, so that'll have to be put off until the next big rewrite of that code.


 No.2208

File: 1457350821440.webm (2.24 MB, 1406x900, 703:450, 16-03-07_22-39-00.webm)

>>2128

>These redirect all gateway links to your local daemon when it's on

lmao, don't even run the gateway on my local machine

hacked this up though, tedious but whatever




[Return][Go to top][Catalog][Post a Reply]
Delete Post [ ]
[]
[ home / board list / faq / random / create / bans / search / manage / irc ] [ ]