How to organize your JAV collection?

I am using a Samba server on my NAS as well. Unfortunately as my collections is growing its difficult to organize the library just by folders. I would like to have tags/categories so better sort my videos. It is possible with DLNA but none of the DLNA-servers i've tried is supporting multiple parts in the non-webmode.



You find the information here: https://github.com/jvlflame/Javinizer#getting-started

# Run a command to sort your JAV files using default settings
> Javinizer -Path "C:\JAV\Unsorted" -DestinationPath "C:\JAV\Sorted"
oh . thank you so much. o i aways used this line but still dont remember it yet
 
So I kind of walked back on myself here. I recently had a fear that my BD50 discs with all my titles would suffer from disc rot at some point so...I used some Amazon points to buy three 2TB WD Blue HDDs to start copying the discs off to. Plus, I have another older 1TB Seagate and some others (2.5 inch) in case I need them. I sure hope one of these drives doesn't fail later...basically proving why I went away from them in the first place.

Anyone have a drive fail recently? If so, did you an "it is what it is" attitude? Or did you try to replace everything?
 
So I kind of walked back on myself here. I recently had a fear that my BD50 discs with all my titles would suffer from disc rot at some point so...I used some Amazon points to buy three 2TB WD Blue HDDs to start copying the discs off to. Plus, I have another older 1TB Seagate and some others (2.5 inch) in case I need them. I sure hope one of these drives doesn't fail later...basically proving why I went away from them in the first place.

Anyone have a drive fail recently? If so, did you an "it is what it is" attitude? Or did you try to replace everything?
I recently (one to two months ago) got back like 80 discs that I gave to my friend 5+ years ago. They are all still in perfectly working order. While I like the fact that Blu-rays can hold more, there is also the potential for more files to go bad, so I stick with 4.7 GiB discs. Also the fact that DVD burners/readers are very cheap, versus needing to shell out near 100 dollars for a blu ray burner/reader. IIRC, Disc Rot is almost impossible to prove, and (IIRC) has never been observed to happen. It seems to be a myth (though Im sure it is possible). That is, IIRC. I could be wrong though. I remember reading from a hypothetical journal (they deal in tech hypotheticals/theory), and even if it does exist, it is incredibly rare.
 
2 week ago my harddisk is dead .its harddisk for porn only.
i create and cutting and move it to koi .
its too much work for rebuild such a big database so i give up .
i no more ...i dont spend my time do hard work again .
if i found something really fun then i start to cutting and building something from it
i dont do it for every movies that i have watched now.
waste time
 
2 week ago my harddisk is dead .its harddisk for porn only.
i create and cutting and move it to koi .
its too much work for rebuild such a big database so i give up .
i no more ...i dont spend my time do hard work again .
if i found something really fun then i start to cutting and building something from it
i dont do it for every movies that i have watched now.
waste time
Wow! Sorry to hear that, friend!
 
I recently (one to two months ago) got back like 80 discs that I gave to my friend 5+ years ago. They are all still in perfectly working order. While I like the fact that Blu-rays can hold more, there is also the potential for more files to go bad, so I stick with 4.7 GiB discs. Also the fact that DVD burners/readers are very cheap, versus needing to shell out near 100 dollars for a blu ray burner/reader. IIRC, Disc Rot is almost impossible to prove, and (IIRC) has never been observed to happen. It seems to be a myth (though Im sure it is possible). That is, IIRC. I could be wrong though. I remember reading from a hypothetical journal (they deal in tech hypotheticals/theory), and even if it does exist, it is incredibly rare.
I've burned over 100 BluRays in the last 5 years. I've had 6 discs (that worked after initial burn, mind you) that did not work upon later access. All 6 of these discs came from the same spindle of DL BluRay discs, so there is some suspicion that I got a bad batch, but no proof either way.

When I say they 'don't work,' I mean to say that some portion of the contents of the disc are unretrievable. In each case I was able to see the entire list of contents, but only able to copy off a small number of the files (less than half in all cases.)

As a result of this, I've sworn off DL discs entirely. I replaced all the other discs in that batch, even though they seemed to be fine. I lost quite a few files, because I downsample everything to 480p in h.265 to lessen space requirements. Luckily, the stuff I drop to BluRay is stuff that, if deleted, wouldn't upset me terribly. I keep all the important stuff on HDD's, and make sure I've got more than 1 copy.
 
Wow, reading some of the posts makes me feel very grateful for the resources and ability to have a pretty extensive storage and backup infrastructure. I'd like to share how an IT Network/Sys Admin stores/sort of organizes their collection and general data(granted almost 0 JAV, 90% idol, 10% mixed VR). I've been in IT for 15 years, managed several enterprise backup/DR systems, and I treat my home lab/network pretty similarly. I'm a big fan of linux and open source software so I utilize that as much as possible. I also utilize ZFS for all my server file systems! ZFS has the following benefits, checksums with period scrubbing for data integrity (ECC RAM comes into play with this as well), ZFS Snapshots and send/receive for backing up data, on the fly compression with deduplication when doing a ZFS Snapshot. Overall, ZFS is just amazing, and well worth the time/effort to implement

  • Server 1
    • OS - Proxmox VE (Debian based linux hypervisor)
    • Hardware
      • AMD Ryzen 7 2700 8c/16t on an Asrock Rack x470-2x10gb motherboard with IPMI
      • 64GB DDR4 ECC (error correcting very important for data integrity IMO)
      • 2x 240GB SSD ZFS RAID1 for the OS
      • 1x 1TB NVME SSD ZFS for VMs (Virtual Machines)
      • 6x 4TB NAS 7.2k drives in ZFS RAID10 (main storage for all data)
      • LSI-3008i SAS HBA
    • Services
      • Plex container
      • File shares via SAMBA
      • Wireguard container
      • Unifi Controller
      • Several VMs for different purposes (one of them dedicated to video encoding)
  • Server 2
    • OS - Proxmox Backup Server
    • Hardware
      • Junk old 4th gen i3 with 16gb ram and cheap mobo
      • 2x 240gb ssd in ZFS RAID1 for OS
      • 1x500GB SSD ZFS for VM Backup
      • 1x 8TB Toshiba X300 ZFS for backing up RAID10 array - will need to upgrade at some point
  • Desktop 1
    • OS - Windows 10 Pro
    • Hardware
      • Ryzen 7 5800x - 32GB RAM - RTX 3080
      • 1x 1TB NVME - main storage and also used as a scratch drive when doing IO heavy video editing
      • 1x 8TB Toshiba - secondary scratch drive, and 3rd tier of backing up RAID10 array
    • Services
      • Dropbox Pro 2TB for ultra-important documents and certain videos...
My main strategy for storage/backups is this

My main Proxmox server hosts a decently sized 12TB volume shared out with SAMBA (SMB on windows), that I connect to via standard mapped drive on Desktop1. All my downloads go straight to this network share. This is protected against bit-rot and other errors via the ECC memory on the server, as well as ZFS protections. It's also capable of losing up to 3 drives and still have all the data available (I have a drive failure right now that I'm working on replacing). That storage then gets ZFS snapshots sent to the 8TB on Server2 (Proxmox BS). Finally, I do a regular file-level backup periodically and often using Total Commander on my desktop that does a synch between the network share, and my local 8TB HDD.

I've thought a lot, and spent a lot of time configuring this, but it should protect against bitrot, data corruption, even ransomware as long as it's caught quick enough. I'm not sure how many snapshots I can get on the PBS server, but its several weeks worth at least. SSDs are getting cheap enough that I plan on upgrading the 8TB HDD to an 8TB SSD in my main workstation as well.
 
Wow, reading some of the posts makes me feel very grateful for the resources and ability to have a pretty extensive storage and backup infrastructure. I'd like to share how an IT Network/Sys Admin stores/sort of organizes their collection and general data(granted almost 0 JAV, 90% idol, 10% mixed VR). I've been in IT for 15 years, managed several enterprise backup/DR systems, and I treat my home lab/network pretty similarly. I'm a big fan of linux and open source software so I utilize that as much as possible. I also utilize ZFS for all my server file systems! ZFS has the following benefits, checksums with period scrubbing for data integrity (ECC RAM comes into play with this as well), ZFS Snapshots and send/receive for backing up data, on the fly compression with deduplication when doing a ZFS Snapshot. Overall, ZFS is just amazing, and well worth the time/effort to implement

  • Server 1
    • OS - Proxmox VE (Debian based linux hypervisor)
    • Hardware
      • AMD Ryzen 7 2700 8c/16t on an Asrock Rack x470-2x10gb motherboard with IPMI
      • 64GB DDR4 ECC (error correcting very important for data integrity IMO)
      • 2x 240GB SSD ZFS RAID1 for the OS
      • 1x 1TB NVME SSD ZFS for VMs (Virtual Machines)
      • 6x 4TB NAS 7.2k drives in ZFS RAID10 (main storage for all data)
      • LSI-3008i SAS HBA
    • Services
      • Plex container
      • File shares via SAMBA
      • Wireguard container
      • Unifi Controller
      • Several VMs for different purposes (one of them dedicated to video encoding)
  • Server 2
    • OS - Proxmox Backup Server
    • Hardware
      • Junk old 4th gen i3 with 16gb ram and cheap mobo
      • 2x 240gb ssd in ZFS RAID1 for OS
      • 1x500GB SSD ZFS for VM Backup
      • 1x 8TB Toshiba X300 ZFS for backing up RAID10 array - will need to upgrade at some point
  • Desktop 1
    • OS - Windows 10 Pro
    • Hardware
      • Ryzen 7 5800x - 32GB RAM - RTX 3080
      • 1x 1TB NVME - main storage and also used as a scratch drive when doing IO heavy video editing
      • 1x 8TB Toshiba - secondary scratch drive, and 3rd tier of backing up RAID10 array
    • Services
      • Dropbox Pro 2TB for ultra-important documents and certain videos...
My main strategy for storage/backups is this

My main Proxmox server hosts a decently sized 12TB volume shared out with SAMBA (SMB on windows), that I connect to via standard mapped drive on Desktop1. All my downloads go straight to this network share. This is protected against bit-rot and other errors via the ECC memory on the server, as well as ZFS protections. It's also capable of losing up to 3 drives and still have all the data available (I have a drive failure right now that I'm working on replacing). That storage then gets ZFS snapshots sent to the 8TB on Server2 (Proxmox BS). Finally, I do a regular file-level backup periodically and often using Total Commander on my desktop that does a synch between the network share, and my local 8TB HDD.

I've thought a lot, and spent a lot of time configuring this, but it should protect against bitrot, data corruption, even ransomware as long as it's caught quick enough. I'm not sure how many snapshots I can get on the PBS server, but its several weeks worth at least. SSDs are getting cheap enough that I plan on upgrading the 8TB HDD to an 8TB SSD in my main workstation as well.
That's crazy! I've never even heard of ZFS before. Just curious for desktop you choose Win10, for games? Or other needs not satisfied by Linux and Wine?
 
That's crazy! I've never even heard of ZFS before. Just curious for desktop you choose Win10, for games? Or other needs not satisfied by Linux and Wine?
As much as I'd love to switch to Linux for my desktop (especially ZFS on my desktop!), VR gaming, and some specific software (like Topaz Labs VEIA) and work from home just don't work seamless enough. My end-goal was actually to make all my Windows workstations VMs, but it didn't work out. Windows just tended to be the easiest, least disruptive solution.
 
is there any program to download only the covers of the movies ? or any program to add the cover photos to folders or default vlc icon automatically?
 
is there any program to download only the covers of the movies ? or any program to add the cover photos to folders or default vlc icon automatically?
its call scrapper.
automatic?
i think if its work then you will lucky enough at these day .
website dont like scrapper , they dont want people to download their item without visit them . they do website for money and viewer is money.
so they do bot and lots of thing that prevent " scrapping"

you can search scrapper ..... plex ...... kodi ..... and there will be something you might want to read.

there are people talking these thing in this topic too .. what do you want is here in some page
 
Sounds like you might need to set Plex or Jellyfin to use local nfo files (sorry, I don't use Plex)

When adding a new library in Jellyfin, I un-check all metadata scrapers and then check the "Metadata savers: Nfo" box like the attached image.

I've just managed to move all of my collection onto a NAS and Javanizer/Jellyfin looks like an awesome combo.

It looks like I'd be best to install Docker versions of both. This is what I'm looking to do if possible, does this sound feasible?

  • Have a drop file location where new content gets scraped (even automatically in a watch location)
  • Output results into a locally stored NFO which can be modified manually if needed
  • Don't mind if scraping pulls covers but prefer ability to source covers manually, store locally and have NFO use local paths not URLs
  • Prefer to be able to preserve own folder substructure, ie. move output sorted files and NFOs into final tree and have Jellyfin use my structure
  • Bonus for having a rating/star system rather than just likes within Jellyfin
  • Bonus for storing Japanese titles, actress and studio names as well as translated
Anyone know if what I would prefer is doable?
 
I've just managed to move all of my collection onto a NAS and Javanizer/Jellyfin looks like an awesome combo.

It looks like I'd be best to install Docker versions of both. This is what I'm looking to do if possible, does this sound feasible?

  • Have a drop file location where new content gets scraped (even automatically in a watch location)
  • Output results into a locally stored NFO which can be modified manually if needed
  • Don't mind if scraping pulls covers but prefer ability to source covers manually, store locally and have NFO use local paths not URLs
  • Prefer to be able to preserve own folder substructure, ie. move output sorted files and NFOs into final tree and have Jellyfin use my structure
  • Bonus for having a rating/star system rather than just likes within Jellyfin
  • Bonus for storing Japanese titles, actress and studio names as well as translated
Anyone know if what I would prefer is doable?
most of this option that you need ,, you can do it in javinizer ..

i dont use plex jelly fish
i dont have money for playing some option
i test all this and i end up with kodi

as i comment ,,,,javinizer have almost option that you looking for
but you need to read doc ...its big program with so many option
and if you want to port the .nfo file and use it in ( plex, jelly and kodi

its work..i cant see any problem because its javnizier do it all in the first place ..
plex , jelly fin or kodi is just " secondary tools and such "
 
  • Like
Reactions: intrepid8 and mei2
I just thought I'd briefly mention that there's a new JAV browser and organizer for Windows called JavLuv. I wrote it after I became unhappy with the complexity of some of the current solutions out there. My goal was to create a simple, effective JAV browser/organizer that could be installed and launched like a typical app with no complex instructions to follow and no additional dependencies to install.

JavLuv features:
  • Browse collections in a traditional thumbnail / details view, and launch movies
  • Automatically scan movies, scrape metadata from a few well-known websites, and automatically download covers if needed
  • Sort collection by title, ID, actress, date, or user rating.
  • Search / filter collection by title, names, keywords, or IDs using a search bar
  • Automatic renaming and organizing of collection into nearly any folder and file structure
  • Built-in utility to automatically concatenate movies losslessly
  • Stores metadata in Kodi/Javinizer compatible .nfo files next to the movie

I'm typically pretty responsive to user suggestions that seem worthwhile or practical to implement. Many of the improvements and a huge number of bug fixes are thanks to suggestions and reports by early users over the past seven months. Feel free to give it a try and let me know what you think.
.

I'll definitely give it a shot, thanks TmpGuy. I'd be happy to give constructive feedback too.

Spent 5 hours fighting with trying to get Javanizer to work from a Docker container on a Synology NAS and quit in frustration. If someone has experience with this particular setup I'd gladly take the help. There's something wrong with the environment variables.

Honestly all I need is the ability to do metadata scraping to NFO files, maybe pulling actress thumbs and being able to implement custom tags/genres. I want everything local and don't want to rely on metadata being available long term on any store front.

I collect covers myself manually (I want the best out there, not necessarily the easiest to pull from DMM). My collection is well organized - I'm happy with file naming conventions and folder structure - it's the cataloging for a media server (probably Jellyfin) and remote playback I'm chiefly interested in.
 
I'll definitely give it a shot, thanks TmpGuy. I'd be happy to give constructive feedback too.

Spent 5 hours fighting with trying to get Javanizer to work from a Docker container on a Synology NAS and quit in frustration. If someone has experience with this particular setup I'd gladly take the help. There's something wrong with the environment variables.

Honestly all I need is the ability to do metadata scraping to NFO files, maybe pulling actress thumbs and being able to implement custom tags/genres. I want everything local and don't want to rely on metadata being available long term on any store front.

I collect covers myself manually (I want the best out there, not necessarily the easiest to pull from DMM). My collection is well organized - I'm happy with file naming conventions and folder structure - it's the cataloging for a media server (probably Jellyfin) and remote playback I'm chiefly interested in.
I would suggest trying to learn the command line version of Javinizer instead of messing with the Docker GUI.
If you can't get Javinizer working, maybe look into jav-it - https://jav-it.itch.io/jav-it. Getting all the features requires that you pay via Patreon. Still not GUI app, but us Jav hoarders can't be too choosy you know?
 
Last edited:
I would suggest trying to learn the command line version of Javinizer instead of messing with the Docker GUI.
If you can't get Javinizer working, maybe look into jav-it - https://jav-it.itch.io/jav-it. Getting all the features requires that you pay via Patreon. Still not GUI app, but us Jav hoarders can't be too choosy you know?

Good call. My needs are few to be honest. I don't need the help with sorting, naming or cover art. I just need basic metadata and then I can probably tinker with the XML on Kodi standard .nfo files to get the tag customization I want.

Grew up on DOS so not averse to command line. Really should teach myself some Linux one of these days... Happy to support the devs on these projects also.

Thanks for the suggestions.
 
Good call. My needs are few to be honest. I don't need the help with sorting, naming or cover art. I just need basic metadata and then I can probably tinker with the XML on Kodi standard .nfo files to get the tag customization I want.

Grew up on DOS so not averse to command line. Really should teach myself some Linux one of these days... Happy to support the devs on these projects also.

Thanks for the suggestions.
Javinizer's "Update" switch allows you to just write .nfo files into the directory without pulling images or re-naming/moving files. I used it to great effect when migrating my hand-build system to using .nfos in a bulk scrape fashion years ago and it worked great. From https://docs.jvlflame.net/using-javinizer/general-usage

Code:
-Update [<SwitchParameter>]
    Specifies to only create/update metadata files without moving any existing files

Note that jvflame is a very responsive developer and has modified Javinizer to accommodate some of my requests - join the discord if you have any questions and ppl can help you out (altho it's not high traffic, so it may take a bit). https://discord.gg/Pds7xCpzpc