/gen/ - General Discussion

talk about whatever you like

BBW-Chan is supported by simple text or static image ads from a-ads.com.
Please consider whitelisting us in your adblocker if you enjoy your stay!


Mode: Reply
Name
Subject
Message

Max message length: 9999

Files

Max file size: 10.00 MB

Max files: 6

Captcha
E-mail
Password

(used to delete files and postings)

Misc

READ THE *RULES* BEFORE YOU POST!

[ / / ]

(10.65 KB 742x104 2.png)
(204.38 KB 2379x1832 1.png)
Storage Anonymous 04/13/2026 (Mon) 21:40:16 Id:d22c59 No. 83472
Hey fellow gooners, tell me about your setup. How much of this content have you got stored? How do you sort it? Where have you mostly got it from? When did you start collecting? How do you manage de-duplicating and preserving the original quality file? I have a Proxmox host running Cockpit (45Drives) for a SMB share, Jellyfin and Tailscale for remote access.
Nigga has almost 8tb of porn with remote access.
>>83472 I keep them saved on random laptop hard drives I have lying around, I've lost maybe 2 stashes before so i'm not really tied to them, if they last they last if they go corrupt i'll just start again. At a certain point you just begin to refine the few models you really want, and then just leave other stuff on sites to watch
>>83472 Yo, I can give you boberrys new weigh in or whatever your interested in. Although you have an ungodly amount of stuff lol. But any chance you would upload your Jodie Elizabeth stuff?
>>83472 Never imagined there were levels to ts. I have vids that take around 100 Gb space and I tend to delete older stuff as it eventually gets boring. I usually find videos on www.leakedzone.com
>>83472 >How do you manage de-duplicating and preserving the original quality file? I'd really like to know more about this, because i'm more of a hoarder. >>83481 10TB are about 1 grocery haul.
>>83472 Re: deduplication. Have you tried WizTree Diskanalyzer? VERY handy for me on my desktop, don’t know how it would scale to such a large storage system but given how fast and efficient it is on my 1 TB drive it’s worth a go. I know StufferDB used their huge collection for training AI and providing that to users to try. But to my knowledge, I think that’s kind of a scattergun approach. Like “just dump it all in and see what comes out”. Would you consider running all your data you’ve collected through an embedding model? You must have a lot of weigh-in content collected, as well as date/time stamped video and image content - you could probably have a good shot at building something which, given a picture of a model, could estimate her height/weight to some degree of accuracy. Could be useful for gauging the accuracy of models’ self-reporting of their weight.
>>83472 I currently have 8TB of adult content featuring plus-size women, distributed across 396 MEGA accounts, cataloged by person. To help me with this, I developed a system that helps me locate, share, and manage these 396 accounts as if they were a single system, even though in practice they are not. Regarding duplicates (which occasionally appear), I use a basic filename and size check based on the person the file is associated with. It has proven efficient so far, and with ZERO storage costs.
>>83600 If I had 8TB of data, the last place I'd have it would be on an online hosting site that can delete it all at will. Granted 8TB of physical storage is a month's salary down there. But at least I won't have any company in possession of MY content.

Delete
Report