FileJoker Sponsorship

SamKook

Grand Wizard
Staff member
Super Moderator
Uploader
May 10, 2009
3,739
5,146
If we use a service to protect our link from being deleted almost instantly, is it clever enough to let us post it anyway even if the link for filejoker aren't directly in the post?

For example, I use safelink.strike-up to protect my links with a password so bots can't get directly to the links and report them. I always have filejoker links available, but the address in the post is from safelink.strike-up and once you click on it and enter the password, it will display the filejoker links on their website.
It's a measure I've found necessary to use to keep my links alive for more than a few days.

Edit: Just saw yipman and c00lzero posts in the forum bug thread so I guess it isn't clever enough.

A solution that would probably make it safe to use the links directly in the posts would be to force people to hide them behind a specific tag and making the links visible only to registered members. Not sure if it's as effective against bots, but I've seen a few forum use that method.
 
Last edited:

tchadoobo

Active Member
Sep 30, 2012
154
167
...
A solution that would probably make it safe to use the links directly in the posts would be to force people to hide them behind a specific tag and making the links visible only to registered members. Not sure if it's as effective against bots, but I've seen a few forum use that method.

If I am not mistaken this forum is only available to registered members and filejoker links there die within very few hours.
At least in this forum bots shouldn't be able to harvest anything.
 

SamKook

Grand Wizard
Staff member
Super Moderator
Uploader
May 10, 2009
3,739
5,146
Ah yes, I forgot that was a member only section already. Bad idea then.
 
  • Like
Reactions: CoolKevin

Ceewan

Famished
Jul 23, 2008
9,151
17,033
Ah yes, I forgot that was a member only section already. Bad idea then.



No, you were right I think. Links get reposted and bots have memberships (they register and everything). I mean if spammers can register why not search bots? Of course those links often get reported for other reasons by other non-friendly board lurkers. The DDL areas in that section should probably be exempt from the restriction of not posting direct links to FileJoker but chances are the staff won't deem that practical and the posters will just have to make adjustments. The auto-detect is just too convienent and saves the staff a lot of hassle. A poster can still use a multi-host or safelink system and post only the direct FileJoker links from it without posting the other links. More work for the poster, less work for the staff here, but it is fair across the board.
 

SamKook

Grand Wizard
Staff member
Super Moderator
Uploader
May 10, 2009
3,739
5,146
The problem if people post only the filejoker links without protecting them is that they won't last very long so it kind of defeat the purpose of having an host everybody can rely on if it only works for a few days while the other hosts last months.
Also, it would be less work for the poster since they don't have to take the extra step of protecting the links since you have to do the uploading and grabbing of the links either way(unless there's some automated website that does it somewhere, I personally built my own set up so I haven't looked at other options).

The problem with having protected links in this situation is that if the system can find a way to detect them, then it means the bots will very likely be able to grab and flag the links so they won't be safe anymore.
Unless you set up some kind of safelist of uploaders that would be exempt from the check, but that would be more work for the staff and kind of defeat the purpose of the check in the first place.
You also can't make exceptions for all posts with safelink or the like links or people will simply abuse it.

I haven't managed to think of an idea that would work for both having links that stay up longer and having the staff do less work.
The best idea I came up with would require them to build their own link protecting system that people could use. That way they would be able to build in some kind of direct access to check the links, but that would be a lot of work to set up and would put a heavier load on the servers which has been a problem many times.
 
  • Like
Reactions: CoolKevin

Ceewan

Famished
Jul 23, 2008
9,151
17,033
The rules only state that you must post FileJoker links. If FileJoker is the one that removes those links then you still would be in compliance. Thus, those that use multi-hosts have no reason not to post their FileJoker links. As far as link-protecting just your FileJoker links that would seem to be a problem......there is no work-around as far as I can tell. This is also why I think the downloadable candies should be exempt from this rule, as it is harder to keep links active in that section as it is. No skin off my back either way, I was just kibitzing here.
 
  • Like
Reactions: Casshern2

C00Lzero

Administrator
Conqueror
Aug 14, 2010
559
1,185
come on people, don't argue over things that atm we can't change.
i have given chompy the task to ask the devs if they can add an option for user-group based activation. if they add this, we will create a new uploader-user-group (or s.th. like this) who won't have to deal with this message and then can use linkcrypter in their posts. Users will have to apply for this group, and they won't get access if they don't meet some criteria i yet have to think about.

atm it's not enabled on any forum, due to some bugs.

making links unavailable for guests would probably kill lot's of reputation and users, since most of them don't want to register - they would just go to any blog they find in the www.
so unfortunately this is not an option.

as i have written in another post, we are trying to block out the abuser-bots, but they also are using lot's of ips and sometimes also tor - i can't block tor, since lot's of people use tor (that's why we don't use cloudflare, even if it probably would give us some advantages - e.g. we wouldn't need 3 Frontend-servers, who do an extreme sort of caching.
then again some even run on private lines with dynamic addresses, which i don't really like to block, since other users can get the same ip.

Implementing an automatic link-protection isn't as easy as it looks, since you have to replace links within the thread-/post-creation on the fly.
though we could setup a service that can do link-protection like save-link, running on another server, using captcha etc. that wouldn't even be that much of a hassle - as long as it doesn't have to check availability ^^
 

SamKook

Grand Wizard
Staff member
Super Moderator
Uploader
May 10, 2009
3,739
5,146
I was seeing it as discussing the problem while waiting for a solution more than arguing about it. I'm not mad about anything here and it got disabled so fast that I never even had time to see the function in action.
That is good news and probably the easiest solution. The hard part will be figuring out the criterias so most people can use it since it's very much beneficial to the forum, but the abusers won't be able to.

The rules only state that you must post FileJoker links. If FileJoker is the one that removes those links then you still would be in compliance.
Yes, compliance is easy, but it wouldn't be the best thing for the downloaders(but that will change once the whitelist thing is activated).
 

Casshern2

Senior Member...I think
Mar 22, 2008
7,021
14,461
Are the bots literal in what they look for? I mean to ask, do we know they are looking specifically for "https://filejoker.net/" or simply any reference to "filejoker.net" in the forum posts? Perhaps they have a way to strip special characters so that even "https://fil3jok3r.n3t" would be found? A slight inconvenience, but doctoring the links a bit would maybe keep links alive longer with no need for link-protecting services. I've seen on other forums when they post passwords to pay sites they usually put http://members.somesite.com!/members/ - I'm assuming the exclamation point is for something. Now, that is because they place actually hyperlinks, so, another question is, do the bots PURELY look for hyperlinks in the post markup? Using flat text that users have to copy and paste into their browser and/or favorite download accelerator would leave nothing for bots to see if that's the case.

Again, that's a bit of an inconvenience. And that, of course, would solve nothing if there are agents on the board doing all the reporting. :D

Anyway, just thinking out loud.
 

Ceewan

Famished
Jul 23, 2008
9,151
17,033
I was seeing it as discussing the problem while waiting for a solution more than arguing about it.


That is me too. Just a little freindly discourse to pass the time.


Are the bots literal in what they look for?

copyright “bots” — automated systems that match content against a database of reference files of copyrighted material.

A swarm of tech companies are rushing in to provide technical solutions to enforce copyright in online sharing communities and video-streaming sites. Those players include Vobile, Attributor, Audible Magic, and Gracenote. And they’re thriving, despite the fact that U.S. copyright law, as modified by the 1998 Digital Millennium Copyright Act, doesn’t require sites that host user-created content to preemptively patrol for copyright violations.

Exactly how do these bots work?

I haven't come across any technical data on how it would apply here. In this case bots likely search metadata for specific matches to keywords such as the DVD codes themselves. How they would know to scrape the links that accompany a post and then to report them is beyond anything I have found so I am not sure it is something they are capable of. Internet Bots are like blind bats flying around eating insects, they are very specific in their targets and their reactions to their targets.

The basic term for what they do is called "Webscraping" (web harvesting or web data extraction). This is closely related to what search engines do which is called web indexing, which indexes information on the web using a bot or web crawler.


Technical measures to stop bots (from the Wiki which knows all):


Blocking an IP address. This will also block all browsing from that address.
Disabling any web service API that the website's system might expose.
Bots sometimes declare who they are (using user agent strings) and can be blocked on that basis (using robots.txt); 'googlebot' is an example. Some bots make no distinction between themselves and a human browser.
Bots can be blocked by excess traffic monitoring.
Bots can sometimes be blocked with tools to verify that it is a real person accessing the site, like a CAPTCHA. Bots are sometimes coded to explicitly break specific Captcha patterns.
Commercial anti-bot services: Companies offer anti-bot and anti-scraping services for websites. A few web application firewalls have limited bot detection capabilities as well.
Locating bots with a honeypot or other method to identify the IP addresses of automated crawlers.
Using CSS sprites to display such data as phone numbers or email addresses, at the cost of accessibility to screen reader users.

Why these techniques don't work (from ShieldSquare anti-bot service):


Setting up robots.txt – Surprisingly, this technique is used against malicious bots! Why this wouldn’t work is pretty straight forward – robots.txt is an agreement between websites and search engine bots to prevent search engine bots from accessing sensitive information. No malicious bot (or the scraper behind it) in it’s right mind would obey robots.txt. This is the most ineffective method to prevent scraping.

Filtering requests by User agent – The user agent string of a client is set by the client itself. One method is to obtain this from the HTTP header of a request. This way, a request can be filtered even before the content is served to the request. We observed that very few bots (approximately less than 10%), used the default user agent string which belonged to a scraping tool or was an empty string. Once their requests to the website were filtered based on the user agent, it didn’t take too long for scrapers to realize this and change their user agent to that of any well known browser. This method merely stops new bots written by inexperienced scrapers for a few hours.

Blacklisting the IP address – Seeking out to an IP blacklisting service is much easier than having to perform the hectic process of capturing more metrics from page requests and analyzing server logs. There are plenty of third party services which maintain a database of blacklisted IPs. In our hunt for a suitable blacklisting service, we found that using a third party DNSBL/RBL service was not effective as these services blacklisted only email spambot servers and were not effective in preventing scraping bots. Less than 2% of scraping bots were detected for one of our customer’s when we did a trial run.

Throwing CAPTCHA – A very well know practice to stop bots is to throw CAPTCHA on pages with sensitive content. Although effective against bots, CAPTCHA is thrown to all clients requesting the web page irrespective of whether it is a human or a bot. This method often antagonizes users and hence reduces traffic to the website. Some more insights to the new NO CAPTCHA Re-CAPTCHA by Google can be found in our previous blog post.

Honey pot or Honey trap – Honey pots are a brilliant trap mechanism to capture new bots (scrapers who are not well versed with structure of every page) on the website. But, this approach poses a lesser known threat of reducing the page rank on search engines. Here’s why – Search engine bots visit these links and might get trapped accidentally. Even if exceptions to the page were made by disallowing a set of known user agents, the links to the traps might be indexed by a search engine bot. These links are interpreted as dead, irrelevant or fake links by search engines. With more such traps, the ranking of the website decreases considerably. Furthermore, filtering requests based on user agent can exploited as discussed above. In short, honey pots are risky business which must be handled very carefully.

Sorry to go on and on but it is a fascinating subject.........
 
Last edited:

aikonhey

resting
Jul 14, 2008
67,131
581,752
i sent a pm to them and i got a reply and i quote;

"Dear Customer,

If the files are constantly removed from time to time, then please avoid re-uploading such files.

Sincerely,
File Joker Team"

so i hope those files should be uploaded once or twice only and let the poster add mirror or backup links. please make this an exception.

btw, filejoker has no copy feature.
 
  • Like
Reactions: Ceewan

SamKook

Grand Wizard
Staff member
Super Moderator
Uploader
May 10, 2009
3,739
5,146
For affiliate accounts, premium renewal - pay ?

I contacted them when mine expired and they renewed it for 2 months.
 
  • Like
Reactions: rascom
Mar 28, 2008
740
312
i sent a pm to them and i got a reply and i quote;

"Dear Customer,

If the files are constantly removed from time to time, then please avoid re-uploading such files.

Sincerely,
File Joker Team"

so i hope those files should be uploaded once or twice only and let the poster add mirror or backup links. please make this an exception.

btw, filejoker has no copy feature.
The section aikonhey posts his links in should be exempt from the filejoker rule.
 
  • Like
Reactions: aikonhey

SamKook

Grand Wizard
Staff member
Super Moderator
Uploader
May 10, 2009
3,739
5,146
The problem is everywhere on the forum, it's just that they don't seem to protect their links against bots in that section when they post(from what I've seen, correct me if I'm wrong) so things get deleted very fast.
 

colweb

Jade Collector
Uploader
Sep 2, 2010
506
1,980
I did ask them about the affiliate program and what they considered extreme content (as that isn't allowed). Got this answer:

Hello!

Our partnership based on a regular PPS plan. It means that when any user will buy premium account (using your link), you will receive your affiliate commission (65% of sum). Or if your referred user pays for rebill, you will receive 50% of sum. It's very simple.

Extreme content means niches like scat, zoo, trans, cp, r***, etc.

Skype: filejoker.webmasters
Sincerely,
Max


It seems that I can't post any Jades anymore. At least not with filejoker links. For those of you who don't know it, Jades are mostly scat, r*** and masturbation vids.

Very nice to have a mandatory sponsor on a board that is all about JAV movies but the sponsor doesn't allow all the JAV movies. Seems like the same as keep2share who didn't allow 'extreme' content at some point, but they at least came with fboom where we could post these kind of movies.
 

zoolanimal

jav fisting connoisseur
Apr 29, 2009
3,367
5,491
I did ask them about the affiliate program and what they considered extreme content (as that isn't allowed). Got this answer:

Hello!

Our partnership based on a regular PPS plan. It means that when any user will buy premium account (using your link), you will receive your affiliate commission (65% of sum). Or if your referred user pays for rebill, you will receive 50% of sum. It's very simple.

Extreme content means niches like scat, zoo, trans, cp, r***, etc.

Skype: filejoker.webmasters
Sincerely,
Max


It seems that I can't post any Jades anymore. At least not with filejoker links. For those of you who don't know it, Jades are mostly scat, r*** and masturbation vids.

Very nice to have a mandatory sponsor on a board that is all about JAV movies but the sponsor doesn't allow all the JAV movies. Seems like the same as keep2share who didn't allow 'extreme' content at some point, but they at least came with fboom where we could post these kind of movies.

akiba-online,

i managed to obtain a one year premium account through one of the resellers - like colweb i am concerned about the "Extreme content" exclusion since for the most part it is what i post - content i post can be everything from fisting, rope and restraint bondage, sm, bdsm, faux r***, needle torture, nose hooks, scat, piss, wet and messy, and similar video i am sure could be considered extreme - of course, i have never had a post pulled because of content

might i suggest a solution? carve an exception to the file joker rule, ALLOW USE OF OTHER FILEHOSTS FOR EXTREME CONTENT. just a suggestion.

e.g., personally i would continue posting filejoker [and asking people to get premium accounts] links as long as i was ensured i wouldn't have my filejoker one year premium account pulled because of extreme content but was merely required to remove the offending post.
 
Last edited:
  • Like
Reactions: Ceewan

Ceewan

Famished
Jul 23, 2008
9,151
17,033
Yep, FileJoker down't seem to like Free Downloaders. I tried 5 times to download one file for free, no dice, failed every time (and failed quick). Sorry guys, I don't buy premium accounts. I partake in filesharing, if I have to pay for it, then someone is selling it, and that isn't the same thing at all. So I am personally done with FileJoker, hopefully some of you post mirrors for po' folk like me or I will just stick to torrents (no premium account required).
 

xbrit

Member
Sep 13, 2011
65
18
Still baffled about how to get a FileJoker premium account. I have no idea how they can stay in business given how difficult it is from the US.

My Visa simply doesn't work, I think Bank of America must have banned the payment processor.

I've tried a half dozen resellers. Most of them claim to take PayPal but when you try it, turns out they really don't.

The alternative payments schemes almost all seem to involve huge extra costs, so a $40 membership would end up like $150.

Bitcoin seemed promising, but it turns out actually buying bitcoin is excruciatingly difficult. I feel like I'm doing a drug deal with all the hoops. Still trying, sigh.
 
  • Like
Reactions: Ceewan