Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Tumblr was removed from Apple’s App Store over child pornography issues (www.theverge.com) similar stories update story
128 points by ikeboy | karma 14321 | avg karma 2.95 2018-11-20 08:50:04 | hide | past | favorite | 148 comments



view as:

Can't say I'm surprised. Tumblr has been so full of (non child) pornographic and outright weird material that it was due to attract this sort of folk as well. It really detracts from the platform for the rest of us, who just want to use it as a micro-blogging platform sitting between Twitter and Medium (where strangely, Tumblr has been horrible in finding its position).

Either Yahoo should clamp down and meanwhile give the platform a solid update -- otherwise they'll probably shut it down sometime in the future. I can image the cost of running it must be pretty high.


I don't know about the user groups, but instead of outright shutting down because it couldn't be what it wanted to be, it should investigate and become what the users want it to be, if it is an amateur porn platform, so be it.

It can do it while prohibiting child pornography. In general, Tumblr is pretty bad at its product, maybe it's a yahoo thing.


> ...if it is an amateur porn platform, so be it.

Copyright raises its ugly head here. Yahoo presumably can't legally encourage the outright IP theft that would entail.


I don't agree with this slippery slope of pornography somehow leading to cp. Tumblr is an image hosting site, any and all image hosters are used to distribute that kind of content.

I don't think it's a "slippery slope of pornography somehow leading to cp" but rather once people realize that there is no moderation and no rules and no one will shut them down, it turns into the Wild West. If criminals learn that there are no police officers anymore, they know they can get away with committing any crimes without punishment.

Since tumblr will host things other sites won't, eventually someone will push the envelope and if they don't get shut down it will get pushed further and further. It's not about "any porn -> child porn", it's a progressive furthering of what is/is not allowed, and once you learn everything is allowed and nothing is banned, child porn (and other illegal content) is an inevitability.


Agreed. I didn't want to infer that any porn leads to child porn (that would be ridiculous), but about oversight, which Tumblr lacks.

>this sort of folk You mean the teenagers (ie minors) posting pictures of themselves? You act as if Tumblr is attracting pedophiles, when it's really just failing to moderate the platform.

Failure to moderate the platform is exactly what's led to Tumblr becoming a haven for pedos.

I got downvoted for this comment, but wanted to mention after the fact that today tumblr just announced they'd restrict adult content on their platform...

There is crazy stuff sitting on Imgur, too.

Link 404s.

What's the point of linking a crawler? Are there any particular insight regarding the crazy stuff in the code of the parser?

> There is crazy stuff sitting on Imgur, too.

But none of it's accessible unless you have a direct URL, and even then the pornographic crazy stuff is taken down quickly. The Imgur front page is pretty sanitized.


The first few years of Imgur data sits behind five randomly generated alphanumeric characters. Quite easy to brute force...

That is very different from getting to it from browsing.

Wait until the review team discover Safari...

If you look at browser apps and search engine apps, they're rated 17+.

My thoughts exactly. You can even just go to tumblr's website, surely.

Tumblr like any website with image content is liable to have illigal images on it. Apple removing it, but not removing imgur, reddit, or any other site as an app, is playing favorites and making excuses.

Its difficult enough to have reliable, safe sites where arists can create, share in legal artistic pornography, but with Tumblr being forced to mass-wipe things they're deleting both sfw and nsfw content. Most likely based upon reports, which in Tumblr culture are typically fake.

There should be grounds for suing apple in mistreating the market in this manner, as this is real loss of profits involved, but I doubt it would work. Either way Tumblr is gonna lose a good amount of it's userbase.


>imgur, reddit, or any other site as an app

Except for the fact that both reddit and imgur have quite heavy handed moderation in place.

Mature content (even when marked) is often removed from imgur. Reddit will ban entire subreddits if NSFW content goes anywhere near legal questions.


That's largely false. Imgur has tons of porn on it, you typically just don't see it on the front page. Imgur is the place to host albums of porn in a convenient way.

Tumblr wasn't allowing the cp. As the article says it was immedietely deleted. In what way is this different to reddit or imgur? I doubt either have better image recognition, if any at all.


The iOS Imgur app hides all the porn though. You can’t search for it on the iOS app either.

The Tumblr app, on the other hand, was HOW you would view porn. If you tried to go a NSFW Tumblr in Safari it would ask you to log in, and then bounce you into the app.


The removal from the app store is supposedly over cp, not it's accessibility to nsfw. Imgur does not magically block porn, because it doesn't know what images are or aren't. Reddit relies upon NSFW tags, and afaik can still be enabled through website settings.

On the topic of pornography access, Apple is incredibly immature about it. They have a fetish of trying to remove pornography access, which should be unacceptable.


That must have changed recently, no? Because a few years ago all the "dangerous" search terms where blanked in the Tumblr app. Not even a straight porn, I was looking for photos of some top models and if there was any kind of topless photo in the portfolio of such model (like Pirelli calendar session, for example) - sorry, no luck, had to go to the browser.

That hasn’t been my experience. You can search for whatever you like.

Not to mention there are actual browser applications in the App store, such as Opera. Not to mention Safari itself. Are they saying there are filters in place in those apps for every single website on the internet? Do they do image analysis on every single image ever downloaded?

That said, the way this works is if people report Tumblr they will ban them. So, it is either luck of the draw, or the problem is worse on Tumblr.

And that said as well, there has been plenty of child porn on Twitter over the years. (Unfortunately I have run into this firsthand because I run Twicsy, a Twitter picture search engine that has indexed nearly 8 billion Twitter pictures)


Sorry to tangent into the tech side of your post; but do you mind talking about what you use to do the filtering? I'm in a similar space, scraping content for textual analysis alongside the images, but certainly run into the same problem. That being said, 8 billion images is a whole other level of scale to what I've touched thus far, and I'm curious how you've approached this problem, because that's CERTAINLY beyond a manual threshold.

If I encounter truly objectionable content I remove a network of users and all of their content with a single click. It sometimes removes users that are innocent, but better to be safe than sorry. I also have a form on the site where you can report content, so I get help from visitors to identify the bad stuff. (Twicsy at its height had over 9 million visitors per month)

I know you were being a bit hyperbolic, but that is actually an interesting point. I wonder how long it will be before web browsers start real-time fingerprinting images that are downloaded and checking against the database. It seems like an inevitable progression.

Depending on the fingerprinting method that could be easy to overcome, servers can send unique images in response to every query, eg crop differently, add icons, or add subtle colour casts. Overcoming hashing checks could be done by varying packing bits or metadata.

there are perceptual hash algorithms overcoming exactly these issues

https://en.m.wikipedia.org/wiki/Perceptual_hashing


I wonder how resistant those are to distortion. The examples on the Insight page are pretty impressive, but fairly naive approaches. I feel like, if you knew the algorithm, you could probably manipulate your image data in a way to make it appear almost the same but come up with a very different hash. Similar to adversial images in computer vision.

https://spectrum.ieee.org/cars-that-think/transportation/sen...


Or put it in an encrypted 7z archive. Or any number of workarounds where you're not sending image data so the filter can't detect you.

sure, but I dont think apple expects you to be able to filter this

As a platform you cant really filter this, unless you blanket ban all archive files. For tumblr this might be viable, but for a file hoster it wouldnt be


Tumlbr at least used to let you add arbitrary HTML to your blog, I remember putting base64 encoded gifs on mine that the browser would render. Unless they removed that (which IMO, would pretty dramatically change the general feel of the site) there's a real limit to what they can block.

Perceptual hashing is certainly resistant to some distortions, primarily affine transforms.

Unfortunately there's not any easy or sufficiently strong defense against someone gradient descent-ing your algorithm.


There is an entire youtube cottage hobby of putting full pirated movies with distortion. The distortion goes beyond simple things like color casts, so I'm guessing they are trying to escape some sort of perceptual hashing. Often the movies don't take the full frame and there are lens flare like overlays put on the images. The audio is also distorted.

I do not very well understand the algorithms used for this purpose, but I imagined that they are somewhat resistant to easy circumvention -- otherwise what use are they in trying to filter CP? The folks who are trafficking in such things would probably be very interested in workarounds if they are easy to accomplish.

Detecting porn is actually not that hard because you can look for skin color. With a bit of machine learning it’s totally doable and there are firewall solutions that can block porn content in real time.

I had friends at a startup that was doing this 10 years ago and they were doing a marvelous job. Don’t know what happened to it, blocking porn isn’t something that people really want to pay for imo.

But it’s totally doable and it doesn’t involve hashing / fingerprinting.


Detecting porn and child porn are very different. Also, detecting porn with no false positives and no false negatives is also much harder.

An image classifier of yesteryear had some problems with sand dunes: https://gizmodo.com/british-cops-want-to-use-ai-to-spot-porn...

blocking porn isn’t something that people really want to pay for imo.

Sell it to libraries, or any other place that has publicly-available computers.

(My local library has a regular wifi network, and a second one that doesn't filter NSFW content.)


Are you suggesting that a censored network is "regular"-- at a library?

Depending on the location, if porn is available, homeless people will come in and masturbate. (I'm not sure whether that problem is endemic or occasional... and they'll come in to masturbate anyway, but porn is arguably a superstimulus and may affect the frequency of incidents.)

Just like 98% of all front doors are easily opened with brute force. Just because you can circumvent something, or doesn’t make it unusefull.

For sure most of what people think of as security is presentation.

But the context here is hashing all images in order to catch illicit images, particularly child pornography.

Presumably people who distribute such images take a rather more practical approach to security.

So implementing a universal change to the internet needs to work against those using bank-vault level security rather than easily-picked, easily-booted down, easily bypassed "domestic front door" level of security.

If you implement something that will work in 99% of cases in the net then you'll let through billions of images from those who take a combatative approach to law enforcement. The result being no effect against the criminals you want to catch.

(Your local version of GCHQ will be able to reduce their database size though.)


There are algorithms developed by Microsoft and others which are specifically designed to normalize images as best they can before processing in order to reduce false negatives due to alterations like cropping, color shifting, watermarking, etc.

They then use perceptual hashing techniques to identify similar images. It's a game of cat and mouse but ultimately I think with ML, the mouse is going to lose.


Edge will be the first, of course. But it will be "opt in" at first hidden beneath a wealth of misdirection.

All the big software companies share databases and code that helps identify CP. I would be surprised if Tumblr wasn't using these resources, just as I would be surprised if reddit or imgur weren't. These resources make it relatively easy to do the bare minimum required here.

Didn't Tumblr say, though, that they did use those resources but the content in question wasn't in the image fingerprint database?

With Tumblr specifically it's user-generated content that's the problem. In high school, there were people who posted racy pics on there all the time.

This sort of action is very infrequent and I'd be shocked if there wasn't solid reasoning behind it. Tumblr has been passed around several times and is currently ultimately owned by Verizon. I wouldn't be surprised if Verizon is essentially ignoring Tumblr and they have ignored a child porn problem that is much worse than that on imgur, reddit, twitter, etc.

This, Tumblr simply doesn't do enough to guard against the problem. The simple steps they could take that would have avoided this are:

- Clearly word their TOS/AUP to forbid any depictions of minors in a sexual context. They are a private organization, it is well within their rights to exclude any content regardless of whether some might have fringe legal status.

- Assign even one single moderator to pro-actively hunt down obvious keywords, blogs, and content.

- Have an easy 'report post' and 'report blog' functionality. Ban false reporters accounts.


> There should be grounds for suing apple in mistreating the market in this manner

It's a private curated app store, they can do whatever they want with it.


I would bet there are civil precedents that give some powers, but not that they'd have an app at all if they won or lost.

We're coming upon a time where some private markets have too wide of effects to not be actionable.


I suspect Apples defence would always be "we only have 15% of the smart phone market".

The key question in many antitrust cases is: what is the “relevant market”? Their worldwide share of sales is probably not the touchstone in a US antitrust case. Certainly plaintiffs would argue this, and point to their larger market share domestically.

So you would make the same argument for Facebook, YouTube too?

Or, as I have learned from HN, different rules apply to Apple vs rest of the tech companies?


> So you would make the same argument for Facebook, YouTube too?

Absolutely. They can do whatever they want with their platform. Facebook bans people and groups all the time based on it's own whims, Google and Youtube as well.


The Roman's build all the roads, if you dislike the Roman's just yield any traffic.

>Apple removing it, but not removing imgur, reddit, or any other site as an app, is playing favorites and making excuses.

That doesn't match what they are claiming though.

Apple requires such sites to have a filter to prevent child-porn and the claim here is that Tumblr was removed for not having one that works. If those other services have a filter that works (sufficiently, I guess) while Tumblr doesn't, then it's not playing favorites, it's just having a policy and enforcing it. To limit the dissemination of child-porn. It might be hard to sue for that.


Safari to be removed from iOS 13 due to child pornography issues because some websites don’t have a filter

Apple apps automatically get an exemption from all App Store rules.

I think the guideline applies to apps with user-generated content.

Now, I'm sure you can come up with an argument about why this still applies to Safari... but it's also easy to come up with an argument about why it does not.

Certainly, the relationship between the content and the app is much more tenuous for Safari -- where the content is everything published on the web by anyone -- and the Tumblr app, where the objectionable content is hosted by and for the Tumblr app.

In the end, you can argue Apple should have drawn the line about responsibility for content in a different place, but clearly a reasonable line can be drawn that allows Safari but doesn't allow Tumblr (with a broken child-porn filter).


Safari has a site filter, Tumblr doesn’t.

How effective filters are is another question, but Apple has long stated rules that apps must not provide access to “objectionable” content without some kind of access control which doesn’t seem to meaningfully exist on Tumblr.


Tumblr has NSFW filtering. You have to be logged in to see nafw tumblrs, and have it active on your settings, iirc.

I’m aware, I use it regularly. I’m also saying it doesn’t work.

Safari doesn’t host content.

You can look at backpage as a very similar example, and in fact if Tumblr does have child pornography on it and they are not actively taking it down and taking measures to prevent it and report it then it should be shut down just as backpage was.

Tumblr is to child porn what LiveJournal was - the site's admins made a very deliberate choice in what they filter and don't filter (courting a user base that likes pornographic depictions of characters that appear to be children) and it's not surprising to see a reaction from more mainstream folk.

Wasn't backpage overwhelmingly predominately adult site?

Craigslist stopped adult ads after FOSTA passed https://www.rollingstone.com/culture/culture-news/craigslist... But BackPage, which was essentially a CL clone, carried on until the owners were arrested

https://www.azcentral.com/story/news/local/arizona-investiga...


I am not sure whether it is yes or no. I remember consensual adult sex workers complaining their job got more dangerous after backpage was killed. (It allowed them better check of customers, after they went by whoever said yes on truck stop - more danger) Don't know much more.

From article: "In its updated statement, Tumblr said that while every image uploaded to the platform is “scanned against an industry database of child sexual abuse material” to filter out explicit images, a “routine audit” discovered content that was absent from the database"

Same thing as removing Twitter because a non filtered post showed up in someone's feed.


So basically, the same problem rooted in management at most tech-heavy companies: Tumblr likely focuses every single dollar spent on development towards pumping out useless new features in under half the time those features would require to finish properly, sacrificing quality over quantity – every damn time.

Some developer at Tumblr is now grumbling, "I fucking told you we needed to spend more time on the filter system, and that it wouldn't work properly when you forced me to ship it to prod after only 3 weeks instead of the 6 weeks I told you it would take". Someone there knows that the cron/API that updates that database is non-functional or untested, but nobody higher up the chain of command gives a shit.

Source: have dealt with this kind of management time and time again. In particular, "Agile" development has destroyed this industry. You tell management the task will take 6 weeks, so they assign 3 weeks to the ticket in Jira, and suddenly that means the feature has been promised as being deliverable in 3 weeks. If not, it's somehow your fault for not meeting deliverables.


There is no filter system in the world that will catch all cp. It doesn't exist because current systems rely upon known content. It's image recognition, and image recognition can be bypassed. Either by an image not in the database, or by modifying one that is.

Every website works off the same technology and database to detect child pornography (PhotoDNA). If it slipped though Tumblr, it would on any other site or app as well (including Apple's).

Most websites also have efficient reporting takedown procedures. Seems Tumblr's are a bit ropey.

The claimed issue was that the database was missing material. This could be as simple as Tumblr stopped updating their database and didn't notice, and so newer images weren't caught.

It's their platform, and they should be able to do anything with it.

That's in fact one of the reasons I don't use an iphone and don't develop iphone apps.


I disagree. From my experience, I was shown porn with what appeared to be underage actors, which I reported, without looking for it. The format of sites like Reddit, Imgur, etc. do not lend themselves to discovering images you’re not looking for from my experience. Reddit, for example, is divided into sub-Reddit’s that are moderated as well as some being marked NSFW.

What I don’t understand is why multi-million and billion dollar companies can’t try harder. On many sites a simple search will find forbidden content that they somehow can’t find? Gimme a break.


...with Tumblr being forced to mass-wipe things they're deleting both sfw and nsfw content. Most likely based upon reports, which in Tumblr culture are typically fake.

Isn't this Karma out to get them? What Tumblr has done to its users, Apple has done to Tumblr.

There should be grounds for suing apple in mistreating the market in this manner

There should be grounds for suing Tumblr, Twitter, YouTube, and Facebook for presenting themselves as platforms for expression, then turning around and acting as censors.


I stumble upon this page: http://www.differencebetween.net/miscellaneous/fashion-beaut...

By ddging "legal artistic pornography", note the comment section has this:

""" Nope. There is no difference between art & pornography in art: The distinction is an imaginary one .

I shall now debunk our authors premises, with historical context.

“1.Nudity in art has been an accepted subject since ancient times while pornography is an unacceptable expression which has been developed in later years.”

Nope. This is a common misconception, but images & texts pertaining to the act of sex & sexuality, what our author would argue is porn, has existed in art since mankind first found a way to scrawl rudimentary images having sex.

Later societies constantly pretend that earlier societies didn’t do it, from the erotic frescoes of Rome, to the carvings of India’s Khajuraho Temples, to the 35,000 years old carving from ancient Germany (about 10,000 older that the “Venus of Willendorf”).

“2.Nudity in art is natural and is meant to let the audience appreciate the human body while pornography is intentionally done to arouse sexual feelings in the audience.”

Then you’ve just declared any amount of ancient classical art pieces to be pornography, since many of them were designed to arouse.

” 3.Models of nude artworks pose naturally and do not convey any eroticism while pornographic models have expressions and poses that are erotic and sexually arousing.”

What does one mean pose naturally, any pose one can do with the human body is by definition natural. Such a distinction as you are trying to construct is a distinction without a distinction, because the poses one would use to arouse are as possible by humans as the pose of eating.

” 4.Nudity in art is accepted in society while nudity in pornography is seen as inappropriate and is banned in most societies.”

Which just goes to show the double standard. The subject matter is the subject matter, which is why this tactic has been used for centuries to try to get artwork banned. You can’t show kids this image because it’s porn…. Oh you mean Michelangelo’s David is porn? How is it porn? It has nudity in it.

“5.In some cases, what one person may view as art might be viewed as pornography by another depending upon how the presentation affects the audience.”

Which again shows that the difference between what makes up pornography & what makes up art is an artificial distinction, that only exists in the minds of people who imagine it exists.

Read more: Difference Between Nudity in Art and Pornography | Difference Between http://www.differencebetween.net/miscellaneous/fashion-beaut... """

My reading is that there is no such thing as artistic pornography.


Another interesting bit of data, ngram of "child pornography":

https://books.google.com/ngrams/graph?content=child+pornogra...

Or you can compare that with the unqualified term:

https://books.google.com/ngrams/graph?content=child+pornogra...

A detailed assessment of what is or isn't obscene:

https://books.google.com/books?id=ccQf_RJ4eXUC&pg=PA5&dq="ch...


Wait, I don't understand, I can differentiate between nakedness and sexual nakedness, even with sexual partners. The distinction would be hard to describe beyond "trying to be sexy while being naked," but it's there.

I do see a difference between pornography and simply art in which someone is nude.


https://en.wikipedia.org/wiki/I_know_it_when_I_see_it

Supreme Court Justice Potter Stewart: "I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description ["hard-core pornography"], and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that."

There are things that we understand in practice, but fail to objectively define. That's because pornography is a social idea that changes on circumstances. 'Artistic Pornography' is when the pornography has an artistic value. An artistic value is when the intentions of the art is to depict more than the sex. Typically portraying love, acceptance, insecurity. A medium which is meant to create emotion, a story that doesn't necessarily have to be told through words. That has much more artistic value.

Tumblr had quite a few artists that would depict these things very well.

But really when I said that I meant porn that's well made. Problem being that people lump all porn together, and it's just not the same things. Videos on the front page of Pornhub vs manga with some romantic sexual relationships for example.


Tumblr's reporting process for illegal images is terrible. Like, I reported someone posting sexually explicit images of themselves whilst claiming to be underage, and about two months later I got back a form email telling me that they were finally looking into it and might or might not do something about it.

First, "mistreating the market" isn't a thing -- a store is under no obligation to continue to sell a product they don't want. There may be an anti-trust claim to be made about walled gardens more generally, but there is no appetite among regulators to make it, and such a claim would be more properly directed toward Google on the basis of current antitrust thinking. Apple can give Tumblr preferential or worse terms for any reason or no reason at all.

Second, as it relates to enforcement. Your post shadows the fact that you disagree more broadly with removing child pornography because of collateral damage. To be totally frank I find that line of argumentation viscerally disgusting.

But it's also the wrong frame for the argument: you additionally claim that given Apple wants to crack down, they are cracking down the wrong way. I am not sure what the issue with selectivity is. Why is it a problem to apply higher standards to large, commercial applications who could easily be coerced to improve enforcement (as opposed to, like, Joe Random making some app that finds images online)? Why, when choosing among large commercial applications, should Apple not go after the worst offenders (in quantities of violation or currently enforcement protocols) first?

This seems like an efficient use of enforcement resources on Apple's part. One target at a time, optimizing for those who are worst in complicity and most able to take action?


"First, "mistreating the market" isn't a thing"

It's clear what Apple and other app stores do. They are a market that makes plenty of money, and they mistreat it. There are losses of revenue over this mistreatment, and that should be actionable. Wether it is or isn't is a matter of law, but law doesn't cover what is and isn't right.

"Your post shadows the fact that you disagree more broadly with removing child pornography because of collateral damage. To be totally frank I find that line of argumentation viscerally disgusting."

That's a strawman. By that logic if I say taking down the internet because cp exists on it, is bad, then I'm defending it. I'm saying that tumblr is going on a spree of mass deletion to appease apple, not in any targeted manner to the initial problem, being that of cp found on their site. I'm saying that all platforms have this, and targeting tumblr for it in this improper way, removing the app, is doing far more harm than good. If Apple cared about solving the problem of cp they'd get in touch with Tumblr and tell them effective methods they should impliment that are fair with other apps, and if they don't do it, then they'll remove tumblr app. Instead, now those who were using Tumblr to share cp will take to a different site, instead of facing repercussions.

If anything Apple could potentially be interferring with ongoing investigations of cp by their lack of cooperation. I think that form of illigal activity goes above the grain of just throwing things into chaos to set examples.


A straightforward way to move a lawsuit like this forward might be to sue to have apps similar to Tumblr taken down. Make Apple choose to either take down all of the apps or pay some large penalty based on damages that are easy to show - find the offending content in each app and cite the law(s) that were broken.

Basically prove to Apple that they risk the viability of their app store when they censor specific apps. With the full legal power of Apple at their own disposal, they might just find a clever way to keep this situation from ever happening again.


Tumblr is not being forced to do anything. What do you gain by using the app instead of the website?

There really needs to be a FOSS model for detecting this stuff. As someone who got in trouble with the police hosting user generated content I've essentially stopped doing it entirely, although I'd want to.

There are some services that you can funnel content through, but getting access to them is tricky and getting a hit adds some reporting and compliance requirements, not to mention I'd be piping all my users images straight though microsofts servers.


You've got an interesting problem there: who hosts the training or comparison set for such an FOSS system?

Specialised police units.

There is: PhotoDNA. The name sounds neutral, by design, but don’t let it fool you: unless your boss explicitly tells you this is your project now, I’d recommend you stay far away.

Well, I’m not sure it’s FOSS or anything like that: it’s maintained by the police (a weak assortment of the services that handle that in different countries) and it has a key feature: it’s a bloom/hash filter of sorts, i.e. you don’t need to open the image to make it work (because even opening the file would be illegal). It recognises photos similar to those already flagged (which are absolutely not free, open or available in any way for obvious reasons). A typical implementation is to move the flagged image in a dedicated folder with very strict access-control, keep really reliable traceability and send details to your local dedicated police team. Even access to the filters is tightly controlled too, because the wrong kind of people could use it to sift through a lot of images.

Because it’s a fairly standard tool for any image hosting company, I’m surprised Tumblr has been shamed. Either they messed up their integration, or they ignored police requests. It’s odd.


> even opening the file would be illegal

Not an expert, but I've worked with large companies that used humans to filter user content. Either they were breaking the law en masse deliberately or there is a safe harbor for doing manual filtering.

Again, I don't know the answer, but I'm very skeptical that it's always illegal to manually filter user content. Presumably swift deletion (and cache purging, if possible), reporting and record-keeping would be important. Although that does raise the issue of how you preserve evidence for police to take action if you delete it immediately... geez this is such a landmine.


There is an affirmative defence in most jurisdictions for reporting cp you come in contact with.

Not an expert either, but I believe that once it’s flagged as (likely to be) CP, you are not meant to open it. Reviewing content en masse is obviously OK -- but I would not recommend you do that without a PhotoDNA first.

Couple of corrections because I got several things wrong:

- credit where credit is due: the technology was developed by Microsoft;

- it was developed as a generic image recognition software, hence the neutral name;

- it is a hash, not a bloom filter; not sure why I remember it as a bloom; it doesn’t appear to be able to recognise new images;

- the technology is widely used, but it was donated by Microsoft to International Centre for Missing & Exploited Children (ICMEC) which isn’t loosely connected but, yes, has close ties with all local law enforcement on those questions.

It is available on Azure Marketplace, so widely available. The technology is still being expanded to work on videos and could be used to identify terrorism-related content. Because it’s meant to be resistant to alterations but not infer new content, it is not able to flag new content: that has to be manual. (Someone at the ICMEC has a horrifying job.) I would assume that it is generally implemented by flagging some images in uploaded albums and sending the whole album, or images from the same uploader, for manual verification. That would explain what I’ve heard about some images being potential.

To come back to the original story: it seems more likely that Tumblr’s problems are related to new images.


late reply, but I reached out to PhotoDNA at the time after getting the first user report.

They told me to fuck off, no individuals wanted. Their site just states you need to "go through a vetting process".

Apparently you de-facto need to be a corporation to allow user uploads without getting fucked now.


If you've been on Tumblr recently, you'll know that it's a pretty hands-off place. You get the general sense that Yahoo! fired everyone who worked at Tumblr and now it's just running on a server somewhere.

As a long time user I think this is largely a misrepresentation. It really feels same as it's ever been.

Yeah, but the Tumblr code and infrastructure was always a bit special. Few sites of this dimension had so many major bugs.

One commenter in the OP article pointed an interesting aspect of it: it's widely required, and considered totally feasible to preemptively filter pornography from the user uploaded stuff, and yet: even the giants like YouTube will fall under the burden of copyright filters demanded by EU.

No wonder then that politics usually introduce laws related the Internet under the "safe the children" slogan. It really does changes the perception.


Hot dog or not hot dog?

An "is this child pornography" filter is a wildly different beast than "is this copyright infringement".

The crucial difference is something either is or is not child pornography, regardless of context. But whether something is copyright infringement depends on a whole variety of factors, and the exact same content uploaded by two different people might be a violation in one case and not in the other.


Of course, child porn is easier to identify. For a human eye.

Also, in case of Apple and their App Store it’s not only CP, it is all kind of NSFW content that has to be filtered out. And that can be nuanced.

It appears that for now the filter mechanism is simple: the content is scanned against known database of bad stuff. So, a twist on a Content ID.


I'm always surprised that these web 2.0 sites don't pay people to surf and flag content. It doesn't take much google-fu to find all sorts of stuff on the clearnet that shouldn't be there. I understand that removing offending stuff from Google search itself is very difficult but the walled garden sites like tumblr are much easier as most of the offending stuff is usually linked together by repost and/or likes. Find one page doing something it shouldn't and follow the likes and reposts down the rabbit hole flagging and removing as you go along.

It doesn’t help that Tumblr had no way to flag posts on mobile (both mobile web and in-app). There was too much friction involved in sending a report which is unlike every other social network I use.

There are a lot of weird quirks with Tumblr that I don't understand like how some pages will only open up in the dashboard.

Those are marked as "private" if you view those blogs in an incognito window (not logged into Tumblr) it will tell you that the blog is private and you must log in to view the blog. That's entirely the user's decision to make their blog private to the public.

If true, this is probably why the app hasn't been restored to the store yet. Apple requires apps that show UGC to have a reporting mechanism. In fact, if your comment is accurate, I don't know why Tumblr was ever allowed on the App Store to begin with.

1: Sign up to webtwopointoh.com as a content-remover

2: Add loads of unwanted material to webtwopointoh.com

3: 'Find' said unwanted material, claim bonus

https://en.wikipedia.org/wiki/Cobra_effect


4: Go to jail for possessing "unwanted material".

exactly. there's no way I'm going to downloading, much less uploading "unwanted material" anywhere. no thanks.

0: Have a collection of unwanted but not illegal material.

To add a whataboutism, what about Snapchat?

Two things:

1 ) I'm unfortunately not surprised. For my girlfriends and I, Tumblr was basically effectively a place to express our lesbian-ness and effectively collect 'tasteful' nudes.

It wouldn't shock me in the least to find teenagers doing a similar thing, which, of course, would be illegal.

While others are saying that Tumblr was sort of picked on unfairly here, I'd argue the percentage of NSFW content that's easily-accessible on Tumblr is, in my personal experience, much higher than that of many other social networks.

2 ) That folks are still using Tumblr is a surprise to me. It seems as though many of my friends and I had a mass exodus about 4-5 years ago.


> It seems as though many of my friends and I had a mass exodus about 4-5 years ago.

Seems common for platforms to wind down and for exoduses to occur, but I haven't heard of another tumblr-like. Where did people head off to?


Nowhere, to be honest.

We read books. We listen to records. We have actual human interaction. It's awesome.


>Where did people head off to?

twitter


Is there really a point to being this strict about child pornography? I'm still uncomfortable with the implications of certain configurations of numbers being illegal, and it's unlikely these laws/bans protect any children, unless there are people who exclusively rape children to distribute the resulting images and videos, which seems dubious.

I can understand individual sites taking down such images, but anything further seems like overreach.


The more restricted an item is, the more a few remaining loyalists will pay to obtain it. We've seen this play out a thousand times with drugs. I infer, then, that there's probably a small set of people making a mint by literally raping children to distribute the resulting images and videos.

And I still think enforcement of these laws should be absolutely draconian. This isn't a victimless crime: real lives are being absolutely destroyed. It's not like legalizing weed where suddenly you have a lot of people using it openly and responsibly. There is no acceptable degree of child rape that we can tolerate in society.

I'm 100% with you if we were talking about file sharing. I'm 100% opposed to you on this specific subject.


The problem is that the definition of child pornography is fairly broad, and the laws make little to no distinction between the various categories within it. Draconian enforcement is great for the hardened criminals profiting from child rape, but it also leads to locking up teenagers for possessing photos of themselves.

I think you're conflating "child pornography" with "child rape" here and that might not be entirely fair. Ignoring things like 17 year olds sexting, my view of CP is that it's evidence of a crime, but shouldn't be a crime, and certainly not at the level it is now. The biggest harm - the rape - has already occurred. Someone funding it is one thing - that's almost certainly conspiracy to commit that crime. But possession by someone completely unrelated to the act?

Just like videos of people being graphically murdered, for instance. I'd put 3guys1hammer up with anything else on the internet in terms of "should not exist", but that doesn't mean I think it should be illegal to possess/distribute/view.


> certain configurations of numbers being illegal

All matter and objects can be broken down into a configuration of numbers (of electrons, protons, etc.). Either possession of things can be illegal or not. To exempt "digital" things will be less and less a clear line as technology advances (3d printers and, presumably, eventually "replicators"). I don't know the answer, but "they're just numbers" doesn't help us solve these difficult questions.

A gun is just iron and various other metals arranged in particular configuration and yet we don't have a problem regulating metals in that manner. Bits are just bits until they're in a configuration that is illegal under current law.

Maybe the law should be different, but saying "it's just numbers" isn't very illuminating, imo.


No, having an image of [thing] is not the same as having [thing], ceci n'est pas une pipe. Replicators would change that, true.

Reductively, a gun is a pile of atoms, like everything else, but unlike child pornography, it has the capability of acting upon the material world.


Calling it a "certain configuration of numbers" is just willfully ignoring the context of it. To me it would be like calling anthrax a "certain configuration of atoms", it manages to both be literally true while also misrepresenting it.

Fun fact: Anthrax is naturally occurring, and not really that deadly if correctly diagnosed.

Your primary care doctor can treat you for Anthrax, and you'd likely be fine. The only difficulty in 2001 was that individuals were in contact with it that didn't realize, and mistook it for a common cold or flu.

Edit: The motivation of that attack was the fear, not the deadliness. In 2001 only 5 people died, and only 17 were infected. As we've seen, a single gunman can do a lot worse without requiring special training or equipment.


The issue is that it's creating a market for such content, and the production of the content inherently entails harm (sure we can split hairs over whether a 17 or 16 year old is mature enough to produce porn, but I think we all agree that there exists a threshold where that is no longer the case).

While I do think concern about over zealous enforcement is warranted, and opens up potential for abuse (e.g. embedding illegal content in other files to troll or extort people), I don't object to the principle behind banning it even if it is "certain configurations of numbers".


> it's unlikely these laws/bans protect any children, unless there are people who exclusively rape children to distribute the resulting images and videos, which seems dubious.

When there is a market for something, that will result in more of it. And, you can’t generally make child porn without children.

It’s not a huge logical stretch to suggest that some content ought to not be legal, especially when the participants are unable to legally decide if they want to participate. Banning child porn isn’t the slippery slope to full censorship as it is often portrayed as. There should be some common sense involved. Banning child porn doesn’t lead to banning political speech or even distasteful speech, societies should have some limits on speech and I would suggest that child porn would be will within reasonable limits and not necessarily leading us down a road of total censorship. Plenty of legal freaky stuff out there that it seems reasonable to exclude children from the mix.


Legal child pornography wouldn't mean legal child rape though, or even being able to legally purchase it. Could be something like what France does with prostitution, where only soliciting is criminalized, e.g. purchasing and production remain criminal, but not mere possession, which technically harms no one, even if it triggers strong disgust.

In fairness, the only point of this would be a more purist implementation of freedom of speech, but well, the status quo is rather pointless too.


> When there is a market for something, that will result in more of it.

So, we should ban not only production, but third-party possession or distribution of any photography/video depicting either results of crime or the actual commission of crime, because it creates a market for crime?


In the case of images of child sexual abuse we know that's true. We know people trade images to get access to websites, and we know people need to trade newly created images to get access to some websites.

We also know that the idea these images are out in the world is something that causes trauma to the victims.


So the fact that some CP sites have "keep out the feds by requiring OC" requirements is evidence that there would be a pay market for CP if it was decriminalized. I don't think I agree with this...

It's pretty easy to make the opposite argument, in fact. People want access to CP, either because they're sick, they're in the age range in question, whatever, and so they go to the only place that has it available - darknet sites that require OC to view. So they go out and create more to gain access, harming kids.

Right now, the way to gain access to CP is to create CP, which of course incentivizes creating more. This is arguably worse than the world in which it's available to all.


>Right now, the way to gain access to CP is to create CP,

This is false; there are plenty of free websites, and people who leak "pro" content on those websites. I'd like to see some statistics on paid versus unpaid content, and whether abuse is actually incentivized just to gain access to more. As far as I know, the research on this topic is conflicting.


Absolutely there is. Demand drives production. So even if you for some reason considered the viewing/possession of it as fine, it's driving those who go out and make the material. There's realistically probably not one way to end it forever, but curbing the demand side of it inevitably curbs production. To what degree, I'm not sure anyone knows...

The general theory is that the continued distribution of child pornography is itself re-victimizing the person or people depicted.

Is the OS responsible? The browser? The client application?

Ugh, I'm sure no one will really care about this comment but tumblr absolutely has a terrible child porn problem.

I basically stopped using tumblr for porn because of how much nude, as well as non nude pictures of kids I've reported to tumblr.

It's a big issue. It's also incredibly easy to find child porn on there and despite reporting it, it always seems to pop back up.

I get that people will post shit regardless but it really seems like they don't try to do anything but delete the single blog that has been reported.

This is a site that's based around people "liking" and "reblogging" content. It's insane to see a picture of some kid being exploited, see how many blogs have reblogged and/or liked the picture, and then see how tumblr doesn't even attempt to address these blogs that clearly are doing the same thing, clearly liking the same things, and clearly reblogging the same pictures as the reported blog.

I've reported blogs for harm to minors well over 20 times. It's a drag.


Yes, agree. I left for the same reason.

The irony is that the same mechanism that allows that content to propagate is also the perfect tool for eliminating it. Liked/reblogged the content? Banned.


I can't call myself too surprised considering that 'tumblr' and 'filtering' aren't so much strange bedfellows as perfect strangers.

I mean, hell, apps geared towards sexual encounters do a better job of filtering content. Grindr, Tinder, etc. Meanwhile, tumblr is pretty much unfettered access to whatever you fancy, not necessarily able to be bound by iOS' own filtering options such as Parental Controls in an elegant way.

Trouble is, if tumblr raised the app's minimum age to 17+ as with a lot of other apps with , that's a great deal of the app's userbase that can no longer access the service. On the other hand, heavily filtering content now means a huge backlog of content to sort through, with plenty of legitimate, if highly erotic, material ripe for getting falsely flagged as inappropriate.

tumblr's in a "damned if you do, damned if you don't" situation, though one entirely of their own design.


So first Gab now Tumblr. The only ones that remain and very likely will always remain untouched because they are standing in higher moral ground. The holy and all mighty lords and saviors Twitter and Facebook.

One of these things is not like the other.

Please elaborate.

Legal | privacy