Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
A global decentralized encrypted datastore with anonymous publishing (github.com) similar stories update story
329.0 points by adulau | karma 7914 | avg karma 8.9 2012-07-31 11:16:06+00:00 | hide | past | favorite | 223 comments



view as:

... in 410 lines of code.

Well, he's already built Celluloid (for handling concurrency) and DCell (for handling the distributed part).

Just learned about those two projects, thanks.

Just leaned about the fact, that he's the maker of Celluloid/DCell. Both pretty awesome!

Also this is the second p2p system I've tried to develop: https://github.com/tarcieri/distribustream

Should be noted that most of the documentation is theoretical. There isn't really much implemented yet, you can't yet use it.

For what it's worth, most of my time has gone into working on Celluloid, which is the concurrent object framework I'm using to write the Cryptosphere, as well as taking the rather intensive http://crypto-class.org

Tony Arcieri, be prepared for airport searches and harassing interrogations.

edit:

Well, at least that's what CryptoCat author has got


Depends how he plays it. The author of CryptoCat and also Jacob Applebaum (who works on TOR) have been outspoken in the media about their gear and its intent (and in Nadim Kobeissi's case, he's not an American citizen).

I'm betting that Tony could get away with keeping things on the downlow and not getting harassed too badly. But this is an empirical question. We shall see.


Small correction: it's Tor, not TOR.

I thought that Jacob Applebaum gets harassed due to his connections to Wikileaks. The Tor Project was US government-spawned after all (wasn't it a Navy project?).

People get searched and interrogated at borders all the time. That doesn't, by far, only happen to crypto guys.

So, you might as well give them a sensible reason to interrogate you :)


You should read the AMA of Nadim Kobeissi, Cryptocat author, on Reddit - http://reddit.com/r/IAmA/comments/x9q0a/im_nadim_kobeissi_cr...

Edit:

I wanted to post not the AMA, but this:

Cryptocat creator, Nadim Kobeissi, talks about his border interrogations - http://www.wired.com/threatlevel/2012/07/crypto-cat-encrypti...

> So, you might as well give them a sensible reason to interrogate you :)

They don't know who am I ;)


The project has enough legitimate uses I don't see the need to hide my identity.

I'd think so too, however I'd be too paranoid to publish this. Actually I haven't published a similar thing I made myself not too long ago.

Fantastic idea. What happens though if a peer sharing data goes offline? Does that person's data disappear as well?

Or can you build up enough "creds" to keep your data in the cloud for some time after your node disappears?


Or can you build up enough "creds" to keep your data in the cloud for some time after your node disappears?

Bingo, from what I understand. More to the point, the data will persist as long as users continue to transfer it.


How do I not unwittingly get CP being stored on my device?

You don't. Why?

Maybe he actually cares about the children?

This is equivalent to boycotting companies that sell cameras, hard disk manufacturers, computer vendors, ISPs, etc, to "think about the children".

There is a big difference; using this system you could directly facilitate the distribution of data that you do not want to be associated with.

Buying a camera doesn't directly help the creation or distribution of info you consider to be immoral or unethical. Obviously you may want to use a camera maker or ISP that actively discourages such behaviour.

I'm sure you would feel extremely guilty if someone used storage facilities that you controlled to spread CP. On the other hand, I've never felt guilty about buying a camera as I've never let anyone use my camera for such purposes.


But this system doesn't allow you to distinguish between files. You either use the whole thing, or don't use it at all.

In the camera example, you have the option of not letting people use your camera your nefarious purposes, which isn't the case here. It's more akin to supporting the camera manufacturer, who, in turn, makes things that might be used by people for crime.


I'm sure you would feel extremely guilty if someone used storage facilities that you controlled to spread CP.

If I didn't know? Definitively not.


OK, interesting. I guess it's a personal thing. For me it's not just a question of whether I know or not. I also have to consider if there is anything I could reasonably do to prevent such things from happening. If there isn't, then I would consider whether I really need to use the system (for example, is this my only option because I live under an oppressive regime?).

I think this was the point of the GP asking, "How do I not unwittingly get CP being stored on my device?".


So you wouldn't work on a company like Dropbox or any other of those file storage systems? Or a VPS/cloud provider? Or an image hoster?

And what about physical things? If you're working on a retail job, the customer buying piece of rope may be planning on using it for tying people up. The guy renting a car may be planning to use it on a hit.

Ultimately, it's not possible to keep track of everything people do with the services we offer.


Even if you were wilfully ignorant?

Even the most ethical manufacturer uses raw products down the line that in some way have passed through unethical processes. Consumer choice and vocal dissent is only one part of the equation, the other is active support of prevention and awareness programs. Privacy for us means privacy for all, but you can still help: https://www.google.com/q=child+abuse+prevention

I'm sure you would feel extremely guilty

Lots of people have no sense of guilt whatsoever. It's not a good thing, but with an enabling community saying "it's not your fault" it's awfully easy for people to pretend there was nothing they could do.


Or it's just that some people are rational about what it is worth feeling guilty about.

If he _actually_ cares about the children he'll do something about it, like acting against child abuse in his community, rather than caring if CP is somehow encrypted in data located on his computer.

Maybe he doesn't want to risk jail time. And that's not even supposing he doesn't want to aid the distribution of it.

you get "something" stored on your device. You don't know if it's a students English paper, home made pornography, nuclear launch codes... or CP. You just get a block of bytes that sits on your drive, and maybe, somebody somewhere has the key to unlock it.

Until that happens, you don't know, you don't need to know, and you can't know, that's the whole point.


Pretty sure this makes it illegal in the UK under our crazy encryption laws [1]. I definitely wouldn't want to be the one to test it.

[1] http://news.ycombinator.com/item?id=4234614


It might well be illegal under crazy laws, but not the one you link to: Beneath the vitriol a very important point was made: The prosecution has to prove beyond reasonable doubt that you have the key and that you're refusing to hand it over. The presence of software like this, traces of your participation in such a network is reasonable doubt. Even with this proven, refusing to decrypt is what they get you for, not what they think you've got encrypted.

If you only have portions of alleged illegal files on your machine, you won't be able to decrypt it. As far as the blocks of data goes on your machine, it's indecipherable rubbish.

That's it! The governments take our privacy and freedoms one by one, and we are afraid to fight it or even say NO...

They have won. The people is already under total control, nobody can take the power from the people ruling this world.


This is the reason that there is no public Tahoe-LAFS network also. To run a node on a network like this you need to be brave, much braver than even operating a Tor exit node.

The addition of the long chain activity log may perhaps add some disincentive to upload CP in the first place but getting the economics of that (e.g. ease of getting upload privileges) right sounds like a difficult balance though.


As someone who ran the public facing node with web-interface for the public grid for a while, I can assure you that it is less concerning than running a Tor exit. You will know the IP address of everyone who downloads and uploads from you via the web-interface, and you won't know the data of those who only use your node for storage.

The only complaint I ever received was from Sony after those "Geohot revenge hacks", in which they claimed that a hacked user database from one of their services was copyrighted by them, while at the same time they denied that it actually was the real user database.

We did some searches for the URL of the webinterface and discovered that it indeed was mostly used by "Anonymous" as a Pastebin alternative, at which point it was decided that we did not want to aid in the spread of personal information, which is why it was shut down.[1]

So you are right, there is no public grid because of unwanted files, but not because of Child Porn, but because of some script kiddies.

1: http://article.gmane.org/gmane.comp.file-systems.tahoe.devel...


Going by the description, if you know a file's plaintext, you know both its plaintext hash (decryption key) and its encrypted hash (storage key). The communication to request files uses the storage key. It would be rather easy to identify by statistical means, people who request files of a common type, for example known CP images. Forwarding a few requests for CP or having it stored doesn't indicate anything. Reading or writing a lot of known CP would be probable cause for a search. This is especially easy because the majority of CP is not new and FBI presumably has the whole damn lot on a hard disk somewhere. They can easily pre-compute hashes and just sit monitoring.

As a result, this network would be useless for CP-mongers, they'd get caught about as easily as using plain old FTP.

On the other hand, I do think the ability to pre-compute hashes is a flaw that massively reduces this network's usefulness to dissidents. It is quite effective as a "publish a manifesto network" with a secret writer and overt readers - the original write is of a never-before-seen file, there is no hash to monitor. It's unsafe as a "store my pirated stuff" network, writers of a well known file can be tracked. And it would take very little statistical monitoring to reveal the interests of a reader.


It would be trivial for a user to layer some encryption on top of the service, preventing anyone on the network from doing any analysis of what they are transmitting.

Not if it has to be decrypted as soon as it reaches the forwarding nodes - the way to monitor it, is to run black-hat nodes.

If I publish gibberish, all you know is that I have published gibberish.

index sites could have password encoded rar files of AwesomeNewMovie-1080P-SceneHipsters.rar (like they already do if they're using file lockers)

wouldn't the hash be different for every rar password used?

I guess the problem is once any copy infringing hash is found it is trivial to search the network and find everyone who has transferred it?

edit: although if the only writer is the uploader, would you be able to tell who read the copy infringing copy vs who just has it because their part of the network?


One could simply modify the known file in any way (say, appending garbage) to change the decryption key, ciphertext, and storage key.

However, in a direct peer-to-peer exchange the Cryptosphere does nothing to mask the transactions a particular is performing, as opposed to systems like Freenet and Tor which make an effort to obscure which host you're actually talking to by routing them through a chain of proxies. In this regard the anonymity guarantees of the Cryptosphere are no different from a system like BitTorrent, aside from the plausible deniability defense that comes from the fact all content is encrpyted and peers automatically provide storage service to other peers.

Instead, the Cryptosphere favors system robustness over guarantees on anonymity. Participants in the system maintain a history of their activities in the form of a long-chain certificate. You can think of this being somewhat like the BitCoin block chain, where the longest version always wins, and its integrity can be cryptographically verified. Every peer maintains its own long chain certificate of all its activities, including services requested and services completed.

Rather than verifying the integrity of a long chain based on hashes, the Cryptosphere uses public key cryptography. Peers requesting services sign off on both the request and delivery of a service (e.g. storing and serving a particular chunk of a file). While in isolation the data points contained within a particular long chain certificate are meaningless, peers can collect several of these certificates and build a database of other peers in the system, using tools like collaborative filtering to make intelligent decisions about which other peers are worth interacting with.

This is interesting. If i understand correctly, this means that given a transfer that you'd like to engage in, searching for a particular file you can trace back the provenance of that file through a network of peers who are making it available (if their transfer histories are accessible).


trace back the provenance of that file through a network of peers who are making it available

That doesn't sound very anonymous, does it?


I could misunderstand what the intent is.

If this is just a distributed block based storage without an index that folks can search, then it's more akin to a way for you to ask folks to hold on to encrypted data for you for a while (without them knowing what the encrypted data actually contains).

I think it's more of a matter of how secure the block chain is. Note this part of the opening blurb:

To ensure quality service and prevent abuse, the Cryptosphere uses an integrated cryptographically secure reputation system which provides a distributed web of trust.


Holding someone else's encrypted data without it being anonymous doesn't sound like a problem that needs to be solved. Also, the UK has crazy encryption laws which could open any user to very bad incarcerations. Makes stealing music off of Pirate Bay or Napster look like Jay walking.

Instead, the Cryptosphere favors system robustness over guarantees on anonymity.

You trace back the "provenance of that file" through crytographic signatures. You could make your own throwaway identity, use it to publish something, and through the continued propagation of that data through the network its publication would no longer require your activity.

It should be considered psuedoanonymous publishing.


But since your throwaway identity has no reputation on the network, it is much harder to find a node willing to carry your data.

Which would create a need for reputable publishers who would review submissions and publish them if they were valuable.

You can confirm any given peer is holding a particular portion of a file (sometimes called the "confirmation of file attack") however you can't prove they published it.

The peer "block chain" will only contain transfer metadata, not specifically which files are transfered. That information is only known by the peers involved in the exchange.

And that's where the anonymity of the system is definitely inferior to FreeNet: peers involved in any given exchange know exactly what was transferred and to what IP address.

I decided not to solve the anonymous transport problem because Tor, I2P etc are working quite diligently on that and it's a hard enough problem in and of itself. I think this has been a big stumbling block for FreeNet.


Could this be used for https://projectmeshnet.org and how?

cjdns doesn't need a bartering system for handling traffic. It uses Kademlia and some novel routing concepts, peering is assumed to be mutual because it's already explicit.

Does this not set things up for a hash collision? They do exist, after all... if this scales to global proportions using SHA256 without verification for deduplication (if I read this correctly) is a risk.

256 bit collisions are too improbable. The distributed hash table that BitTorrent uses relies on a 160 bit keyspace (SHA1) and that is seen as sufficient.

Assuming that there are no weaknesses in SHA256, you'd have to calculate a set of 4.8x10^35 hashes to have a one in a million chance of seeing at least one collision in that set[1].

If you could calculate (and store!) a trillion trillion (10^24) hashes per second, that would take about 15000 years. Needless to say, nobody has ever found a SHA256 collision.

[1] http://en.wikipedia.org/wiki/Birthday_attack


There are 115792089237316195423570985008687907853269984665640564039457584007913129639936 possible SHA256 hashes, which is a few orders of magnitude away from the number of atoms in the observable universe. The chances of an accidental hash collision are vanishingly small.

I had half a design prepped that allowed a similar thing using bitcoin wallets - in order to buy space on the system, you have to have a bitcoin in a wallet, and use the comment on the commit of the bitcoin to the wallet to sign the piece of data...

Unfortunately it will end up full of kidporn just like Freenet. This says a lot about the present state humanity, and what it says isn't good. Giordano Bruno and the Renaissance alchemists would have killed for a system like this, but the most interesting thing we can think of to do with it is fill it full of pictures of child abuse so people can jack off to them.

It's nasty stuff, too. This is not "merely offensive." Kids are abducted and murdered to make this stuff. In some cases, the act of making it causes bodily harm. (Think of the mechanics of an adult having sex with a six year old.) In parts of the world children are more or less raised for the purpose. It's horrific. Picture someone getting off on photos of war atrocities. This is up there with that, but worse... imagine that the war was held expressly for the purpose of producing the images.

Edit: there is also, of course, rape porn depicting adults being abused horrifically. That isn't any better, and freenets are full of that sort of stuff too.

I'm not saying the technology is bad. I'm saying that it says something very depressing about the users of said technology. It also makes me dubious about running nodes on such networks, since I know that a lot of the material I'll be storing and forwarding is abuse-porn.


I want to solve this problem.

I've thought about solving this problem too.

One idea I've had is a content-type-restricted network that permits only text. That would allow utterly un-censorable communications: chat, planning revolutions, whatever, but wouldn't be useful for CP. (Unless you like ASCII-art CP.)

It could support ANSI. That would be neato. It would feel like the old BBS world. Wonder if anyone would still care if a name like ViSiON-X were stolen for it. :)


couldn't you just encode the images into text?

Yes, which is why you keep humans in the loop: you design a system in which the will of humans who want the system to be used only for text prevails and which is resistant to attack by humans with contradictory agendas.

Yeah. You'd have to be more clever than that. There's a lot of interesting work on meaning extraction and text analysis. I wonder if you could have some kind of information density threshold.

It would be possible to encrypt text in such a way that things could be said about its information density but not about its meaning, too, though that would permit steganography to be used. But it would raise the technical bar for using the system for this purpose so high that it would probably drive away all the chickenboners.

There's also a dumb way: length limits. That would force binary data to be divided up into a huge number of posts, making it an annoying medium for file trading. Plotting the Iranian revolution would not require >1mb posts to a forum.


That's true. But, it's also completely possible to make a machine learning algorithms to separate "real text" (text meant for human to human communication) from text encoded images since they differ in significant ways. Granted, you could always try to make a text encoding format which resembles real text, but I'm fairly sure the machine learning algorithm could be constructed in such a way to make its usage unfeasible.

Usenet was text only and it let to the widespread use of UUENCODED images and videos

But couldn't one always use some binary-to-ascii encoding to circumvent this?

You can still encode any data as base64.

Usenet shows this kind of thing in action. It's now used for the most part for illegal file trading.


This has 2 problems:

* You can base64 encode any file, so it'll look like text. Limitations on message size might solve that.

* Sometimes, photos and videos are important. Think of the Abu Ghraib torture pictures, the Tianamen Square Tank Man. Sometimes photos & videos are censored and should be shared with the world.


Yup, that's true.

Maybe binary content propagates differently. Text that meets certain criteria is replicated indiscriminately, but binary content is only replicated when a user votes on it.

Edit: you could apply game theory to this problem. Model the network as a graph and write an agent-based modeling rule set for... say... CP-propagators and non-CP-propagators. Run iterative simulations of different propagation rule sets and weightings/parameters. Now introduce bad actors in the form of, say, government agents trying to suppress political discourse. The difference is that average-joe will cooperate in pushing out CP but will "defect" in a game with the other kind of bad-actor. You're looking for rule-sets and parameters where the CP gets pushed to the margins of the network or excluded but where the other kind of bad-actor is also excluded.


That sounds like an interesting idea. It might be helpful for spam-filtering too. Sybil attacks could be a potential problem.

Now you've really got me thinking...

Could Bayesian classification be implemented through a homomorphic cypher?

http://en.wikipedia.org/wiki/Homomorphic_encryption


Limitations of message size won't solve that because the messages will be broken into hundreds of pieces, just like they are on Usenet. In theory you could limit the total amount of content coming from an end point, but it's probably not feasible in an anonymous system.

But base64 encoding is pretty easy to identify, as are most imaginable encoding schemes. Simply disallow them.

If the encoding schemes become so obscure as to not be recognizable, then the problem is still effectively solved.


How about 256 words (or sets of words) representing each byte value? It's less dense by a factor of ~5, but it would work, be easy to decode, and be very difficult to identify, especially if you used sets of words. You could even cleverly generate in a way that is grammatically correct.

You're playing cat and mouse then, and you'll never win. If base64 was banned, I'd base26 encode it (i.e. letters of the alphabet), then use the NATO phonetic alphabet "alpha bravo charlie". The message size would be massive, but the message would get through.

How about a reddit-like ranking system as a portal do the darknet? It would easily categorize content in niches that would be curated socially. It could even help further ostracize the CP crowd to a marginalized role even in the darknet. It would enable anyone to flag out the abhorring niches and focus on the actual benefit of enjoying free exchange of information.

It doesn't get rid of them, but that's something that we wouldn't really be able to do (they all existed prior to the internet), but it would make the darknet usable.


A reddit-like ranking system would merely ensure that only mundane, low-brow and easily-digested content would be popular.

Which is exactly what we need to get this technology to become popular, which is critical to making this kind of mesh useful - there's a quite literal network effect.

If we are discussing potential solutions, maybe friend of a friend-network could somehow help thwart this issue. Even if CP etc was traded in the hypothetical foaf-network, it hopefully would remain in an isolated island of sorts. Anonymizing foaf-network could be bit of a challenge though.

Seems to me the only possible solution is a network of trust. Start small and build outward over time. This was the approach Facebook took to solve the "real identity" problem on the Internet at scale.

I've thought about this many times. How can you create a system that preserves the anonymity of publishers in persecution situations but excludes the child pornographers? The difficulty comes down to the fact that there is no technical difference between the two classes of publisher, just a moral one. Suppose the persecuted wants to share pictures of children being abused by government forces?

One idea I had was for the system to be semi-anonymous. Publishers would form public groups and the publication of content comes from the group as a collective. The members of each group are known, but the specific originator of the content within the group is not. This is the spartacus model of anonymity :)


Oh wow. There's a discussion about an anonymity network being used for horrible stuff, and people say "I want to solve the problem and make a network that can't be used for horrible stuff"?

I want to stop people from doing horrific shit to other people in the first place. Unfortunately, I have no idea where to start...


With a sufficient technical background, you could work for the FBI helping to monitor, decrypt, and track these a-holes.

That... is actually a very good idea!

I think you could make an argument for a 'war on paedos' (for example) being as useful as the war on drugs, crime, terror, or whatever.

The difference with this one is that you can conduct your 'business' entirely online, with almost complete anonymity. And what liberal solution is there that doesn't involve children or agitating the mob?

Technology is definitely the wrong thing to look at, I agree. I think, historically, it would be like blaming speakeasies for allowing people to drink illegal booze.


I think it's pretty easy to argue that. Child molesters cause way more suffering than the rest (well, depending on stats, but still), so I think that's a much better way to spend your resources.

What if you only connect to those you know? If you don't trust someone, don't connect to them. It causes other problems, but those are much more solvable.

*Edit: If Google can use reputation to solve search, why can't reputation be used to solve this?


I think that all you can do is to have some system of classification and rating validated by a web of trust reputation system.

As I evaluate content on the network, I classify it and rate it. My identity associated with those things is established though (possibly) anonymous. Over time, islands of trusts within the web will form that can be used to help filter large amounts of content.

If I start a node, I can link to islands of trust to only allow verified content acceptable to my filters to pass through my resources.

It's not perfect. Some will attempt to game the system by building up trust and then attempting to sneak content through. Some will attempt to hide illicit content in innocuous content.


I think you would need to limit the scope from an entire decentralized , anonymous 'network' to just a decentralized anonymous website or discussion forum..

Something like an anonymous decentralized HN or reddit with mods that have the ability to ban posts, topics , & users. It wouldn't be as 'free' as tor or freenet, but with the right group of benevolent dictators it could be as free and useful for a certain niche topic like politics or news.


Probably not.

The non-repudiable tracing of exchanges make it easy to trace consumers of a piece of data.

See their note:

In this regard the anonymity guarantees of the Cryptosphere are no different from a system like BitTorrent, aside from the plausible deniability defense that comes from the fact all content is encrpyted and peers automatically provide storage service to other peers.


...everyone knows child porn is bad. It's existence though, does not negate the usefulness of a system. Not that cryptosphere is going to be the open government revolution, but your comment is a useless vomit of emotional blather.

Okay, I'll re-state it technically:

How do we design a system that is anonymous and un-censorable where users can opt out of being relays for certain types of data?

Hard problem. Trying to solve it would be interesting. Not trying to solve it would make you identical to FreeNet and Tor and all the other efforts in this area, and thus less interesting.

I agree that there is no 100% solution to this problem, since all data can be converted to any format. There is also no 100% solution to pollution in a city, for example, or public health, or usability of a GUI. But there are 90% solutions that could make the problem marginal rather than severe.

BTW, on my "emotional vomit:"

Tell me. If these networks are for real human beings to engage in open communication, what happens when one of these real human beings comes across... say... a picture of a little girl being cooked over an open fire like a pig. (I didn't see this, but I was discussing the Tor .onion network on Reddit and someone claimed they came across this. I believe them.) Do you really think that person is going to return to this network to discuss... say... politics or economics or their local election?

It is a problem. To strip away the "emotional vomit," let's call it a usability problem. How do we make a freenet that is usable for non-psychopaths?

Edit: what I'm really saying is this:

Freenets have been done. It's a solved problem. Add some PK crypto and some hashing and some onion routing and shake.

What isn't a solved problem is: make a darknet/freenet that your mom would feel comfortable using. Make one that your average person -- maybe one with kids and thus really turned off by CP -- would want to one-click install from the Mac app store and browse.

THAT would make a serious political impact. Now you'd have hordes of average people using an utterly uncensorable chat system that was also hard to data-mine and tie to identity.

Right now, most people are going to start browsing the offerings that already exist (Tor is pretty easy to set up) and see stuff like "world's largest archive of hard-core lolita!", close the app, delete it, and never return. That's why these networks are not very popular, and it severely limits their political impact.


The problem is thinking that there is a way. You can't. Data can easily be obfuscated. As long as bits are flowing through your machine from sources that you don't control, you could be middle man to anything.

API stated it was a hard problem. Hard problems dealing with cryptography can take years and some times produce PhDs when they are cracked. It is easy to tell it is a hard problem, it is hard to say for sure it is not possible. If you have done the hard work of show it impossible please share and save others from repeating your work.

This isn't about cryptography, this is about what data can be pulled out of bits transfered through your computer. There is no requisite that they be encrypted, they can be obfuscated, hidden, or just not recognizable as illegal to you. There is no way for you to verify that a collection of bits, put on your computer by a 3rd party, does not in fact represent something illegal. You would have to have access to every existing and theoretical encoding, encryption and obfuscation technique, and use them in every theoretical combination to verify such a thing.

There is at least one free Ph.D thesis topic in this thread. I think solving this problem and creating a darknet that most people would want to use would be at least as impactful as the development of Bitcoin. People said that was impossible too.

The hard work has already been done and it is easy to demonstrate the problem and the impossibility of telling 'good data' from 'bad data', assuming the system is cryptographically secure, as by definition a cryptographically secure system resists analysis of content.

Consider the problem of one time pads. If I have two messages the same length, one made of 'good data' and the other consisting of 'bad data' and I encode them both with different one time pads, then it is possible for the resulting ciphertext version of each message to be identical. Another way of putting this is that for any given ciphertext that has been properly encoded with a one time pad, the only information available about the plaintext is the length of the message (assuming you know already that a one time pad was used) and nothing else.


api: > How do we design a system that is anonymous and un-censorable where users can opt out of being relays for certain types of data?

So not necessarily following any of the specifications of the system in the article.

I read his specification to mean that users are anonymous, they can post data and it can not be tracked to them. I do not see this necessarily requiring the data be filtered in a encrypted state only that it can not be tracked back to a submitter who took reasonable precautions.


The aim of - un-censorable where users can opt out of being relays for certain types of data - seems possibly paradoxical.

Well if no one opted out of any data type then it would be just like the some systems we have today. If most relays opted out of a data type x then the result would probably be that data type x would be less anonymous then other data types. It would take fewer conspirators to subvert the system for data type x. Similar to anonymity from government Z is effectively lost if they control m% of the nodes on tor.

Though I do not study cryptograph professionally that would be my current guess.


It is. What «api» has been proposing is literally, "I want a no-censorship network which I can censor."

That it is paradoxical does not necessarily make it impossible, though. The goals are certainly contrary but I am not certain that they are contradictory.

If you think about community-based censorship, this could probably be arranged even in an anonymity community, as long as it had active-enough participation. A popular search engine like Google can have tremendous ability to censor others even on a network like Tor where people cannot easily be censored.

The chief problem is that «api» faces is that his/her aspirations are too individualistic and unimaginative. You could always put the to-be-censored material in an encrypted archive and distribute the link to the material with the password to it -- this sometimes happens with BitTorrent (and then you'd have to click on ads to get the password and it becomes a nightmare). Then nodes cannot inspect the content. So what are you going to do, limit content-types? This was done by Napster, where only MP3s would be shared -- but a piece of software quickly came out called Wrapster which "wrapped" other files in MP3s. There exist JPEG steganography tools as well, both hiding files within the least-significant bits of the image data as well as in parts of the JPEG which do not get interpreted by a normal JPEG reader (e.g. appending a RAR archive to the end of the JPEG image).

I say "too individualistic" as well because any sort of relay net where the nodes themselves inspect the content that they trade is going to expose itself to a possibility of systematic censorship. "I know that you know what you were sending me" is a horrible way to start your cryptosystem.

Nonetheless, there might be hope for a sort of global data-store which the nodes collectively take responsibility for, which nodes collectively trade and where nodes can vote to "veto" certain indexed files. The idea would be that you can't take down the data store by taking down individual nodes, you can't prove which node "uploaded" a file, and you can't necessarily fault the nodes for failing to down-vote a file tracked by the community since hosting the file is a collective decision, not an individual one. It would have to use central aspects of the design of BitCoin alongside central aspects of anonymity networks, but I don't see why it would be impossible.


I see a lot of people saying that you cannot analyze the data flowing through a system - and yet, I have to wonder, how then do people find the data they are looking for? It seems to be a rather fundamental theorem that if you know a URL to a file, no, there is no way to stop you passing whatever you want. But if you only have search, then surely the search itself implies that you have a way to classify the data!

Personally, I think this is enough to stop the spread (if not the storage) of horror. In other words, someone might safely store their cache on my computer without my knowledge (heaven help me) but I refuse to store anything that is searchable as a horror.


I don't think this system is searchable. You'll be storing a bunch of encrypted blobs, each of which can be retrieved and decrypted by anyone who knows a short key. One of them might be a CP picture (if you store enough of them, statistically one of them will be a CP picture). One of them might be an index full of other short keys, pointing to a whole host of CP pictures spread across the mesh.

If you're fine with that, that's good enough; you can run this system, and political activists and perverts alike will be able to stick their blocks there, accessible to anyone to whom they can pass the relevant short keys. But many people will be uncomfortable with even this much.


Yeah this system does not appear to be the quick glance I looked, but api's disired system does not need to model this one.

api's requirements:

> How do we design a system that is anonymous and un-censorable where users can opt out of being relays for certain types of data?

As long as we are reasonable by taking "un-censorable" to mean very difficult to censor and "users can opt out of certain types of data" to mean highly limit traffic of data type <x>, it seems like a hard problem until proven impossible.


>I don't think this system is searchable.

How then can it be used to share anything? I can see how it could be used as a secure, distributed backup (which itself is rather handy) but I'm not sure how it can be used to distribute data.


It's (AIUI) meant as a replacement for pastebin etc. - you host something on this mesh, and then you only have to spread a short hash key around. It also lets you do the wikileaks thing of publishing a bunch of encrypted data which you could later release the key to.

I'll bite.

In the OMG, think of the children case: A number of cases that went public (some even linked here) agreed that legally, child porn is 'i recognize it when I see it' kind of subjective. I'm obviously talking about teenagers here and different moralities or a missing context (such as those 'taken for fun' or 'sent to a friend, privately and deliberately' cases).

Api might, from his subjective view, decide that this as-yet-never-encountered image is bad/evil/perverse. How would you ever create an algorithm for that, other than 'api, please press a button that says "fine by me" or "no way hell", right next to the image in question'?


It is not clear to me that a system could be created that would be fine grained enough to take into account individual preferences with out general AI. I can imagine a rough grained system were none of them match perfectly but some get close. A user would have to pick a standard and live with the good and the bad that came with it. Not perfect but more choice then what you have if you sign up to be a tor relay or run freenet now.

Forcing such data to be obfuscated is a step in the right direction. At least we won't accidentally happen on it.

Regarding the social implications and imperatives...

I think that groups like LulzSec provide a public service (see "Why the Joker and Not Batman is the Savior of [sic] Us All" http://thisorthat.com/blog/why-the-joker-and-not-batman-is-t... ) in that they show the importance and the need for everyone to be security conscious. I wish there were more groups like this out there raiding and dumping stuff periodically.

I wonder about whether it'd be political or legally feasible to have a law enforcement agency which just trolled around the internet and attempted to crack services that citizens depend upon.

The reason why this is relevant and important to dark nets, is that currently the only folks who use dark nets are folks who have something to hide. That might be folks who are illegitimately persecuted by governments, or folks who are legitimate criminals. These two groups are functionally indistinguishable, even if their intents and causes are different. They both have data they're trying to hide and communicate, without exposing themselves to authorities.


> How do we design a system that is anonymous and un-censorable where users can opt out of being relays for certain types of data?

If you do really want true anonymity and un-censorability as guarantees of the system design, then no, I don't think users can decide what they don't want to store or transmit. For, if they can, then their governments can coerce them into making the same "choice." Any preference that can be set by a user, can also be forced upon said user by a system administrator, operating system vendor, etc.

My real question is, do we need cryptography and anonymity built in at a protocol level to have something that's useful for political activism? It seems to me that there are only two real "innovations" these networks bring over, say, pushing encrypted blobs to people over SFTP drops (these, by coincidence, are both factors I've only really seen on Freenet):

1. That you have the ability to "push" content into the network, such that it will then replicate and spread through the network as it is accessed, without the possibility of an audit trail leading back to the source peer (even though the original source may know which client uploaded it, each peer only knows which other peer they got it from, so all you need to ensure anonymity is an internet cafe);

2. That content cannot be removed from the network easily--as there can always be dark peers who have copies of your data block, who will come online later and repopulate the network even if it has been seemingly purged of a block (by, say, all involved homes and data-centers being raided by the feds)--and that this happens pretty much transparently to the people involved, since people are always joining, leaving, and re-joining the mesh/swarm/whatever-it-is.

Encryption need only happen on a layer above this system, where and when it's desired. Anonymity need only happen at the end-points: the users can just access the system over Tor if they don't have the requisite internet cafe/seven proxies handy.

As long as you're just passing cat pictures around, why not just throw them onto a simple, infinitely-sized, everyone-can-create-files-but-nobody-can-delete-them DHT-based "disk"? And if you're passing political activism around, just encrypt and sign it like you were going to send it over email, then drop it in the mesh and email the URN instead. (This is presuming a stable PKI key-publishing/querying infrastructure as well, of course.)

And if you want to make it convenient for end-users, just make a browser extension that can load those URNs through the mesh as if they were regular HTTP URLs, and does the decryption and signature-validation automatically--and have the mesh software install that browser extension--and then you'll have something.


>As long as you're just passing cat pictures around, why not just throw them onto a simple, infinitely-sized, everyone-can-create-files-but-nobody-can-delete-them DHT-based "disk"? And if you're passing political activism around, just encrypt and sign it like you were going to send it over email, then drop it in the mesh and email the URN instead. (This is presuming a stable PKI key-publishing/querying infrastructure as well, of course.)

The latter is basically equivalent to this system, and so your system would have exactly the same problems: the only way you could avoid being a relay for child porn is to refuse to relay any encrypted content, at which point your node is not helping the political activism. Allowing unencrypted content also leaves you much more open to traffic analysis (if only a small fraction of data is encrypted, it's much easier to find the nodes that are inserting the political activism data).


Right, I'm not claiming that my suggested alternative is able to avoid what I might provisionally call the AUE theorem†; I was just suggesting the alternative as a way to separate out the "dumb" everyone-shares-one-infinite-distributed-hard-disk block-transfer layer, from the "smart" only-certain-people-can-see-certain things layer. The lower layer is an infrastructure protocol, with about as much Ethical impact as the Internet itself (to be specific, it could be compared to, say, BitTorrent.) The upper layer is where ethical responsibility comes into play.

We can probably convince your mother to download an app from the App Store that integrates just with the lower layer--hey, it's just like Dropbox, but bigger! [Well, as long as anyone and everyone can read random samples of your data if they like...]--because the upper layer, with the encryption and signing, will siphon off all the stigma of not-so-above-board usage of the protocol and attach it to itself. It's no different technically, but it is very different socially.

The advantage of having one reviled app on a larger infrastructure is that that reviled app gets to "hide" its blocks among all the above-board usage of the infrastructure. Like another poster in the thread said, if you go onto Freenet or the Tor Directory, the links to CP sites are plain and obvious, because it's a large part of what's going on there. But if you could look at your own disk usage as a node in this network, I imagine the number of encrypted blocks as compared to, say, plain-old MPEG frames of TV shows, would be vanishingly small. (And it's be relatively impossible to define which is which, either, since this infrastructure has no "index" or metadata; it merely is a big bucket of blocks named by their content hashes, of which most--not just the encrypted ones--are meaningless unless you have another block giving the order in which to string them together to make a file.)

† "Anonymous, Uncensorable, Ethical: pick two."--named after the CAP theorem of database design. Well, it would really be the AUE conjecture for now--but I'd love to see someone prove it either way; it seems like the sort of thing that is amenable to that.


> it's be relatively impossible to define which is which, either, since this infrastructure has no "index" or metadata; it merely is a big bucket of blocks named by their content hashes, of which most--not just the encrypted ones--are meaningless unless you have another block giving the order in which to string them together to make a file.

If you can't tell whether a given block is encrypted data or just part of an mpeg, how can you choose to store only unencrypted data? I suppose you could make an argument for building this system on top of a nonencrypting distributed data store, like bittorrent, for the sake of looking like that nonencrypted protocol to anyone intercepting the traffic. But there would have to be some metadata that let the encrypting protocol know where to find its stuff, and if the user who's downloading it can tell, so can anyone intercepting the unencrypted stream. Wouldn't you just end up with a situation where the upper layer is to the lower layer as freenet is to the internet?


> If you can't tell whether a given block is encrypted data or just part of an mpeg, how can you choose to store only unencrypted data?

I didn't say you could :) The point of this alternative is that it separates the stigma 1%-99% toward the upper layer, but puts the implementation 95% into the lower layer--and therefore we get a stable, un-censorable distributed storage network on the lower layer with the "abuses" of the upper layer (CP and political activism both) being an unavoidable free rider, but not something "visible" (in the sense of seeing CP sites listed in your index directory) to people only using the lower layer.

This situation, of course, also describes the Internet as it is today: protocols like HTTP and SMTP are used by everyone, and also by some unethical people who send their stuff over those same protocols in encrypted containers using anonymizing proxies.

The difference here is that the two big hurdles--of identity-diffusion over time after initial data seeding, and of guaranteeing data persistence as long as there continue to be consumers of the data becoming persistent-caching peers--are taken care of by the lower layer, allowing the upper layer to just handle transparent encryption in whichever way it sees fit.

(And thus can we also replace the upper layer if we come up with a better way to anonymously and securely get the right metadata into the right hands, without having to throw out the network effect of all the extant peers. They simply start transmitting-and-caching blocks representing the new kind of metadata exchanges along-side the blocks representing the old kind.)


I doubt this is easily possible, with Tor as designed.

Maybe you could establish a blacklist of CP sites, and that could be applied at the entry and exit nodes of the Tor network. This blacklist would have to be public and checked by many that it didn't contain non-CP sites, so in effect it would be a public directory of CP, which is problematic already.

Then, those that run entry and exit nodes could voluntarily apply the blacklist. In this way the Tor community could have its own values, while still being independent of any authority.

But this assumes that CP will remain restricted to certain domains in the .onion system, or the traditional DNS system. Which of course they won't. Maybe there will be one .onion domain per picture. Maybe there will be a Flickr of .onion where it's not so easy to figure out who's doing what. Then you'd have to lean on that service to police its own members' content.

I can imagine various messy and imperfect ways to limit the amount of CP in the world, or at least make it harder to find, but we just don't have good legal models for dealing with true freedom of speech. And our institutions today would rather persist in the fantasy that they can completely control speech, than accept that their role might just be to advise the citizens on how to police themselves.


You could require receivers to calculate a difficult problem where the difficulty was based on the "hate" people had for the content, similar to what Zed Shaw tried to do awhile ago with IRC (?).

Basically, each person that didn't like the content would spend some CPU time to up the difficulty of transferring it. After awhile, it'd take someone who wanted the content so long that they would give up, in which case the content wouldn't be transferred any more.


I disagree. I found his post to be very informative. I like the idea of an encrypted anonymous network precisely because I have this romantic notion of people of the world sharing important information with each other that can't be stopped or even tracked by governments or big corporations.

If the reality is that my contribution of resources to a project mostly just benefits a bunch of child-porn creators doing horrible things to innocent human beings, then I have to admit that my romantic notion is naive and behave correspondingly.

Sometimes you just can't have nice things because the worst members of society will criminally abuse them.


I've seen some terrible things on 4chan posted by trolls or spammers. Sparing my morbid curiosity, how much worse does it get? Are any lines drawn?

Worse. Ask in Reddit /r/onions.

Why can't there be a text only system for people that don't want to look at images? You would think that would be possible.

What's to stop someone from storing a base 64 encoded image in a text field?

Well, since the client won't interpret it as an image it will just look like spam garbage.

Easily filtered spam garbage.


CP will always exist. Especially if you group everybody under 18 into that category (a 17 year old is not a child, a six year old is) -- a 16 year old can't take nude pictures of herself without becoming one.

That Freenet is currently filled with it is a good thing -- it indicates how much pressure society puts on the people who are into this stuff and that no other group, such as the KKK, gun rights activists, astronomers, has that much reason to fear going public with what they right.

And any such group will quickly fill freenet up with other content, as it will be much bigger than the child abuse rings.


That's an interesting line of thought, but then where's that other sort of content from more repressive societies, of which are are many?

Most groups that would be on such networks (other then child porn producers/consumers) would be mostly political in nature. As such their use of said network would mostly be to distribute textual material (forums/blogs/manifestos/howtos/newsfeeds/ebooks/etc). It would be material that takes up less space, but is actually more time-consuming to produce. For example, writing an ebook takes longer than rolling the video camera for a couple of hours, and produces a much smaller (space-wise) amount of content.

Not really. Astronomy generates nearly endless amounts of data (although they are no longer likely targets of haressment) and the other groups would be posting plenty of video (mostly recordings of police stops, howtos, etc).

And while 16 year old sexting would be considered cp I highly doubt they would be uploading it to any such networ, if only because they don't want others to see them.


Your comment about "quickly filling freenet up with other content" seems to assume that many more people are interested in astronomy, gun rights, or the KKK than are aroused by child pornography.

Is that true? How would you know? Is there statistical information out there about what fraction of the population is aroused by child pornography? Given your statement about naked 16-year-olds sexting falling into that category, it seems implausible.


I've taken some time to consider this issue myself. As other posters have noted its impossible to stop in a system where data is transmitted. I prefer living in a world with free communication (with a channel for criminals) than living in a world with restricted communication. The true crime here is child abuse, not the sharing of data. We don't say people aren't allowed to associate in person without a government minder because they might be swapping photos of child abuse.

>...they might be swapping photos of child abuse.

But in this case we know they are swapping pictures of child abuse, we just don't know who they are. The problem is the inverse.


I know something - and that's how little you know. For one, if you did know, you'd be admitting to a crime.

But more-over, even if you investigated Tor exit-node traffic you can't know if any porn you might see was actually being traded or merely continually transferred by repressive regimes to have an excuse to ban anonymity providing services.

You suspect a lot, and have reached some conclusions, but you don't know what it is, let alone who if anyone is doing it.


I see these kind of services in a similar light to safe deposit boxes at banks - where you also have near-complete anonymous storage. I doubt any bank manager is losing sleep because child pornographers could be using his service.

The biggest problem with this mindset is proportionality at the moment - bank safe deposit boxes have much more public awareness than these services, and have thus gained wider use (and acceptance).


What if you walked into the bank and there was a line of people with sticky pictures of dead kids in their hands waiting to enter the safe deposit box area. Would you come back?

Here's a thought...

Put on your foil hat for a second. Let's say you were a government and you wanted your citizens not to use systems like Tor and FreeNet. Wouldn't flooding them with extremely disturbing porn be a great way to make sure these systems weren't used by anyone except CP wankers? Wouldn't it be a great way to get people to go along with outlawing them?

This is a technical problem. When you think about it like that, it becomes obvious that this is a vulnerability in the security/crypto sense. I'd state the problem this way: these networks are trivially vulnerable to a particularly devastating social engineering spam attack that renders the network virtually unusable by most people. Call it a social DOS attack.

Edit: I believe I can state the problem succinctly:

Design a darknet/freenet network that is anonymous, uncensorable, and yet is not trivially vulnerable to social engineering DOS attack.

Hard problem. Hard equals interesting.


This is actually a really interesting point. However, its clear that absolutely any method of communication is vulnerable to the very same "social DOS attack," especially if we're imagining it to be perpetrated by the government.

Personally, I've spent quiet some time on the .onion network and never been ambushed by child porn. Don't you think your problem is solved on the .onion network the same way it is on the clearnet: with moderation on a website-by-website basis?


I certainly see links to it fairly prominently displayed on the Tor onion network, which I know would drive off at least 75% of users.

What I'm saying here is that there's an interesting unsolved problem and that this problem might be the thing that's blocking the adoption of these technologies.

It's also a critical mass problem. I don't think you could run that sort of attack against the Internet because it has over a billion users. Once the network reaches a certain mass, it becomes far less of a problem. The problem is that CP-wankers (and possibly attackers) instantly colonize darknets, rendering them quickly polluted before they have a chance to escape their nascent phase. Like I said in another post: you could apply game theory here.


That scenario is neither anonymous (someone specific pays for the deposit box), nor publishing (since the contents are not made available to the public).

I wasn't saying they were equivalent, just that they possess similar moral issues. Privacy is a key feature of both, though you get far further with anonymous networks than with safe deposit boxes.

The specific issue with these networks is proportion of use: they get co-opted so early by bad uses that they never get a chance to show their legitimate side.


Regardless of the plausible deniability of the contents, from a legal standpoint, if a court issues an order to divulge the contents and ownership of the box, the owner of the storage resources (banker) can comply, whereas the owner of the encrypted anonymous storage network node cannot.

I'm not going to make a blanket statement that this is a good or a bad thing, since it largely depends on what's in there (the technology itself is neutral); and some jurisdictions might define as depraved indifference (minimally) or facilitation (maximally) should the contents be illegal.

Also I can easily foresee that being unable to trace the provenence of data stored on one's node could put one in a difficult position to assert it isn't one's own, when possession is usually all that is needed for criminal liability.

The banker can cover the ownership case, and hence his backside.

As far as public/private goes, it seems this network from what I read, maintains an opacity shield with regards to contents, but is peer-to-peer storage. So in that regard, neither the safe deposit box nor the storage network are "publicizing" anything, per say.


Couldn't this be solved by having such a decentralized system only be text based and not allow images/video etc?

Cough. Usenet.

So, since people can put kiddy porn on images, images shouldn't be allowed at all?

You would probably have to have some kind of flagging system built into the network, so that people could flag bits and pieces as childporn or whatever categories you provide. The client could be configured to refuse any data with user-selected flags.

Of course it would depend on the network not being taken over by a majority of childporn distributors and viewers, and the flagging system would need to be resistant to spoofing and manipulating.

Tribler is heading that way with anonymous metadata, iirc their roadmap.


What kind of anonymous and untraceable system do you imagine where the files are visible to be blocked by other nodes? I've spotted a small bug...

This is wrong attitude. There should be nothing in place to censor anything. And you as a user should censor whatever you do not wish to see. I agree that certain things might be shocking in real free world, but this do not grant you right to block anything from anybody. Just from yourself.

This is naive because it ignores the social dimension of technology. As I said in another reply: when most people see horrific hard-core CP or rape porn, they will close the program and delete it. End of story. They will also tell all their friends that darknets are only for pedophiles.

Then, when the government wants to ban darknets, nobody objects because everyone knows darknets are only for pedophiles because what non-pedophile would want to be immersed in kidporn?

This severely limits adoption and thus political impact. Average people are just not going to use it. Niche things do not change the world. Want to be really subversive? Build a darknet your mom would want to use.


My grandma find politics very disgusting and discussing it would mean the same as CP for you. She is from ex-USSR. Does this means we should ban politics in darknets too?

And it is your approach is naive. You ignore technology. You can not technically limit information propagation in any kind of network. It will find a way. But if people will think that the only reason to go to any time of crypted network - is to fetch CP - here we go - society prepared for laws preventing cryptography (you can be imprisoned in UK for not giving crypto keys for a several years!)

Again - we might want to work on technology recognizing nudity, or abuse or whatever and protect ourselves and our kids or our moms from this type of content, but this should be completely separated from technology used to share this data! Just like there are parts of big cities which are not safe for white/black/chinese/etc guys. It doesn't mean we have to put tall wall around this part of city. Instead whoever think it is not safe there - just do not go there.


Want something to be used? You have to think about user experience. You have to think like a marketer or a designer. What I see here is a lot of people evading the issue because it involves human beings. Technologies do not become popular until somebody includes the user in the design equation, and only popular technologies have large impacts.

Maybe this speaks pretty highly of the current level of free speech? Perhaps everyone, for the most part, is already able to say what they want and the number one use for systems like this are these horrible things and not say a newspaper that speaks out against the local government. I don't mean to say that a system that provides perfect free anonymous speech wouldn't be useful, but for every person who has something to contribute to wikileaks theres millions more who just want to get off to sick stuff.

And now replace the "Freenet" in your sentence with "Internet". The sentence is still valid. But would this be a reason to shutdown the Internet or to not run any Internet backbone routers?

You raise a good point. There is a difference, however; the Internet is actively monitored and ISPs respond to/work with the police to reduce this kind of content. Can the same be possible on something like Freenet?

What about PGP? and cryptography in general? you can't do an algorithm that don't encrypt CP.. There is no way to stop a particular kind of content, and yes: technology could be used for bad things! you should already know this. Even if every package were monitored&filtered and encrypted messages were illegal steganography could be used to circumvent the censure. So, it's impossible to stop the distribution of CP, we must accept this.

The real problem here is child abuse, and people should focus on this. Sometimes I think that actually our society don't really care about the children: it's more like they just don't wanna know and that's why the common answer from the govt is "filter the Internet". IMO the best we can do is think how we can find the child being molested on certain pic, not trying to figure out some magical way to stop distribution, because if we keep trying this we're just wasting the(ir) time...


I can browse the Internet for a long time before I see an archive full of hard-core pre-teen porn unless I go looking for it. It is difficult to browse the Tor .onion network without seeing at least links to this. Sometimes users land on it by mistake. Try it.

Tor gives the impression of being a sea of kidporn with little islands of interesting stuff, especially when you factor in the cognitive fact that human beings give greater weight to negative stimuli.


Don't you think the "cognitive fact" that you give greater weight to negative stimuli would be causing you to overestimate the relative sizes of the kidporn sector of the .onion network?

Then why did you mention it at all?

I haven't cruised .onion sites in a long while, but I just tried and I didn't come across anything bad by accident. I don't even understand your assertion that you can end up on any .onion site by mistake, since the URLs are impenetrable.

Don't blame the tool. Never blame the tool. Knives can do horrific murders, but they are pretty useful for cooking as well. I'm surprised your comment is the top one here.

I'm not blaming the tool. I'm pointing out its inadequacy. I am saying there is an interesting and very hard problem here that severely limits the adoption of these technologies, and that solving it would be both interesting and important.

The darknet problem as defined by most techies is solved, yet almost nobody uses these systems. Why?


What inadequacy? To say that a tool designed to facilitate anonymous communication is inadequate because people are doing things on it that you don't like seems like a pretty strange position. The inadequacy of a darknet is its uncensorability?

Your network is trivially vulnerable to a social DOS attack that will instantly drive off 90% of its user-base.

In freenet at least there is one significant inadequacy that you must commit your resources to supply others with material you strongly disagree with. The difference between having a network which others can use for trading CP and having a network that want's your bandwidth and disk space to serve CP crosses the line for many people.

I see -- I can understand how that might be a concern. I'm not sure it's really a valid moral issue, but I can certainly imagine it bothering people.

> Never blame the tool.

We can predict how a tool might be used by various users. We can also look at how similar tools have been used historically.

If, empirically, certain tools tend to be mis-used in familiar ways, it's just ignorant to say "don't blame the tool". It's a straw-man. When people "blame the tool", it's typically short-hand for arguing that the creators and distributors of the tool share some of the blame for its misuse, along with the abusers.

I.e. there's a long history of tool creators playing dumb / innocent about the predictable and likely abuses of the tools they create. There's an equally long history of these abuses, so any such arguments are to maintain cognitive dissonance, or made out of pure ignorance.


But it's a cost/benefit problem, no? People say this bearing in mind free speech. Sure, free speech makes it ok to say lots of terrible things, but the boon to human rights outweighs by far the evil made possible. People assume that this logic extends to technologies that facilitate free speech. Care to argue that it does not?

Many countries have decided that removing free speech rights from, e.g., child pornography is worth it. Why would that logic not extend to the technologies you mention also?

Because part of the purpose of these networks is to subvert government control. If you can censor one thing (e.g. child porn), then you can censor anything, which makes it useless if you're an anti-government radical in China, for example.

I'm thinking more of engineers who happily work on scramjets and ignore that extra mass budget and those empty payload bays that aren't for cameras.

Someone on here recommended an inspiring TED talk by the head? of Darpa. Rather than finding it inspiring, I found her naiveté or willful ignorance chilling. Somewhat off-topic.


> I'm surprised your comment is the top one here.

At least at the time of writing this, his comment is the only top-level comment. So it's not so much that it's the top-rated comment, but rather it's the only one that can be displayed in that position.

EDIT: Nevermind. I didn't notice that there are multiple pages of comments. Disregard.


Interesting. I think most people here on HN also believe in gun rights (as a fundamental right, not that they actually own a gun) and so the saying "Guns don't kill people, people kill people" probably goes over well here.

BUT don't most people here also wish that nuclear weapons could be un-invented? Isn't that the general feeling among those that participated in the development of that technology, that they wish they never did it?

What about the responsible disclosure of zero-days? Don't we agree that those are things we don't want floating around so that anybody can potentially use them before it can be patched?

Or what about an intentional backdoor into an encryption system? Is it only bad if people use it? Or is it bad in of itself?

It is obviously more complicated than "never blame the tool". Some knives are designed for cooking, some knives are designed to inflict maximum damage and pain to human flesh.


I think most people here on HN also believe in gun rights

What makes you think that?

If I had to guess, I'd say ~30-40% of people on HN support the US concept of "gun rights". "Gun rights" support probably around 50% in support in the US[1], but 80-90% against outside the US. Given the large international audience HN has I think that would move the average significantly.

(I agree with the rest of your comment, though)

[1] http://www.cbc.ca/news/world/story/2012/07/23/gun-control-po... says in the US 44% support the status quo and 11% support less strict laws.


> says in the US 44% support the status quo and 11% support less strict [gun] laws.

A solid majority don't know what the laws are so "support the status quo" is interesting.

They tend to believe that the laws are less restrictive than they actually are. When you quiz them about specific "proposals", which happen to be current law, you find that those proposals are significantly less popular than the status quo.

Two examples of this are wrt concealed carry and automatic weapons. On the former, very few people think that police should have complete and arbitrary discretion wrt CCW, yet they do in the jurisdictions with the majority of the population.


If what you say is correct, and assuming the US Citizens on HN reflect public opinion then that would reduce support on HN even further than I estimate above.

Agreed. I don't mind saying I'm wrong on that; I admit I have a typical American bias. I would say that HN is probably much more libertarian-leaning than the average US citizen but probably not enough to warrant my claim of a majority (of HN users). Can I claim my point is still valid though (re: guns as tools)? :-)

Well, like I said, I agree with the rest of your comment.

BUT - I think that technology is an amplifier - it makes things easier, quicker and more powerful than before.

Sometimes, building tools that amplify certain behaviours isn't neutral.


> assuming the US Citizens on HN reflect public opinion

They're not a very acccurate reflection. For one, their demographics are very different.

> then that would reduce support on HN even further than I estimate above.

Reduce support for what? My claim suggests that the more folks know about US gun laws, the less they support current law and the more they support less strict laws, and I didn't even address the folks who want more strict gun laws. (When you ask them the same questions, many of them have the same reaction as "status quo" folk. They want "more", but they don't want things as strict as they already are.)

BTW - That's why the whole "assault weapon" campaign is political genius. The guns in question are "military" in the same sense that the cars that you can get at a Chevy dealer are race cars (that is, not at all). It plays on ignorance.

Then again, a large number of folks think that "tactical vest" means "bullet proof". (It means "lots of pockets"; think fishing vest, only black or camo fabric.)


What about a system that establishes a series of trusted agents, who get the data first, and have to approve it to propagate it onto the the wider publicly available network.

Anyone could become a trusted agent by publicly verifying their identity. Basically you need to find at least one other person who's identify is public to "sponsor" your information.

If you are publishing information the Chinese government you find a U.S. sponsor and vice versa.


I think it actually says a lot more about how generally-free our society is.

Giordano Bruno was imprisoned and executed for disagreeing with the church, so had much reason to hide his activities, even though they are what we could definitely consider "legitimate".

Child pornography producers and consumers are similarly persecuted, though clearly with much more sound reasons.

At least in western countries, there aren't a lot of instances of repressed communication that need to be conducted across a channel like this -- especially few legitimate ones. This is not to say that such a system isn't useful; just that I believe the fact they're so full of child pornography and the like is actually, in a roundabout way, an indicator of a healthy society.


Your comment is about child porn, not about TFA. Please stay on topic.

Precisely -- I opened up the comments to see what people thought of the quality of the implementation. Does this guy know about cryptography and robust decentralisation? What are the trade offs in the architecture...?

As others have said, guns don't kill people, other people (and Chuck Norris) do. Unfortunately, the right to privacy has this side-effect of allowing people to hide from others really nasty things. But it will always be like this: laws and rights protect both well and ill-behaving people. Giving those up would, however, harm all of us.

Couldn't it - or a similar algorithm - be used as a bittorrent alternative with superior anonymity? If yes, this might become a more common use case.

Couldn't you require N random contributors to approve of each upload before it is published? With that in place, either the system is just a group of bad actors or the good actors weed out the bad content.

Accidentally downvoted you; I thought people on HN interested in autonomous, decentralized systems not acquainted with Giordano Bruno should become so:

http://en.wikipedia.org/wiki/Giordano_Bruno

http://warburg.sas.ac.uk/index.php?id=446

"The Art Of Memory" by Frances Yates outlines the pathway of mnemonic and knowledge systems from alchemy to the Rennaissance, and is a comprehensive introduction to the life and work of Giordano Bruno, Ramon Llull and many other ancestors of our various technologies of memory.

http://www.amazon.com/The-Art-Memory-Frances-Yates/dp/022695...


Your argument is somewhat self-contradictory.

We know that lots of child porn exists. If you create additional scarcity by preventing the exchange of old/existing child porn (probably mostly 20 year old pictures, etc.) then you increase the chances of new children being exploited.

The legal adult industry has seen most of the profit evaporate b/c of the large amount of free content. The only people making money are those selling ads on top of existing content, which is often not even owned by the companies hosting it. The original owners lack the resources to enforce copyright law b/c they have no income stream to use to do so.

So rather than relying upon the idealistic notion that all exploitation and exploitative pornography can actually be successfully eliminated, it might make more sense to take a more pragmatic view about the forces of supply and demand involved, with the goal of reducing additional harm.


Exactly. Anyonymous payment however would be bad. And currently, with e.g. bitcoin scramblers, we are getting there.

I can see no harm in anonymous data exchange (except copyright infringement) on an open society. But if you can order whatever you want online anonymously -- that could result in some deplorable stuff. (That's why I think bitcoins will get shut down entirely by goverments at some point)


What bad scenarios do you see occurring with anonymous online payments?

I do agree that the US Government will shut down bitcoin fairly soon, for the same reasons it doesn't allow people to create any other kind of currency.

I consider governments to essentially be criminal gangs that have achieved enough power to be able to buy legitimacy. Money laundering allows 3rd party gangs to start to claim legitimacy too, so it must be stopped. This is not to discount the good things governments (and gangs) accomplish, just to call attention to the good and bad done by both. Governments generally have elaborate propaganda and disinformation arms as well, and use money laundering laws to attack the funding channels used by competitors (described derogatorily as gangs).


> What bad scenarios do you see occurring with anonymous online payments?

How about an assassination market[1]? Especially as far as present governments are concerned, this would be an enormous downside to the existence of an anonymous payment system.

Disclaimer: I am not declaring that assassination markets would necessarily be bad for humanity (I haven't thought about it enough), but on the surface it's definitely something that most people would consider to be a potential "bad" outcome of any anonymous payment system.

[1] http://en.wikipedia.org/wiki/Assassination_politics


I am a guy in my 20s who is attracted to children and I use freenet to download illegal pictures of them.

The most active forum on freenet for sharing these pictures is called "child-models-girls". Little girls pose for sets of pictures which are sold commercially and then shared for free on freenet/open-internet. Mostly they are wearing skimpy clothing or swimsuits, or sometimes they naked. 90% of the freenet store is probably pictures like this which are little different to what you used to see in clothing catalogues.

There is real cp too of course - pics and videos of girls/boys doing sexual things with themselves or with adults. I have no doubt the average HN reader would be shocked.

However.

> It's nasty stuff, too. This is not "merely offensive." Kids are abducted and murdered to make this stuff.

This is just not true. Only a tiny minority of people on freenet/similar-systems are looking for stuff where children are obviously being hurt. Most of us despise it. The truth is we like to see children happy because it makes us feel better about looking at these pictures.

> Think of the mechanics of an adult having sex with a six year old

The mechanics don't really work. Thats why 99% of pics/vids involve touching/licking/rubbing/etc only.

> In parts of the world children are more or less raised for the purpose. It's horrific.

This is wild conjecture.


I'm probably somewhere close to Dr. Kinsey on such issues: very libertarian and open-minded. I have no problem with pornography. I also don't necessarily judge people for having feelings. I appreciate the logical and honest post.

What I have to ask is whether the people in these photos, even the very "soft-core" ones, know that their pictures are being shared over the Internet and used like this. Did they give consent, and did they understand the full implications of that consent? Did they get paid? Did they sign a contract?

I think I'm opposed to this kind of thing because I'm a libertarian. I see it as exploitation and deceit and privacy violations levied against people who are too young to understand or that cannot defend themselves.

I've debated people on this topic once before, and it seems to me that there are a lot of cyber-libertarians that will go straight to the mat to defend privacy rights except here. Why don't eight year olds have privacy rights? What if your doctor photographed you, told you it was for medical purposes, and then posted the pics to a gay porn site? What if a TSA screener posted millimeter wave video of you (essentially naked) to YouTube? Isn't this a lot more invasive than Facebook selling your friend graph to a marketing company? Yet most people would find that very invasive and deceptive -- a violation of their rights -- if they hadn't given consent.

And no, the dark stuff is not wild conjecture. I understand that the majority of pedophiles wouldn't be into it, but it certainly does exist. Over at Reddit /r/onions I read a while ago about a forum that exists on the Tor darknet called "Violent Desires." There were lots of "I cannot un-see" kinds of comments, and many warnings about "do not go there... you do not want to know." Child trafficking is quite real as well. The world is absolutely filled with unbelievably dark stuff (in other areas too) that normal people sometimes have a hard time believing: torture ("extraordinary rendition"), blatant fraud to the tune of billions of dollars, human trafficking, slavery, off-the-books unethical human experimentation, and so on... My experience in other areas of life suggests to me that the reality is probably worse than I care to imagine.


An approach I've wondered about: what if we were to take only existing CP (previously seized by the government, let's say), for which the model is known, now of age, and consents to this (because they believe it's going to lead to fewer children being abused), and publish it as widely as possible. Continue to bring the full force of the law upon anyone producing new material. Obviously this wouldn't eliminate production entirely, but it seems like it should lower demand a great deal (because what difference does it make to the "user" if a picture was taken fifteen years ago) and at least make it impossible to produce CP for material profit.

My instinct is that this is, objectively speaking, morally good, but would still be unacceptable to the population at large.


Why is the grandparent comment "dead"? It is an honest and constructive post that contributes to the discussion. Moderating him out of the conversation sends a certain sort of message, and I don't think it's a good one.

And to you, api, this is a very tactful response. I'd like to see more respectful posts like this between lovenothate and you, lmm, and some others in this thread. But it appears that's not welcome on HN?


I've always been quite surprised by how everyone reacts to child pornography. Is it bad per se?

Adult pornography isn't bad unless there's a real rape. I don't see why this wouldn't apply to minors.

Is it bad to take a picture of your chidren in its bath? No. (There's plenty of pictures of baby-me in the tub at my parent's house)

Is it bad to post it on the Internet? No.

Is it bad to wank to a picture of a child? No.

Is it bad to violate a child? Yes, as it his with adults.

In fact I think there's a giant stigma around paedophilia. Somehow, today it's okay to be gay (sexual deviation), but not okay to be a pedophile (another sexual deviation). Unless there's rape or abuse, these people should have the same respect as homosexuals.

"Did they give consent, and did they understand the full implications of that consent? Did they get paid? Did they sign a contract?"

Usually it is the responsability of the parents to support the child and take some decisions for him. They could loose this if they do something to endanger the child, but taking pictures of him without clothes does not constitue a danger in my book. In fact, it's pretty much as harmless as it can be.

"If god had wanted us to run around naked, we would have been born that way"


Sadly, that's the price you pay for free speech. The only way to ensure that the world's whistleblowers can publish what they want without fear of reprisal is to ensure that the world's child pornographers can also do so.

Consider the possibility that these networks are seeded with that content by those immune to prosecution for doing so, who also have an interest in that kind of freedom being curtailed. This is conspiratorial, and of course I have no evidence, but I simply don't think rapey kidporn is as popular as it's portrayed.

Strange, how you say someone could wage a social denial-of-service attack against Freenet with a bit of kiddy-porn FUD...

None of the links I follow into the onion (coding mostly) have ever led to porn, let alone CP. Even if it's everywhere you look (and maybe you should consider why that is) it's nowhere I and most other people look.

Even if there were a non-insignificant amount of CP being distributed there are really just two options, 1) there's a super-secure, secret group of pedophiles who cooperate to abduct and molest children and they continue to get away with it, or 2) various law enforcement groups use the same few pictures over and over in stings and honeypots.

No, it's a nigh-unto made-up problem, and to the tiny degree it may exist at all, it's exactly the same on the Internet as a whole, on Dropbox, via shortened URLs in twitter, etc.

To combat the problem perception that Tor is for CP problem simply quit telling people about it constantly. And if you're told that's all Tor is, treat it exactly like you would someone who started at the sleaziest portal they could and complained the internet was full of porn.

As for you knowing what Freenet is full of, that's impossible unless you're claiming to have seeded it.


Maybe stick to a text-only format. Very little illegal text which isn't otherwise beneficial.

This looks like the perfect platform for feminist activists to name and shame rapists. Great!

Doesn't the media do a pretty good job of that already?

Is this thing using raw RSA with no padding? https://github.com/tarcieri/cryptosphere/blob/master/lib/cry...

Padding is added automatically by OpenSSL unless explicitly disabled. Also I will be switching to Curve25519 (for DH-style key exchange) and ECDSA (for signatures) quite soon. There will be no other use of pubkey crypto, so actually that entire file will be gone soon.

I quickly read over the description of how this works. Nodes have a crypto identity which is used to establish trust between nodes, and storing data from another node gives you bandwidth "credit" to download other stuff. Transactions are cryptographically signed, but presumably this is safe because the contents of the files are unknown because they are encrypted. So how are they encrypted and then stored, and later retrieved? Content is hashed, the hash is used as the AES256 decryption key. Another hash is made of the encrypted data, and this is your lookup key. Only the crypto hash is needed to query a file, but it's useless to you unless you know the original data hash to serve as a decryption key.

I see two problems with this:

1. So, if both hashes of a file of illegal content becomes publicly known, like say on a website, I don't see how you avoid liability having it on your machine. It seems you can only avoid legal liability if someone stores stuff on your machine that is never intended to become publicly available. In any other case, the system has created a cryptographically provable trail between the data and your storage, which can be used to prosecute you.

2. The FBI can generate a SHA256 hash of every computer file of child pornography it has ever collected, and immediately be able to identify every node that contains this data. Presumably this gives them enough legal authority to shut down your node, regardless if you have plausible deniability that you are aware of the contents.


I should mention, you change one bit of a file and it will obscure the file from the "FBI test". But still, you trivially can generate the Cryptoshphere lookup for any unencrypted file you have access to, and see if anybody has it.

Yes, this is known as the "confirmation of file attack" and there is no feasible way for the system to operate without it.

The confirmation of file attack is actually the degenerate case of the "learn the remaining information attack", in which the majority of the plaintext is known except for some low-entropy portion.

You can imagine a standard form letter that contains your credit card number. An attacker can then generate all possible permutations of that low entropy data and find matches where those are stored.

For more information see: https://tahoe-lafs.org/hacktahoelafs/drew_perttula.html


Thanks for that information, that is extremely informative.

But what does this limitation mean for the security of Cryptosphere for its defined use cases? from the article: "If you want to store banned books or political pamphlets without attracting the attention of an oppressive government, or store pirated copies of music or movies without attracting the attention of copyright holders, then the confirmation-of-a-file attack is potentially a critical problem."

Doesn't this mean this system is DOA for its intended purposes?


No, I plan on employing the same system that Tahoe does: I will optionally incorporate a random convergence secret. This effectively disables the deduplication properties, but provides a defense against these two attacks. This convergence secret can be added to the end of every capability token, or optionally omitted (in which case I use zeroes). So you have two options: allow deduplication but be susceptible to the confirmation of file attack/learn the remaining information attack, or more security but with duplication.

Cryptographically this feeds in as a salt/initialization vector to HKDF along with the entire plaintext. HKDF is then used to generate a key and iv for use with AES


I think this may be a good solution for feminist hackers who want to name and shame rapists or incidences of sexual assault.

Legal | privacy