The masses need to pick a new site to invest their energies in, rather than scatter at random to lots of different sites! Any particular one out there that's better than the rest?
As someone who hasn't used usenet for downloading content, why is it that it seems that a few central indexing services are the way people access this medium? Wouldn't it be possible - and preferable - to distribute the index through p2p? That way, there would be no single point of failure.
The index would be insanely huge. A lot of these indexing sites are used for their API. Dedicated applications like SickBeard automatically compile lists of wanted files (movies, TV rips and music) and use them to search against the provider. The provider returns a file (NZB) that contains references to hundreds of encoded files hosted on newsgroups.
Here's a breakdown of one NZB, for a single 3.88GB file.
Inside are 74 RAR files, and 15 additional PAR files containing parity for recovering corrupted data.
Each of these 89 files is split into 37 parts of 1.4MB for ~3200 pieces total.
These pieces are then encoded with yENC (similar to base64) and then uploaded to the newsgroup.
Every single one of these yENC pieces is referenced in the NZB file, making for a total of 400KB. Just because of this, the NZB indexs are enormous; hundreds of gigabytes, probably terabytes in some cases.
There's no need for everybody to serve the entire index. Set up a DHT and let interested parties exchange the NZBs. All problems in computer science can be solved by another level of indirection.
it's a reputational and data quality issue as well. completely decentralized content distribution doesn't usually work unless there is some kind of curation, there are just too many motivations for disruption. it's not an impossible problem, but long ago ceased being mostly about the technical occupation of shipping bits between nodes.
You could always have sites that only serve up indexes of NZB files (e.g. "NZB file => SHA-256"). Then you could use these to determine which NZB files to trust. I would assume that distributing a "NZB => Checksum" hash would be a lot smaller/easier than running an entire site on top of a terabyte of NZB files.
DebTorrent was to be a set of BitTorrent extensions to allow the Debian archive to be syndicated as one ever-growing torrent. Seems similar to the use case here, and free from the problem we have now which is the huge indexes getting hit by takedown notices.
http://wiki.debian.org/DebTorrent
That's usually where the indexing sites get them from in the first place. The issue comes when you're a client that's not indexing continuously; downloading billions of headers to try and find a particular NZB is a mammoth task.
> downloading billions of headers to try and find a
> particular NZB is a mammoth task
This is true. Even a ~10 of years ago, downloading just the headers on some of the newsgroups shot my cache directory into the gigabytes (for a single newsgroup).
I'd give nzb.su a try. I started using them a few weeks ago as an auxiliary when I was having issues with nzbmatrix.
In my short few weeks of using it, the API seems better than nzbmatrix, but the community / comments on files is near nonexistent. If you're only looking for something to hook up to sickbeard / couchpotato though, it should do the job just fine.
I was getting a good amount of missing blocks on Astraweb only; I bought a backup news account from blocknews and haven't had any missing block issues since.
I'll not jump ship until my downloads are frequently disrupted at the block level, across multiple providers. Pretty difficult to achieve, given the architecture.
Indices come and go... Much of the Usenet crowd know how to set up an indexer if need be. Perhaps a hidden service is warranted?
I've been using Usenet and Newzbin/NZBmatrix for 4 or 5 years. New content has been getting taken down the past few months. Go look at the comment section of any new movie. It got progressively worse, to the point that I was starting to consider what to do next. I suppose I'm not surprised to see this this morning. But it was abrupt.
An example of someone writing a better implementation of a program and taking over a protocol. DirectConnect was a centralized client-server file sharing program[1], with a client and server (both written in VB). DC++ was a C++ implementation that started out as 3rd party implementation of the client, and then IIRC the server. The original client/server were incredibly buggy, so DC++ client/server over-took the original client (they may have even extended the protocol).
[1]: Basically, you connect to a central server (that anyone can run), and you can share files with other connected to the server, chat with them (a single central chat room), private message connected users, and search files shared by connected users. Note: that some of this may be out of date as I'm mostly referring to the initial implementation.
The page up at Newzbin2.es (or at least the Google cached version for those of us in the UK) which shut recently mention that BitCoin is too complicated for most people. I never used it so can't say...
The fundamentals may be, but the implementation doesn't have to. One could easily just make do with "send" and "receive" functionality. Blockchain has a very nice android app.
My first experience was buying some coins through Mt. Gox, and immediately having my email leaked. It's something that seems to happen with alarming regularity in the Bitcoin world, people dropping 0-days left right and centre to get their hands virtual stashes.
I suspect I probably got burnt trying bitcoin, but I would still warn anyone against using Blockchain for anything more than a toy.
The alternative is to run your own client, and download close to 3GB of data at a snails pace. Neither are at all attractive at the moment.
Hmm, from what I saw, Blockchain keeps everything encrypted at all times, so it looks pretty secure. I imagine that buying Bitcoins might be more problematic, though, true.
Inject into their code and wait until the big players log in and reveal their keys to the trusted site. Game over. The encryption only protects against an opportunistic database grab.
Well it's different, but the end result is the same.
Banks are massive and have pretty much everything insured. If they mess up, it's on them. Not so much with bitcoin, if they mess up their users lose out.
Additionally demonstrated by Mt. Gox, most bitcoin services are run by complete novices. Basic SQL injection on a website dealing with currency is unforgivable.
Think of it like Pirate Bay, but for newsgroups. Newsgroups are massive with some providers archiving binary groups for up to 4 years (and growing). Sites like NZBMatrix index the content and provide nzb files which tell your client what to download so that you don't have to constantly download the headers for a bunch of sites and search through them. Sites like NZBMatrix were also somewhat curated to help keep passworded files out of the results.
+ the great thing about nzbmatrix was to use their API for the searches. You could then have a local process download the actual files after doing a request to nzbmatrix.
running your own custom tailored usenet indexer seems pretty easy nowadays thanks to http://www.newznab.com/ - if you have a need for one in the first place, that is.
If you're looking for scene releases, just use a pre database like preDB.com. Find a release on preDB, copypasta the release name to binsearch, create the NZB, done.
Generally speaking it's very easy to find material just using binsearch. If you want a 1080p copy of a particular film, just search for the film name along with '1080p' and you're almost guaranteed to get relevant results on the first try.
This is awkward... When newzbin shut down, I was without a NZB provider for a week. Then 2 days ago I paid for my membership at NZBMatrix. And now it's gone?
Similar position.. but I was waiting a week or two before committing to actual money :)
Glad I did.
Going a bit old school at the mo, downloading headers from a.b.mm for example. Get some retention up myself, might take a while!
Despite being 99% illicit content, could somebody explain why this is not a case for Safe Harbour? Is this principle enshrined in any law at all, or is it just some neutrality idyll that we assume others share?
The "safe harbor" includes more requirements than responding to take-down requests. In particular, the service provider must "not have actual knowledge that the material or an activity using the material on the system or network is infringing", and "in the absence of such actual knowledge, is not aware of facts or circumstances from which infringing activity is apparent".
In other words, though you don't have an active duty to police your site, if you do become aware, and don't take action, you're not protected. Similarly, if you set up something with the sole purpose of distributing copyrighted material, and try to hide behind a "gosh, we never knew that kind of thing was going on until you told us", you're probably not going to get a court to believe you.
>Once this notice is completed we are left with an impossible task of policing our indexing bots. Even then it won't stop there, there will be follow-up notices etc.
This seems to me like they're being told to actively police automatically-indexed content after this notice is processed?
I'm not sure how provable that 'knowledge of infringement' clause is.
"knowledge of infringement" basically boils down to "you know damn well what you're running here". It's not that every site operator has to actively police every single submission for possible infringement; it's that if even a ten-second glance at your service reveals obvious massive presence of infringing content, you don't get to pull the "well, gosh, we never knew" and claim safe harbor (this is the "is not aware of facts or circumstances from which infringing activity is apparent" clause).
I've never used torrents and I've always used newsgroups. Does that make me old? Anyway, I feel naked now, with newzbin down and now this I feel I'm going to have to end my fun or try torrents which I'm worried about the legal issues. Are torrents "safe" now a days, what's the best approach (Seedbox?)
Torrents aren't really safe or unsafe, it's mostly about the quality of the trackers you use. There are some OK cultivated communities out there, but the worry over ratio and constant leech on one's upload stream is a real bummer when you can just purchasing blocks from a news server and not have to worry about it. A seedbox is way more expensive than a news account and usually comes with quotas which one must mind.
All in all, nzbs are a much more pleasant experience if you can afford the account (and pretty much any employed person can) and have a good indexing site (harder to come by, and now I don't have one either :( ).
I'm old too (using USENET since mid-90s). There are other indexers out there. Try http://www.nzbsearch.net/, http://binsearch.info - and if that one goes belly up, there will be others, or install your own private indexer (or write your own). I stay the hell away from torrents.
Would be nice if they would a) publish their code and b) upload their nzb database on Pirate Bay. Would make it much easier for someone to pick up the torch.
NZBs die pretty quickly, so a back catalog wouldn't be that useful. They even mention on the website that content is getting taken down faster than NZBs can be updated.
I've used NZBs that were 3+ years old and they've mostly worked fine. Occasionally blocks are missing, which of course is more likely the older an NZB gets, but most of the catalog is still together.
They're right that many providers are getting hit with notices, which causes them to remove a few chunks of each rar from their server making it impossible to complete the download and too pervasive for repair by typically-sized parity file, but this can be circumvented by using a lesser-known news server as a backup; you still use Astraweb or Giganews for 99% of the transfer, but your fallback picks up the pieces that have been DMCA'd out of AW/GN/another major carrier.
I would be happy to see a more intelligent splitting system than rars, as one missing block in a couple of rars will often make it really difficult to extract the content you DO have and use a more robust delivery system, like a torrent, to download the blocks you're missing. This is going to become increasingly important.
News servers are interesting because they really are a non-optimal method for this kind of transfer that has incidentally become a hot bed for filesharers, probably just because of the plausible deniability ("yes officer, of course we only intend our server to be used by those discussing photography..."). I too wonder if it's not time for a new protocol, something more direct than torrents (webseeds kinda works here, but not quite what I'm looking for) but less hacked-up than NZBs and ASCII-encoded binaries in split files.
It's time to stop skirting around the issue and try to put together a serious underground analog to direct-downloaded binaries, something hearty and immune, or as immune as possible, to interference and foul play like this.
you can bet your bottom this fact''Federation Against Copyright Theft Limited''will be into all nzb providers now by attacking the index list making sites like nzb matrix
take down massive amounts of copyrighted movies and other copyrighted items, changes in the usenet indexing this looks bad all round for us downloading they cant get the server providers so they will go after the next best thing the index system that list content.
It was probably cheaper to force or induce the closure of index sites that are done more as a love of community than of profits, not make nearly as much as the usenet providers themselves.
These index sites, I'm sure, understand the legal costs and risks of continued operation if those options were presented to them. With the majority of them down and out of the way the providers themselves will see a reduction in revenues and may not put up a fight when they fall into legal pressure to close up shop.
a lot of chat about "we never use them and get by" ,,how about some feedback for alternatives!!!!!
i have only ever used newzbin ,,then yesterday paid matix ,and today not sure where to go?
why not? admittedly, i didn't use nzbmatrix, but i do use nzbsearch and the ui is simple, easy to use - search is ok, but it seems to get the job done.
Gotta say that I find it very questionable how many people dump on nzbmatrix or say that it was not necessary anyway.
Sure, if you only watch high quality stuff like Skyfall and listen to Lady Gaga, a site like nzbmatrix is probably useless to you. But for those who appreciate things that aren't that easy to come by, it is essential to have some sort of an index to even HEAR about them. My estimate is that at least half of the movies and games I got through nzbmatrix, I had not heard about anywhere else.
And as every usenet user knows, it's basically just not possible to "browse" the usenet itself.
Then there are software releases where 95% are fake or bug-ridden. Where it saves a LOT of trial & error if people help each other out by posting only those that are for real.
And I found no other nzb site as useful when it comes to finding all new releases. And I have looked at them all because I did not appreciate nzbmatrix's own censorship.
You don't need a usenet indexing site to learn about good indie movies or music. There are dozens of websites that review and discuss these things without releasing them for free against the authors wishes....
This is a Usenet binary sharing site whose front page, in a Usenet "video review", was a table of movies, TV shows, and video games. In other words, it wasn't even possible for the owners to look at their own site and not know that it was being used almost exclusively as a tool for piracy.
Extremely common DMCA misconception: it's not enough to be "takedown compliant". That's not how the law works. You also can't ever operate with specific knowledge of infringing content; you are, in effect, required to "take down" any pirated content you find.
Has it actually been established that merely linking to copyrighted content is illegal in the United States? I'm not trying to troll, just genuinely curious.
My technical understanding of this could be incorrect, but it seems to me that there is a meaningful difference between a BitTorrent tracker actively coordinating copyright infringing downloads vs. a website like NZBMatrix hosting nothing but static links to another location on the Internet.
The NET act criminalizes any willful "copyright infringement" (the term used in the act) so long as it's done for financial gain. Contributory or vicarious liability for copyright infringement for linking is well established.
At any rate, services like these rely on the DMCA Safe Harbor provision. Regardless of whether the potential liability is civil or criminal, you can't have actual knowledge of infringing activity on your site and the ability to remove that activity from your site and claim safe harbor.
What about Google? They make a copy of almost every image on the internet with the full knowledge that they are nearly all covered under an absolute copyright license of "All Rights Reserved", including the right to store and rehost a copy of that content. How would Google's financial outlook change if they decide to follow the letter of the law and never use pieces of copyrighted works "for financial gain" without explicit permission from each rightsholder first?
It seems to me that the only difference between Google and NZBMatrix et al is that the media companies like one more than the other.
Just a heads up for whoever wants to do the same: the process that it goes through to create the releases is that it downloads all of the binaries and parts and then runs that through a few regular expressions to determine what goes with what and how it should be categorized and displayed. The free community edition only ships with two regular expressions and it wasnt creating releases for me. I donated to the project and got a lot of regexes plus some other stuff and now it works as expected.
This is a major blow to usenet. But more importantly this could be the start fo something very very bad for usenet. It, and it's users of who I am sure there are plenty on HN, have so far been able to fly under the anti-piracy radar of the big media companies. This clearly demonstrates that they know exactly what is going on, and may be on the war path to shut down more indexes and search engines.
Nauseating. Piracy killed the real Usenet, and now pirates are upset that enforcement is killing the goofy little playground they built inside its corpse.
When I was 18, I ran a competitive (on the Freenix leaderboard) full-feed Usenet server for the ISP I worked at. Every ISP in the world could have provided full-feed Usenet access, but for the assholes who loaded the system down with ASCII-encoded binaries. Even while it was possible for an independent provider to offer Usenet to customers, it was still a total nightmare because of the ludicrous storage requirements for binaries, which ensured that only an ever-dwindling number of providers would take the time to offer it at all.
It's startling to me that there's a Usenet at all anymore, since it's essentially been reduced to a collaborative system for sharing pirated binaries.
So I am chuckling at the idea that some young guns who want to be "free" to do what ever they want in their network of computers to come up with a scheme where they are will use telephones to call up one computer to the next, and addressing will be free form a series of "hops" where you tell the computer what sequence of machines will have to be called in order to get your message from you to your destination. You could use a character like ! to separate the hops making an address 'bobs-machine!piratebay!alices-machine!alice' delivering through three hops to Alice.
We'll also need something to deal with these high-latency links. A piece of hardware which generates the ACKs locally so our computers will send faster will sell like gangbusters! It'll really pep up the connections.
I was also running a news server for my school and an ISP around the same time (1994-1996). The alt.binaries hierarchy was a constant headache. The total data flow and storage requirements were already about 10X the rest of Usenet. We had to manually tweak refuse lists when the disk drives filled up.
I wasn't still around, but I expect the second wave of automated binary posting/retrieval tools in the late 90's probably created the surge that changed Usenet access into a separate paid service, no longer something that came with an ISP account like email.
However, it wasn't just warez and porn that killed Usenet. Spam took off there before email, and it wasn't until email spam became untenable that modern automated filtering tools were created.
Why did you have to include alt.binaries.? I thought the alt. hierarchy was basically optional, and it was the news server admin's prerogative to opt in and out of alt.* groups.
That’s just one side of the coin, though. While the binary tree played a role in the ”death” of the Usenet, services like AOL played another major role. Companies tried to make the internet more comprehensible for the average user and invented forms of communication that they thought of would be understandable by the masses and easier to sell (remember those pesky ”You’ve mail” AOL ads?).
Furthermore I don’t think it’s true that the Usenet is dead. There are still some active groups that rely on it. It’s just that there’s so much more these days. There are several social services competing with each other, there are a lot of web sites that let you discuss about things and are targeting specific niches.
Question for those with Usenet know how. When downloading raw headers, I have observed a convention where the title is a random string of letters and numbers, and within the post are for example 50 rars and 10 pars. How are consumers 'decoding' the title so they know what they are getting? Does the uploader provide a translation? Or can the 'real title' be recovered programmatically? Many thanks.
There are lots of these and they are interesting because the first part suggests a date, 29/11/12, and the second part suggests a sequential number. The size is almost 6 gig.
Aaaaand I'm glad I didn't look that one up on the work PC.
I'm assuming you can get the filename info by looking at either the RAR headers or the first archive.
Such disappointing news compounds last weeks troubles. When nzbs'r'us shutdown over similar payment problems. Payment is a huge achilles heel for these services. Blocking payments was used to hurt WikiLeaks, and Dutch usenet providers have recently lost the ability to use paypal for payments too. A popular bitcoin-esque service can't happen soon enough.
There is NO such thing as priracy, there is only sharing of property between people. When I buy something I own it, no matter what those fools say, and when I own it, I copy it if I like!
Move the site to TOR and let digital copyright enforcement cops try to figure that one out. All they can do it push it completely underground and once the site is on TOR they have zero recourse or means to track down the operator.
The site is just text transmission, like torrents. Your connection to the news server would not have to be piped over Tor, just your access to the web interface where you download text files telling your news client what to download.
I guess in this era of Bitcoins and Tor hidden services, it's inevitable that a site using credit card funding and a public IP address for piracy is being shut down. Low-hanging fruit.
reply