I’m sure you’ve read the old fable of “The Boy Who Cried Wolf.” Facebook has made voluntary decisions at the highest levels, reported over the course of the past decade, to shield certain people from its content rules that apply to the general public. So if it was a bug this time (and I have no reason to believe that it wasn’t), I’m sure you can understand people’s skepticism about it.
Moreover, I assume this bug was reported internally, probably pretty quickly. How long did it take to get fixed? If the fix wasn’t prioritized and corrected within, say, a day (along with a regression test to ensure it never happens again!), then that would be pretty damning of the company’s culture and priorities as well.
That's completely inaccurate. He responsibly disclosed the bug but Facebook security didn't have the right privacy settings turned on and was completely unresponsive to a huge security hole. So he did something non-malicious to get their attention and it was fixed in 5 minutes. See how that works?
I think people know that this is a mistake. But it shouldn't have happened. Facebook is a massive, wealthy company that has many many programmers. Bugs like this should not have happened, considering that privacy is one of the biggest media problems Facebook has.
I know the timeline. As explained by the security person at Facebook, it was not a bug. They were mistaken. However, the fact is, he wasn't penalized for that incident. Rather, the incident with MZ's account.
So, by statement still stands. Unless you want to contend that we should assume we know better than Facebook Security and ignore what they says is and isn't a violation/bug?
It looks like a mistake, FB is actively communicating it's trying to restore the content but still everyone is witch hunting.
I thought people here were more sensible to the fact that developers at all companies leave bugs/undesired behaviour behind them.
This may very well have been spec'd and developed 10 years ago, and now the reality have changed, but not the code. Because let's face it, it's not exactly an everyday use case.
It's an assumption that they even care or are looking into it. So far the resulted outcome is they are blaming the guy who didn't follow "their rules" (that wouldn't likely have been clear to him due to a language barrier). This should hit mainstream media though - because if that kind of bug exists, what else exists that Facebook doesn't even know about, that's being taken advantage of?
Facebook is wrong on this issue. OP made a good faith effort to report the problem. When this failed, he demonstrated the bug in a non-destructive way. He did not post maliciously, nor did he use the bug to obtain confidential information. When the channel set up by Facebook failed, he took the problem to the CEO. I will post this issue to various social media outlets until the OP is fairly compensated. Facebook's actions here are deplorable and discourage users' efforts to report bugs.
Except that a bug implies it was accidental. Everyone deserves the benefit of doubt, but Facebook's history is exceptional in this regard.
Similar to the OP, I have ethical concerns with some aspects of what Facebook does, or at least obvious harmful side effects of what they do. But I think for me the reason they bother me more than Google or others often lumped into the same category is the consistent and, as far as I can tell, completely unabashed disingenuousness they display (e.g., the history of public statements made every time this has happened going back almost to their founding days).
And in all likelihood, someone else monitored the incoming reports, recognized that this was a high-priority issue, and fast-tracked it to the developer. On top of that, I don't know Facebook's process that well, but releasing something that quickly probably involved some coordination to get it out so fast. Maybe someone else verified it, maybe someone fast-tracked it through the approval / launch process, etc.
Even things like bugfixes are rarely entirely creditable to only developers.
Judging by the name of the endpoint, it probably wasn't a super-complicated fix anyway - just disable / blacklist the endpoint that was obviously a mistake / test.
I'm not familiar with the Facebook incident you are referring to, but it doesn't sound like something caused by a bug? The article is about bugs; i.e. unintentional tragedies. If what you're referring to is not the result of a bug then it's off-topic (though perhaps not unimportant).
Really? Just because FB's security team was dismissive of a real bug report due to a language barrier they could have overcome with the tiniest bit of due diligence?
Thank you for your honest response. Mistakes happen all the time.
The thing that concerned us most is that as a user of Facebook we had the impression that all efforts of reporting the issue where unheard. There was literally no feedback what so ever. I totally get that Facebook can't respond individually to error reports, it's just got too many users for that. None the less we felt really helpless in that situation.
I would never have imagined anyone of Facebook would ever answer this HN post. So again: Kudos for this statement and thanks for resolving this issue.
When I first saw the blog of this guy, just looking at the initial bug reports he mailed, I understood the nature of the bug this guy had discovered. It's strange that the people who are responsible for maintaining security at Facebook, failed to grasp that idea. It could be that they get so many bug reports a day, they have gotten in the habit of making fast initial judgments(which is natural). Maybe they made an initial judgement about this guy and then being biased, they failed at their job. If Facebook does not have a review of bug reports that they dismiss, and if this guy had not posted on MZ's profile, this bug would probably still exist.
We have identified this bug and closed the loophole. We don’t have any evidence to suggest that it was ever exploited for malicious purposes.
Yeah, it's not like Facebook already solved the problem. It's fun thinking of them as utter incompetents who don't spend years executing very subtle redesigns without alienating their hundred million users, because they haven't been radically changing the site since it first launched.
I am making a case for the OP's comment that Facebook may have made a genuine mistake by introducing this bug - like they literally called out in their statement.
A bug is a bug. Whether it allows a hacker to sneak in to steal all your data or whether it allows a company to collect data it wasn't supposed to (as in this case Facebook specifically mentioned that it didn't turn off the feature though it intended to).
Interesting to see the difference in perspectives: Reddit is jumping on that quote about it being a bug and saying it's definitely intentional, while HN is taking a more conservative approach. Personally, I always err on the side of bug/incompetence rather than maliciousness in matters like this because even Facebook isn't brazen enough to pull something this obvious.
Moreover, I assume this bug was reported internally, probably pretty quickly. How long did it take to get fixed? If the fix wasn’t prioritized and corrected within, say, a day (along with a regression test to ensure it never happens again!), then that would be pretty damning of the company’s culture and priorities as well.
reply