> “We can’t build it for the iPhone because it wouldn’t make it past the App Store approval process,” Mr. Tanjeloff said.
The app is, by definition, spyware. It secretly, continuously forwards all SMS messages to a third party, and “there is no visible icon or shortcut to access it.”
I'm the only one who knows the password + swipe pattern to unlock my phone so I'm the only one who installs apps on it, plus all apps are listed if you decide to check your installed app inventory.
An up-to-date iPhone is not trivial to jailbreak, especially if it isn't yours. Jailbreaking is a specific kind of rooting. It "ought" to be virtually impossible.
Then good luck hiding your trail, reassociating the Apple ID, hoping no apps broke due to the jailbreak (it's happened to me), keeping the device associated with the owner's iTunes, etc.
There are times when jailbreaking an up-to-date iPhone is not possible, but right now the latest firmware can be rooted in a few minutes with a very simple program. It does not affect your Apple ID, iTunes association or anything other apps. It just gives you root access and installs the jailbroken app store from which you can install OpenSSH, etc.
I'm not sure if there are any trojans available, but if there were, you could easily install them and then either delete the jailbroken app store or just hide it from SpringBoard.
This process could easily be done in 10-15 minutes as long as you have physical access to the phone.
" It secretly, continuously forwards all SMS messages to a third party, ..."
It's only secret if you're not the one who installed it.
I think of spyware as apps that claim to be for one thing but instead (or also) do some remote reporting that I, as the installer, would never approve of had I known. This, though, is called Secret SMS Replicator.
If I had a need to spy on my children (perhaps I was concerned over who they might be texting) this would be quite handy.
It is unethical to use this software on one’s children without letting them know. Not to mention hypocritical if one’s purpose is moral instruction (in giving them a phone, its implied purpose is communication to parties of their choosing, not to someone and mom/dad; installing this software is deceitful).
Some consider communication technology almost like “soft cybernetics.” By that line if thinking, secretly reading every SMS is like tapping into your children’s mind. This may be desirable but I would consider that a violation of their own right to privacy and personhood, and (mild) psychological abuse.
But this whole discussion is on the usefulness of SMS forwarding, which is orthogonal to the security & privacy issues of the original article, and the implied endorsement of a quality App Store.
> It is unethical to use this software on one’s children without letting them know.
I disagree. If you had written "spouse" or "significant other" or really any other adult I'd agree with you, but I don't think you can make that absolute statement when it comes to (minor) children.
Letting the child know about the monitoring software would be the good and trusting thing to do, but "unethical" overstates the case significantly. I can imagine cases where it would reflect poor judgment or be inappropriate to snoop in this way, but I can readily imagine scenarios where it would be unethical NOT to secretly monitor a child's communication.
Part of a parent's job is to keep their children out of trouble until the child is old enough to do that for themselves. (Another part, as you suggest, is teaching the child how to do just that.) Absolute statements about privacy kind of go out the window when you are talking about a relationship and responsibility of that nature.
It is not hard to imagine a child (not every child, but certainly some, especially in the "tween" age range) that has the capacity and desire to hide some communication from his or her parents but lacking the judgment and life-skills to keep themselves safe from harm when doing so.
Suppose your child is engaged in inappropriate or damaging behavior such bullying/being bullied, involved in self-harm/cutting, drug abuse, illegal activities, or things even more sinister. If your child has the desire to hide that activity from you, you could tell your child about the monitoring software but then they'll be careful to find another channel to communicate about this activity. Alternatively you could not tell them about the software and have some additional insight into their world whether they want you to have it or not.
Look, it wouldn't be my first course of action, I wouldn't do it without provocation, and I would be very very careful how I made use of the information, but I wouldn't hesitate to violate my minor child's privacy if I thought it was important to protecting her well being. Monitoring text messages is nothing, I'd read her diary if necessary.
So moral absolutes apply to adults but not children? There is never a case when spying on an adult could save their life?
I'm not sure about your examples. If you suspect your child may be suicidal, talk to him or her. If you do NOT suspect bullying or cutting, how do you know that spying is justified? I think you're "begging the question."
That's a bit of a strawman, I'm pretty sure that's not what I said.
This much I know: an absolute right to privacy does not apply to children.
Like you I mean this in the ethical sense: there are (IMO) beyond doubt cases where it is ethical for a custodial parent to invade the privacy of a child, possibly without the child's knowledge or consent. But I suspect this is also a legal fact virtually everywhere, which suggests to me that this may not be a controversial perspective.
I'd be surprised to find and interested in understanding the perspective of a parent that disagrees.
This seems like a no-brainer to me: at some stage kids just aren't capable of managing their own lives. I would take action to stop my (hypothetical) 13-year-old child from cutting just as I would take action to stop my 3-year-old child from playing with knives. The actions would be different. Talking about it would certainly be a preferred option, probably the preferred option, in dealing with the 13-year-old. If it came to monitoring, telling the child that their communication will be monitored would be preferable to not telling them. But I certainly wouldn't just throw up my hands as say "c'est la vie" if those things didn't work and I certainly wouldn't hesitate to invade the child's privacy if I thought it would have a positive impact in the long run.
I did not and cannot make a universal statement about "moral absolutes", only about the "right to privacy".
Personally I'm not sure whether or not an absolute right to privacy applies to adults. You make one argument against it (for their own protection). Personally I disagree with that, I think that if you are a mentally competent adult eventually society has to give you the flexibility to harm yourself if you choose to, but I can understand where that argument is coming from. There are other arguments against an absolute right to privacy that I think are more widely accepted, such violating an adult's privacy for the protection of others, but that's a slippery slope to go down.
Also, re:
> If you do NOT suspect bullying or cutting, how do you know that spying is justified?
one of us must be misunderstanding the other. I'm in 100% agreement with you. If I did not suspect an issue that could be helped by "spying", I wouldn't be doing it.
Well that’s certainly an eye-opening use case of that software. I understand that setting the iTunes preference “Encrypt iPhone backups” and/or not allowing your spouse to log in as you would cover stop this. And deleting SMSes before syncing would also shield them from these, I assume.
Bad enough if he'd said "make sure you trust the people who handle your phone", but he went full-bore, saying "make sure people trust you." He's preemptively blaming the victim — the person on whose phone his software runs.
The lede for the article softpedals the degree of scumbaggery involved with the sale and use of this product. It's software for the domestic-abuser demographic.
If you think you need something like this in your relationship, your partner deserves something better than you. If you think your partner betrays you, talk to him/her.
Not just your life partner, but people could use it with their manager, business partner, person you're interested in dating/stalking.
Given that text messaging is the dominant means of communication for many people, it really becomes crucial that you don't let your phone out of your sight.
Who decides whether it should be forbidden or not?
What if I'm traveling to a high crime area? I might want to install this on my phone, to make recovery simpler in case someone steals it and starts messaging their friend bragging about their great new phone, and to meet them at Taco Bell in five minutes so they can show it off :)
Google does. It seems pretty clear that it violates the Android Market Program policies, specifically the sections on impersonation, private information, and illegal activities: http://www.google.com/intl/en_us/mobile/android/market-polic... I would expect Google to remove the app from the Market and any Android phones it was installed on.
While your use of the program would be fine, the promotional video linked in the New York Times article makes it clear that the intended use is to spy on your significant other's cellphone. There are much easier ways to recover stolen phones than to install a program on it that will forward text messages to your non-stolen second phone, but good luck with that.
Actually does it break any of the rules listed? It's very upfront about what its doing. It is sending your text messages to another phone number. The issue isn't the app per se, but the fact that the person installing the application is not the owner.
For example, an app that allowed me to browse my SMS messages wouldn't be blocked, but clearly if someone stole my phone, they could then browse my SMS messages.
The app isn't deceptive to the person installing it.
And can Google remove the app if it wasn't installed from the app store? I can sideload it can't I?
I like that is like the desktop and open enough to be abused. I prefer that to approval process based systems.
I like that the choice is available (the world would be a poorer one if it didn't have such systems for people who want them), but this makes me just that much more happy with my iPhone. For me.
Not that such a thing would be impossible with an iPhone, of course: someone could jailbreak it and install a similar app. But it'd certainly be harder and more time-consuming.
There's only so far that static analysis can go, however. There are a few key issues with static analysis in the context of Android:
1) You can generate and load new classes at runtime, making it impossible to do complete static analysis on things running on the Dalvik side of things.
2) You can run native code, allowing you to circumvent static verification in about a billion different ways. Note that they could take the Google Native Client approach and force binaries to be statically verifiable to a large extent, but then you can't JITC code.
Static analysis, while cool, is not the solution to problems like this. The solution is fine-grained authorizations to access, which Android is already half way there on. Honestly, the problem with authorizations isn't a technical one (especially when you enforce a pure-managed code system like Dalvik), but a user experience one.
That said, some degree of runtime analysis to verify that an application is not trying to escape its bounds can be helpful.
That was a gut reaction, mostly countering stupid signature based "verification" of binaries, a la typical anti-virus software.
In terms of locking down an Android box, well, the least they could is what you suggest, fine-grained control. Flash apps have this, there is a bit in the file that allows the applet to choose between file-system access and network access. I thought that was clever. Another thing is to allow applications to send back only data that they bring in (i.e. for version verification.) along with a generic set of data about device capabilities, location, etc.
I'm just awe struck that an app that you have to go seek out, download and install on your phone is classified as "malware".
If I download Firefox on your desktop and hide it, is it really fair to call it malware?
edit: Especially since it's not on the Market, so someone has to be messing with your phone quite a bit to go find the link, download it, enable non-Market sources and install it. Again, I just don't feel that it deserves to be lumped in as "malware". I feel like malware is something that a user gets on accident. Not something that someone physically intentionally installs.
I initially thought this was going to be some rogue app that looks like a game, fart machine or something innocuous but behind the scenes sends your text messages to the developer. That would definitely be unfortunate and is a risk. Although if a fart machine app asks for access to your text messages and that doesn't cause you alarm, that's a separate issue.
This is more of an issue of identity / authentication. The phone doesn't have any way of knowing that the "you" which installs the app isn't the "you" that is the primary user of the phone. The phone tries - it offers you a swipe code lock, you can even install other locking systems.
Imagine if the app was not described as a marital eavesdropping tool but rather as a way to help you keep a record of your text messages that you can view via your account from your computer. It doesn't install an icon because there is no on-phone UI and the icon would just clutter the apps menu, a convenient feature to be sure. Same app, different story. The Google Voice app has a very similar idea.
It is illegal to wiretap someone else's phone. It is illegal to open someone else's mail. It is most likely illegal to install this App on someone else's phone.
Just because you can commit wiretapping crimes with the application doesn't mean that the app is illegal and should be removed. It's not illegal for people to install this app on phones that they own, possibly for added security in case someone steals the phone and uses it to send text messages, or if you just want to log all of your text messages at one number.
It's like removing a torrent application just because you can illegally pirate music with it. I oppose Google's decision to remove the application from the Android Market, as the application is appropriately labeled and does exactly what it says. It is not malware or spyware.
Who's asserting it's illegal? You seem to be setting up a straw man. Google should remove it because its use is unethical, and its continued presence is an indictment of the market's legitimacy.
isnt that the same as cisco routers that secretly provide packet data to the NSA? Are we surprised? What about the AT&T closet that provides physical access to your backhaul?
The app already was pulled from the market.
The market works (pretty much like wikipedia) through openness and a broad community to monitor it without heavy censorship from one vendor.
There is a whole category of apps like this for the BlackBerry -- apps that you deliberately install on a phone to spy on someone else who is using it. It is categorically different from conventional spyware because it doesn't misrepresent itself or do anything malicious to the user who installs it. If users are infiltrating each other's phones and spying on each other, that's between them. You may find it sleazy, but the developer isn't spying on anyone.
The app is, by definition, spyware. It secretly, continuously forwards all SMS messages to a third party, and “there is no visible icon or shortcut to access it.”
It’s like an ad for iPhone.
reply