>> Seny Kamara: Okay. It's a pleasure to... Muhammad is a student at University of Illinois at Urbana-Champaign...

advertisement
>> Seny Kamara: Okay. It's a pleasure to have Naveed Muhammad speaking again today, so
Muhammad is a student at University of Illinois at Urbana-Champaign and today he'll be
speaking about Android security.
>> Naveed Muhammad: Thank you, Seny for the introduction. Today I'll talk about our work on
Android device driver customization and how the security of the phone is reduced during this
customization process. This is joint work with my colleagues from Indiana University. Android
has more than 80 percent of the global smart phone market share. There are more than a
million apps on the play store. The good thing about Android is it’s open source and all of our
observation is bassist on this open-source nature and this is one, and what we show in this
paper is one bad thing about open source just because of the open source nature. What
happens is that because the Android code is open sourced, the manufactures customize the
Android to support different hardware or to give different functionalities. For example, they
have their different apps, their default apps. They provide different functionalities, so they can
just like somehow get better phones then their competitors. They do need to modify drivers
always because, for example, if they want to have a camera from a different vendor, or any
device from a different vendor, they need to provide for their own drivers for it. For that
reason they need heavily customize both the kernel layer and the application layer. How this
works is that Google first releases their Android source code to these vendors. Then Google
gives some time to these vendors because before releasing it officially. Google releases their
phones with leading-edge version, but they give some time to the other vendors so that they
can also put their state of the art Android in their phones. Typically, this time is something like
six months. When vendors add new applications and drivers to the kernel layer and there was
a paper at CCS that showed that the preloaded applications leak system capabilities and
sensitive information to malicious apps. AOSP stands for Android Open Source Project. Since
2009 there have been 19 official Android releases, so when the need to customize very
frequently and for every different phone, they need to customize the version that was released
at that time and date test in that version. Before talking more, I will just introduce the Android
architecture. Android is built on top of Linux kernel and then on top of Linux kernel there are
libraries and there are Android virtual machine, that is actually a modified form of Dlvik virtual
machine and then these are C libraries. On top of this layer is the application framework that
provides different services to the application layer and then on top of this application layer,
developers can rewrite their applications. Vendors do add applications at this layer, system
applications at this layer and they do it heavily modify the Linux kernel there also. We observe
that there are no changes at this level, minimal changes that framework layer. As you know,
the device drivers work on the kernel layer and drivers communicate with the apps through the
framework layer. This customization process should not degrade security at both the
framework layer and the kernel layer. Actually, if you are familiar with Android permission
model, what happens is that the permissions are checked at application framework layer and
then after checking the permissions they are enforced at the kernel layer, so somehow if the
app can bypass the application framework where and the kernel layer and just talk directly to
the kernel layer, for example, by just writing C code. C code is somewhere here and they can
just talk to kernel without going to application framework layer. Then they can just bypass the
permissions. They need to, if they modify something the kernel layer and there is a need to
make sure that this is consistent with the kernel layer. The customizations that are done at the
kernel layer customize different things. We analyzed different phones for like what type of
customizations have been done, and the Samsung Galaxy Ace 3 what you can see is that 43
percent of the customization in the kernel are being done in the driver's fault. In this
architecture there are also drivers. There are drivers like four processors, for GPU and these
other things which we have not considered in our study because we cannot exploit them.
Twenty percent are the other changes. They do it for other purposes. On Samsung Galaxy S11
you can see that there are even more driver changes and less remaining changes. This just
means that at the kernel layer, they heavily customize the drivers. At the framework layer they
do not customize much. Pretty much they are the same. How Android security model works is
that the Android security model is built on top of Linux user and process protection. Every app
on Android is a user. It is assigned a UID and then it is protected by using Linux user isolation.
Every app can just access the data that is in its own sandbox. It cannot access it off of any other
app. If an application wants to access the system resources, for example, a camera, a
Bluetooth, NFC it needs to ask for a permission at install time. When you install an app it will
ask for permissions and at that time the user agrees and after that the app has these
permissions. What happens is that at the install time when you give an app permission for
some resource, then an app is granted permission and it is assigned to the corresponding Linux
group, so there will be a group, for example, for GPS and then it will be assigned to that group.
This is how they check if the app has permission at the kernel layer. Resources on Android need
to be detected both at the framework layer because at the framework layer it checks if an app
has the permission or not to access the resource, and at the same time they need to enforce it
on the kernel layer because at the kernel is where the resource, the device node exists, device
file exists, driver exists. These two need to be consistent with each other. This is a very
challenging task. If you modify anything in the kernel and then make sure that it is consistent is
very challenging because we have just six months to customize the OS before Google officially
releases it. It's very difficult to do it properly and so they just don't care about security. They
just, as we observed in the study, the driver programmers don't have security expertise. They
don't care. They just customize it and they just release the phones. There is no documentation
provided by Google to do this whole customization process. The vendors have to do it on their
own. To study like this whole problem systematically we have developed an approach and then
we studied a large variety of these security hazards that are caused by driver customizations.
Just to make sure, these terminologies are a little confusing, so by phone we mean like the
mobile smart phone, any smart phone and the device actually means the hardware on the
phone, for example, a camera is a device; GPS is a device. And device node or device file we
mean an interface for a device driver. Every device in Linux has a corresponding file, so by this
we mean by device node we mean that file. For example, for camera, they will have in their
file, for example dev slash video 0, so every device has a corresponding file in Linux. Then we
have other files. For example logs and node files and we call them device related files. So how
do we measure the whole security of these customized phones? What we do is we compare
the protection level on the customized phones to the corresponding version of Android open
source code. For example, if a phone has an Android version 4.03, then we compare the
security of this version with the Android open source product. That is, we use it as a reference
of the same version. Here the assumption is we are not saying -- sorry. This is an error. This is
AOSP. We are not saying here that AOSP has the best security. What we are saying is that this
customization process should not degrade the security of the Android below what the AOSP is
providing. We checked if the vendors degrade the security below what the Android original
source code is providing. The high-level idea is that first we need to automatically identify the
Linux files, that is like the device node files that correspond to different devices. For example, a
camera will have a device node in the Linux folder and we need to identify which file
corresponds to the camera. This is a nontrivial task and I will explain why it is like nontrivial.
After we have done that, then we need to compare levels of protection. We just see the
permissions. For example, if it is publicly readable or not and we checked the permission with
the corresponding file in the AOSP. If the customized file has a different permission, then we
say that it can lead to a security hazard. It is possible that it does not, but it is also possible that
it can. If we detect this permission change, it might not lead to a hazard, but in some cases it
can. For example, in Samsung Galaxy S11, camera device is publicly readable. Any app can
read the camera device without any permission. It is publicly readable. I'll show you attacks
also, how we exploited that been carried out attacks. It does not need to be publicly readable.
It should be just readable by the root and the apps in the camera group. This is how the
customization flaw works. If you want to take a picture, you go through this manager and this
manager will just check permission and then it will link you to the camera service. This is virtual
link. How it happens is that it first talks to a binding device. This binding is just a device, is just
a driver to which this app will send an interprocedural call. Whenever an app wants to
communicate with this service, it needs to send a message through this device binder. This is
virtual link and this is how it actually works. Then it talks to this camera device through this
binder and then this goes down like this. It can talk like through this framework layer
permissions and now you can see if there is another app that is like a native app, which means
that you have written it in C, then you, if it is written in C that means that you have bypassed
the application framework layer. If you have written it in C you can just access the libraries
directly and you can just bypass this whole permission system and just talk to the camera. The
whole thing is that this permission is being checked here, but it should also guarantee that this
file is not readable publicly. If it is, then you can just go like this by just by passing the
framework layer and just read the camera file, just get access to the camera file. There are
many challenges to study this problem on a large scale. Correlating the devices to the
corresponding device nodes is hard. For example, an NFC device, so NFC stands for near field
communication. It is like one device and the name of this device on Google Nexus 4 is like this
and on Galaxy S11 is this and it varies on every device and we don't have any idea what it is
without knowing something about it. This was really hard, how we figured out this. We
created a tool to do this mapping. Depending upon whether this can be arbitrary does not
depend on anything. To study this problem in a systematic way we developed this tool that we
call Addicted, Android Device Customization Error Detector. It has two components. One is
device minor, that finds out device nodes for a device and then it has another component that
we call risk identifier that checks the corresponding device nodes with their counterparts on
the Android source code. This is how it looks like. This is a complete design of Addicted. This is
smart phone and the smart phone is running an application that is called Test Runner, and then
it has our dynamic analyzer that I'll explain later. What happens is that we have a test case and
test case is just like take picture. If you want to see, if you want to see the device node for
camera, this test case can be just like take picture. This reference test case is we need to write
a complete app write to take picture. This reference test case just means that it is a complete
application, but it will not take picture. So all noise that will come out of it, like -- okay. Let me
explain this another way. What happens is that we have this Test Runner and so Test Runner
will run this test, like take picture and dynamic analyzer will analyze all the system calls. The
hope is that if the analyzer system calls, one system call will touch the file device nodes and
from there we will pick out that this file is related to this device. Now the problem is there is a
lot of noise. There are many processes running. How we find out that this device, like this call
should correspond to this device, so far that we have this reference test case which is the same
app but it does not take a picture. Then we do this test thing and analyze it again. And then
here we just filter out the noise and we do it repeatedly like so many times and then we get
constant and so we are sure that the device node that we are getting is just because of the test
case that is like take a picture that is related to the device that we need. At that point we add it
to a database and after that as I have shown you even at this point we just know that this
device node put in the database is related to camera. But then it can be a completely different
device on AOSP, like the name can be completely different, so we also fingerprint the system
calls for this device and we create a fingerprint that will -- and then we find identical
fingerprints in AOSP for the file and then if we have identical fingerprints, that means that these
devices are doing the same thing. For example, if two files have identical fingerprints and this
one was camera, then they both correspond to camera. At that point we are sure that these
two files correspond to the same device and then we correlate the permissions and see what's
the permission level at in AOSP and what's the permission level in customized OS. After that
we just report it like it is a customization flaw or not. So what LCF stands for is likely
customization flaw, because it might not be a customization flaw. It might be some false
positive, but we output all of them. Any questions? Okay. Device minor is just like it traces
operations on Android to identify related Linux device files. Static analysis here, like to do the
same task is very hard because the code is written in both Java and kernel is in C and also there
is no static analysis tool available that works. Dynamic analysis, if you are going to do that,
requires extensive instrumentation in the operating system. What we did is we wrote our own
dynamic analysis tool. What it does is when we run this test, when the app wants to
communicate to this device, it goes through this channel. When it comes here at this point, we
know that the app wants to communicate with this service and now this device, we instrument
this device binder like we added code here and then this device binder we actually launched
the tracer which is like s trace so this application when it is attached to any process, it will
record all the system calls generated by that process. We attach it to these system processors
and we also attach it to the application, because sometimes this application might access
directly the kernel and it might not go through here so we attach to here also. This is where we
attach it and then we record the all of the system calls and once we remove all of the noise and
everything we get a final descriptor for this file and then we find how the s trace file is related
to this device. We do this with dynamic analysis and as for tracing we use s trace. This is a test
case that we ran for the different devices. We ran it for camera, NFC, audio, radio, external
storage, GPS and they needed these permissions. If you want to access these resources, you
need these permissions. These other, like frame buffer, you cannot even access it. Frame
buffer is actually stores your screen, whatever appears on your screen is stored in frame buffer,
so you cannot access it. It's very dangerous. But still there is a device known for it. And the
same for these other devices. And then we do these types of test guesses to test each device.
To remove noisily do like extensive differential analysis and see like, so we run test cases again
and again and again and again such that we get a constant intersection of all of these test cases.
This means that these nodes actually corresponds to the device and they are not because of
some noise. After that we have this risk identifier which just compares with the reference. For
example, this is one device that is on a customized phone and this is another device that is the
reference, so the reference permission is like, it's not publicly accessible. And here you have
publicly accessible, so this is -- sorry. This is not publicly accessible. It is 0 and this is publicly
accessible, so this means that this is a customization flaw, which means that this allows any app
without any permission to read this file, which we do not want. We did this on three different
phones, Galaxy S11, Grand and S11I and we found that these customization flaws, that is in
Galaxy S11 we can access the camera mode. This is front camera; this is back camera. Both of
them are publicly accessible, same for Galaxy Grand and Input touchscreen. So this is another
vulnerability that we discovered that is whatever you touch on the screen it's like it is logged
and this should not be publicly readable. And this is also publicly readable. There is frame
buffer, as I showed you in the frame before it should not be accessible by any application. It
stores everything that appears on the screen. In this Galaxy ACE3 this is public. All of these
vulnerabilities allow very high impact attacks and we developed three different attacks based
on these issues and so our adversary model for these attacks is there are these malicious apps
running on the phone. We assume there is one app that is running on the phone and
intentional that app is that. It can be any app. For example, if angry birds want to do this
attack they can. If anybody wants to do this attack they can. All of these apps that we develop,
they do not have root permissions and they also don't have a permission to use the device. For
example, if I want to take a picture with these apps, they do not have camera permission. We
first developed the touchscreen loggers. Our tool Addicted found that dev input event 2 file,
this is public on Galaxy S11, same thing Galaxy S11. Many of the phones have this problem and
Galaxy S11 is like the flagship phone of Samsung. This was the phone that gave them the break.
Millions of people are using this phone. And It's not just limited to this. This problem is
relevant on many of the other phones. We exploited this one vulnerability and we can log the
screen without any permission. So if you can log the screen then if you, you know how your
phone, where your keyboard is on the phone, right? Then you just need two fingerprint like,
like this position like xy position corresponds to A. This position corresponds to B, so whatever
you type any app can steal this data. If you type your password, they can log it, again, without
any permissions. And then we have a camera attack that allows on Samsung Galaxy S11 that
allows any app to take pictures without any permissions. Your phone can take pictures and you
would never know what's happening. For the camera attack the attack is nontrivial because if
you have the camera it wants to take pictures. This is a camera app. We need permission
checking. It needs to talk with the camera hardware abstraction level and then this is how you
talk to driver and then in the kernel layer it talks to the driver and after that you can just talk to
the hardware. We tried this approach because this camera driver, I mean we have all of the
permissions. It's publicly accessible, but this way it was hard to figure out the protocol, so the
phone keeps rebooting again and again so we didn't know what to do, so what we did is we
wrote our own app and we put on the complete disk library. This is like the open source driver.
We put it in our app as a native library and then we just talk directly to the camera driver. This
is also possible but it would have taken more time. Another, we exploited the frame buffer,
one frame buffer issue and now we can take screenshots. If you can take screenshots that
pretty much means that you can still whatever is showing up on your screen, which means
everything. You type your password. You take pictures, anything. And this is, again, because
this file is publicly available and from this app we just need SD card permission so we can store
the screenshot. Now I can show you the attacks. We have the attack demos. This is the attack
screen logger attack. You can see that this app is running on the phone and we will just show
you that it does not have any permission. If the app has any permission it will show up here. It
does not have any permission, and now you launch this app and you can see this is actually our
server and you can see it is getting the location, pressure and everything that is lit on the
screen. You get like all of the coordinates on the screen. We have just blinded it because we
needed to submit the paper. We didn't change it. This is our test screen logger attack. This
next is our screenshot attack. Again, we show that this will just have SD card write permission,
so it can just change the SD card because we need to store somewhere that screenshot. Now
you see screen captures. There's no screen captures. This can happen in the background also.
For the demo we wrote this app so we can show that we started the app and everything.
Everything could happen in the background without user interaction. This is taking screenshot
periodically. This app is taking screenshot data periodically. It has taken all these screenshots
without any permission. And screenshots like that, there is no permission to get screenshots.
Screenshots is like very dangerous. They don't have any permission to take screenshots. It just
goes on and shows this. We have this final attack which is very interesting. Now it can take
pictures. Again, it does not have any permission. It does not have any permission. This is our
attack app. The video it is not… So there are just coin showing that we are taking image. We
just started the app. Okay. I'm not sure if you see it but it put the coin and then it took the
picture of the coin. This is like all without any permissions. These are very high impact attacks.
If you can take pictures without any permissions that's very dangerous and that's also on
flagship phones. This is just like on three phones. Than we did a large-scale measurement
study to find out how widespread this problem is. For this we connected 2500 images
approximately and this corresponds to 288 distinct models, so some of them are for different
upgrades and like different updates for every model, but we have 288 distinct models and 2500
images. Samsung provided these images. They uploaded all of the images online so we were
able to download it from there. It has to be done through this other third-party website. You
can see the different distribution of Android versions. The most phones had Android 4.1.2 and
some had 4.0.4 and this is just the different distributions of Android versions. We did a wide
analysis on all of these images, 2500 images, and we find that 53 percent of these images
contain these likely customization flaws. They are, they belong to a variety of phones because
of these images. Forty percent have video vulnerability, which means that 40 percent of these
it can take pictures without any permission. Then we also discovered that other LCFs that we
discovered might or might not be a problem. They need to be studied in detail, but we
discovered there are like 28 issues input and there is just one node that corresponds to input.
For video there are five different nodes that we discovered and a frame buffer is standard, just
FB. There were like 1290 images of these LCFs and 952 images had the video issue. Some had
input in some had this. But you can see that these are like widespread and our attacks were
possible, so like our camera attack is possible just because of this, so of 952 images there can
be something like 100 models. There's like millions of phones you can take pictures without
permission. This is like a huge distribution of how these are distributed throughout the world.
Each image has a call, like if it is from, if this image is for China then they had a call like CHN.
From that we figured out that in Chinese phones they have most LCFs and less in U.S. and less
in, yeah. Maybe this is NSA. NSA has a has a [indiscernible]. I'm not sure. But for some reason
we find, maybe on the same device models distributed in China has different LCF than the same
model in U.S.. Different device manufacturers, different carriers like AT&T and these other,
China Mobile and T-Mobile, they also customize these OS images and that we found from the
call [indiscernible] they have most -- so model with LCF means that they can be customization
flaws that we don't know. Committed means, confirmed means that we know that they can
cause security issues. You can see that they are like widespread, less in U.S. but more in other
countries. This is how different versions are distributed. Most of them have 4.0.3 so the Galaxy
S11 phone that we showed, it has this version, 4.0.3 or 4.0.4. Then we studied if the upgrade
process, like whenever the Android system is upgraded to a new system, whether they solve
this problem or not. We find out that they fix at least one customization flaw in 53 models.
They fixed all of the flaws 6. They fixed, like they fixed some, like they fixed in one version and
then it shows up again in the next version. We found two of those also. And they also
introduce new customization flaws. And like these customization flaws are not just because
sometimes they know what they are doing, but they need it. For example, if they have a
different type of camera and they need to interact with a different driver and they cannot, so
they just make it public. In conclusion I would like to say that these vendors heavily customize
the phones and we did a first step to study the impact of these customizations on the security
of these devices. These customizations do degrade the security like to a horrible level. And
millions of phones are affected from these vulnerabilities and we are now working to write an
app that will tell you if your phone has these issues or not. We talked to Samsung and we
actually talk to Google and we found this issue they forwarded us to Samsung. Samsung is now
like we work with Samsung to fix these issues. I'm not sure if they fix it all but we are working
with them and we told them these are issues with your phone like for ethical purposes. In
return, they send us a [indiscernible] 3 phone. They have very small bug [indiscernible]. That's
all. [applause]
>> Seny Kamara: Do you have any questions?
>>: [indiscernible]
>> Naveed Muhammad: I think so. They might have thought that, they were at least like you
pressed, so they offered job now in Samsung to do research to the first [indiscernible] like my
colleague, now he, yeah, so they're really impressed.
>>: You said that there were 26 cases where the upgrades [indiscernible] [indiscernible]
distribution of that was? It would be curious to see [indiscernible] if the new ones just
happened to be [indiscernible] that would be really suspicious.
>> Naveed Muhammad: We do not claim anything about like. This is because of the region. It
might be, but yeah. This can be a…
>>: [indiscernible]
>> Naveed Muhammad: Yes. If their intention enter into.
>> Seny Kamara: Thank you.
Download