24569 >> Desney Tan: Right. So let's go ahead and get started. It's my pleasure to introduce Wilson To coming to us from UC Davis. Wilson's got an interesting background. He's finishing up his Ph.D. in integrative pathobiology, which I'll let him explain. It's fairly unusual for around here. But he has an impressive track record working in technology for healthcare, engaging in competitions like the Imagine Cup that you guys are well familiar with, doing the UCIT Ideas Competition and the Harvard Business Competition getting grants from various agencies including the Gates Foundation to do much of his work. So without further ado, all yours, Wilson. >> Wilson To: Thank you. So first and foremost, I wanted to thank you all for coming out today to listen to my talk. Before I really begin, I wanted to give you a little bit of insight about my background, as Desney talked about and a little kind of nontraditional candidate. But my life can be really summarized in kind of three pictures. Medicine, business, and technology. Stereotypical images, right. My entire academic kind of background has been in the healthcare sciences bachelor's was in biological sciences. Master's was in comparative pathology, and then my Ph.D. has been in integrative pathobiology. So essentially I study disease processes across different species whether it's a dog, horse human or a piglet, for example. And so it kind of lends itself to be a really, really cool kind of research program and graduate program and studies in general mainly because I'll be working on a flamingo one day and then a human the next day. It's really, really cool in I can see diseases across different expressions of different species. On the business side of things, I've worked in a number of different start-ups, kind of just on my own launching different things. Along with some of my colleagues, where we're doing kind of creating businesses around healthcare systems and healthcare technologies in general, how can we introduce and create technologies from the bench top and really translate it to the bedside for patients and on the technology side, not much of a coder, but I do things on a very, very high level and understand technology and the potential impact of technology in different fields, particularly with regards to healthcare in general. So that's a little bit about me. A lot of the research that I've focused on has been in healthcare in general. When you look at healthcare and how it's developed over the past decade or even century, we see a lot of kind of changes that occur. So starting from 1901 birth of medicine, going to the design of basic medical tools to the revolution in imaging technologies and ever since then there really hasn't been too much else going on. The latest real technology revolution, like I said, was an imaging technologies, the introduction of the PET scan CAT scan, MRIs, things like that. And it really kind of lends us a question of what's next. Here we have the introduction of things like the EKG monitor or the stethoscope. And now we are on the kind of the cusp of mobile technologies. How can we utilize mobile technology in healthcare? And so I'll be describing a little bit about the healthcare space in general, primarily how it's focused in the vascular disease area, because that's one of my areas of focus. So vascular disease as we all know is one of the major problems globally in terms of global health. It affects five out of the eight UN Millennium Development goals whether it's in poverty or maternal diseases or childhood deaths. It touches the lives of billions of individuals worldwide. Secondly, in some countries 40 percent of healthcare resources are used for treating these diseases. That's a substantial burden in terms of socioeconomic burden for these countries. And it's ridiculously high, as you can obviously see. And the thing is costs are expected to substantially increase year after year after year due to these diseases. And even so much that vascular diseases affect more than an eighth of the world's population. And so there's a huge need to address this problem. And so we need to introduce new technologies in order to address these problems. And so one of the big ones that I wanted to focus on was diabetes. Pretty much everyone in this room knows someone or is friends with someone or friends of friends of someone who has been affected by diabetes. When we look at the numbers themselves it's pretty staggering, 25.8 million Americans are diagnosed with diabetes this year or last year. And that's a substantial increase from just a decade ago with only 18 million individuals in the United States. Imagine how these numbers grow globally. And the directly related results from these diseases really stem from any sort of cardiovascular complications which result in 233,000 deaths per year. But when we look at kind of the disease, it's not just a medical kind of problem. It extends beyond that scope of medicine, and really touches upon socioeconomic problems. So when we look at the indirect and direct costs of just diabetes in itself, billions of dollars. $58 billion in indirect costs, $116 billion in direct costs. And these problems are only going to start ballooning. And so we really have the need to introduce new technologies in this field to really address not only the medical side but also the socioeconomic side of things. And so the problem with diabetes isn't that there aren't enough methods or tests to really test for the disease or in that there are not effective treatments for the disease. In fact, they're actually pretty pretty good. And a lot of cases where they're actually treating the patients it works very, very well. The problem really lies in that the on set of Type II diabetes may occur as much as nine to 12 years before clinical diagnosis. And it's something that I've explained to some of the researchers here where the patient has developing disease at day zero but up to nine to 12 years it can kind of be asymptomatic during that entire period. And throughout that course there's been extensive damage that occurs in the body not only in the eye but extends further past systemically to all the systems internally. And so when we look at the end stage complications of diabetes, one of the ones that really stands out is diabetic retinopathy, really patients are often diagnosed with diabetes only after the fact that they visited an ophthalmologist because of vision impairment problems. They go in for blurry vision, turns out they've had diabetes for nine to 12 years. That's where the problem lies. How do we have an early means of diagnosing for diseases or screening for diseases in a nonevasive manner such we can apply it and make it accessible to patients everywhere. And so one of the projects that I've been working on is computer-assisted intravital microscopy. I'll explain more about it. But what it entails is more or less a noninvasive invivo real time imaging platform that allows for objective assessment of the microcirculation in the vulva conjunctiva. We're looking at the red blood vessels essentially in the white part of the eye. And using that as a biomarker and a platform to kind of do much more than just disease diagnostics, but really give an effective analysis of what's happening in the body and throughout the body. So using the system, we're actually able to do visualization of individual blood cells and blood vessels in the eye. So as you can see here, patients sitting down, kind of in one of the older models that we have of the system, but we just have them rest on the chin rest and actually just start taping and taking pictures of the eye. As you can see, this is very noninvasive test that allows for a, kind of the focus to be very, very clear in that we're using a macro lens-based kind of system to take pictures in an easily accessible environment of the sensitive micro vessels in the vulva conjunctiva and in turn are able to do an objective analysis to really show and quantify what's going on in the body. So just based on the number of changes that have been studied in terms of microcirculation so far, here's a list of 15 that we generally look for in terms of when we were doing analysis on the eye. Things like abnormal vessel diameters or ischemia within the eye or micro aneurysms. These are peer reviewed kind of tested markers by which a lot of scientists and researchers in the field look at the microcirculation in order to do a lot of the analysis on that part. And so when we think about what's actually going on in the body, a lot of it is dictated by something called Walstrey [phonetic] array or Walstrey stress. If you imagine the microcirculation system or circulatory system in general, you can probably kind of think of a plumbing system, for example. You have a series of pipes and actual water flowing through those pipes, and that can be applied to what we see in blood systems and blood circulatory systems. So everything is dictated by how much stress is the water is exerting on to the pipes. Similar to what blood is exerting on to the blood vessels. And because of any changes in any of these kind of factors, we're able to see changes in the microcirculation as the body adapts to those areas. And so when we kind of look at it from a more I guess a different profile, here's a side-view of the parabolic blood flow profile of blood going through the system, and as you can see, it's only at the center of the lumen that that's where the highest velocity is. Over across the sides due to the parabolic nature we actually see blood dragging across the blood vessels which is in turn causing additional stress on the system. And so when we look at the results of these changes, we can actually see the microcirculation actually adapting to a lot of the changes on there. So here we have the microcirculation of a normal healthy individual. can see, it's an even orally distribution of arterioles, venules and capillaries. As you These are microvessels enlarged. Anyone who has known someone with diabetes will see extensive changes. If you think of diabetes as a whole, it's an increase in blood glucose in the body which is in turn increasing the blood viscosity and changing the sheer stress and sheer rate that's going across the blood vessels themselves. And so when we look at the adaptations that the body has to undergo in order to address those systemic changes and chronic changes we see changes like this. So this is within a patient with diabetes for about nine years. And so as you can see extensive tortuosity in the system, micro aneurysms and hemo scission [phonetic] leakage throughout the system. So when we look at sickle cell disease, for example, we'll see similar changes. Vascular diseases in themselves because of their nature are actually easily able to be visualized in the system just by looking at the conjunctival bed. And so let me show you a quick video of what we're actually able to see in the system. So as you can see, we can actually to individual blood cells flowing through the system using the sort of technology and setup that we have. Let's see if this works. Perfect. And so what we're doing in our case is essentially mapping the eye. A lot of the gold standards in terms of diagnostic medicine is looking at the retina or the back lining of the eye. What we want to do is actually image the surface vessels in the white part of the eye. And so by extracting a lot of the frames from a lot of the video sequence that we have we can recreate what the surfaces vessels look like and actually do analyses of these. What we want to do in terms of our next steps is really how do you use computer vision to actually look at some of the different changes in the microvasculature in order to have a very effective model for essentially screening for different diseases. And so I'll be describing a little bit about one of the studies we worked on in this case is really how do we position biomarkers that we've discovered as a means to do early detection of vascular disease? And so from time zero we have the healthy patient to the point of disease onset to when symptoms actually emerge and they're clinically diagnosed. There's this huge asymptomatic period that I described earlier. And with that developing complications such as retinopathy and neuropathy, nephropathy that I described earlier. So what we want to do is actually have that point of diagnosis and treatment actually shift over to the left where we're able to do that during that asymptomatic period or previously asymptomatic period and really provide a sort of means to look at and really detect diseases earlier. And so here's some of the images from the study we've seen. So A is, again, one of the normal healthy controls that we have. And then B, C, D, depending on how long they've had disease, the disease really shows the extent and severity of the disease, and we can see just based on our numbers that patients with Type II diabetes has a higher severity index, the list of the 15 I showed you earlier, they have a significantly higher number than those in the control subjects. And so when we look at the retina, for example, which is the gold standard by which a lot of the diagnostic platforms are done nowadays, diagnostic tests, we actually start seeing very, very similar results. However, they usually later on during that whole pathogenesis period of disease. So when we look at kind of retinopathy levels, Type II diabetes patients generally have around twice as likely to show indications of disease. And this is the gold standard. So this is the means by which ophthalmologists actually diagnose diseases. So when we think about kind of the bigger picture, when we look at healthy patients, we see a normal conjunctiva, normal retinal fungus. When you look at the asymptomatic period, the retinal fundus still looks pretty normal but we start seeing changes in the conjunctival microcirculation. When symptoms emerge and we're actually able to see different changes in the retina, those changes in the microcirculatory system and conjunctiva actually get worse, and when they have full blown peripheral diabetic retinopathy, that's when you see extensive damage happening in both conjunctival beds. So when we're comparing the numbers in terms of the SI, severity index, versus what's going on on the retinopathy levels, healthy, we see pretty standard 0 and 1.33 on the severity index, and those numbers only get higher. It's only when symptoms emerge that we start seeing changes in the retinopathy levels, but we see extensive changes already happening in terms of what's seen in the conjunctiva. As you can see, there's an upward trend in terms of the pathogenesis of disease, from disease onset to when they actually develop clinical symptoms, to when we're actually able to see these diseases. And so this sort of model isn't just applied to diabetes. But we can actually apply it to a lot of different diseases as well. Whether it's Alzheimer's disease or sickle cell disease or hypertension, we see similar changes in those areas. So when we look at the retinal fundus of a patient with hypertension, we see extensive tortuosity in the system. We see a number of those actually reflected in the conjunctival bed as well. However, when we look at it a little closer, the ones in the conjunctival bed actually reflect a more accurate and more sensitive bed to really predict if a patient has hypertension in general. So when we apply these in terms of looking at humans and/or humans or animals, we can actually see very, very similar changes. So it doesn't matter -- it doesn't matter what sort of species we're looking at it also doesn't matter what age we're looking at. We've applied this sort of concept to children as young as a year and eight months, I believe, to as old as 82. And so this really has a very, very flexible platform by which we're actually able to apply new technologies in the healthcare scheme to introduce a noninvasive real time manner to diagnose and screen for different diseases. Kind of more on the contact lens side, so this is an area that I've recently gotten into where we're actually able to measure how contact lenses affect the microcirculation and microcirculatory bed. So essentially we're measuring and answering the question of whether extraocular pressure from contact lenses affect the microcirculation. Especially in cases where we're using hard contact lenses or corneal refractive therapy where we're reshaping the corneal surface of the eye. So as predicted, a lot of what we can actually see is reflected in the microcirculation as well. So when we look at A and B, for example, we see signs of tortuosity in healthy individuals, but only those tortuous vessels can only be seen near the contact lens edge, not really anywhere else on the surface vessels in the perilimbal region of the eye. So we can actually see kind of over here a micro aneurysm actually start forming as the contact lens is actually exerting force into the eye and actually creating different occlusions. And so we're actually able to validate whether or not contact lenses in general are able to become a little better in terms of fit, in terms of design materials and whether or not they're safe and effective in terms of its use in patients, especially with the increased adoption of contact lenses worldwide. And so with CRT lenses we're actually able to see something very similar. We're actually always seeing kind of occlusions happening throughout the entire microcirculatory system within the confines of where the contact lens edge is. So applying this model when we're looking at how contact lenses are designed and how we're actually able to kind of predict whether or not this will increase different risks of conjunctivitis or ceratitis or things like that, we're actually able to see whether or not damage is actually happening in the microcirculatory system. And so that's essentially what computer-assisted intravital microscopy is and how we've applied it into our lab. A lot of the lessons we've learned has really a profound impact on what we're able to do in terms of medicine. And so this is a noninvasive real time invivo tool where we're actually able to screen patients for different vascular diseases or potentially screen them. So if you think about the microcirculatory beds, if we look at a patient today versus seeing them a year from now, we can actually use the patient's microcirculation as a control within itself. >>: Just wanted to clarify, it's still images that's taken not video? >> Wilson To: Both. Yes. Sorry about that. So we're actually able to compare patients as their own control and actually see how that blood vessel changes over time. So if we have a patient with diabetes and we give them a treatment we can actually revisit different sites and actually see whether those blood vessels are actually getting smaller, thinner, less tortuous, things like that. So it really provides a platform where we're actually able to effectively evaluate whether drugs are working. >>: So the phenomenon you're observing is blood pressure is high for a long time period of time and so that makes the vessels larger? >> Wilson To: One factor is blood pressure. Another is viscosity, for example, blood flow. We're actually looking also at kind of biochemical markers. So if you look at how blood vessels are responding to different vaso dilators in patients with vascular disease, they're essentially not responding to those different things. And so what happens is the blood vessel isn't able to dilate at all so you have increase in blood pressure. So it kind of has this vicious cycle where it feeds on itself. That's why patients with vascular disease often have a lot of chronic problems that eventually lead to cardiovascular disease or heart attack just because of those reasons. >>: Does the same effect happen elsewhere in the body and you're using the eye because it's particularly easy or does it only happen in the eye because of its shape? >> Wilson To: This can be a representation of everything in the body. So if you think of the microcirculatory system in general, it goes from I guess your arteries to your arterials to your capillaries to venules and veins. This is a micro scale of what's happening. That can be thought of what's happening systemically in the body. So the reason why we're using the conjunctival bed in the vulva conjunctiva because it's an easily accessible area. The other two areas you can probably look at it is underneath the fingernail and underneath the tongue as well because they expose a lot of different blood vessels in those areas as well. >>: What's the discriminatory power that you have looking at these blood vessels to tell the difference between different conditions? >> Wilson To: You know, so a lot of the abnormalities between the different diseases actually overlap. But varies -- with different diseases, they have very specific characteristics that can be seen. Sickle cell disease, for example, you only see common signs in that sort of disease. And we're using it more as a method for screening patients in order to undergo the test of actual test rather than actually having them perform like a blood draw and blood panel to go through pretty much the entire works of a blood glucose test, for example. Cool. Awesome. That's a little bit about the computer assisted intravital microscopy, has a new way of looking at how technology can be applied in the healthcare system healthcare in general. If you imagine what we're able to do with this it's pretty substantial. Imagine going to your doctor or your optometrist, for example, and being screened for different vascular diseases. More and more optometrists are being positioned as gateway physicians where they're actually able to look at what sort of changes are happening in the retina and kind of referring patients to go see their primary care physician. And the conjunctival bed offers another earlier way of having kind of patients kind of get tested and screened for in that sort of sense. And so as you can see, there's been extensive hardware evolution kind of happening when I first started the project in my graduate program. So essentially it was more or less prototyping different contraptions and seeing what would work. And we eventually moved on to different kind of platforms that are widely available in kind of the mass commercial market and actually adapting different slit lengths seen in ophthalmology and optometry clinics to see whether that would be an applicable route to implement this sort of technology. And so another project that we've been working on is whether or not we're doing this on a cell phone. And so this was the basis of one of the Imagine Cup projects we worked on in the past where we're able to use a cell phone to take pictures and video of the eye in order to see whether or not we're actually able to do onboard analysis of different disease trends and that sort of aspect. >>: Are you going to say what you had to do with the hardware to make this? >> Wilson To: Definitely. The whole full effect. Not a problem. So this is a Windows 6.5 device. A little old. Back in the day before the Windows Phone were invented really. And we were using the phone with a lens attachment to it in order to look -- intra vital lens essentially where we're actually able to position this over the eye and use the flash on the device with a green filter to kind of enhance and contrast of the blood vessels and actually image the kind of blood vessels in that sort of manner. It does work. What we're actually trying to do is continue our development of this hardware evolution and really introducing this on a slate and kind of a tablet kind of hardware fashion where we're actually combining this sort of technology and interface with that in slit lengths and really provide a easy-to-use simple tool for optometrists and ophthalmologists to kind of play around with and see whether this has clinical applications in the real world. And so that pretty much sums up what I've been doing in terms of my primary research during my graduate program. And I'm going to go into a little bit about another project that I worked on as part of the Imagine Cup. So I don't know how many of you are familiar with the Imagine Cup, a little bit? Awesome. Those who aren't familiar with it, the Imagine Cup is a worldwide technology competition sponsored by Microsoft wherein students are challenged to create new technologies to solve the world's toughest problems. And so essentially this is how I sum it up. The Imagine Cup is where extraordinary happens and change is possible. Chris is a good example of a competitor, Tristin is another one. And so they've come up with really, really robust kind of solutions to address poverty in the world or healthcare in the world, healthcare challenges in the world. And so one of the projects that I worked on with Tristin here actually was a project called Life Lens. And essentially what we wanted to do was use the Windows Phone as a means to do light microscopy. So when pathologists and laboratory technicians look at blood samples, for example, they use those really high powered microscopes. How do we apply that technology and carry it over to a kind of like a smartphone kind of format? So a lot of research has been done in this area already. Cell phone-based platform for biomedical device development has been kind of explored on extensively by a number of different researchers across the world, where essentially we're using a cell phone with a ball lens to look at different samples and really image and to samples in that sort of manner. So these are the images that we're actually able to see using this sort of technique. So this is a kind of untouched photo of what's going on. If we enhance the magnification do different analysis on it, we're actually able to really create a very, very powerful platform by which we can do a lot of healthcare analysis of bloodborne diseases. And so just by taking a blood smear, a little finger prick really of patients we're actually able to do a lot of visualization and analysis. Can't really see it on that screen, but of the blood cells in the system and actually look for diseases like malaria. So this was kind of the basis and foundation for a lot of the projects that Tristin and I worked with about a year and a half ago to really showcase the power of technology when it's applied to healthcare. And so imagine equipping healthcare workers in third world countries with no access to hospitals or clinics nearby with point of care devices where they're actually able to kind of do diagnostics on the spot. And so one of the cool features about doing this on a mobile device is that there's so many gismos and gadgets attached to it we can actually put together a lot of the information to really perform more or less epidemiological studies. So using our application and the Windows Phone we're actually able to see where outbreaks are happening kind of in real time and really provide this information to teams that do preventive medicine preventive care, such as things like as simple as mosquito nets, for example. How do we deploy mosquito nets in areas where they're bound to have it within the next week or something. So combining this information with things like weather patterns or predictive models about where, what sort of direction disease outbreaks are happening, we can actually provide a new method of preventive care for a lot of healthcare teams around the world. And so our project won third place worldwide this past year, or I guess two years ago. And we actually recently got a big chunk of the three or $3 million Microsoft Magic Cup Grant to continue developing our project and actually implementing them in various test studies around the world. And so we're using this as a means of providing a new sort of inspiration and product into the market where we're actually helping save lives using common devices that we have in our pockets. So one of the new projects we're looking on is something called pathologic code which is still in its infancy right now. But essentially what we're doing is protein folding on devices. So how many of you are familiar with Fold It? Perfect. Awesome. So Fold It is essentially for those who don't know protein folding as a game more or less. And so a lot of it is done on the computer right now, with kind of players kind of pitching their ideas of how best to fold a protein in order to make a specific kind of structure. And so we wanted to really create that and really connect devices to explore that field. Like I said, this is kind of still in its infancy right now. But software isn't just meant for -- I guess software in itself isn't just the medium by which we can address healthcare in technology but also gaming as well. So that's one area of exploration that we're trying to work on developing right now. And so really to summarize what's going on in healthcare and technology in the space is mobile technology is untethering healthcare and enabling the practice of healthcare anywhere in the world. That's pretty much what my whole foundation and belief is in terms of research really is built upon. If you think about what's happening really worldwide, people have more access to cell phones than they have fresh water, fresh, clean water. It's mind blowing sometimes. And so really how do we introduce mobile technologies in the space of healthcare in order to provide new methods of diagnosing diseases, preventing diseases, kind of across the entire spectrum. And so one of the cool things about mobile phones is they're pretty crazy nowadays. This is something that's really true. During one of my talks before someone actually tweeted this to me. Your phone, your mobile phone has more computing power than all of NASA in 1969. NASA launched a man to the moon. Here we are launching birds into pig structures. And when you think about that, it kind of says you about the potential of technology in the mobile space for healthcare. And it's something that we really wanted to be on the cutting edge of in order to help facilitate a lot of the growth in that area to make sure it's kind of pointed in the right direction. And so when we think about how we're planning our projects, we should really let our relationships kind of guide our strategy. We have a lot of people invested in terms of healthcare and kind of the technology involved in healthcare. So it's not just research for research sake but we're doing research because it aligns with our global passions for global health. We're equipping communities with new tools and techniques to help them manage their whole healthcare scenarios, working with team members and shareholders and really kind of providing a means where it's not just a financial return that they're kind of building towards, but really a deep human sense of return of how we're actually able to contribute to the communities and really working with our customers and partners to really think about what's the best approach and how is it being used in the field. And so really to summarize what we've been doing is really kind of innovative screening techniques. I personally feel that in the area of pathology and medicine, it's kind of the screening that's a problem. It's not the treatment. It's not the diagnostic process. It's really how do we keep pushing to the edge of when can we actually screen for these diseases to provide that early intervention and early diagnostics. So providing innovative screening tools to bridge connections in science and technology to really provide a real impact for a better tomorrow. And so I'd like to thank namely one person in this room here, Tristin Gebeau [phonetic] who is working on the project with me for the past couple of years. Dr. Anthony Chung, Peter Chen, Patricia Took, David Telander, UC Davis, and Jason [inaudible] on Dove White [phonetic] and a few of my interns over at UC Davis as well. And more than anything to the Bill and Melinda Gates Foundation to provide the funding to me for my research and academic studies throughout my bachelor's and master's and Ph.D. program. Thank you for your time and open to questions. [applause] >>: How much expertise or precise positioning does it take to get the right picture? I mean, you could imagine we're all wearing glasses that are imaging our eyes all the time or something like that. But the set up you showed had their heads down and all still. >> Wilson To: Definitely. A lot of what we've been doing kind of on the slit lamp side and the device, the reason we're using a chin rest is because it provides the stability and lack of kind of motion for the patients. And so it definitely helps to have a very, very calm patient. But at the same time, we wanted to integrate a lot of video stabilization algorithms into what we're looking at in order to kind of make sure that we're seeing the images as they are rather than blurs everywhere. But a lot of what we're doing is more or less if we follow the protocol that we've developed, we're actually able to get enough information in two to three minutes. So it's pretty robust in that area. >>: Hold still. >> Wilson To: Basically try to hold still. We've done it in patients as young as a year and eight months. And those guys don't sit so well. So we're actually able to extract a lot of information in just a short amount of time. Essentially when we're doing analysis we're only looking at a few frames of the entire sequence all we need is that little bit. Yes? >>: How do you define if you've got tons of frames, two or three minutes worth of frames, how do you zero in on the ones that are important. >> Wilson To: So we actually have a program that kind of sorts out what's a good frame, what's a bad frame and we actually do analysis on all of them. We actually have different investigators do analyses on them to make sure we have a very accurate picture of what's going on in the body. >>: That analysis offloaded to humans or is that automatic process? >> Wilson To: Right now it's based off of humans. But the plan is how do we automate this entire function, in order to use computer vision, which Tristin is kind of helping developing, to really teach the computer what to look for and how to identify those different features and markers to really kind of push this technology such that you don't need training to do any of the analysis at all. But right now it's completely done by individual investigators in three separate locations to kind of verify that everyone's getting the same numbers. >>: Can you say a little bit more about the cell phone setup, the lens and how you actually capture the video of the cell phone. >> Wilson To: Definitely. I have a paper I can actually send to you if you want more detailed information about it. But essentially we're using a intra vital lens that's attached to kind of the back of a cell phone. Essentially the cell phones nowadays have pretty much a very, very self sufficient, self-contained kind of environment where we're actually able to do all sorts of things to it. And so by attaching a lens to the back sensor of the image sensor of the cell phone, we're actually able to see a lot of interesting things with it. I think the sensor or the lens that we're using before was actually a pretty big, like a 3-inch 4-inch kind of lens that we are using to do a lot of intra vital microscopy work in microscopes. We're adapting it to kind of miniaturize it to do a more point of care kind of approach to everything. I can definitely send you some additional information on it. Yes? >>: Have you investigated some of the more self-diagnosing technologies like the blue field [inaudible] phenomenon? >> Wilson To: I have not. But it's something that I can definitely look into. Little curious about it, actually. >>: For self-diagnosing scotomas in the retina. >> Wilson To: Definitely have not looked into it more. A lot of the research and the work I've been doing has been more on the conjunctival level rather than on the retinal level, mainly because a lot of the vascular diseases kind of are able to show themselves and manifest in the vasculature a little earlier on in the microcirculation in the conjunctiva rather than the retina. A lot of the work I've been looking at has been sort of in that field. Yes? >>: Can I have you speculate on a technology wish list? >> Wilson To: Technology wish list. >>: There's some things I think you see in here, things like requiring stability for two or three minutes that could actually possibly be solved by better technology. >> Wilson To: Definitely. >>: Better imaging or better extraction techniques. Are there other things you've seen in any of these projects where if we gave you the magic ball and said you have three wishes on any technology you wanted what would be on the top of your list or different sensors? Is the camera an acceptable sensor in general to use? >> Wilson To: Probably one of the biggest things that I think would kind of help this project the most is having a bigger image sensor. The more information that we can get from these sort of microcirculatory beds, it's the more data that we can use to see what's going on. Even though we're not -- we're able to see too much into it, we're actually able to see a lot. So did you have a question really quick? >>: I'm sorry. First of all, that I came in late. Stuff was really up my alley but I got a meeting that got cancelled. So you were using RGB sensor or [inaudible] sensor. >> Wilson To: Those are RGB sensor but we're using a -- we had a few filters set up on the cell phone itself, where we're using kind of a anti red filter both in terms of what's on the flash system as well as the actual video capture. We're doing kind of everything in monochrome to actually enhance a lot of the images that we have. >>: How would a large sensor have helped? that appear in the conjunctiva? I mean, because of the structures >> Wilson To: So a lot of it, if you look at it from this sort of point of view, if we have a bigger sensor, we're actually able to look at the conjunctiva at a much closer level as well. So depending on what sort of lens elements that we're using, combining with a larger image sensor, we can actually capture a lot more information, a lot more detail in that sort of aspect. Going back to your question, the second wish would probably be I think access to a lot of ability -- I guess, access to resources to kind of build different casings and models to really simplify, like rather than having a chin rest, for example, how do we build that into more of a platform even if it's a forehead press or something. Because I feel that a lot of what's happening in terms of developing the technology and the solution is really limited by what we have in the lab. And so it's more or less how do we duct tape stuff together and get things working. And as cool as that may sound, like it provides a lot of challenges. And so resources in that sort of area definitely help. And then lastly it's just in terms of the sensors within mobile devices themselves. Like from the point when we were using this device to even using the Windows Phone now this substantial increases in our ability to do things a lot quicker and more efficiently, but to be able to look at I guess combining a lot of our projects together in terms of we're looking at the microcirculation here, we're looking at blood here, and I mean there are apps to really look at what sort of diets people are having or what sort of lifestyle they have in terms of how much they're running or whatever it might be, how do we combine that information together. How do we create it open enough to where we can get a good picture of what an individual is like in order to really provide very personalized solutions and recommendations for that individual. And so those would be the three. >> Wilson To: Other questions? >> Wilson To: Yes. >>: Could you potentially diagnose from a single frame? >> Wilson To: Definitely not. Mainly because when we look at the microcirculatory -- like one frame, for example, is a very, very tiny piece of what's happening in the entire conjunctiva. So we want to make sure this isn't just an instance where you just rub your eye in that area, there's micro aneurysms or there's hemo [indiscernible] in that area because you're rubbing your eye or because you've had micro trauma in that location. When we look at the microcirculation as a whole we try to look at the entire white part of the eye to get a good view of what's going on systemically. So...generally know. >>: The frame have the entire eye -- like you need the whole picture but you don't need it over time, I think is what ->> Wilson To: No. Essentially what -- I don't know if you remember the slide where it had red boxes of the different frames stitched together. So that's essentially what we want to do. How do we just map out the entire eye so we can take a look at the image itself and compare it over time. If we look at that image in itself over the course of six months or three months or a year, we can actually see how blood vessels are responding to different treatments and kind of look at that from that approach. >>: So there's no significant temporal profile that you're interested in, movies or high speed photography or [inaudible] or fascia speeds could help you with. >> Wilson To: It's definitely something they've worked on in the past. I know some of the research before the different models that I showed you here, they actually use a lot of SLR-based systems to try to do a lot of high speed photography. But it wasn't very, very powerful or robust in that sense because what they did was take pictures, develop them, kind of measure it from point to point to point to point and it wasn't a very efficient manner to do it. But it definitely has the potential to move into that field. If we're actually able to create new kind of techniques and approaches using those sort of systems. But it's definitely doable. >>: Able to get pulse profile at least pulse velocity. >> Wilson To: Definitely. >>: And filtering, too. >> Wilson To: Yes. That's why I was very interested in what you were talking about earlier. It was like wow, that's perfect. What I do. >>: The feature, the single frame, what's the size of the smallest feature of the camera? >> Wilson To: So based on our converted ratio or conversion -- I think it's 12 microns we can look at. So and those are some of the features involved with the microcirculation itself. We're essentially looking at blood cells. We can actually see the blood cells going through in the system. We can look at it from a tinier perspective, but it's essentially around 12 microns. >>: Ophthalmologists, take tons of pictures, have tons of gadgets, how does your differ from the types of pictures that they're normally taking? >> Wilson To: Ophthalmologists look at the retina rather than the conjunctiva. So they're looking at the blood vessels in the back lining of the inside of the eye. And so that's where essentially light hits and you're actually able to process images. And so what we're looking at is the surface vessels. So it's a little easier to access and to kind of take pictures of and it doesn't require any sort of injections or Flurosyn [phonetic] or anything of that sense to see how fast blood is flowing through the system. When you look at the retina versus the conjunctiva a lot of the symptoms that appear in the retina are generally advanced stages of different diseases already. So the conjunctiva serves as a more sensitive bed to the changes that are happening in the body. You're more likely to detect something in the conjunctiva than something you see in the retina. That's why we were talking about a little bit of the study when we're comparing what's on the fundus versus what's in the conjunctiva to show that timetable of pathogenesis when someone is developing a disease. >>: So this results in a new result? [inaudible] >> Wilson To: I'm only three years into the program. But, yeah, it's a relatively new technique. The whole idea has been around in terms of looking at the microcirculation has been around for about 15 years. And even more so when we look at some of the other aspects of the bioengineering and biochemical properties of the microcirculation. But this is a relatively new field. >>: So what is the adoption path of something like this? >> Wilson To: So with this specific solution, we're looking to partner with a lot of the healthcare providers and insurance companies. So our group worked with and talked with some folks over at Walgreens in terms of their executive team to see -- and just imagine having access to this at every single corner in terms of having a drugstore nearby. And working with doctors and optometrists to see whether we can get this as part of your comprehensive examination. I mean, people go see their optometrist every year. People go see their doctors every year. But what if we're able to provide as many places for people to access this sort of technology in order to get a good timetable and establish how well they're doing in terms of their vasculature internally. >>: I guess I ask what do you see as the main object to that, besides politics and money getting to the [inaudible] store, are there technology obstacles still there? I would guess the fact that you have to have humans interpret images is a huge obstacle to making that dream a reality at this point. >> Wilson To: No, I completely agree. And that's something that we're actually looking into, how do we introduce new computer vision algorithms to kind of automate the entire process and specifically look for things. So that's an area of research we're moving into a little more. So I definitely agree with you on that. Kind of removing that whole human subjective kind of analysis will greatly improve the objective nature of what we're doing. >>: I would say it's not unreasonable to think like a trained ophthalmologist, skip through video and maybe find some frames that are indicative of ->>: A device for the doctor [inaudible]. >> Wilson To: Excuse me. >>: The device directly sends the video to the doctor, trained ophthalmologist. So you don't need [inaudible]. >> Wilson To: That's definitely a plausible scenario. >>: Seems like the impact is greatly larger if you can [inaudible] doing it, untrained person doing it versus a trained person. >> Wilson To: Yes, definitely opens access and the ability to kind of do instantaneous analysis. It definitely is the path that we want to move towards. But it's probably a couple of years from now. >>: How hard in particular would it be to make it completely machine-based and not depend on [inaudible] these images that this resolution and this frame rate, how effective is it to ->> Wilson To: It's definitely doable. It's something that we've already started to look into. And out of those 15 we can probably look at probably three or four of them with pretty good confidence that those were pretty accurate numbers. But we definitely want to take it a step at a time -- we want to maintain its accuracy while expanding to new areas in terms of the list of 15. It's definitely doable I feel because we're looking at pictures of pixels. How do we teach computers what sort of pixels pictures relate to what sort of thing. So as long as all the features are very consistent in terms of what they look like and it is in the case for microcirculation, because my body, your body, anyone's body kind of adapts in the exact same way. Different patterns but the exact same way. So we can definitely look at it from that sort of view, and I'm confident that it will continue moving forward in terms of developing kind of a new way of looking at and automating the analysis of those images. Great. Thank you. [applause]