Document 17860173

advertisement
>> Asta Roseway: So I'm very excited to welcome Ingmar Riedel-Kruse,
Assistant Professor at Stanford University. And he runs the
Riedel-Kruse lab. And he will talk about interactive biotechnology and
also about different platforms about -- sorry -- biotic games and also
how to create like low-cost do-it-yourself kits with smartphones.
Thank you very much.
>> Ingmar Riedel-Kruse: Thank you for having me. So as the picture
here indicates, one of the major themes in my lab that we're interested
in is basically how can we make the microscopic world that we usually
can see through a microscope actually interactive. We can look through
a microscope but we can't touch, we can't actuate, and this is the kind
of thing we want to enable in very different ways and for many
different reasons as we'll show you in my talk.
So let's start with these two pictures, which I guess are kind of
familiar to you, but just asking the audience, what's the difference
between those? Say some differences.
>>:
Tactile.
>> Ingmar Riedel-Kruse:
>>:
One is an interface so a more successful interface.
>> Ingmar Riedel-Kruse:
>>:
Okay, what else?
One is probably labels.
>> Ingmar Riedel-Kruse:
>>:
One is tactile.
One doesn't.
So --
The users are numbers.
>> Ingmar Riedel-Kruse: They're different users, exactly. So
essentially there's like 50 years difference between these two. In
both cases we see computers and we see humans interacting with those.
And as you appreciate these early computers primarily about number
crunching, they weren't very effective by today's standards and they
were hard to operate. By those days, these devices basically provide a
human interactive experience that is kind of child play that is very
powerful. And so 50 years of technological development made lots of
things possible. If you now look into biotechnology, we can actually
see what's going on there right now, see similarities to what is being
electronics 50 years ago and kind of try to project where should we go
with that in biotech space. And just to -- okay. Now the movies don't
run. I don't know why.
What you see here is as a microphytic chips -- can we get the movies?
I tested it before, it didn't run, then it ran, now it doesn't run
again.
Okay, let me try to walk you through. So what you see here is
basically a microphytic chip. It's like a network of little channels.
It has valves in there so you can move fluids and little packages
around. It's used for diagnostics, for example. So if you have a
droplet of blood and you want to do diagnostics of understanding which
kind of disease markers or other things you have you can imagine how
these particles are moved around on this chip and are kind of
processed.
And there are many relationships to these early computers. They can be
run in the lab by expert people. But they're not readily available for
day-to-day use. So this technology gets better. There's something
similar like Moore's Law to it. So can we basically get to a more
human interactive experience with those ones as well.
So, the first thing I want to tell you about is what we started with is
the idea of making games these types of platforms. Even the first
computers, one of the first things people did was making games to kind
of show how you can interact with those machines.
So what are biotic games? We're interested in machines that have two
basic features. One is it should really allow a human to interact with
these biological processes at a small scale. And you should really
have something to do with modern biotechnology. What we don't mean is
a SPU emulation or something more microscopic like cross-pullover which
we could call ancient biotechnology.
What you see here is a single-celled organism called paramecia.
Relatively large. 250 microns. Very simple 2-D microphytic chip.
These little speckles, these are the cells. And you see four
electrodes. If you apply an electronic field, you can make those cells
swarm along the field lines to do behavior called galvanotaxis. And
here you see kind of a joystick or game pad where you actuate those
electric fields and you can basically in real time on a computer screen
see how these cells move.
Now, if you put some virtual objects on top of it and integrate some
tracking mechanism, you can play games with that. Like and the thing
you just saw, it was about getting points by swimming through these
studs. In this case you try to get a virtual soccer ball into a goal.
And they could kind of envision saying basically this is a biology
experiment or you could say this is a game. So in this game engine, so
to speak, you can create many different games by implementing different
types of skins on top of it making different rules and things.
>>:
What's your goal with this game?
>> Ingmar Riedel-Kruse:
goal or ->>:
So the first -- you mean what is the end game
What is your goal, the research goal?
>> Ingmar Riedel-Kruse: The initial research goal was could we make
games on this type of platform and even asking what would these games
even look like. And before video games came around, no one knew what a
video game was. And we've seen this tremendous evolution based on
technology that led there.
And so it's ->>: Is this something about you can see how they move.
to teach people how these particular --
Are you trying
>> Ingmar Riedel-Kruse: Yes, so there's lots of -- so the initial
thing is let's make games, let's make an interactive experience and
show that it can be done, secondary question what is this good for.
And I come to things like informer, informer biology education. So
that may be one application there. So here's again the setup it's very
simple, open chip, Web cam and basically you project this on a screen.
So this was kind of very early first thing that we built and kind of
laid the paradigm of what a game could be like and that sort of
platform. And since then we basically have gone quite further down the
road and so the next system I want to tell you uses a different
organism called Euglena. It's about 50 micrometers in size, a little
bit smaller, it's green and photosynthetic, has a nice little red eye
bud that uses to send to 3-D environment and flagella that propels
itself through the fluid.
So this particular project we wanted to set up a system where we can
play through a smartphone and where we also have a much more robust and
reliable kind of game experience than what you saw in the previous one.
So the education motivations behind this is if you look at other kind
of areas, especially microtronics and robotics and also computers with
video game, National Video Game Challenges, I'm always kind of envious
because you see we have these beautiful toys that kind of engage kids
and students to learn about these technologies, build something, have
fun and intrinsically learn. And we really don't have that to that
extent yet in biology.
And so here again these cells that I just spoke about, these Euglena
cells, swimming through the fluid. A little bit about their
biophysics, so to speak. They have this iceberg and long flagellum and
essentially they can roll or they do roll in a 3-D helical motion
through the fluid, thereby with the ice pod sample through 3-D space,
and then there's a feedback loop between the flagellum and this ice pod
and this allows them to basically, for example, go to the light if the
light is very dim, because they need light for photosynthesis. And
walk away from the light if it's strong. We don't want to get
sunburned and basically here we used the negative for phototaxis. And
on this setup, on this phone setup, the kind of games that we
implemented is a very similar setup before. Instead of electrodes we
have LEDs now and light is much easier to control in many aspects.
And so what you see through the differences of the game I showed you
before, it's in color. So we made some advancement there, which you
may not appreciate is not these cells live for a couple of days in this
chamber. You almost get a plug-and-play experience. You turn the
thing on, you can play and turn it off again.
And we also already integrated a number of things that are educational,
so you see scale here, you see the velocity down here of any sort of
cell that basically swimming through this object. Right now this cell
is tracked and you kind of see the real time velocity. You also see
whatever cell is tracked you kind of see the cell enlarged. The idea
is while you play with these things there's many subtle ways how you
can convey the learner or the player as something that is relevant
about biology. In which you would not get if you would play purely
electronic version of that.
So just to summarize again, so this is kind of this phone you can
actually build it yourself with a 3-D printer simple electronics
interact essentially in two different ways with the cells. One is tap
on the screen with your finger, select the cell of interest and the
other use a joystick to apply a light stimulus and make the cells move,
and as we show new the next few slides there's three modalities of
learning or education that we have in here. One is you can build the
system yourself. You can do inquiry by really doing experiments, or
you can just splay.
So we tested this with a number of children at science fairs to see
each different age groups what they get out of it when they draw. For
example, an eight-year-old drawing with a X on it which these cells
don't have. But clearly reflecting that the child realizes this is
something living.
We also had some test questions about like how big are these cells or
how fast are they swimming which you could see from the game as I
showed you before. What you generally find, around 11 years, the kids
fully get that. Below, kids enjoy it but may not fully understand
what's going on.
And this actually nicely aligns with typical biology curricula in
middle school when most of biology is taught. Here's basically the
whole setup to build something like this. So you do some 3-D printing,
have some very simple microphytics, make with simple tape and plastic,
and very simple electronic circuit. The idea is built, learn these
concepts in an integrated way.
And here's another one to do more serious experiments. So, for
example, instead of building the microscope you can also put this
interrogation chamber on the phone directly on the school microscope
which makes the whole thing cheaper and we made some ads where we can,
for example, track individual cells on the light stimulus or measure a
number of velocity traces and you can ask hypothesis through the cells,
for example, swim fast when the light is on versus non. You can do the
whole scientific method on a system like that. We also thought how can
we integrate programming in there, programming an Android is a little
bit hard. So we resided to scratch children programming language where
we set up a simple model which is also shown here. This is the whole
equation that shows the motion animation of these cells and you can
basically program games on top of that or more serious experiments.
So we're now in the process kind of engaging with teachers more
directly and seeing how this could kind of flow in the school curricula
or after school and hopefully in the long run has similar impact like,
for example, Legal Mindstorms. Now let me go to a second project uses
again the same organisms.
>>: Such as to get -- don't teachers ask you about more serious
research questions you could be doing with these cells, like what are
they good for? What do they do?
>> Ingmar Riedel-Kruse:
>>:
Do they do.
Yes, so we actually --
Don't kids worry about that?
>> Ingmar Riedel-Kruse: I think the question of cell dying worry
adults more than children. So we had a number of teachers actually
coming to our lab and showed them interviewed them on what they found
valuable and whatnot.
So some of the things the teachers really found very valuable is
already just having a screen where multiple people could look at it and
just pointing on a cell and tracing it.
Some teachers were less excited about the game aspect. Some teachers
feel games are good for learning. Others feel it's not there, right?
So it really now is this the question how to really bring this to the
classroom and how to make it meaningful for the teachers. It's really
the next step.
>>: [Indiscernible] or something useful as opposed to speed.
you teach them anything about what these cells actually do?
Could
>> Ingmar Riedel-Kruse: So you can teach them, for example, they
respond to light. Already the concept that cells have an organelle
that sees light, so to speak, and can respond to it and also have a
closed feedback loop that you always go with the light. That's an
important concept. Another thing ->>:
Serious part of me just wants to know more about these.
Okay.
>> Ingmar Riedel-Kruse: Good. So in the second project, we basically
use the same cells but tried a different set of stimuli. Again it's
light but it's light from above. So we have much more programmability
as you will see now.
So this is a touch screen. What you see here are the cells swimming,
and whatever you draw on the touch screen is directly projected on
these cells. We draw a circle in blue light. The cells avoid this
blue light and you can essentially trap a cell here and another cell
there. So just loop this.
So watch out for this cell here. It's kind of swimming then it hits
the light and turns around. So it's actually also much more natural
stimulus than what I showed you before. Before here you didn't see the
stimulus while the other one with the joystick, some light maybe from
the side it's not always directly recognizable.
So what's the setup behind that? It's essentially a normal projector
with a slightly different object, optics to make it small and you have
standard microscope with camera to feed it back into the touch screen
and also have an eyepiece, a regular -- so you can also really see not
only the cells but also what you project.
And here what this movie is showing actually you can do it on a
different length scale by zooming out. Again you can see how the cells
go away from this blue light and start assembling on these other areas.
So it is what the machine looks like. This is the view through the
eyepiece where you see the cells here and you also see what you draw.
And this becomes later as you will see.
So now you can do a number of things with it, for example, simple
science activities we can ask what colors do they respond to so users
can choose different colors, for example, red or green, noticing the
cells whenever they encounter that sort of light they don't respond.
But if there's a blue light, they do respond. Very simple example of
scientific inquiry, activity, and you can make games on it. For
example, there's this apple and you somehow need to feel the motion of
the cell, such that it gets to the apple.
different directions.
Tried to make them bounce in
So we had this system then tested and multiple times in a sample say
tech museum. The setup is essentially like this big touch screen here
and then next to it the setup which is open so people can look inside
and also the eyepiece. So we had people coming and basically just we
were watching from the distance what people would do. And I want to
walk you through a few examples.
So what usually happens is people come to the touch screen start
drawing. It's a very natural thing. After a while they realize that
there's something swimming in the back whatever they're swimming
they're moving it's actually responding to the things that you draw.
Here you see an example where a person within two minutes of this whole
thing is an image sequence of two minutes where a person kind of
initially trapped one cell and then made a little maze and tried to get
the cell out of the maze.
So we can argue that we enable the museum visitor to make the
microphytic circuit within two minutes without having ever done it
before and doing the self-guided experiment. And what's also very
interesting kind of having this eyepiece and the touch screen, if there
were multiple people, usually only one person can operate one thing.
So one person started looking through the eyepiece while the other one
was drawing, and of course they were recognizing that when one person
is drawing you see in the eyepiece and people started talking to each
other and made kind of guessing games like you draw I guess what you do
and so forth. And which kind of really led to prolonged engagement on
the set up and people spending three minutes and more on there, which
really what you kind of want in a museum. Which again is something
which we are really used to from physics like hands on and things and
active museums but not that many in biology. Apart from petting zoos,
for example.
Just to show you another example here, this is in collaboration with
the exploratorium, where we use the exact same system but we use Kinect
as an input device where Kinect uses, detects your body and then
projects that one into the mike scope. What this leads to is that,
first of all, you kind of project the cells in your kind of own big
space. You as a person meet the cells there, but at the same time if
you look through the eyepiece you see a little person dancing inside
the microscope. That's actually something that especially children
very much respond to. And so here you see kind of like with this frame
how you can also trap cells and move cells around. So I think this
really gets at a level where we could really have a two-way experience
between a human and a microscopic organism both in our scale and their
scale and we kind of now are developing exploring like what are the
best ways to implement these kind of activities and what does it really
mean philosophically for us.
And the final thing I want to tell you, and that sort of way, what
you've seen so far is kind of seeing ourselves. But if you take like a
million of these Euglena cells and put them essentially in a closed
petri dish, which is about this size, and put light from below, you get
so-called bioconvection patterns where the cells when they're swimming
up and the cells accumulate at the top of the surface but the fluid
gets denser and this dense fluid actually sinks down and drags the
cells along with it and swim up again. You kind of very get dynamic
patterns. What you can do, for example, if you have an initial as the
movie shows you homogenous distribution, you show pattern on it. You
see something like initially like a photograph almost. And then it
develops further and more and more complex patterns. And though this
is kind of a nice complementary system to the previous one because here
you kind of almost don't see the cells but you really get a collective
phenomena that involves millions of cells. It's phototactic behavior.
This one is also easy to build because it's essentially a petri dish
and the cells live in there for a couple of weeks.
Okay. So this one's kind of our museum's, museum interactive
explorations, and now I want to tell you something about how we can
make biology experiments available over the Web. And one of the main
motivations of that is there are many students across the world that
need to do, would like to do biology experiments. And depending on how
good your school is situated or how dangerous maybe even the organism
is, there's lots of experiments you can do. The question is could we
facilitate students to work from any sort of electronic device over the
Web to do biology experiments.
And we can. And we use again this paradigm that I showed you before
with these four LEDs and basically looking at the phototactic behavior
of the Euglena cells. And this is essentially the website. It has one
big live view of the camera, which is seeing yourselves move. You have
this joystick here, this joystick you control the direction of the
light. And you have an external view, camera view on to the
microscope. So here if you watch closely you see the different LEDs
turn on and off, which is kind of important to give remote user a sense
of what they're actually doing. If you follow closely now, for
example, the joystick points down here. So this LED is on and you see
how the cells move downwards. If you look for a while you realize not
all cells are the same, some respond more, some less, different sizes.
Generally image, video data is very rich. Beyond the experiment cells
respond to light there's more to be discovered and analyzed in these
types of movies.
So this is simple paradigm of kind of a cloud lab, online lab, but it's
very good for us because again these organisms are very stable and we
can long-term culture them and really provide a platform for the end
user is almost as robust as a standard electronic one.
And so in order to assess the utility for that for education, I have a
strong collaboration with Paulo Blikstein, Professor of Education in
Stanford. And went to San Francisco middle school, a teacher who
teaches multiple classes. Within different days, a number of students
basically using our system and not only doing experiments but also
trying to analyze the data and then doing some modeling experience,
modeling activity around that.
So that's essentially what the class looks like. It's pretty tough to
fit in a 50-minute class. Something we do experiment analyze it and
model it. And then do pre and post test which are you also needed to
do because you do research. But essentially that's the layout. So
first about the experiments for the students. So there are different
ways students can actually do experiments in a cloud, either every
student does an experiment on their own or you do a front classroom
experiment where one student is basically doing it on the wall and all
students are basically participating.
What happened is that schools have firewalls, bandwidth limitations all
sorts of things which eventually led us to gravitate more towards the
model where there's one setup in the front of the class but we would
envision in the long run when school's better suited for these kind of
things that every student at the same time can run their experiments in
parallel.
So then the students after this experiment, they would look at their
data. Basically scrolling through the movie and analyzing how many
cells did respond to light, do they really line -- we made some post
image processing where from one single imagine you can recognize which
direction the cells are swimming.
So, again, the main questions that stand around this the cells are
light responsive you move away from light. We implemented modeling
environment. So modeling is something that I think is still lagging in
middle school compared to other science activities in my mind.
And how does one even bring or enable a student to model these kind of
complex cells. And what we resided to is building an emulation where
we basically see this little box here. Implemented full 3-D physics of
this cell and the student was taught to kind of find the parameter set,
three parameters where this cell would actually match the swimming
trace here and in purple and this purple trace is due to a single cell
that has been tracked. And the three parameters that the student had
to align was the swimming speed, how fast it was around its own axis
and remember it only has one eye spot, rolling velocity really has an
effect and how strongly it responds to light and students were
basically kind of fiddling back and forth with these parameters and
trying to get a good fit. And in the future we can think of many more
sophisticated ways of how you really teach and engage children in
modeling.
What a post test revealed importantly the children learned the content
and the other thing is that the students liked the activity, they felt
agency and they felt that performance was successful. And also got
very good feedback from the teachers as well who said this was one of
the better classrooms. Children were much more alert.
If you kind of now think what are the pros and cons of normal hands-on
biology experiments. So certainly the logistics are much easier like
the teacher doesn't have to bring in all the microscopes and wrestle
with the organisms. Of course, on the other hand suddenly you lose a
certain tactility you have if the child turns knobs on a microscope.
And I think ultimately you need to do both in tandem. Certain things
it's really done well online and you can really have the student focus
and experiment design and doesn't have to worry about hands and skill
but in other cases you really want to have that one as well.
Okay. So what's the long-term vision of that. How would that scale in
the future? So what you see here is basically a multiple of these
microscope setups. So image is not very good but essentially it's a
camera here. The objective here and then this little chamber. We see
the tubes where basically we have organisms in larger culture chambers
and basically from time to time flush organisms into these chambers
thereby having systems that can run scenes autonomously over weeks.
Then we build out a cloud system where we have multiple of these
machines and have corresponding user management where how do you queue
people and give preferences, for example, for teacher or student one of
those. If the machine is broken for whatever reason how do you drive
people to a better machine.
So with any sort of cloud system, you need to somehow maintain it and
kind of monitor what's going on. We have that. But it's a special
aspect here to the biology that you don't see in any normal electronic
computation lab, and that's the organisms themselves that may for some
reason not respond as well be dead whatever. For example, if you see
the middle machine not that many cells. We basically implemented a
framework where each of these machines every couple of hours runs a
simple experiment and then with object tracking kind of recognizes how
many cells there are, are they responding. And if the machine is not
properly working people would not be routed to this one and we would
get an alert to kind of fix it. So kind of really getting to this
point where the end user kind of can log on and gets routed to
something that gives a good experience.
If we think about cost and scale-up, so the whole technology behind it,
as you saw in the talk before, it's a very simple microscope, very
simple chip. You can run the whole machine with a Web cam and
Raspberry Pi, if you put it together a setup costs maybe $250. If you
think you have many of those, have someone to maintain it you can think
on average costs maybe $250 per year to maintain a single machine. And
experiment essentially takes a minute. If you track all these numbers,
what you come up with is one of these machines in principle can run
half a million experiments a year. That means of course running day
and night but people can access it from all over the world. The visual
experiment is less than a cent. So it's something where you can think
of, for example, addressing five million students and you guess that
every year go to a similar kind of grade level. Basically really
supplying sort of biology experiments at scale and at low cost. And so
that's something we're now thinking of how do we scale this up and make
this happen.
And, of course, we also varied -- not varied, concerned about
generality. Thought light from the site, you can use stimuli like you
saw from before, like projecting from above. And use different
organisms. What you see, for example, these are Euglena cells, and
boardwalks, microorganisms, multi-cell ball. If you shine light this,
it actually moves to the light while Euglena moves away from the light.
And in different organism, you see follow closely how these cells
generate hydrodynamic flow fields from other cells around. Lots of
things to be discovered and to be scene. And one of the tricky things
if you want to bring these things stable online is again a maintenance
question like which type of organisms lend themselves well for being
robust and don't need too much logistics in our end in terms of
maintenance. So finally I want to tell you about an earlier cloud lab
project which is an interesting contrast, because it's working on a
different length and time scale and also is not as scaleable as the
first one, but, on the other hand, maybe have more utility for students
actually building setups like those themselves. And the system
involves petri dishes and organism physarum that has a chemotaxis
response. Focus on this one. It's a petri dish about this size. And
yellow you see the organism and red it's indicated where robot
basically puts food particles the whole movie lasts over a day. It's a
much slower experiment and here an image is taken at about every ten
minutes. And what you can see is that this yellow organism essentially
follows this trail of food. And you always see like branches a little
bit out but essentially grows, initially grows everywhere then finds
the food moves along the food trail and the experiment essentially
works like this that the student logs in every couple of hours,
programs this to the robot for a few more segments of the food trail
and comes back a few hours later and sees how the organism has
responded. The system itself we built out of Lego. Built a pipetting
robot have a flatbed scale below that basically takes images of
multiple of these dishes. So here's the interface where you can
basically select your experiment and you can route it to a particular
one. You see a petri dish and then you program essentially this food
trail by just drawing a line and telling beforehand what's the time
difference between those. And these are the robots that execute these
experiments. And what's interesting now is that we have these robotics
classes in school and after school programs that now students could
also build robots like that and then get this whole megatronics
education also more into the life sciences. So then after the robot
has executed it, the images placed in a database and you can come back
and see what you've done. So, again, this is the robot. This is
basically the pipette which is all made out of Lego except enormous
range, pretty low cost. Here this is the standard acquisition, three
standard machines each having six of such dishes, there's important to
us to try to understand the technology if you have multiple machines in
parallel each with multiple experiments how do you route and
synchronize all of that. So that's actually what this kind of
schematic is showing here you have multiple of these machines each of
them running multiple experiments. On the other hand you have the
user. And this setup is actually very different from the real time
Euglena lab you saw before because before the Euglena, what it is is a
user checks out a experiment owns it for a minute does the experiment
and then gets put off again. While here all these machines operate
autonomously, the user only puts input into a database and machines
query what they need to do and then do multiple things in parallel. So
two very different modes of operandi. And what's important is in
biotech space we get one more high throughput equipment for pharma and
research in the fourth and you can imagine how you use this high
throughput highly parallelized equipment to run many experiments in
parallel. While the one scientist runs 10,000 experiments in one run,
what about 10,000 students running one experiment each. And so I used
this setup in a class at Stanford, graduate level class that I taught,
where the students essentially run experiments over 10 weeks. You see
examples where students test for the organism, responds more to -- kind
of distinguish between lots of food versus little food and how strong
it can follow, and the student would analyze the data, measure fractile
growth speed and make some models.
And here is another example of a little more simplified robot where you
kind of for example program in 24 well plates, as you see here, food
color gradients, something that's very easy to do, if it's colorful you
cannot think about more colorful mixes. And we also can do is more
complicated experiments where you use bacteria that you induce certain
gene expressions that lead to colors. And so what this can really show
students is how -- give a feel for what industrial robot's actually
doing and how robots can actually facilitate experiments. Take the
label off the person and enable you to think more about programming and
experimental design rather than physical labor and hands on skill. So
kind of division where all this leads to these kind of cloud labs,
talked quite a bit about education and you heard about Moonks
[phonetic] and other things like those. Another project we're studying
is working with physicists, a small select group of people that we
allow give access to our cloud labs. They can run experiments based on
their own choosing and try to come up with research questions and make
models around that.
Other more long-term vision is of course really high throughput
equipment on these kind of machines, on these kind of infrastructures.
Both for research and education. And we can also think about citizens
designs, for example, what happens in case you don't know when you turn
to our project we are tens of thousands people help solve biologically
relevant problems. What if you enable -- I mean regular people to
design their own experiments and thereby contribute to the scientific
endeavor.
So to kind of now step a little bit. Conceptually. What I talked
about is interactive biology, like we want to enable people to do
experiments in a very convenient way and what do we need for that? We
always need, of course, some biology. We need some sensor and some
actuator. And if you want to interface that with our standard
electronic world we'd like to have a digital input and output. You can
think of interactive biology where you don't have that, but as you saw,
for example, if you want to build cloud labs or whatever we need to
kind of interface with these digital infrastructures that we're so
heavily used to. And so if you are kind of connect something to the
decision realm then you can make a close relationship or comparison to
other chips we know, CPUs, GPUs, so forth, where you can say you have
an input and something magically happens, digital output. And the same
thing happens here, except there's a biological process in the middle
that is noisy, has other properties. For that reason we call this sort
of setup biotic processing unit, BPU. We can also ask what's a
performance metric of such a BPU of any sort of system. Like you know
how fast the CPU clocked all these things how many operations can it
do, what would it mean for a system like that.
And you could, for example, say, if you look back at this, early system
we have this projector and camera, you could say let's assume we have a
certain projector on the camera HD, there's a certain amount of
information processed through there and use that as a measure. But
that's not a good one. Because if you have no biology in between, all
you'll get is meaningless stuff out of it. It can't be the answer,
can't be good measure. We need to take those into account. The way we
really know about it is really ask any question if you as a provider a
stimulus, how much of a response do you see. If you provide a
stimulus, don't see any response, you are essentially watching a movie.
What's the feedback you get and the capture this is actually mutual
information. So information theoretical perspective, basically from
the response that you see how much can you learn about your stimulus
and the more there is presumably the more you can interact or the more
people can interact at the same time. And just to make this a little
more concrete. So for example if you have two LEDs takes about ten
seconds for a cell to show a visible response. One bit within ten
seconds you can say within ten seconds which of the light has been on.
If you have four LEDs then you already have two bits. If you now think
about this projectile system where we can show things from above, have
multiple cells in certain sub areas, you can actually get to higher
levels. On the other extreme, if you think back to the physarum system
where we took an image every 10 minutes typically the growth rate on
that log scale is 20 minutes get to a much lower bit rate. If you now
ask yourself if you want to, for example, a certain a number of
children, biology experiments or researchers to do certain experiments,
when I ask which system should you build, you can use these kind of
quantitative measures to say this is the kind of power of investigation
you have.
Okay. So two final slides I want to talk about. So when you first had
our games out there, some people asked like this is ethical to play
with living organisms. I don't know if the audience wants to respond
to that in any sort of way.
>>:
Do the paramecium know they're playing games.
>>: Ingmar Riedel-Kruse: No, because they're not sanctioned right.
But what prompted us is going to Biosis at Stanford and wrapping this
issue and basically coming up with a number of possible objections one
could have. So there's, of course, the question of animal welfare,
like do we hurt something, do we cause pain, there's kind of a general
question respect for life if you play with this organicism it's
something disrespectful to single cell organisms whatever that means.
We also got some from online responses we took YouTube videos where
people are commenting and those are some of the comments we found from
there can characterizing, some people saying this is playing good you
should not do. Some people say this looks yucky. Some may say this is
slippery slope. Maybe who would be the single celled organisms later
you do it with mice that may not be okay don't go there. Some people
say this is trivial pursuit taxpayer money should we spend money on
this couldn't you do something better with your time. Question about
public safety, basically kill a virus, generates it and gets released
in the public. Do we generate games that are good games, that are
joyful for the player not frustrate kind of. The other questions which
are kind of more longer fetched but if you think about scientific
discovery games, if things are discovered, who owns the discovery,
things like that. So it's about some of the major themes, have the
relationship in kind of many bioethical debates. So we sat down and
said let's make a few recommendations, kind of setting boundaries of
what we want to do here to err on the safe side. The obvious one is
don't cause obtain or harm, nonessential organisms, the cells single
cell when you put pizza in the oven you kill yeast similar complexity
don't think about that. We could certainly turn to the public and say
why are we doing this and that's why I give talks here. We have a
mission to kind of engage people and teach, find new ways of teaching
about biology and biotechnology, especially since it's an area that
more increasingly influences our life. We should show respect for
life. It's always a good question what does this exactly mean. If you
think about playing with your dog, kind of throw the stick and the dog
brings the stick back you have a positive relationship. Or you could
make a game with two dogs fighting something we don't really like and
think of how to incorporate that.
You should also have respect for the players who make games that in the
end you think there's some positive outcome for that person. So those
are kind of some recommendations and published an ethics paper on that.
And so I started my talk with some relationships between computer
science and biotechnology. And here's a graph which makes the point
there's some sort of 50-year gap, if you want to say so between the
two. There was kind of the early foundations of computer science like
quantum mechanics can be maybe matched up as being the discovery of DNA
structure. What we call the golden years and then eventually there was
the transistor, integrated circuits, which we now have in biology,
right? And you can basically fit a line through it and say 50 years
and we can also say when did the first video games come up what did we
do here. We write on the slope what we are doing.
So that ends my talk. So the main thing I want to highlight would be
interested in providing an interactive experience between micro scope
human and microscopic organism. We're building a number of our
systems, learned a lot, a lot to be said how do you make stable and
robust technology and lots of applications focused on education which I
think it's a rather straightforward one because if you want to bring a
million single experiments and a million students a experiment is
simple easier to achieve than bringing a very complicated experiments
to ten specialized scientists but the types of technologies in the end
should relate to each other. That concludes my talk, and thank you for
your attention.
[applause]
>>: Comment more on the citizen science aspect like two prominent
examples was [indiscernible] and also astronomy. So what do you
envision.
>>: Ingmar Riedel-Kruse: So the general idea is could you not only
have people involved in data analysis, and problem solving but maybe
even in data collection and experimental design. And so the Turner
Project, which was on this one slide by Richard Daus at Stanford, is
already doing this to significantly extend where people basically
design R and A structures and they submit the design and then these
designs will synthesize and test whether they fold properly. It's
getting people more and more involved in that part of that research as
well. Another example would be we would see a lot with the cell
phones, for example, goes to the wild and take images of birds and
other things and basically help with the data collection.
>>: What is the reason that they want to do different light
frequencies.
>>: Ingmar Riedel-Kruse: The Euglena respond most to blue. So, first
of all, UV the blue light is more energy intensive. But it's also a
little bit cheating because if you crank up the red light extremely
high these projectors have some sort of spectrum they would also avoid
that. So what really chose to do something that gives a very clear
distinct experience to the user where the blue gives a very strong
response and the other ones do not. Again, it's a little bit it's
mainly cheating because you should not mislead the user saying they
don't respond to red light at all.
>>:
Does it have any effect, though.
>>: Ingmar Riedel-Kruse: What do you mean "effect" then? So if you
have very strong light for long time make it UV damage, for example, to
the cell. May heat up and really depends on what exactly is going on.
>>: Are you thinking of standing like open up to different stimuli
that will just be lights.
>>: Ingmar Riedel-Kruse: So you saw the chemical stimulus with the robot in the very first game that I showed where
we had like this electrical stimulus. So we tried to do more and more organism more stimuli but the thing is that the
light is very easy to handle stimulus because very easy to switch on and off and these cells Euglena cells are not easy
to handle. So we would really like to branch out but the more complicated stimuli organism is the more complicated
the technology gets. Any more questions? Thanks again. [applause]
Download