My-Immortal-Avatar - Computer Science and Computer Engineering

advertisement
University of Arkansas – CSCE Department
CSCE 4013 Virtual Worlds – Final Report – Fall 2010
MY IMMORTAL AVATAR
Matt Hull, Chad Richards
Abstract
Humans are mortal—we age, die, and decay. The ultimate goal of the My Immortal Avatar
project is to create a digital representation of a person so that he or she can live on in avatar form
long after his or her mortal counterpart has passed. The design involves a three-pronged
approach: a database to store the vast information that makes up a person, a digital model of the
avatar to capture the person's physical appearance, and a user interface that both connects the
database to the model and provides interactions between users and the avatar. The three
combined prongs will come a long way toward representing a person in digital form.
1. Problem
Our team is attempting to solve the problem of mortality, digitally. Instead of tackling the
problem of extending human life, we are creating a way for the person to live on in a virtual
world long after they have passed from the real world. One important application of this would
be to keep ancestral records. The avatar could be interacted with by family members and
predecessors of the individual, for instance, to create a living history, a permanent biographical
record of the individual.
2. Objective
The objective of our project is to create an avatar or chatbot that contains biographical
information of an individual and is modeled by a 3D digital representation. We decided to
implement four primary functions in the client: list entry by keyword, list entry by date, say
something random, and have a random one-way conversation.
3. Related Work
3.1 Context
Pervasive computing is the idea that almost any device, from clothing to tools to appliances to
cars to homes to the human body to your coffee mug, can be imbedded with chips to connect the
device to an infinite network of other devices [1]. The goal of pervasive computing is to create
an environment where the connectivity of devices is embedded in such a way that the
1
connectivity is unobtrusive and always available [1]. The Everything is Alive (EiA) project
describes a pervasive world where everything can sense, act, think, feel, communicate, and
maybe even move and reproduce [2].
3D virtual worlds take the idea of pervasive computing into a virtual environment. The goal of
such 3D virtual worlds is to create a one-to-one correspondence with the real world. Objects in
the virtual world should act and respond exactly like the ones in the real world. Our project
relates to the concept of pervasive computing because it adds a person from the real world into
the virtual world as an avatar and makes the avatar realistic in look, sound, thought, and memory.
3.2 Key Technologies
Ubiquitous computing - "machines that fit the human environment instead of forcing humans to
enter theirs" [3]. My Immortal Avatar could be used in museums and art galleries where people
could talk to artists or historical figures. Also, picture frames could be comparable to the ones
seen in the popular Harry Potter series which would not only move but also interact with passing
people. Such picture frames would also be able to remember people’s faces and earlier
conversations between them.
Ontology - “a formal representation of knowledge as a set of concepts within a domain, and the
relationships between those concepts” [4]. One of the limitations of Ontology is getting this
shared set of concepts or ideas into a domain and relating the data. These concepts and ideas
would be collected into a single format to be stored for many years. The concepts, ideas, and
emotions of many people could be stored long past the time of their passing and able to be
accessed and related to other shared concepts, ideas, and emotions.
3.3 Related Work
Alicebot is the most related to the project because it is an avatar chatbot. Unlike the My
Immortal Avatar project whose focus is storing biographical information, Alicebot focuses on
communication between a user an an avatar. A.L.I.C.E. (Artificial Linguistic Internet Computer
Entity) is an award-winning free natural language artificial intelligence chat robot [5]. The
software used to create A.L.I.C.E. is available as free ("open source") Alicebot and AIML
software [5]. Dr. Richard S. Wallace formed the ALICE A. I. Foundation in 2001 to promote the
development and adoption of Artificial Intelligence Markup Language (AIML) and ALICE free
software [5]. The A.L.I.C.E. project includes hundreds of contributors from around the world
[5]. In 2000 AliceBot won the Loebner Prize for most human responses.
3.4 Related EiA Projects
Our projects on My Immortal Avatar could relate to these other EiA projects:
Ontology - Both our project and the ontology project seek to represent real world knowledge.
Stutterbots - The Stutterbots project turns ordinary chat into stutterchat to simulate someone who
stutters, so others who communicate with that person understand and learn how to do so. It
could relate to the My Immortal Avatar project because, much like My Immortal Avatar, it
attempts to simulate nuances of real people. In the future, the immortal avatar should contain the
ability to speak, and if the avatar does stutter, the Stutterbots project could prove very useful.
2
Mirror World - The Mirror World application strives to represent a real store digitally and
provide methods for controlling objects within the store. When a real world object is moved, it
should move in the virtual store. Much like the Mirror World attempts to represent a real object
in a virtual environment, the My Immortal Avatar project attempts to represent a real person in a
digital world.
4. Architecture
4.1 Use Cases
4.2 Tasks
1. Understand …
a. SQL Syntax (Chad Richards)
b. MySql Workbench (Chad Richards)
c. MySQL Routines (Chad Richards)
d. C# Syntax (Chad Richards)
e. MySQL Connector to ADO.Net (Chad Richards)
f. Visual Studio.Net (Chad Richards)
g. Photoshop(Matt Hull)
3
h. Video/Sound Editing(Matt Hull)
i. Video/Sound file extraction(Matt Hull)
2. Design …
a. MySQL database (Chad Richards)
b. 3D model (Matt Hull)
c. Bender Test Model(Matt Hull)
d. C# Client (Chad Richards)
3. Implement …
a. Test avatar (Chad Richards)
b. Test 3D Model (Matt Hull)
4. Test …
a. Querying biographical information by date (Matt Hull, Chad Richards)
b. Querying biographical information by keyword (Matt Hull, Chad Richards)
c. Stating a random fact/biography entry about the avatar (Matt Hull, Chad
Richards)
d. Producing a one-way conversation (Matt Hull, Chad Richards)
5. Report …
a. Results of queries (Chad Richards)
b. Potential errors and bugs (Chad Richards)
c. Future work (Chad Richards)
4.3 Architecture/Design
There were three primary components that had to be designed in the My Immortal Avatar
project. A database had to be designed to hold the information that makes up a person, which
could include anything from physical aspects—height, weight, hair color, etc.—to the
connections a person can have, whether that be coworker, boss, relative, or their pet. A 3D
representation of the avatar had to be created to make the avatar look like the person it
represents. Finally, a client that combines the features of the 3D avatar render and the database
had to be created.
Chad Richards was responsible for the database aspects of the project. Designing the database
required not only knowledge of a database scripting language but also insight into what
information to store about individuals and how to compact the information into manageable
tables. Humans have physical attributes like height and weight, mental attributes including
intelligence and emotional tendencies, and a whole biography of experiences. They have likes
and dislikes, interests, hobbies, and favorite quotes and stories. They have a multitude of social
connections including friends and family. All of these features must be able to be contained into
a manageable database.
4
To accomplish this, Chad decided to implement several tables, but kept the fields generic. For
instance, the connections table contains the UID, Type, Subtype, and Strength fields. The UID is
used to assure that the connection is unique. For example, the avatar could have two aunts
named Sally that live in the same area. A unique identifier helps to make sure the two aunts do
not get stored as the same person. The principle of keeping the tables generic can be seen in the
Type and Subtype fields. These can be used to describe a Type=Relative, Subtype=Aunt
connection. They can also be used to describe a Type=Work, Subtype=Coworker connection.
Keeping the tables generic in this way not only allows many types of information to be stored in
few tables, but also allows the client to have flexibility in the way it stores and queries
information.
The last field in the connection table mentioned is the Strength field. This is a concept that Chad
decided to implement in nearly all the tables he created in order to provide further human
qualities to the avatar. Strength could be used in many ways. How much the avatar likes certain
interests could be determined by the strength field. Is this interest casual, or is it one of the
avatar's life passions? Another use for strength could be how well they know a certain
connection. Is it a casual acquaintance or one of their closest friends? One of countless other
uses could be for how much the avatar likes to tell a certain story about him or herself. If the
biography entry is a strong one, the avatar could feasibly tell this story much more often than
weaker entries.
After designing and creating the database, Mr. Richards created stored routines for the client to
interact with the database. Stored routines are used for a variety of reasons, one of the most
important being to separate the client code from the SQL statements. This makes the client code
easier to read and prevents the need to recompile the client code if the SQL statements need to be
changed at a later time. As a further advantage, using stored routines also speeds up the
program. Over a hundred routines were scripted to provide maximum flexibility to the client,
giving the client many different choices in how to retrieve information from the database.
Matt Hull was responsible for designing the 3D avatar model. To design the 3D model, these
steps had to be taken. A photo of the person needed to be taken on the front, one side, and top of
the face. These pictures were then placed into a 3D modeling program adjacent to each other,
and planes were placed around the edge of the face correctly with reference to the three images.
A 3D rendering of the head was then visible in the middle of all three images.
Chad Richards developed a simple client for displaying information from the database. Given
the availability of information and recent death of actor Leslie Nielsen, Chad decided to create an
avatar based on him and stored many of his well-known quotes in the database, along with
associating keywords and dates to these entries. The client was capable of displaying entries by
keyword and by date. The client also contained a “Say Something about Yourself” feature which
would display a random entry and a “Have a One-Way Conversation” feature that selects and
displays a biography entry then selects and displays an entry related to the first. The client also
featured an area for displaying a picture of Mr. Nielsen to put a face to the things the avatar is
saying.
4.4 Testing
We tested our project by programming a Windows application that connected to the database and
selected information by keyword, by date, and at random.
5
5. Results and Analysis
Pictured above, we used MySQL Workbench, a very useful graphical user interface, for creating
the back-end database.
This is the client we created. It features an image of the avatar, Leslie Nielsen, a combo box for
selecting biography entries by keyword, a combo box for selecting biography entries by date, a
button to have a one-way conversation, and a button to have the avatar say something about
himself. On the right, the selected entries are displayed.
6
This is the format in which the model is built. Matt Hull used 3DS Max to do most of the test
model rendering and animation.
This is the beginning process on how to render someone’s face. Note that the picture that is
being rendered is also a 3D rendered face.
7
6. Conclusions
6.1 Summary
Designing a database to store all the various aspects of human life was a daunting task. Keeping
the fields generic proved to be a significant advantage in manageability and flexibility. Different
types of clients, from Windows applications to web applications could be created, and each could
implement features of the database in different ways. MySQL is quite portable, given its
availability on web hosts and the connectors that are available for it, such as the C# connector we
used. MySQL Workbench proved to be quite buggy in its current state, though. It is not too
buggy to be used for applications such as this, but the bugs are there, and they do cost time.
Rendering a person to look and act like a specific individual would be very tedious and difficult
and probably require a team of graphic design artists. To release commercially, the program
would need to be able to render a 3D model through scripting, which would certainly lose
aspects of realism that a team of artists could produce. Also, showing emotion about certain
events and lip synching during speech would take extensive knowledge of all the various facial
expressions humans undergo.
What we were able to produce was a back-end database that could store a tremendous amount of
information about a person. This will be a useful feature to any future researchers on the project.
We were also able to navigate the complexities of 3D rendering using 3DS Max to produce a
video scene of a working modeled character. Finally, we were able to produce a client that
represented Leslie Nielsen as an avatar, successfully connected to the database, and had features
that included having a one-way conversation, saying something about himself at random,
querying biography entries by keyword, and querying biography entries by date.
6.2 Potential Impact
This project is significant in many ways. There is a sort of ambiguity when it comes to our
ancestors and who they were. It would be a huge impact if families could have virtual family
trees which would allow them to interact with, learn from, and get to know family members who
died hundreds of years ago. It could also be used in a classroom, fully simulating famous
individuals like Abraham Lincoln or JFK. Celebrities could create avatars of themselves, and
fans could ask them questions or see their life history. In time, it could be used in the household
to have a living family history, it could be used as a learning tool for students, and it could be
used as a form of entertainment for fans wanting to know more about their favorite celebrities.
6.3 Future Work
We have only scratched the surface of the My Immortal Avatar project. The database provides a
wealth of options in which a full-featured client could take advantage. A client could take
advantage of the strength option, for example, to have the avatar talk about things he or she likes
or dislikes. The avatar could also favor telling stories that are important to him or her.
After one-way chatting is thoroughly explored, the next big step would be to explore having
conversations with the user. The user should be able to say something to the avatar and have the
avatar answer with an intelligent response. Other projects, like the Alicebot, have researched
this area for many years but are still far from completing the goal of simulating intelligent
conversation. Research in this area, then, is still a highly important avenue.
8
In order to become a releasable product, 3D rendering must also be done on the fly instead of by
hand. Customers expect their face to be immediately connected to the avatar instead of having to
wait months to have their likeness accurately rendered by a modeling artist. This is no easy task.
Customers would have to upload photos the program asks for, and then, the rendering program
would have to be able to take those images and compose them into a 3D render. The 3D
rendering program would require extensive use of scripts instead of being able to artistically
mold the images into a perfect likeness, as 3D artists do.
The avatar should also contain the ability to speak, vocally, and have lip synching capabilities
that match up with the customer. Dragon Naturally Speaking has made a great deal of progress
in this area and have released an SDK for programming use, so future researchers would be best
served by pursuing that avenue first.
7. Biography
Mr. Chad Richards, Student – Richards is a senior Computer Engineering major in the
Computer Science and Computer Engineering Department at the University of Arkansas. He has
completed relevant courses in database management systems and algorithms. He has extensive
work experience in creating, maintaining, and accessing databases. He has been a professional
web designer for the past five years, with experience in various scripting and database languages,
including SQL and C#. In the My Immortal Avatar project, he was responsible for designing the
database schema, creating the database, and scripting the routines for connecting the user
interface to the database. He also became responsible for creating a client to connect to the
database.
Mr. Matt Hull, Student – Hull is a senior Computer Science major in the Computer Science
and Computer Engineering Department at the University of Arkansas. He has early courses in
3D modeling, database management, programming paradigms, and virtual worlds. He also has
professional experience in team collaboration, computer programming in SQL and design in
various formats. He was responsible for the 3d modeling and animations involved in the
immortal avatar. He used blender and 3DS Max to create the visual models as well as the
animations. He used NCH Wavepad and Videopad to create the sound files used to create the
avatar's voice.
Dr. Craig Thompson, Mentor – Thompson is a professor in the Computer Science and
Computer Engineering Department. He leads the Everything is Alive research project that is
currently focusing on how to simulate pervasive computing using 3D virtual worlds. See
http://vw.ddns.uark.edu.
8. References
[1] Pervasive Computing. (2010). Retrieved from
http://www.webopedia.com/TERM/P/pervasive_computing.html
[2] Thompson, C. (20 Jan 1999). Everything Is Alive. Retrieve from
http://www.objs.com/reports/9901-everything-is-alive.html
9
[3] Ubiquitous computing. (22 Nov 2010). Retrieved from
http://en.wikipedia.org/wiki/Ubiquitous_computing
[4] Ontology (information science). (12 Nov 2010). Retrieved from
http://en.wikipedia.org/wiki/Ontology_(information_science)
[5] ALICEBOT. (2010). Retrieved from http://alicebot.blogspot.com
10
Appendix A – Deliverables Manifest
Directory Structure:

My_Immortal_Avatar
◦ SlideShow.ppt (Slide Show of project)
◦ FINAL-REPORT--My_Immortal_Avatar--Richards,Hull.doc (contains report file)
◦ Chad Richards
▪ MIADB.sql (contains SQL for importing into MySQL Workbench)
▪ My_Immortal_Avatar.zip (contains Visual Studio Project for Windows client)
◦ Matt Hull
▪ MIA.rar (contains bender video and 3DS Max files)
11
Download