Team 19: Project Proposal and Feasibility Study      

advertisement
 Team 19: Kegan Leberman Nathan Leduc Chad Malinowski Tom Van Noord Project Proposal and Feasibility Study December 11th, 2015 © 2015, Calvin College and Nathan Leduc, Chad Malinowski, Kegan Leberman, and Tom Van Noord. 1 Executive Summary Project Brösel is an alternative interactive whiteboard technology for the classroom. The goal of the project is to make the smart classroom available to all students and teachers by producing a high quality interactive whiteboard at an accessible price. The device the design team produces will be useful for teachers, students, presenters, and business people as well. The target market for this device is primarily focused on schools and businesses, as those are the markets that will find the device most useful and applicable. In order to be competitive in this market, the team is focusing on several key areas. The goal is to use as much of the existing technology in the classroom as possible to save on waste and to limit how much the customer needs to purchase. The technologies that the team is producing is to be useable in any classroom environment that has a projector and surface onto which to project. Since the technology is integratable into an existing classroom, the project does not involve redesigning expensive technologies such as projectors and integrated whiteboards to inflate the cost. Thus, our products can be sold at lower price. Along with a lower selling price, the designed technology combines important features together so that a classroom is better able to use technology effectively. Project Brösel has four members: Kegan Leberman, Nathan Leduc, Chad Malinowski, and Tom Van Noord. All four of them are senior engineering students in the Electrical and Computer Concentration at Calvin College. In this document, the team summarizes their project feasibility and the design criteria in detail. The team determined that the best method of making the design work was to have two camera sensors on the plane of the board that are communicating wirelessly with the computer. The cameras would send image data to the computer to be processed and output position data for the stylus. The stylus device will have a gyroscope and accelerometer to act as a ranged mouse and would also be detected by the cameras and tracked to output onto the screen. All of these peripheral devices will be connected to the computer which will do all the heavy data processing. 2 Table of Contents Table of Figures
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Table of Tables
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Abbreviations
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1 Introduction
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.1 Calvin College
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.1.1 Calvin College Engineering Department
. . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.1.1.1 Senior Design Course . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.2 Team Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.2.1 Team Name and Logo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.2.2 Team Members
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.2.2.1 Nathan Leduc
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.2.2.2 Chad Malinowski . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.2.2.3 Kegan Leberman . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.2.2.4 Tom Van Noord
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.3 Report Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.4 Design Norms
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.4.1 Stewardship . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.4.2 Justice
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.4.3 Trust . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2 Project Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.1 Project Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2.1 Functional Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2.2 Electrical Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.3 Software Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.4 Communication Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.5 Physical Requirements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3 System Design Alternatives
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1 Development Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1.1 Possible Solutions
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1.1.1 Arduino . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1.1.2 Raspberry Pi
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1.1.3 Altera Nios II Development Board
. . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.1.2 Decision Criterion
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.1.3 Final Decision and Decision Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.2 Motion Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.2.1 Possible Solutions
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.2.1.1 Three Dimensional Position Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3 3.2.1.2 Determine Position by Infrared (IR) Sensors
. . . . . . . . . . . . . . . . . . . . . . 16 3.2.1.3 Determine Position by Cameras . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.2.1.4 Determine Position by Radar method
. . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.2.1.5 Determine Position by GPS
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.2.2 Decision Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.2.3 Final Decision and Decision Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.3 System Communication
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.3.1 Possible Solutions
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.3.1.1 Bluetooth
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.3.1.2 WiFi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.3.1.3 ZigBee . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.3.2 Decision Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.3.3 Final Decision
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.4 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.4.1 Possible Solutions
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.4.1.1 Use Existing Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.4.1.2 Create a Program that enables writing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.4.1.3 Create a program that overlays and enables interaction . . . . . . . . . . . . . . . . 20 3.4.2 Decision Criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4 Hardware Selection
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 4.1 Sensors
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 4.2 ​
Stylus
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 4.3 Overall System Block Diagram
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 5 Software Selection
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 5.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 5.2 Functionality
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 5.3 Inputs
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 5.4 Outputs
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 5.5 Software Updates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 6 Physical Design Selection
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 6.1 System Enclosure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 6.2 Material Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 7 Prototype and Deliverables
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 7.1 Final Prototype
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 7.2 Deliverables
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 7.2.1 Fall Semester Deliverables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 7.2.2 Spring Semester Deliverables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 8 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 8.1 Sensor Testing
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 8.2 A­to­D Converter Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 8.3 Image Processing Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 8.4 Future Testing
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 9 Business Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4 10 Project Management
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 10.1 Team Organization
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 10.1.1 Documentation of Organization
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 10.1.2 Division of Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 10.1.3 Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 10.2 Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 10.2.1 Schedule Management
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 10.2.2 Critical Path. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 10.3 Operational Budget
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 10.4 Method of Approach
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 10.4.1 Design Methodology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 10.4.2 Research Techniques
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 10.4.3 Team Communication
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 11 Conclusion
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 11.1 Project Feasibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 11.1.1 Technical Aspect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 11.1.2 Cost Aspect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 11.2 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 11.3 Acknowledgements
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 12 Appendices
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Appendix A: ​
Meeting Minute​
s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Appendix B: Business Plan
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5 Table of Figures Figure 1: Image that Led to the Name Project Brösel Figure 2: Project Brösel Logo Figure 3. Project Brosel Team Picture Figure 4: Three Dimensional Concept of the Wireless Camera Assembly Figure 5: Projected Views of the Wireless Camera Assembly Concept Figure 6: Diagram of stylus design. Figure 7: System Block Diagram Figure 8: IR LED and Sensor Boards Figure 9: IR Sensor Output Voltage as a Function of Distance Figure 10: The A­to­D converter sends the IR sensor data to the Raspberry Pi through a SPI bus. Table of Tables Table 1: Development Platform Decision Matrix Abbreviations ●
●
●
●
●
●
●
●
●
●
●
●
●
●
ABET:
DPI:
IR:
I/O:
LED:
DMM:
PC:
S:
SCK:
MOSI:
MISO:
SS1:
SS2:
GND:
Accreditation Board of Engineering and Technology Dots Per Inch Infrared Input/Output Light­Emitting Diode Digital Multimeter Personal Computer Infrared Sensor Serial Clock Master Out Slave In Master In Slave Out Slave Select 1 Slave Select 2 Ground 6 1 Introduction Prior to the analysis of the system design, background information is provided that relates to the purpose of the project, introduces the team, and discusses the Christian values­ the design norms­ upon which the project was designed. 1.1 Calvin College Calvin College, founded in 1876 by the Christian Reformed Church in North America, is a private liberal­arts college located in Grand Rapids, Michigan that “equips students to think deeply, to act justly, and to live wholeheartedly as Christ’s agents of renewal in the world.1” Consisting of a student body of 3,990 students, the college provides programs in over 100 majors and minors although the top majors, according to student enrollment, are education, business, engineering, and nursing. 1.1.1 Calvin College Engineering Department Calvin College’s Engineering Department is accredited by the Accreditation Board of Engineering and Technology (ABET) and offers a Bachelors of Science in Engineering in four distinct concentrations: Chemical, Civil and Environmental, Mechanical, and Electrical and Computer. During the 2015­16 academic year, the engineering department consisted of 467 students, 60 of which are graduating in May 2016, according to assistant registrar, Melisa Hubka. 1.1.1.1 Senior Design Course Engr 339 and 340, Senior Design, are the capstone courses of the engineering department and use everything students have learned during their four years at Calvin. This Senior Design course focuses on applying the skills and knowledge students have learned through the design of a project relevant to their concentration. Teams are typically composed of three to five members and may include members from different concentrations if it would be beneficial to the scope of the project. 1.2 Team Information In order to understand the project better, it is important to first understand who it is that is designing the project as well as to understand why the team chose the name Project Brösel which, upon first glance, seems to have no obvious connections to the nature of the project. 1.2.1 Team Name and Logo Project Brösel is the codename for the project and it is important to note that it is not the product name that would be used in marketing the product. In an attempt to secure the intellectual property of the design, the actual marketed name would not be chosen until the design is ready to mass produce. 1
http://www.calvin.edu/about/who­we­are/ 7 Still, the story as to how Team 19 chose the codename Project Brösel is rather humorous. Team 19 ®
wants the design to be a low­cost alternative to the SMART Board​
but also double as a gyro mouse. Knowing this, various names were thrown around such as MotionMouse, SmartMouse, and WonderMouse. However, none of those names had the particular ring to them that made Team 19 happy. In a moment of desperation, Chad Malinowski typed “WonderMouse” into Google Images and stumbled upon the image seen in Figure 1. The team was so amused by the intricate combination of cuteness and ridiculousness seen in the image that, from that day on, the codename for the project was Project Brösel. Figure 1: Image that led to the name Project Brösel A meaning behind the name Project Brösel was later derived from the definition of Brösel, which.means breadcrumbs in German. Naming the project Brösel is fitting because the purpose of the device is to enable the presenter to better illustrate his or her ideas to the audience. It is as if the presenter is leaving a trail of breadcrumbs to their ideas for the audience to follow. The logo for the project, seen in Figure 2, was designed by Jacob Meyer, a graphic design major at Calvin College graduating in 2016. Figure 2: Project Brösel Logo 8 The equal sign conveys that the writing as drawn by the user is identical to what the computer records on the screen. It is composed of two identical calligraphy strokes which are notoriously difficult to imitate convincingly. This all conveys a sense of accurate replication which is the end goal of the system designed by Project Brösel. Additionally, the equal sign conveys one of the main values to which the team holds: equality in education. Project Brösel wants to design a low­cost alternative to the SMART ® Board​
so that low­income schools can have the same access to educational technology as more privileged higher­income schools. 1.2.2 Team Members As previously mentioned, senior design teams are composed of three to five senior engineering students. This next section provides a short introductory biography of each of the four team members. Figure 3 shows a team picture. Figure 3. Project Brösel Team Picture. From left to right: Kegan Leberman, Chad Malinowski, Nathan Leduc, and Tom Van Noord. 1.2.2.1 Nathan Leduc Nathan is a senior at Calvin College and is planning on graduating in May 2016 with a Bachelors of Science in Engineering with a concentration in Electrical and Computer Engineering. He grew up as a missionary kid and is an American and Canadian dual­citizen who was born in England and spent nine years growing up in France. 9 He hopes to pursue a career in controls or in hardware development and is hoping that, some day, his career takes him back overseas. Outside of engineering, Nathan enjoys rock climbing, spending time with his friends and girlfriend, and never refuses an opportunity to travel. 1.2.2.2 Chad Malinowski Chad is a senior engineer in the Electrical and Computer engineering program at Calvin College. He grew up in Ada, Michigan. In the summer of 2015, Chad had the opportunity to work as a research intern at Carnegie Mellon University. Chad’s research focus was hardware security. His interests are in controls engineering or embedded systems engineering. 1.2.2.3 Kegan Leberman Kegan Leberman is a senior at Calvin College. He is currently pursuing a degree in Electrical and Computer Engineering and a minor in Mathematics. Kegan is from Manchester, New Hampshire. His hope is to graduate and move on to professional work in embedded systems and digital systems design. 1.2.2.4 Tom Van Noord Tom Van Noord is a senior engineering student at Calvin College in the Electrical and Computer engineering concentration. His hometown is Grand Rapids, Michigan specifically the Ada/Lowell area. Tom has done work involving microprocessing and precision motion. He hopes to go on to work in a field involving robotics, automotive, medical, or electric vehicles. 1.3 Report Overview This report details the project that Team 19 is undertaking. Section 1 provides an introduction to the project clearly explaining the scope and ideas for the project. Section 2 states the requirements for the project. These were devised by the team and will be in the final design prototype. It will include all of the electrical, functional, software, communication, and physical requirements. Section 3 details the system design, including the motion tracking, system communication software, and overall system design. Section 4 describes the motion dependent devices such as the stylus, and the sensor bars. Section 5 details the needed software for the project that will be designed. Section 6 describes the physical package that will be designed to fit the product. Section 7 makes clear the deliverables that will be provided at the end of first and second semester. Section 8 states the testing methods that the team has used and will be using to determine if the device will perform as desired. Section 9 details the business plan and model that the team is basing their product on. Section 10 details how the team was organized and the roles that each team member played. Section 11 concludes the report by summarizing what has been accomplished and what will be accomplished in the future. 1.4 Design Norms All four of the design team members are Christians and, as such, their Christian faith plays a crucial role as to how they want to approach the project. As a way to provide guidelines for how to honor God 10 through their design, they have chosen three design norms­ stewardship, justice, and trust­ which capture the essence of what they want to accomplish through the project. 1.4.1 Stewardship ®
Stewardship is a key component of the design. By designing an alternative to the SMART Board​
that utilizes motion tracking technology instead of the standard resistive touch boards, less hardware can be used to accomplish essentially the same task. Using less materials helps to preserve the earth’s resources and, thus, results in wise stewardship. 1.4.2 Justice Alternatives to the design can cost upwards of $10,000. By designing a system that costs less than or around $100, schools with a smaller budget, such as those found in inner­city neighborhoods, can benefit from the same technology as schools with a higher budget. This helps to level the playing field between schools and helps to ensure a fair and just access to quality education. 1.4.3 Trust A key component of the project is trust, meaning that the customers who use the design must trust that the system is reliable and will work when it is needed. If the system is unreliable, it will not be used consistently and the entire project will have been for nought. 11 2 Project Requirements Before the design process can begin, it is essential to first know what it is that is being designed. The first step in answering this question is to formulate project requirements which establish the core structure of how the design process will proceed. These requirements cover a wide range of design areas related to the project. The requirements relate to the functional design of the project, as well as the electrical, software, communication, and physical design. 2.1 Project Statement Project Brosel’s goal is to create a new computer interface called Brösel that will be used in business presentations and classrooms. The device will allow the user to control the computer from a distance as well as provide a method of interfacing with what is shown on the screen. Essentially, the goal of the project is to retrofit the whiteboard with an easy to use device that would be able to read the user’s motions to output their writing on the projected screen or the computer screen. There would be no need for markers; the user could simply draw or write with the device and the output would be displayed on the screen in the form of digital ink. The device would consist of two sensors and a stylus. The stylus will use accelerometers and gyroscopes to determine the relative motion of the device from a distance and will use a two dimensional sensor array to determine the precise motion of the device when interfacing with the screen. A button on the stylus will be used to trigger the accelerometers, gyroscopes, and sensors in order to determine when the device is writing on the board. As well as providing special interfaces, the device will also function as a simple computer mouse, providing multiple uses for the convenience of the customer. The team expects to market this system to teachers, professors, and anyone who frequently makes presentations. 2.2 Requirements The requirements for the new computer interface will have to meet the following requirements discussed in this section. These requirements are essential in order to successfully create a niche in the target market. 2.2.1 Functional Requirements The basic function of the device shall be similar to a gyroscope mouse. This means that the device will operate as a standard mouse; however, the device can perform standard mouse functions in the three dimensional space as well as two dimensional space. In addition to these functions, when the user moves the mouse on to the projection surface, the cursor of the host computer will match the location of the physical device. This will enable the device to draw or write on the screen when a particular button is triggered. 12 2.2.2 Electrical Requirements The device shall be powered by a battery to provide the device enough life to last for at least four hours before a recharge is required. In addition, the device will require some sort of external peripheral sensors which will need a similar if not slightly longer battery life. These peripheral sensors will be discussed later in this report. 2.2.3 Software Requirements The software will be the backbone of device. It shall use all the information gathered from the device and sensor network to determine the proper mouse location on the board. In addition, the software will give the device the useful functionality such as changing the digital ink color, erasing, and saving the writing on the board. The software is also required to run in the background of the computer without interfering with other computer programs and processes. The last software requirement is to have a variable dots­per­inch (DPI) setting, which allows the user to change the sensitivity of the pointing device in the gyro and standard mouse configurations, as well other user­specified settings to make the system much more comfortable and easy to use. 2.2.4 Communication Requirements The device will be wireless to give the user the mobility to move freely around the classroom or conference room. A 40 to 50 foot range from the computer will give the user plenty of room for freedom to move around during their lecture. However, to allow multiple devices to be used in adjacent rooms at the same time, the means for communication between the sensors, writing device, and host computer must be unique. This way two or more systems in close proximity will not interfere with each other or interfere with other devices or systems using a similar communication protocol. 2.2.5 Physical Requirements The device that the user holds will require a comfortable packaging. This way the user can perform a comfortable mouse motion and writing motion throughout the entire lecture. The entire system for this device shall be portable and easy to set­up and begin operating. This way the user can take the device with them between rooms. The target takedown and setup times are less than a minute and five minutes respectively to allow teachers to take down the system, move between classrooms, and set the system up again for their next lecture or presentation. 13 3 System Design Alternatives The System Design Alternatives section will discuss the design alternatives and approach to selecting the best design for this system that will meet the requirements. The section is broken down into three parts to analyze the system design: motion tracking, system communication, and software. Further detail about the chosen design will be discussed later in this document. 3.1 Development Board In developing the initial prototype of the system, a development board was used to connect and test the peripheral sensors as well as the handheld device. This required choosing which development board to use. While a wide array of development boards exist on the market, the three boards that were considered were the Arduino, the Raspberry Pi, and the Altera Nios II Development Board. These were considered over other less­common development boards primarily due to familiarity and ease of access, as well as the availability of third party software and software libraries. 3.1.1 Possible Solutions The possible solutions as to which development board to use are discussed below. 3.1.1.1 Arduino The Arduino family is a group of open­source prototyping platforms best known for their ease of use in integrating software and hardware and their ability to develop applications quickly. The Arduino board is a microcontroller and executes written code as the firmware on the board interprets it. It is a great product for simply reading data from input and output (I/O) devices. 3.1.1.2 Raspberry Pi In contrast to the Arduino, the Raspberry Pi is a full computer, as opposed to a microcontroller, and runs a modified version of the Linux operating system. It is better suited than the Arduino to process large amounts of data quickly and is able to network with other devices through an ethernet port mounted onto the board. A constant 5V power supply is required to maintain power to the Raspberry Pi. 3.1.1.3 Altera Nios II Development Board The Altera Nios II Development board is a more sophisticated option than either the Arduino or the Raspberry Pi. It allows the user to completely configure and program a processor and, as such, allows for much more customization. It is also capable of handling a large variety of I/O ranging from audio codecs to an LCD module to an IrDA (Infrared Data Association) Transceiver. 14 3.1.2 Decision Criterion At the time when a development board was chosen, it was assumed that the motion tracking data would be processed on a microcontroller and then the final calculated motion tracking data would be sent to the computer. As such, processing power was the main emphasis on which development board to use. If cameras were to be used to collect position data, a board with a large amount of processing power would be a must. Integrability with sensors was also taken into consideration as the development board would have to connect to the sensors in both the motion tracking peripheral sensors and the handheld writing device. While ease of use was considered in comparing the three development boards, it was prioritized lower than processing power or integrability with sensors as the team had previous experience working with all three development boards in their free time and course work at Calvin College. 3.1.3 Final Decision Weighing the decision criterion appropriately according to importance and ranking each development board according to those decision criterion created the decision matrix seen in Table 1 below. Table 1: Development Platform Decision Matrix Weight Arduino Raspberry Pi Altera Nios II Ease of Use 2 9 7 4 Integrability with Sensors 6 9 5 6 Processing Power 8 4 10 8 Total 106 124 108 From this, it was decided that the Raspberry Pi would be used as the development board for prototyping. As previously mentioned, it had been assumed, at the time that the decision to use the Raspberry Pi was made, that the motion tracking computations would take place inside the microprocessor. However, since that time, the decision has been made to transfer the raw sensor data to the computer for further analysis due to the fact that the host computer has a far more powerful processor. Still, the decision to use the Raspberry Pi makes sense since, as part of the testing process, it was necessary to connect the development board to the computer through an ethernet cable ­ an ability that the Raspberry Pi is capable of handling without the need to modify the board or purchase extensions to provide networking capabilities. 15 3.2 Motion Tracking In the process of choosing the motion tracking technology for the final product, several criteria were kept in mind. It is essential that whatever method of motion tracking is implemented is capable of providing precise and consistent measurements so that an accurate position is calculated. 3.2.1 Possible Solutions Possible design alternatives to the motion tracking aspect of the project are presented below. 3.2.1.1 Determine Position from Acceleration The initial design idea was to track the device location throughout the room using accelerometers and gyroscopes. The accelerometers and gyroscopes would determine the acceleration of the device and angle of acceleration. From the gathered data, the computer would perform the double integral on the data gathered from the accelerometer in order to determine the position of the writing device.2 However, further research indicates that, while this calculation is possible, the error grows exponentially over time.3 The major source of error in this method is not being able to record initial conditions of the device without error. When taking the integral of acceleration, which yields the velocity of the device, the initial velocity of the device needs to be known. In this application, the initial velocity term would be equal to the final velocity from the previous point. However, this value will have error and will be continually folded into the next calculation, compounding the error as long as the device does not calibrate itself to an external point of reference to correct for any error introduced into the calculations. This error is then compounded again in taking the integral of the velocity to get the position. This situation is much like walking with a blindfold on and not removing it or not getting position data from another person nearby. For a brief period of time, the user will be able to determine where they are in a three dimensional space. However, the longer the user walks with the blindfold on and not taking it off at all to see where they are in space, the further they get from where they think they are. Thus, the three dimensional position tracking method would be extremely inaccurate without an external point of reference. 3.2.1.2 Determine Position by Infrared (IR) Sensors Four IR sensors could be attached on the edge of the projection space. The IR sensors would then detect the illumination of an IR LED attached to the writing device. Ideally, when the user would move the device to write on the board, the IR sensors would detect the movement of the IR LED. The distance could then be triangulated between all four sensors based on the intensity of the IR light detected by the sensors. To make the IR sensor system work, the IR LED on the device will need to emit a uniform light 2
https://hal.inria.fr/hal­00966200/file/tmp.pdf ​
http://www.pnicorp.com/wp­content/uploads/Accurate­PositionTracking­Using­IMUs.pdf 3
16 across the plane perpendicular to the board. This is because the intensity of the light read by the photodiode is not only affected by the distance between the LED and the sensor but also the angle of the LED to the sensor. Without the LED emitting a uniform intensity of light, it would be impossible to calculate the position of the LED and writing device. This is a concern because the writing style of each user can greatly differ as both the angle at which the device is held and which hand is used to write can change. 3.2.1.3 Determine Position by Cameras One or two cameras could be used to determine when the user is at the board and where the device is with respect to the board at some predefined points in the projection area. The camera would record an image and either an external board or computer would process the image to determine the device location. The difficulty in this method would be implementing it in a way where the cameras has a clear view of the board. A common obstacle to using a camera is line of sight. If something is obstructing the line of sight from the camera to the writing device, the entire system would be rendered useless and seem unresponsive to the user. 3.2.1.4 Determine Position by Radar A radar method could also determine the position of the writing device by measuring the time that light or an ultrasound wave takes to travel between the sensor and the device. The time it takes to travel between the handheld device and the sensors can then be used to triangulate the device position on the board. While this could prove to be a good and reliable method to fill the motion tracking requirement, it would require a synchronized clock between the sensors and the handheld device to track the time the wave travels and, hence, the distance which might be a difficult feat to accomplish. Another downside to ultrasonic triangulation is that this technology is limited to only tracking one object. In order for this technology to detect multiple objects, the sensor would have to be programmed to time multiple echos and to not stop listening for echoes after the shortest one. Another fact to consider, which is both a downside and an upside, is that the sensor will detect a wide range of materials. In the case of wanting to use a finger as the writing method, this would be beneficial as the ultrasonic sensor would be able to detect the presence of the finger. The downside would be when using the electronic writing device, the reading of the ultrasonic sensor could be thrown off by objects in the line of sight such as the user’s hand. 3.2.1.5 Determine Position by GPS A final possible way to determine position is via GPS. To determine position using GPS, the system would need access to at least three of the GPS satellites in orbit. Using the signals from these satellites, it is possible to determine the position of an object on the surface of the earth. With access to a fourth satellite it is possible to determine the elevation as well. The problem with this method is that there is no guarantee of a stable connection with four satellites especially when indoors. The other issue with using GPS is that there is a delay between the transmission and the position calculation; this delay is too high to be effective for the device. The final obstacle to this technology that the device would be moving small distances from the point of view of the GPS. Moving a fraction of an inch on the board may not even register as movement for the GPS. 17 3.2.2 Decision Criterion The chosen sensor array must be able to provide a reasonable amount of resolution across a projected screen in a classroom. This system will take the position of the stylus that is placed on the board and send the positional data back to the computer. This data must be sufficiently accurate for the computer to accurately display the motion of the device on the board as a mark on the screen. Radar and GPS tracking won’t work either due to the unreliable nature of these methods as well as the lack of accuracy with small changes in distance or their ability to track multiple objects. 3.2.3 Final Decision Choosing which method to use for the motion tracking was fairly easy after some initial research and testing. Using the second integral of acceleration to track the position will not work due to the exponential growth in the error. The IR sensors, while initially promising, will not work either due to the difficulties in achieving a wide enough range of distance where the output voltage would be beneficial to use. The results of these tests is presented later in this report. Another challenge is getting a very sensitive sensor and making sure that the ambient light does not affect the readings of the sensor while being sensitive to the smallest change in the irradiance of the LED. The method that has been chosen is using cameras as the tracking technology. Cameras provided multiple advantages that the other methods could not such as tracking multiple objects, the ability to change its exposure time and filter different colors of light. Two downsides to this technology is that the use of cameras consumes a large amount of power to operate and has the most data to process of any of the alternatives. 3.3 System Communication Another crucial component of the design is the wireless communication between the sensors, the stylus, and the computer. In order to most efficiently and effectively achieve this wireless communication, three ®​
®​
communication methods were considered: Bluetooth​
, WiFi, and ZigBee​
. 3.3.1 Possible Solutions The three system communication design alternatives are presented below. 3.3.1.1 Bluetooth ®
Bluetooth​
is a wireless communication standard that enables data communication over short distances. ®
Bluetooth​
is incorporated into a large percentage of today’s technology ranging from wireless ®
headphones and speakers to wireless keyboard and mice to hands­free calling in cars. Bluetooth​
would be easy to implement since it is so widely used and, as such, documentation for helping to design the ®
system is readily available. The downside of using Bluetooth​
is that not all computers, and especially ®​
older computers, might not support the use of Bluetooth​
. This would require the use of a USB interface ®​
to provide Bluetooth​
capabilities to the host computer. 18 3.3.1.2 WiFi WiFi is a local area wireless computer networking technology that allows enabled devices to communicate across a distance. Often the communication frequency is on the 2.4GHz frequency, but recent developments have allowed for the 5GHz band to be used as well. The advantage to using WiFi for the device’s communication is that WiFi is a widely used technology and is thus well documented and fairly easy to implement. The disadvantage is that a WiFi network is required to make the devices work and there may not be a wireless network easily available. 3.3.1.3 ZigBee ®
ZigBee​
is a low cost, low power communication standard. It uses a low duty cycle to achieve the low ®
power consumption. The communication can range up to 300 feet. ZigBee​
is mostly used for building network topologies as it is great at creating wireless network meshes. One downside to this technology is its relatively low bandwidth of data. 3.3.2 Decision Criterion The communication method chosen must be able to communicate over an extended distance, approximately forty to fifty feet. It must be easy for the user to set up and connect to all of the devices. The communication must also be fast enough to allow for near real­time editing of the board. 3.3.3 Final Decision ®
After considering all of the options, Bluetooth​
seems to fit best in accordance to the specifications outlined above. The downside to WiFi is that it relies on the customer’s network and, thus, it is more difficult to control the ability to transmit data wirelessly which could potentially lead to an unreliable ®​
system and violate the design norm of trust. By using Bluetooth​
, all of the components required to ®
achieve wireless communication are contained within the system. While ZigBee​
could be a viable wireless communication method, its low data bandwidth might potentially cause some problems when ®
sending image processing data over the network. Since Bluetooth​
has a higher bandwidth, it was ®
deemed most reasonable to implement Bluetooth​
technology to be used for the wireless connectivity of the system. 3.4 Software The software used in the system will enable the actual process of writing on to the computer. The software will take the data from the sensors and the stylus and will process it in order to fulfill the design requirements and objectives. Three alternatives are considered as to how to implement the software of the system. These three solutions are slightly different in nature to the design alternatives in the previously mentioned sections. These software solutions all build on to each other so that, by selecting the third alternative, the basic premise of the first two would also be selected. 19 3.4.1 Possible Solutions The three software design alternatives are detailed below. 3.4.1.1 Use Existing Programs This alternative is not ideal but the software design could simply act as a mouse and interface with the different programs in the same way that a mouse would. This limits the effectiveness of the device to only being able to ‘write’ in visual editing programs such as Microsoft Paint that rely solely on the mouse. 3.4.1.2 Create a Program that enables writing A program that enables writing would essentially be an overlay to the existing screen. The overlay would be a clear window so that the user could see everything that is being displayed. While in this program, buttons would not be able to be clicked because the clear backdrop is running in the foreground of the PC. 3.4.1.3 Create a program that overlays and enables interaction Increasing the complexity of the program further would enable the writing on the board as well as interaction with the programs in the background. This final iteration of the software would be absolutely ideal as it allows the user the most freedom possible when it comes to interfacing with and using the device effectively. 3.4.2 Decision Criterion While a software package with a plethora of features is ideal, time constraints will dictate what can be accomplished in the coming semester. The goal is to create a software package that showcases the basic and most important functions of the system in order to demonstrate the abilities of the system. However, if time permits, a more comprehensive software package will be developed to showcase advanced features of the system. 20 4 Hardware Selection Any good electrical engineering project requires a good amount of hardware and this project is no different. The two primary hardware components are the stylus and the sensors. The stylus is the handheld device which the user holds as he or she writes on to the computer. It doubles as a gyro mouse and has an IR LED which is tracked by the sensors. The sensors are image processing cameras which, by using several cameras, can triangulate the position of the stylus. 4.1 Sensors The peripheral sensor system will consist of two wireless camera assemblies that will be mounted in the top or bottom left and top right corners of the projection space. These assemblies will need to be positioned a small distance from the corner of the projected image in order to keep the projection space in focus of the cameras. The length of this distance will be determined at a later date through testing various distances once the camera and assembly layout is determined and prototyped. The main components of each camera assembly will need to contain one high resolution camera, a ® Bluetooth​
module to communicate with the host computer, one accelerometer used for device leveling, a rechargeable lithium based battery as well as a micro USB interface for both data communication and power. Between the two assemblies, a laser and photodiode will be required to align the two cameras. Other components that will be required on each assembly will be various indicator LEDs and magnets to mount the assemblies to the whiteboard or other mounting devices. These sensors will take in data via the high resolution camera and possibly other sensing devices to be potentially added at a later date. This data will be an image of the space above the projection surface ® and will then be transmitted to the host computer via the Bluetooth​
module for further processing. Other data that will be transmitted through this connection will be the statuses of the sensor and host computer and commands that will be sent to the camera either requesting a full image or a fraction of the image. In order to ensure that the sensors are placed correctly, the accelerometer and laser will be used. Each assembly will have an accelerometer onboard in order to make sure that the assembly is level. The laser and photodiode will be used in order to ensure that the two camera assemblies are at the same height. If the leveling is not done properly, the cameras’ data that is analyzed to determine the position of the writing device could produce contradictory conclusions. The preliminary design of the sensor bars can be seen in Figures 4 and 5 below. 21 Figure 4: Three Dimensional Concept of the Wireless Camera Assembly Figure 5: Projected Views of the Wireless Camera Assembly Concept 4.2 Stylus The device that the user will use as a mouse to write on the board will be a stylus. It will include four pushbuttons that will function as a traditional left and right click on a mouse and two customizable pushbuttons. The stylus will also include a scroll wheel and another pushbutton on the end to toggle between the write and mouse functions. An LED will be placed on the end of the stylus to be detected by the cameras. The LED will pulse to lower power usage. An alternative to the LED is the use of a retroreflective tip. Placing the LED on the camera assemblies would then produce the same effect as the cameras would detected the retroflected light from the LED. Further testing will be conducted in order to determine with solution provides more accurate results. The device will also have an accelerometer and a gyroscope to control the mouse and board from a distance. Additionally, the device will also have a 22 wireless communication chip to send its data to the host computer for processing. Figure 6 below displays a basic stylus design. Figure 6: Diagram of stylus design. 4.3 Overall System Block Diagram The system includes four main hardware and software elements that will be designed by the team. There will be two camera sensors on the board where the screen is being projected. These camera sensors will see the stylus or possibly other devices that touch the screen and output the results just like a stylus would on a whiteboard. The stylus will also have gyroscopes and accelerometers so that, from a distance, the stylus can be used as a gyro mouse. Figure 7 below displays the overall system block diagram. 23 Figure 7: System Block Diagram 24 5 Software Selection Project Brösel will require software support to the hardware. The software section will outline the inputs, outputs, and functionality of the software. The section will also cover future software updates. 5.1 Overview Ideally, the software will provide several functions. It will take the wireless signals from the sensors and the stylus, process and decode them, and provide the necessary functions to take full advantage of the system’s capabilities. The main software method will proceed as follows: if the stylus is in use on the screen of the board, the computer will take the data returned from the sensor bars and create the shape or shapes drawn by the stylus and display them in the correct location. This will take place over top of the current processes running on the screen. It is likely that the software will base itself on the mouse movement. If the interrupt that signifies writing has been triggered, the mouse will move to the desired location and perform the desired click, drawing, or any other function entered by the user. The idea is to make the software and hardware as simple as possible to fulfill the needed task. The more complicated the design and software, the less user friendly it is. 5.2 Functionality Key to the operation of the software will be multiple OS support. Schools are not standardized when it comes to the computer systems that are implemented in the classroom. Therefore, the software must be useable on different computer systems. The program will essentially run in the background allowing the other computer processes to run in the foreground. The user will then be able to write and manipulate the programs running on the computer using the stylus and screen. The software will take data from the wireless camera assemblies, process that data and display the results on the projected image. It will also be easy for the user to set up and implement the program. The installation of the software should be self explanatory and honest by not installing or requiring the use of bloatware or other unwanted things. This is in conjunction with the design norm of trust. 5.3 Inputs The inputs into the computer’s software will be the data from the peripheral sensors and the data from the stylus. This data will be sent to the computer from each of the devices individually and processed by the computer as needed. The computer will take these inputs and perform the needed calculations and output the information to the screen in a timely fashion. 25 5.4 Outputs The computer’s software needs to output the signals required for the wireless communication. This may include acknowledge signals or start/stop signals. Additionally, the computer will indicate the presence of the paired devices and provide an idea as to the strength of the wireless connection. 5.5 Software Updates If there is time left at the end of the semester after completing the basic system requirements, it would be possible to implement additional features into the software of the system. The first, and presumably most useful, software update would be support for multiple handheld devices or the use of fingertip as a writing device. This would enable multiple people to write on the board simultaneously. Supporting another user will not affect the hardware for the camera assemblies as these devices simply take and send images to the host computer. The system that would require the most modification in order to add a second user would be the image processing software. In the case where there is only one user writing on the board, the stylus will always be in view of both camera assemblies and the software will know that there is only one writing device. Once a second device is added, it is possible for one writing device to be between the camera and the other writing device. This creates a blind spot in which one camera will only see a single writing device and the other camera will see two writing devices. The software will need to be written in a way that it will be able to distinguish the two writing devices from each other, be able to process both of their movements, and handle situations where one writing device cannot be seen by both cameras. Another feature which could be implemented would be for communication between boards. For example, students from one classroom may be able to see changes made in real­time by students in another classroom. Alternatively, this feature could be implemented in classrooms that contain multiple interactive whiteboards. This could enable an entertaining form of communication between individuals in distinct geographic locations. A third feature which could be implemented into software would be user gestures. These gestures would allow the user to change the screen and control different aspects of the device software through simple motions. This feature is not critical to the system’s functionality but it would be a nice inclusion into the project. 26 6 Physical Design Selection The Physical Design Section will encompass the packaging for the sensors and stylus. This section is important in order to provide the user with a comfortable, aesthetic device. 6.1 System Enclosure The system will be composed of four parts: the computer, the projector, the stylus, and the sensors. The computer and projector will be purchased separately from the system and the sensors and stylus will be interfaced with the existing hardware. The mouse device will be enclosed in a polymer case and its shape will look similar to a stylus. It will be both comfortable to write with, use as a gyro mouse, and function as a traditional mouse. The sensors will be enclosed in a similar material and will have magnets on the side to allow the sensors to be attached to the surface of a whiteboard. It will also include magnetic clips so that, if need be, the system can be attached to a projector screen. 6.2 Material Selection The material of the physical design will be a light weight plastic polymer. This way, the device will be light enough to carry and the sensors will be light enough to hang on the wall while protecting the electrical components from damage. The magnets selected will be powerful enough to hold the devices up on the board but not too powerful to cause damage to the electronics or to the board to which it is attached. 27 7 Prototype and Deliverables The Prototype and Deliverables section outlines how the team plans to demonstrate the Project Brösel on senior design night. It also describes what the team will be expected to deliver. 7.1 Final Prototype By senior design night on May 7th, the final prototype shall solve the problem described in previous sections. It shall consist of the user device along with the additional sensors required. The prototype will connect to one of Calvin College's computers and projectors to demonstrate its functionality. 7.2 Deliverables Described below are the items that the team will deliver at the end of the fall and spring semesters. 7.2.1 Fall Semester Deliverables By the end of fall semester, a clear choice on the method of motion tracking will be chosen. Since this is the most crucial component of the entire system, the goal of the first semester is to decide what sensors will work for the project as well as predict the amount of data and power that will be required to implement each option. This choice will then be implemented as time permits. The first prototype should be designed by the end of January or early February with product testing and any necessary revisions beginning soon after that time. Through the course of the past fall semester, research was done along with various tests and it was determined that cameras will be used as the sensors. 7.2.2 Spring Semester Deliverables The Spring 2016 semester deliverable will consist of the second prototype as well as the computer software. Ideally, the second prototype will be finished by April which would allow the team to debug the design and create a better product to show at senior design night. At the very least, the system will consist of the core requirements and features in order to demonstrate the fundamental purpose and primary functions of the system. These components include the two peripheral camera assemblies, the wireless receiver that transmits data between the sensors and the host computer, the stylus, and software that supports a single writing device. If time permits, additional functions will be on display. This may include support for a second writing device and other potential software updates. 28 8 Testing The Testing section describes the time that was spent in conducting experiments to test the feasibility of the design alternatives. The tests conducted tested the image processing, the functionality of the infrared sensors design alternative and some testing with the analog­to­digital (A­to­D) converter. The section will also cover future tests that will be conducted to ensure product functionality and safety. 8.1 Sensor Testing The team conducted some tests on the feasibility of using IR sensors for the peripheral sensor system. By connecting an IR LED to a power supply and connecting the sensor to another power supply as seen in Figure 8, a test could be conducted to determine the relationship between output voltage recorded by the sensor (as IR luminosity was a function of the output voltage) and the distance between the LED and the sensor. Figure 8: IR LED and Sensor Boards Preliminary tests were not promising as it was determined that the relationship between output voltage and distance was not linear but, rather, an inverse square relationship. The results from the test can be seen in Figure 9. An inverse square relationship is not ideal for a peripheral sensor system because the output voltage of the sensor changes too rapidly for precise position calculations. From these results, it can be seen that the optimal range at which the sensors could be used would occur between 14 to 23 inches. This is the region of the graph where the relationship between the output voltage and the distance is close to linear. While it would be possible to write software in such a way that it utilizes the quadratic result of this test for the triangulation process, that would simply add an extra degree of complexity to the code. A usable distance range of 14 to 23, per the criterion described above, means 29 that the optimal range is only 9 inches long which would not work for a standard projection screen width of 100 inches. Figure 9: IR Sensor Output Voltage as a Function of Distance 8.2 A­to­D Converter Testing The IR sensors output an analog voltage signal so, in order to process the data on the computer, the output voltage needs to be converted into a digital signal through the use of an A­to­D converter. The A­to­D converter was connected to a Raspberry Pi and, through a program written to read the digital output, the functionality of the A­to­D converter was tested. An initial obstacle to this phase of testing occurred when the digital output shown represented a voltage range from ­2.5V to 2.5V instead of the expected 0­5V range. Soon after encountering this error, it was determined that infrared sensors would most likely not be the optimal approach to pursue for the motion tracking technology and, as such, the focus shifted away from solving this error. The program code used for this test can be seen on the team’s website. While test results did not look particularly promising for the IR sensors, additional tests were conducted by Chad Malinowski and Nathan Leduc in conjunction with a project in their lab for their Engr 325 class, Computer Architecture and Digital Systems Design. This phase of testing combined the sensor testing with the A­to­D converter testing. The test setup can be seen in Figure 10 below. The program code used for this test on the team’s website. The result from this test, described in further detail on the team’s website, verified the results from the previous testing as it determined that the IR sensors were not going to work particularly well. 30 Figure 10: The A­to­D converter sends the IR sensor data to the Raspberry Pi through a SPI bus. 8.3 Image Processing Testing As part of the lab project for Engr 325, Kegan Leberman and Tom Van Noord did some testing regarding the feasibility of the image processing sensors which are an alternative to the peripheral IR sensors. The image processing tests were encouraging. Leberman and Van Noord were able to use a USB webcam to determine the position of an infrared LED in a picture taken by the webcam. These results imply that the team’s choice to use cameras ensure that the device design is feasible. The project indicated that there are still several things that need to be improved upon such as the processing speed of the images and image transmission from device to device but, still, the preliminary results have been encouraging. 8.4 Future Testing The following interim period and spring semester will include more to ensure that the system will work. By the end of the semester, the team is hopeful that they will be able to conduct a field test wherein teachers are given the prototype and will be asked to use it in their classroom. Feedback from the teachers will provide the team with an idea as to what works well and what areas of the design can use some improvement. 31 9 Business Plan See Appendix B for the business plan for Project Brösel. 32 10 Project Management The Project Management section describes the process by which the team was organized into different roles to most efficiently design the system. In addition, the section will explain the team methodology behind the design process. 10.1 Team Organization Members of the team include Nathan Leduc, Chad Malinowski, Kegan Leberman, and Tom Van Noord. Leduc will be primarily involved in designing the hardware of the board. Malinowski, Leberman, and Van Noord will be handling a lot of the software for the project. Additionally, Leberman and Van Noord worked on the mathematical models needed to complete the project. Malinowski managed the scheduling of the project while Van Noord kept record of the team’s budget. Leberman created, maintains, and updates the team’s website. All four members have been heavily involved in the testing phase of the project. The team faculty advisor is Mr. Mark Michmerhuizen, Assistant Professor of Electrical and Computer Engineering at Calvin College. In addition to Mr. Michmerhuizen, instructors for the Senior Design course included Dr. Jeremy Van Antwerp, Mr. Ned Nielsen, and Mr. Robert Masselink. Team meetings took place every Monday and Thursday at 7pm along with other meeting times as needed. Documents pertaining to the completion of the Senior Design project were kept in a shared file on Google Drive. 10.1.1 Documentation of Organization Minutes were taken at each meeting by Leduc so as to keep track of what was completed at the previous meeting and to quickly find a way to pick up where the team left off. Additionally, a Gantt chart was produced so as to ensure that everything would be completed before the final completion date of May 7th. Documentation of the minutes can be seen in Appendix A and the Gantt Chart can be seen on the team’s website. 10.1.2 Division of Work The project, broadly speaking, is broken up into two large sections: hardware and software. While all four group members were willing to help out in whatever way needed, Leberman and Van Noord are heading up the development of the software while Leduc and Malinowski are taking the lead on the hardware. 33 10.1.3 Milestones The team overcame two large milestones this semester: system communication and system design. ®
Different types of system communication were researched during the semester and Bluetooth​
was determined to be the most feasible. In addition, the overall system design has been chosen so as to include cameras as the device to track the motion of the stylus and the stylus device. Future milestones include a finished prototype using a Raspberry Pi which incorporates the sensors and stylus. The potential deadline for this milestone is the beginning of the second semester in order to give the team ample amount of time to design their own board for the prototype. Another milestone includes finishing the pre­alpha phase of the software application by the end of interim and then further develop the software into the alpha and beta stages during the second semester. All the milestones will hopefully be finished by two weeks prior to senior design night. 10.2 Schedule Chad is the schedule manager; he has and will continue to change and update the schedule. The schedule will be updated after each group meeting. This means that the schedule is updated twice a week. The group meetings will start with each group member updating their project status and presenting on any ideas or research findings. The group members are required to notify the group if they are going to miss a meeting. Each group member will work an estimate of eight hours a week during the fall semester and sixteen hours a week during the spring semester. 10.2.1 Schedule Management The schedule will be managed by evaluating the team's progress and adjusted accordingly. The schedule manager will review the schedule at each meeting in order to inform the team of any upcoming deadlines or to make any needed adjustments. The work breakdown structure for the project is shown by the Gantt chart found in the team’s website. The Gantt chart outlines the tasks and milestones required for the project. 10.2.2 Critical Path The majority of the time frames on the Gantt charts are soft deadlines. This means that tasks have at least a week of buffer. The hard deadlines for the project include developing and debugging the prototype by May 7th, 2016­ senior design night. To meet this hard deadline, the system communication and the overall system design must be completed by the end of the fall semester. At the beginning of interim, the team will split into two groups to tackle the hardware and software portions of Project Brösel. The hardware portion of the project is required to finish the Raspberry Pi prototype by February 1 so that there is enough time for the team to fabricate their own printed circuit boards for the sensors and stylus. The fabrication of the sensors and stylus needs to be completed by senior design night. The software portion of the project is required to finish the pre­alpha phase of the software and start the alpha phase by the end of interim. 34 10.3 Operational Budget Tom is the budget manager; he will be keeping track of how much is spent and what it is used to purchase. The budget will be managed through a spreadsheet that will be accessible to everyone on the team. Every time money is spent, the spreadsheet will be updated so that the team can keep a close eye on it. The plan for spending money is to first spend it solely on things that are absolutely essential. Once the essentials have been acquired, the team can look at spending money on more aesthetic or non­critical items such as the case or holder of the device. When budget issues arise, the team will decide what is absolutely necessary and cut items that are not necessary. If the budget is still overdrawn after making cuts, the team will either determine if what is on the list to purchase is truly necessary, and if so, request additional funds. Otherwise, the team will continue make additional cuts. 10.4 Method of Approach The team originally met several times a week in order to better define what it wanted the project to resemble. After fully defining the project, the team moved into the research phase. Researching is taking place in phases. The preliminary research phase involved checking to see if the team’s concept was sound. In the next research phase, the team will be looking into methods to optimize its design methods. The team communicates effectively through several different methods in order to make sure that each team member is on the same page. 10.4.1 Design Methodology There were several different steps and processes that the team used to come up with the design for the device. The first step involved researching existing products that were similar to the end product that the team had in mind. From these existing products, the team determined what functionalities were necessary to make a viable product. In the next stage, the team determined the feasibility of the different ideas by systematically eliminating those that were infeasible and further researching those that were deemed feasible. If further research proved that the idea would not work, the idea was eliminated and the next one was picked up. Once the team determined an idea would work, it was adopted into the design. This idea was then researched even further to gain a thorough understanding of the process or technology and how to effectively integrate it into the final design. 10.4.2 Research Techniques Researching was broken down into two steps: databases and experiments. The first technique to research is to inquire as much information as possible from online databases about related products or systems that may be beneficial to the project. Then, further time was spent researching any potential leads. The Engineering Research Databases, Patents, and IEEE were some of the main databases used 35 to uncover any useful information related to Project Brösel. No patents were found of systems that could aid in designing Brösel; however, the research led the team to explore the three design alternatives (three dimensional positioning, IR sensors, and camera sensors) as described in section 3. Experiments were then conducted once design alternatives were conceptualized. Experiments enabled the team to conduct research specifically related to Project Brösel in order to determine the project's feasibility. The experiments were composed of cycles that tested an aspect of a design. Then, the results were evaluated and test parameters were altered if needed. This cycle was repeated until conclusive results were obtained. 10.4.3 Team Communication The main method for communication between group members was and will continue to be conducted through biweekly meetings and group text messages. The group meetings will start with each team member given the opportunity to present any ideas, findings, or results to the other team members without any interruptions. The text messages are used as the communication between group members when they are not around in order to schedule group meetings and answer any questions group members may have. 36 11 Conclusion Project Brösel is a new computer interface device that will enhance presentations. Through extensive analysis, the design team has deemed the project feasible while remaining within the financial goals. 11.1 Project Feasibility The objective of the fall semester was to determine the feasibility of Project Brösel both technically and financially. 11.1.1 Technical Aspect The design team is currently testing various aspects of the project. Most notably, it is beginning to investigate motion tracking through image processing. The project is feasible using the cameras and the team is confident that the the project is able to meet the design requirements. 11.1.2 Cost Aspect Price is an important aspect of the design as a low­cost system is one of the main goals of the project and is driven by the design norm of justice. The team’s goal is to create a more affordable alternative to a smart whiteboard or smart computer interface. In the business report (see Appendix B), the team calculated a rough estimate of what the product should cost based on several factors. The price was determined to be approximately $140. This number has a lot of wiggle room considering most smart whiteboard alternatives cost at least a thousand dollars at least. To achieve the goal of the design, the team would like to be well below that thousand dollar mark which appears to be achievable. Consult Appendix B to see detailed device price calculations. 11.2 Lessons Learned One lesson that was learned in this project was that when sensing the environment around an object, it is not possible to completely and accurately sense its surroundings using only internal sensors. Without relying on external sensors or points of reference, the data from the internal sensors will compound error as it continues to collect data and make calculations based on that data. The team learned that it was important to thoroughly analyze and research a design idea before deciding to implement it fully in the final design. It is important to thoroughly research and understand a concept or piece of technology before implementing it. 37 11.3 Acknowledgements We would like to thank the following people for the invaluable role they contributed in helping the work of Project Br​
ösel​
: ● Faculty advisor Prof. Mark Michmerhuizen of the Calvin College Engineering Department ● Industrial advisor Mr. Eric Walstra of Gentex Corporation ● Electronics Shop Technician Mr. Chuck Holwerda, for helping the team develop circuit boards needed for testing ● Lab Manager Mr. Bob DeKraker for helping to purchase components needed for the project ● Jacob Meyer for his role in creating the logo ● Assitant registrar Melisa Hubka for providing information related to the Calvin College Engineering Department Class of 2016 38 12 Appendices Appendix A: Meeting Minutes September 16, 2015 ­ Discussed possible senior design ideas. Brainstormed and spoke to Prof. Kim, Michmerhuizen, and Brouwer. September 21, 2015 ­ Decided to pursue “SMART Board without a smart board” technology. Began initial technical conversations. September 24, 2015 ­ Met in Engineering Building conference room. Continued brainstorming as to how to go about motion tracking technology. September 28, 2015 ­ Met in Engineering Building conference room. Talked about calibration more in depth. Discussed capability to record location so that calibration would not be necessary. October 1, 2015 ­ Met in Engineering Building conference room. Spent time discussing the pitfalls of using accelerometer to track position due to error. Mentioned idea of “pinging” between transceiver and device for motion tracking. October 5, 2015 ­ Decided to go about developing an Android app to test accelerometer to position theory using sensors on phone. Aim to have app done in 2 weeks by oral presentations. ­ Nathan and Chad will continue to do research on accelerometers and ping architecture October 7, 2015 ­ Chad researched accelerometers ­ Nathan spent time updating the minutes and set up Asana account. ­ Kegan and Tom worked out the mathematics for the Android app to be developed for testing October 15, 2015 ­ Decided that Kegan is the webmaster ­ Divided up portions of business plan among team (with Ryan Beezhold) ­ Chad added on to the WBS and Gantt Chart in Microsoft Project. 39 October 22, 2015 ­ Created Oral Presentation October 26, 2015 ­ Received parts ordered (3 IR LEDs, 10 IR sensors (5 each of two different models), and 3 A to D converters) November 2, 2015 ­ Initial and very surface­level analysis of components ordered. ­ Created poster. November 6, 2015 ­ Initial testing of components ordered (whether or not A to D worked). ­ Spoke to Fridays at Calvin students about our project. November 9, 2015 ­ Divided up into groups: Nathan worked on PPFS, Chad worked on some research, Kegan and Tom worked on software. November 14, 2015 ­ Worked on website design and PPFS. November 19, 2015 ­ Preliminary testing of IR sensors. Determined that the specific sensor would not work as voltage did not decrease linearly with distance. Additionally, voltage range did not meet desired values at proper distances. Team is highly considering switching to using image processing for the sensors. November 23, 2015 ­ Worked on code to get A­to­D (Analog to Digital) converter to work. ­ Continued improving website design. November 30, 2015 ­ Decided that, for the remainder of the semester, all work will be focused toward writing the PPFS and doing the Engr 325 Lab Projects ­ Chad and Nathan are continuing tests with the IR sensor ­ Kegan and Tom are doing initial prototyping with camera sensors ­ Will meet on Thursday in pairs to work on 325 Lab Projects and will meet on Saturday to write PPFS (have Business plan portion done by then) 40 Appendix B: Business Plan 41 
Download