Overview This portfolio showcases three notable academic pieces of work representing my knowledge gained from the Human Systems Engineering program at Arizona State University. I selected these projects because they demonstrate my ability to apply the scientific method and cognitive psychology principles in the real world, working as a user experience (UX) designer. The first project introduced in this portfolio is a research proposal paper on learning the state-of-the-art of UX in global and enterprise UX teams in terms of what makes effective teams do UX, and how successful UX teams compare to effective science teams. The goal of this paper is to use the findings from the research survey to reveal ways UX teams can increase the effectiveness of UX team processes. This paper demonstrates my ability to apply the scientific method in research planning concerning which methods, materials, and procedures I use to carry out research. The second project presented in this portfolio is an R tutorial on how to create wordclouds as a graphic visualization technique for analyzing large sums of textual data. When conducting qualitative methods such as survey research, data analysis can be incredibly exhaustive and time-consuming when done manually. Whereas, wordclouds make it easier for UX researchers like me to identify themes and patterns through conducting textual analysis to make words that occur more frequently appear larger. Learning and practicing R programming showcases my ability to apply statistical analysis to make sense out of large sums of data. The last academic accomplishment highlighted in this portfolio contains a heuristic evaluation report of an infotainment system. I chose to include this project in my portfolio because learning how to evaluate the user experience of a user interface using introspection methods such as this has helped me reveal large usability problems in design projects at work without having to carry out user research, which can be expensive. Increasing Effectiveness of UX Team Processes I completed this research proposal for my Methods and Tools in Applied Cognitive Science course. The paper seeks out to reveal the state-of-the-art of UX teams to see how successful UX teams compare to effective science teams. With that knowledge, the paper aims to learn how UX teams can increase the effectiveness of UX team processes. Significance As technology continues to evolve and change, user experience plays a vital role in enterprise companies because users demand it, which is why the need for highly effective UX teams in larger companies has become significant. With all of the working parts that exist in cross-functional UX teams, this paper discusses the key challenges UX teams face when there is not an effective strategy or process in place. Related Materials See Appendix A. Summary Many experts in the field would agree that great products and experiences demand “highly collaborative multidisciplinary UX teams" that have a mix of different skill sets (Lund, 2011). In multidisciplinary science teams, individuals specializing in different areas of expertise will combine their knowledge as they work towards a common goal. This combining of knowledge of different discipline areas is highly collaborative and requires a high level of participation (Cooke, 2015). Like science teams, high task interdependence is a key challenge for multidisciplinary UX teams which is why having an effective team process with a shared understanding of each members' roles and the project goals are essential (Cooke, 2015; Lund, 2011). Other challenges that can hinder multidisciplinary UX teams from being effective (which can lead to unsuccessful UX) is when the right people are not involved when important decisions are made; when team roles and responsibilities are unclear; when design decisions are backed by weak logic or are not supported by data; and when designers and developers are unable to compromise (Kuusinen, 2015). To restate the research questions in this proposal, the paper seeks to learn a) what is the state-of-the-art of UX as far as teaming goes?; b) how can you make an effective and efficient team do UX?; and c) how do effective science teams compare to effective UX teams in regards to team processes, collaboration techniques, strategies, and team methodologies? The survey proposed in this paper was designed to explore these unknowns and inform interdisciplinary UX teams on increasing the effectiveness of UX team processes. References Cooke, N. J., & Hilton, M. L. (2015). Enhancing the Effectiveness of Team Science. The National Academies Press. Kuusinen, K. (2015). Overcoming Challenges in Agile User Experience Work: Cross-Case Analysis of Two Large Software Organizations. Proceedings of the Euromicro Conference on Software Engineering and Advanced Applications (pp. 454-458). Lund, A. (2011). User experience management essential skills for leading effective UX teams. Morgan Kaufmann. Rosenberg, D., & Kumar, J. (2011). Leading global UX teams. interactions, 18(6), 36-39. doi:10.1145/2029976.2029986 R Practicum 11 - WordCloud Tutorial I completed this R assignment for my HSE Data Analytics course. For the R project, I created a tutorial on how to make wordclouds in R using the ‘tm’ (text mining) library. In addition to using the R style guidelines to write clean, “pretty” code as described by Wickham (2015), I decided to showcase this project using R Markdown to make it more user-friendly and easier to read. Significance Wordclouds are especially useful in UX research for analyzing large amounts of textual data, such as qualitative surveys and user interview transcripts. Having the ability to quickly organize qualitative data to find key patterns and themes in R is beneficial for me as a UX designer because it takes out a lot of the manual work and saves me time analyzing and synthesizing data. Related Materials See Appendix B. Summary In this R assignment, I learned how to do text mining in R to produce a wordcloud, which highlights the most frequently used words and displays them in different sizes based on frequency in a visual representation (Depaolo, Concetta & Wilkinson, 2014). R is a free, opensource coding language used for statistical computing, predictive analytics, and data visualization. It is a practical and useful tool for understanding data in a way that is easy to communicate with others (Depaolo, Concetta & Wilkinson, 2014; Kabacoff, 2011). Text mining in R can be a helpful tool in UX research when used for cleaning and organizing textual data, sentiment analysis, analyzing word frequencies to identify patterns and themes (e.g., wordclouds), and analyzing the relationship between words. As a UX researcher and designer, analyzing and synthesizing qualitative data from surveys, focus groups, user interviews, and usability testing sessions can be incredibly timeconsuming and labor-intensive. Text mining in R using wordclouds helps me make sense of data is using it as a screening tool to identify the key concepts and themes upfront to hone in on before carrying out a more detailed investigation, which has been a huge time-saver for me. References Depaolo, C., & Wilkinson, K. (2014). Get Your Head into the Clouds: Using Word Clouds for Analyzing Qualitative Assessment Data. TechTrends. 58. 38-44. Kabacoff, R. I. (2011). R in action: Data analysis and graphics with R. Shelter Island, NY: Manning Publications. Wickham, H. (2015). Advanced R. Boca Raton, FL: CRC Press. 440 - 444. Heuristic Evaluation of a Hyundai Infotainment System For my third project piece in this portfolio, I chose to include a heuristic evaluation (HE) report that I completed for my Human Systems Engineering Methods course. As part of this class project, I conducted a heuristic evaluation to assess the usability and design of a Hyundai infotainment system to identify usability problems of the radio component of the infotainment center by judging the system with Nielsen’s ten usability heuristics. Significance This project significant because it has taught me a practical and valuable skill that I have applied to every work project since. Heuristic evaluations are a quick and easy way of revealing key insights and usability issues of products and user-interfaces early on in the product lifecycle. Although it does not replace the need for user testing, it provides a baseline for designers and developers by exposing major usability issues upfront that may be catastrophic to the user experience. Related Materials See Appendix C. Summary The usability assessment presented in this project focuses on the radio component of the infotainment system only, which I depict in the research findings and recommendations. For the evaluation, I used Jakob Nielsen’s ten usability principles—rules of thumb used by human factors experts and non-usability experts—to take a methodical look at an interface. Also, I used Nielsen’s severity scale to rate the usability problems I identified. In 1990 when Nielsen and Molich conducted their study on the heuristic evaluation method, people rarely conducted empirical research due to a general lack of interest and not having the time or sufficient knowledge of how to do it. In their research article, they describe a heuristic evaluation as attempting to gain an opinion about what is good and bad about an interface by using a set of best practices as guidelines (1990). It is important to keep in mind that heuristic evaluations should be used in supplement with other empirical research methods to get enough user data to drive design decisions since they lack insights from real users engaging with an interface in a real context (Khajouei, Gohari & Mirzaee, 2018; Nielsen & Molich, 1990). This shortcoming was evident in Nielsen and Molich’s study when only half of the usability problems were found in the experiment that performed the best (1990). In the field of UX, I frequently use heuristic evaluations to analyze existing products to see how well it works and to assess where it can be improved. The process involves walking through a website or application, documenting what executes well and what requires improvement or a full redesign, followed by written recommendations for improvement. Applying my knowledge in heuristic evaluations is valuable in my role as a UX designer because it is a quick and low-cost way to spot major usability issues and improve the user experience. References Khajouei, R., Gohari, S.H., & Mirzaee, M. (2018). Comparison of two heuristic evaluation methods for evaluating the usability of health information systems. Journal of Biomedical Informatics. 80. Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces. Proc. ACM CHI'90 Conf. (Seattle, WA, April 1- 5), 249 - 256. Reflections One of the main reasons why I invested in going back to school to get my master’s degree in human systems engineering is so I could become a valuable resource to my colleagues as an expert in human factors and user experience. Looking back on my experience in graduate school, I accomplished what I set out to do, which is to become an expert in the field of UX and apply my knowledge and skills that I gained from the program to my career as a user experience designer. Before beginning the program, I was a self-taught UX designer with no professional field or lab training in the UX industry. When I came on as a UX designer at Insight, it was a UI (user interface) team of one, and there was no UX process in place. After I joined the team, I saw there was an opportunity to come up with a scientific team process for UX, and I wanted to be the one to do it. But before I could be confident in my expertise to lead the team, I needed to grow my knowledge and skills in the field by getting formal training in human factors engineering. The three projects I chose to include in this portfolio demonstrates the knowledge and skills I gained in this program, which I apply in my everyday job to create user-driven design solutions. The Human Systems Engineering program at ASU has helped me expand my knowledge in user experience methodologies by teaching me how to apply the scientific method and cognitive psychology principles in my current job role at Insight.