Usability & Reflection
How has the testing impacted your alignment of outcomes activities and assessment?
The usability testing process has had a significant impact on aligning my course outcomes, activities, and assessments. One key insight I gained was the importance of making the course accessible and manageable for participants. The length of the survey, along with the requirement to have a Google account, discouraged participation, which highlighted the need for me to streamline both my assessments and the tools used for accessing the course. As I rely heavily on Google Classroom, I need to ensure that activities and assessments are clear and easy to access for all learners, especially considering their varying technical abilities. This has prompted me to make the assessments shorter and more focused on achieving specific outcomes while remaining flexible with how participants can engage with the course materials.
​
Additionally, the feedback I received through the usability testing emphasized the need for a more structured approach to integrating content, such as incorporating explicit grammar instruction throughout the course. I have since adjusted my unit plans to better align with the course outcomes by adding grammar components and extending the course length to allow for deeper engagement. This refinement of activities and assessments ensures that students are not only able to meet the learning objectives but also apply their knowledge in practical and meaningful ways, such as through projects and writing assignments. Ultimately, the testing has helped me create a more cohesive and accessible learning experience that better aligns with my desired outcomes.
Survey
​Reflect on who you were able to have conduct the usability testing? Were you able to get the right people? Why or why not? What can you do to better improve this in the future?
After completing the blended learning course in Google Classroom, I conducted usability testing to receive feedback from Spanish teachers in my diocese. I posted a request for feedback in our existing WhatsApp group and offered to share the course materials I had specifically developed. Unfortunately, none of the teachers from my diocese participated in the survey, and given their busy schedules, I did not expect anyone to create a video as an alternative. As a result, I placed considerable reliance on the survey and teachers from my previous school to assist me with the survey.
​
Although I reached out to other teacher colleagues for assistance, their responses were similar. Many were occupied with end-of-quarter preparations or had family commitments, leaving little time for additional work. Despite this, a few individuals managed to complete the survey, though not as many as I had hoped.
​
Fortunately, several of my peers from the 5318 course agreed to participate in the usability testing during our final class session. However, of the four who started the survey, only two completed it.
​
In terms of improving my ability to gather feedback from stakeholders in the future, I find it challenging due to my unique situation. I work at a private school where I am the only Spanish teacher for grades 2-8, which is also true for many schools in the diocese. The stakeholders I need to engage are spread throughout the Galveston-Houston Diocese and face similar difficulties, often being the sole Spanish teacher at their schools. They are responsible for multiple grade levels and navigate the challenges of teaching without substantial support so adding extra work to their already large workload was not feasible.
​
Although participation in my survey was lower than expected, I was able to extract some useful insights from the responses. The assignment guidelines indicated that the usability test should take between 30-45 minutes. However, I created a comprehensive survey that, according to the site I used, would take around 55 minutes to complete. While my work met the requirements of the assignment, I believe the length of the survey may have discouraged participants. I observed on my survey platform that although there were 13 views and 4 starts, only 2 individuals ultimately completed it.
​
Given these disappointing results, I reassessed my approach and devised a new plan. I reached out to every teacher I had previously worked with, some of whom are personal friends, and earnestly requested their participation in the usability testing.
​
Moving forward, I recognize the need to design shorter, more accessible surveys and to build more consistent communication channels for future feedback efforts.
Survey Results
What impact did your platform (LMS, Google Docs, or other digital sharing) have on the testing and results?
The platform I used for usability testing, Google Classroom, had a significant positive impact on both the testing process and the results. The participants who reviewed my work noted that the program was easy to navigate and appreciated the ability to access a complete unit, which included worksheets, videos, and assessments, all in one centralized location. This feedback highlighted one of the key strengths of using Google Classroom: its ability to house various resources in an organized and accessible format.
​
Reflecting on this, I agree with the reviewers’ assessment. Prior to this experience, I had not considered creating comprehensive unit plans in my personal Google Classroom, but now I see the value in doing so. By building units for each topic I teach and storing them in Google Classroom, I can create a repository of materials that is both accessible and adaptable, regardless of where I work in the future. This course has shown me the importance of having a structured and fully integrated digital space for lesson planning, and Google Classroom has proven to be a highly effective platform for this purpose.
First, I realized that not everyone has a Google account, which posed a significant barrier to accessing the course in Google Classroom. Requiring a Google account inadvertently restricted access, making the course less accessible to those who may not have or wish to create one. This was an oversight on my part, as I should have ensured that the course could be accessed by a wider audience, regardless of their preferred platform.
​
Second, I learned that the survey I designed was too lengthy, which likely deterred participants from completing it. Although the survey instructions requested a video review if the participant had time, it became clear that everyone I reached out to was already busy with their own professional and personal commitments. Given the limited time they had, completing a long survey, let alone creating a video review, was simply unrealistic.
​
In the future, I will focus on making the course more accessible and ensure that the surveys are shorter and more manageable, allowing participants to provide feedback without feeling overwhelmed.
What were the lessons you learned from the usability testing feedback?
What have you done to your design to address the usability issues revealed in the testing? What have you added or taken away?
To address the usability issues, I streamlined the instructions and only asked for the survey to be completed. I also realized that only people with Google accounts will be able to access my Google Classroom. If I want this course to accessible by everyone, I will have to find another platform to house my course. Currently, I use google classroom at my school so changing the platform is not an option.
However, since I currently use Google Classroom at my school and it remains an essential tool for my teaching, switching platforms is not feasible at this time. Moving forward, I will look into ways to make the course more accessible, perhaps by offering materials in multiple formats outside of Google Classroom, while still using the platform for my primary instruction.
How has this process improved your course and your learner's experience?
This process has significantly improved my course and enhanced the learner experience, albeit with some limitations. While I would have preferred a greater number of reviewers to better gauge the overall effectiveness of my course, I have conducted multiple self-reviews and identified several areas for improvement. This introspective evaluation has been instrumental in refining my course content and structure.
​
I recognize that this is an ongoing process; I plan to continuously add to and modify the course materials as I receive further feedback and reflect on my own teaching practices. My goal is to develop a more comprehensive course by creating my own handouts, rather than relying on resources from Teacher Pay Teachers or using worksheets from my former district. By tailoring materials specifically to the needs of my students, I aim to enhance their learning experience and ensure that the course is as effective and engaging as possible. Ultimately, this iterative approach will allow me to create a more robust and meaningful educational experience for my learners.
How will you address the infrastructure, system, and support needs and issues the learner may face?
Addressing the infrastructure, system, and support needs that learners may face is crucial in my role at a small private Catholic school, where the technological infrastructure is often lacking. Frequent difficulties with internet connectivity necessitate a blended approach to teaching. Consequently, I design my lessons to incorporate both digital and non-digital resources, ensuring that if the internet goes down, we can continue our work using worksheets and other activities that do not rely on technology.
​
Given that my learners range in age from 8 to 13 years old, their technical abilities vary significantly, with some students having limited computer skills while others are quite computer-savvy. To accommodate this diversity, I implement differentiated instruction tailored to the varying levels of technological proficiency. This includes providing clear, step-by-step guidance on logging into Google Classroom and navigating other websites used for assignments. By offering personalized support and resources, I aim to create an inclusive learning environment that addresses the challenges posed by our infrastructure while ensuring that all students can effectively engage with the course content.
Adjustments made
To enhance participation in the usability testing, I made significant adjustments to the survey instructions to better engage potential reviewers. By clarifying the purpose of the survey and emphasizing the value of their feedback, I aimed to entice more educators to complete it. Additionally, I simplified the language and structure of the instructions, making it easier for participants to understand what was required of them and how their input would contribute to the course’s improvement.
​
Furthermore, I have continued to refine the course content by incorporating elements that were previously overlooked. A notable addition is the explicit grammar component, which I have integrated into each section of the course. This adjustment ensures that learners receive targeted instruction on grammatical concepts, thereby enhancing their overall language proficiency.
​
In addition to the grammar component, I am actively working on including projects and a writing component, which will encourage students to apply their knowledge in practical contexts and develop their writing skills. However, to effectively integrate these additional elements, I anticipate needing to extend the duration of the course from the original 9 weeks to 12 weeks or possibly more. This adjustment will provide ample time for students to engage deeply with the material, complete the new assignments, and demonstrate their understanding through various assessments.
​
Overall, these adjustments aim to create a more comprehensive and effective learning experience for my students, ensuring that they not only grasp the core content but also develop essential language skills in a supportive environment.
References:
American Council on the Teaching of Foreign Languages (ACTFL) Membership.
(2021a, May 18). World readiness standards overview. YouTube.
World Readiness Standards Overview
Athuraliya, A. (2023a, January 5). Top 7 instructional design models to create
effective learning material -. Creately Blog. https://creately.com/blog/education/instructional-design-models-process/
Harapnuik, D. (2018, June 19). Outcome-based Education vs Competency-based
Education. Outcome-based Education vs Competency-based Education
Harapnuik, D. (2020c, December 24). Assessment as Learning. YouTube.
Harapnuik, D. (2021b, September 19). EDLD 5318 Deeper Learning.
Kurt, Dr. S. (2022b, April 9). Merrill’s principles of instruction. Education Library.
https://educationlibrary.org/merrills-principles-of-instruction/