This 45-minute web-based instructional unit aims to teach basic web typography concepts to non-designers with a variety of E-learning principles. By the end of the course, students will be familiar with choosing a typeface for the headline and tagline of a landing page based on its brand identity and features of the web page.
E-learning course design
Methods and Tools
Backward Design, Cognitive task analysis, Think-alouds, A/B test, Qualtrics
WHY WEB TYPOGRAPHY
Good typography tells a story and communicates messages well. Learning the basics of web typography will help non-designers who may develop their own portfolios, blogs or web design projects. My initial research shows that current online learning resources on Lynda.com and YouTube cover typography knowledge in a generic way, and does not provide worked examples on choosing a typeface given a website page.
INTRODUCE THE COURSE
The introductory web typography course is designed to give step-by-step demonstrations on how to choose a font grounded in data gathered from expert cognitive task analysis and novice think-alouds. To manage the cognitive load, I used scaffolding, scoping, and staging as learning principles. For scaffolding, students are led through simple topics to more complex ones. For scoping, I gave students enough information to complete the task without overburdening their cognitive load. For staging, I created worked examples that help students integrate knowledge in a step-by-step learning process. I varied the worked examples to give students exposure to different typography scenarios to apply their skills.
I adopted backward design as my approach to design the course. Backward design starts with the competencies students should be able to demonstrate, followed by a process of iterating between assessments and goals, and finally completed with creation of the instruction to address key skills. My design process involves identifying learning goals (knowledge components) through expert cognitive task analysis (CTA), designing online assessments, testing them through think-alouds with novices, designing instructional unit, and conducting AB test with 20 students.
IDENTIFYING LEARNING GOALS
Given a landing page and its brand identity, students will be able to choose a legible typeface for the headline and tagline that matches the brand design.
The learning goals for this E-learning unit were refined based on cognitive task analysis (CTA) results. In CTA, the focus is to start larger and get smaller, and specify the internal thinking processes. The theoretical CTAs based on readings and online courses helped me map out key decision points. The empirical CTAs conducted with two experts helped me narrow down my scope and identify the critical aspects of instruction.
From CTAs with two experts, I found that requiring students to summarize brands’ identity could be very demanding, since it entails years of experience to do. Besides, the experts insist that learners stay with one typeface until they have achieved mastery. Most importantly, the think-alouds were very enlightening in that it showed how the the features of the shapes in the brand pieces had a big impact on the final decision
DESIGNING ONLINE ASSESSMENTS
My initial assessment task design was based on my first iteration of learning goals. I designed nine questions that are aligned with each knowledge component and ran the assessment questions with two novices and one expert. They were encouraged to think aloud so that the goals, strategies, and knowledge influencing their solutions could also be recorded.
The expert got all the questions right, but need clarification to make the right choice for the last question, which is to choose the image that has the right typeface on it. The novices did not get all the questions right. Some of the questions are too easy, and novices could easily guess which is right by excluding the wrong answers. Some questions give clues in the wording. Therefore, I clarified the questions, increased difficulty level, and broke complex questions into several steps in my second round of assessment question design.
Based on findings from learner tests, there appears to be potential difficulty in selecting the right font for headline and tagline of the website based on the five steps. Students were guessing which font was appropriate. My instructional design idea was to first sequence my instruction from teaching basics of legibility and font classifications towards higher goals, which was to choose the right font for a given website. To teach this more complex task, I adopted a worked example with problem-solving practice. Students are given a worked example and shown step-by-step how to choose a font, and then they will be given another problem to practice by themselves. This problem will be different from the one given in the instruction to add variety so they can practice in different contexts.
The instruction for my course unit is in a sequence of direct instruction, worked examples, and practice provided through asynchronous online learning, resulting in a combination of memory/fluency building, induction and sense-making. It takes approximately 5 – 10 minutes to walk through the direct instruction pieces; the embedded examples and practice opportunities may take approximately another 10 minutes.
My AB testing was based on my whether students receive explanatory feedback. The students in the control group receive no explanation for the formative assessment, and those in the treatment group get explanatory feedback. There are four module variations. Pre-test and post-test have been counterbalanced. The tests were randomly distributed to 20 people separated in groups of five using Qualtrics. Most of the respondents were recruited online using Fiverr, and some of them were my classmates who show strong interest in typography.
The result shows post-test gains in all four groups. The t-test is 0.07, which is not larger than p=0.05, the standard to consider a difference to be significant. Multiple factors could affect the result, such as chance for guessing, varied prior knowledge, small and biased sample. Another interesting thing to notice is that my classmates spend as long as one hour and twenty minutes to take the unit, while other online respondents could spend as short as 10 minutes to complete the unit, which may have affected the results.
The instructor from my interaction design class helped me check my instruction slides and assessment questions. The instructor validated my design but also commented that I was trying to simplify a very nuanced subject, and there were many variables involved in the final decision-making. Moving forward, the course could include other variables such as how the lighting of the background image affects legibility and font choice. Ideally, the course could enable students to choose and compare different fonts instead of only using multiple-choice questions to scaffold learning.
Besides, the instructional design should involve more practice opportunities than direct instruction. When I looked at the data from AB test, I found most of the respondents skipped the questions that require them to type in the text box, and there is no way to learn whether they have carefully checked my instruction slides embedded on the platform. For the next iteration, I will interweave instruction slides with practice questions to engage students, and add self-explanation prompts for learners to identify the rationale that underlies each step.