Content Design and Writing

Designlab

Creating content that's helped over 2000+ people achieve their design careers

Role
team
Visit Website
Learning Content Designer
product Manager
Curriculum Strategist
Graphic Designer
Learning Content Designer (2)

Designlab is a UX/UI design education company based in the US. It empowers those transitioning their careers to UX and UI design through 12-30 week long bootcamp-style design courses.

As a distanced learning education company, almost 50% of its students and 300+ mentors are based outside of the US. They range from working professionals to university graduates. For 2 years, I was a key contributor in the content creation and design of Designlab's core products: Their UX/UI Foundations and Academy courses.
Before this role, I was a UX instructor with Designlab for 1 year. My knowledge of the course delivery to students was influential in shaping my perspectives on creating high-quality, engaging content.
Joining the tight-knit Learning Content Experience team, I was a key decision-maker in every part of the planning, organization, research and creation of course content. The process carried out for each of these products is similar in structure.

Summary

For 2 years, I worked on Designlab's core products: Academy and Foundations.
Contributing to large curriculum changes and updates, I carried out multiple planning and user research initiatives in collaboration with the Learning Experience Team.

I wrote over 100 content pieces, including content outlines, design projects and quizzes. I created graphic assets to support the design team with the final versions. On each content release, I worked closely with the community engagement and marketing teams on the outreach and update messaging.
Being in a fast-paced, sprint-focused startup, there were several tradeoffs made to shorten release deadlines. During planning phases, 20-30% of initially planned changes are postponed into staggered releases for a more sustainable content creation process. These tasks have been backlogged as far as 6 months after to prioritize more essential content and general maintenance tasks.  

When I lead the Academy Capstones project, a beta validation round was cancelled to meet the release deadline. Instead of launching new content to a small group of students and mentors, it was launched to everyone. I negotiated a compromise to send follow-up surveys to the new student cohort so they could easily share their feedback directly to the Learning Experience Team.
1

Research

The first stage of the content creation process started with comprehensive research. Whether content would be created as new, or repurposed with smaller edits, this phase was essential in setting the team's priorities for the next 3-6 months.
We had access to both quantitative analytics and qualitative feedback, although qualitative data was more narrow as feedback was optional for students.

Research Tools

Key Questions

  • What do students/mentors like about the current course?
  • What are the biggest challenges students are facing?
  • What are the biggest challenges mentors are facing?
  • What are some top suggestions based on X amount of data?
  • What information are we missing?
  • Where and how can we obtain this information?
2

Organizing Data

Referencing the existing data available to me, I carried out additional research by auditing the current content to evaluate their relevance for new content and to support my writing.

Content Types

Audited Items

  • Current live lessons and projects
  • Video tutorials
  • Knowledge check quizzes and exercises
  • In-house instructional resources (project templates, reference sheets, etc)
  • Third-party instructional resources (links, blogs, books, etc.)
  • Required/Optional content and duplicates
  • Video tutorials
  • Estimated content read-time (minutes)
  • Total content time (minutes)
  • All instructional resources
3

Example Audits

Below are several examples of audits I've completed.
Screenshot of foundations content inventoryScreenshot of foundations resources inventory
Through affinity mapping, the team and I uncovered content pieces that caused significant student pain-points. From there, I continued with comparative analysis, follow-up surveys and interviews with targeted groups of regular mentors and students. These data points aimed to gather detailed reasons behind the blockers of the problem content, as well as open a continuous feedback loop with participants to review content updates.
Academy course feedback analysis
4

Qualitative Process

Implementing survey and interview studies was a collaborative process. I led the research plan, proposed participant groups and scheduled them, and prepared research incentives.

My survey and interview scripts were reviewed by my product manager, while the participant outreach was handled by the Mentor or Student teams. Studies spanned from 2 - 6 weeks, depending on the number of participants, their schedules and follow-up sessions.

Once the study was completed, I organized the results into a spreadsheet, using ChatGPT to create summaries of the raw, anonymized feedback data.
A typical prompt would be:
"Summarize this feedback from students on the benefits and challenges associated with the three-tiered criteria grading assessment."
These summaries supported my longform insight reports. In addition to insights, I provided recommendations and further research opportunities based on the results.
High-level chart of the Designlab's research process. Starting with a research proposal, research goals are defined, followed by participant recruitment. A report is created to summarize completed interview insights.
Research Proposal
Research Goals
Participant Groups
Interview Script
Existing Quant Data
Participant Outreach
User Interviews
Research Report
Existing Qual Data
Study Incentives
5

Research Studies

1: Academy Course Capstone Research

After students complete their fundamental design lessons in Academy, they are challenged with three web and app projects, called Capstones, whose project briefs teach them skills that are "UX portfolio-worthy." These graduation projects are the last leg for students, and are required for them to successfully graduate from the course.

For many months, the team observed regular feedback around these Capstone briefs causing students significant delays.
For 5 weeks, I led a research initiative to survey and run follow-up interviews both mentors and students to examine complaints about these briefs causing delays in course completion.

From the survey results, I reached out to 12 students and 6 mentors interested to talk through their experiences with the current Capstones format. Out of this outreach, 5 students and 4 mentors responded to longer 1-on-1 interviews.
1-on-1s
5 students / 4 mentors
Over 2 weeks, I ran 1-on-1 interviews with each student, and one-hour group interviews with the mentors. I led a research initiative to survey and run follow-up interviews both mentors and students to examine complaints about these briefs causing delays in course completion.

Research Insights + Review Loop

Content Reviews -
Follow-ups with 3 mentors
Based on the insights, a key change was a significant re-structure of the briefs to add more clarity and support around deliverable expectations.

In the next 3 weeks, I made key brief changes and kept a regular review loop with 3 mentors.

2: Foundations Research

The main research goal was to examine the effects of reducing the current grading criteria from three tiers to two tiers. At the time of this study, 21% of students re-did their projects despite already passing. Through qualitative feedback, the team discovered that the current grading criteria left students demotivated that their work wasn't up to par with the third-tier.
I carried out a survey to assess the simplification of grading criteria for all design projects in the Foundations course. If this change were rolled out, it would directly affect the quality of project submissions, and thus the grading methods for graduation projects.
Qualitative Survey (anonymous)
6 students /
30 mentors /
6 colleagues
As this decision would affect both current students, as well as any new students in future cohorts, this survey was sent out to three groups: current mentors (n = 30) , current students (n = 6), and colleagues (n = 6). All replies were anonymous to reduce bias and encourage candour.

Research Insights

Based on the insights, the team got the green light to implement a simpler assessment criteria. Additionally, I advised that the new criteria be correctly framed to increase clarity of project requirements.
  • Simpler asessment criteria
  • Additional support for affected users
👆🏻 Go Up