May 2018 - Mar 2019
SketchyMedical is creating a mobile application for users to review content, take quizzes, and review flash cards on the go. I created hand-off material for our remote iOS engineer to facilitate implementation and went beyond the scope of replicating core features by adding visual design and gamification.
Vision for stronger community and engagement
SketchyMedical offers time-lapse videos that illustrate a scene. These scenes or sketches are accompanied by narration to tie crucial concepts in medicine to symbols. This method is highly effective in committing information to memory and speeding up the process of knowledge acquisition.
The majority of learning no longer happens in the classroom, but on bus rides, walks, and places outside of lecture halls. To align ourselves with this shift, SketchyMedical is creating a mobile application for users to review content, take quizzes, and review flash cards on the go.
💡 Speak to unique capabilities and context of mobile engagement and follow Apple's Human Interface Guidelines
Now that the majority of Americans own smartphones, they're connected to the internet "on the go". This creates more opportunity for people to learn new things, no matter where they are.
The Association of American Medical Colleges (AAMC) administers the Medical School Year Two Questionnaire (Y2Q) each year to all active, second-year medical students.
I combined findings from 2014 to 2018 to illustrate interesting trends in medical education. I also grouped together participants that chose "Often" and "Most of the time" for these graphs for a more clear visualization of "yes" and "no".
I broke down SketchyMedical's core capabilities on web before prioritizing features. On desktop, student's can watch videos, review symbols that are mentioned in the videos, and create quizzes based on the videos.
Due to constraints defined by the PM, the iOS app would not have the capability to stream videos for students for its launch. With data showing that the majority of users watched videos on desktop, we decided to leave out streaming for the first version. Instead, it would allow users to review symbols and have all Quiz taking capabilities so that they could effectively test themselves on mobile.
Capabilities that I would take from desktop and implement into the mobile application:
Based on feedback gathered from users and research focused on flash card usage in medical school, we included 2 new features in the mobile application: flash cards and search. I hypothesized that students would use these features frequently due to high demand for these tools.
To give an idea of the direction I would be taking, these were some of the screens I was given from the previous designer to work off of:
I went back and created visual flows for tasks in Review, Quiz, and Flash Cards. Within each flow, I created notes for interactions, behaviors, and requirements.
To communicate with our remote iOS developer, I used a combination of Zeplin and Airtable to keep and maintain files in one place. I organized all assets in the Airtable, linking to each file in Google Drive. I also used InVision to communicate flow.
User should be able to preview the details of the quiz when going through Review, such as last performance, number of correct and incorrect questions, and which video topic the quiz is on.
Our doctors had created about 10 questions for each video topic. Students must be able to view the question stems, the status of the question, whether they've favorited it, and review the answers.
Each side of the flash card can be edited to show an image of the symbol, and or a text description of the symbol definition. Text can be edited by the user.
We were cutting it close to our deadline and spreading ourselves thin so I suggested pulling back features to the project manager. We decided to remove Symbol Search so that we could pay more attention to Review, Quiz, and Flash Cards. Version 1 was the planned implementation, while the 2nd filled in the gaps when Symbol Search was removed from the app.
Alongside our iOS developer, we accounted for each iOS device, following the Human Interface Guidelines and discussing differences in design. I included designs for iPad, iPhone X and iPhone 8.
After successfully implementing Review, Quiz and Flash Cards, we will prioritize bugs with our remote iOS developer using Airtable. From there we will gather participants for user testing and plan for the second phase (expanding learning mediums to audio and video for mobile use).
To measure success we will track:
The project owner suggested that all the features be included in the initial launch of the iOS app so that we could create incentive for new users. I pointed out that aiming to ship all the features would increase time and cost in development and QA. I mentioned the benefits of launching the iOS app with core features as an MVP solution so that we could begin testing.
Working with a remote iOS developer, I further refined asset organization and standardization for communication and implementation. This helped us navigate through the correct sizes and formats.
Our consistent Zoom meetings gave me the opportunity to explain flows and reinforce them with artifacts. These conversations helped me learn more about the various constraints in iOS development.