Innovative Teaching through Community Pharmacy Simulation (MyDispense)

Clark Kebodeaux, University of Kentucky College of Pharmacy
Marcus Ferrone, University of California, San Francisco, School of Pharmacy
Jill Fitzgerald, University of Connecticut School of Pharmacy
Lisa Holle, University of Connecticut School of Pharmacy
Tina Brock, Monash University, Faculty of Pharmacy and Pharmaceutical Sciences
Keith Sewell, Monash University, Faculty of Pharmacy and Pharmaceutical Sciences

In 2011, MyDispense, a web-based community pharmacy simulation program that promotes active, person-centered learning and allows students repetitive opportunities to achieve established learning objectives related to dispensing skills critical to the medication-use process, was introduced by the Faculty of Pharmacy and Pharmaceutical Sciences at Monash University in Melbourne, Australia.1This innovative, simulated learning environment allows students to practice the necessary skills to accurately and safely dispense medications in the community pharmacy setting. Although initially designed for students seeking licensure to practice pharmacy in Australia, MyDispense has been expanded to mimic pharmacy practice around the world. For example, there are currently 16 schools and colleges of pharmacy in the United States and 6 schools in the United Kingdom who have an instance of MyDispense to help students in their curriculum. To date, over 300,000 exercises have been completed globally.  Many schools of pharmacy are focusing their efforts on students without prior pharmacy experience and use MyDispense to help simulate the medication use process in the community pharmacy setting.

The adaption of the Australian MyDispense software has been successfully incorporated internationally with instances in Australia, United States, and the United Kingdom. In addition, MyDispense is used in other countries in Europe, Asia, and Africa, though some instances are still under development. The ability to create various exercises using similar or repeated patient names, addresses, and multiple products provides faculty significant creativity in implementation while providing multiple opportunities for the student to make errors and learn the process.

The various configurations to MyDispense have led to successful adoption by schools of pharmacy worldwide. Despite different medications, legal standards, and even dispensing processes amongst countries, the MyDispense platform has proved viable for its end users. Its adaptability to contrasting practices of pharmacy on different continents further demonstrates the potential rollout to any interested school of pharmacy and is a further testament to its novel design and continual maintenance by Monash University.  Monash University has always been committed to building communities to support pharmacy education worldwide and this value is ultimately demonstrated by its offering of the MyDispense program free-of-charge to any university wishing to implement the software into its academic curriculum.

Individual school and collaborative research projects among several schools are aimed at understanding the implementation of this educational technology on educational outcomes.2  When implementing any new technology into the classroom, faculty are faced with multiple challenges.  Published research on implementation has highlighted the need for appropriate student orientation and appropriate tutorials provided to the students.  Successful implementation was noted when the tool was aggressively promoted and showcased allowing students to see the value of longitudinal, curricular implementation.2

Faculty continue with this model of collaboration by using PharmAcademy (, a global, collaborative website developed by Monash University to allow the sharing of resources to facilitate adoption of this new technology. Users of PharmAcademy can download and use or adapt existing MyDispense exercises for their curriculum as well as share their own. 

Interested in adapting MyDispense for your University?  Please contact for more information!


1) Mcdowell J, Styles K, Sewell K, et al. A Simulated Learning Environment for Teaching Medicine Dispensing Skills. Am J Pharm Educ. 2016;80(1):11.

2) Ferrone M, Kebodeaux C, Fitzgerald J, Holle L. Implementation of a virtual dispensing simulator to support US pharmacy education. Curr Pharm Teaching Learning; published online March 23, 2017. DOI:


Thoughts on Multimedia Design Principles

Adam Pate, PharmD, BCPS
University of Louisiana at Monroe

Where did you learn to make a PowerPoint? Funny question right. For most of us it has been trial and error. Although active learning is becoming more popular many faculty members still rely on PowerPoint and/or may not be comfortable with adopting active learning. Is there a way to transition to active learning, make PowerPoint relevant again or dare we say even useful in teaching? A look at multimedia learning and multimedia design principles gives us hope.

Most faculty members have little to no training in multimedia design and cognitive load management. They just make PowerPoint(s) the best they can. Unfortunately, as a multimedia platform if we apply written information principles (bullets and words) to PowerPoint it often results in death by PowerPoint and classroom disengagement.

Contrast this to a multimedia design adherent PowerPoint and current evidence suggest that applying multimedia design principles (MMDP) can improve both retention and application of material.1  Additionally, both short and long-term transfer and retention of material were improved in medical students by adhering to multimedia design principles in PowerPoint.2 Limited evidence in pharmacy education also suggests students are comfortable with this PowerPoint design, feel confident in their learning from it, and may perform better on exam material when lectured with a PowerPoint adhering to MMDP rather than “traditional” PowerPoint.3 Looking at this evidence one cannot help but wonder if maybe PowerPoint is not so bad, we have just been using it wrong. 

Multimedia design not only applies to PowerPoint, but any multimedia platform. Once you start learning about multimedia design you’ll quickly realize the extent of its application and understand why the American Association of Medical Colleges endorses it. This sounds great, but if you’re thinking about using this where do you start?

Richard Mayer is considered the father of multimedia design principles so start by understanding his cognitive theory of multimedia learning which provides the theory behind the practice.4 Next simply read Mayer’s nine methods to reduce cognitive load and see which ones make sense to you.5 Then get started changing your multimedia. If it is a PowerPoint it may be easiest to go slide by slide and think about what is the main goal you are trying to convey with each slide. Then make a newspaper style heading at the top to help orient your audience and find 2 to 4 pictures to demonstrate what you are talking about. The idea is your old bulletpoints become spoken words and relevant pictures replace them on the slide. Be prepared for students to feel very uncomfortable without having “everything they need to know” in words on a slide. Help them through this by preparing them beforehand and sharing with them the science behind this change and how it should help them learn. This may increase acceptance and let’s them know you are only trying to help. Admittedly change like this takes a great deal of time and effort, but in the end it’ll be worth it.


  1. Issa N, Schuller, Santacaterina S, et al. Applying multimedia design principles enhances learning in medical education. Med Educ. 2011;45:818-826.
  2. Issa N, Mayer R, Schuller M, et al. Teaching for understanding in medical classrooms using multimedia design principles. Med Educ. 2013;47:388-396.
  3. Pate A, Posey S. Effects of applying multimedia design principles in powerpoint lecture redesign. Curr Pharm Teach Learn. 2016;8(2):235-239. 
  4.  Mayer, RE. Multimedia learning. 2nd ed. New York: Cambridge University Press; 2009.
  5. Mayer RE, Moreno R. Nine ways to reduce cognitive load in multimedia learning. Educ Psychol. 2003;38(1):43-52.

Literature Evaluation via Online Group Quiz


Sharon K. Park, Pharm.D., BCPS                                                          Notre Dame of Maryland University

Courses designed to teach how to appropriately evaluate biomedical literature are absolutely necessary in pharmacy education. However, one of the most challenging parts of teaching it in a large classroom setting and assessing the skills in literature evaluation is that the instructors have a hard time determining how much the students actually learned vs. they seem to know. The true level of students’ knowledge and skills does not become apparent until they are on their clinical rotations, attempting to do a journal club alone for the first time.

As with any other educational movements, an effective method of teaching this skill set is increasing the amount of time that the students actively verbalize their thought processes. However, unless the course is tied to a skills lab, it is time-consuming and resource-intensive to exercise journal-club-like activity in the didactic setting. One way to solve this problem is to have the students talk to one another and share their knowledge, and do this against time and in a competition with other students, in a small group.

In a literature evaluation course with about 54 students, 13 groups were formed, based on the students’ preference. Each week for 5 consecutive weeks, the students were assigned a clinical trial to read and prepare for a group quiz which consisted of 10 multiple-choice, 4-option based questions using either Quizizz ( There are a few free online-based quiz platforms available including Kahoot! (, Quizlet Live (, and Quizalize (, just to name a few. A few reviews comparing these platforms exist on the Internet and each has its own advantageous and unique features. Instructors now have a variety of platforms to choose for a small survey or a full-scale assessment on these sites. An example screenshot of one of the quizzes is shown here via Quizizz.

tipel december.png

Student groups were asked to sit as a group and register with the quiz number to begin the quiz on one student’s computer. As they enter their answers, the Quizizz platform registers the progress including correct and incorrect response, ranking, and speed. The allowed time can be controlled for each question based on the difficulty. However, the groups are ranked also based on the time spent on answering the questions, and a majority of them spent less than 10 minutes to answer 10 questions, with the study articles available to them during the quiz.

Based on a short survey to determine if the students’ perceived the group quiz was more helpful at improving their skills and if the group quiz should continue to be offered, an overwhelming majority agreed that the activity was helpful and should continue in the future. 

When using these online platforms for either a group or individual assessment, it is important to ensure that the questions are simple enough to be answered with a reasonable amount of time. In addition, the amount of time given to each group and each quiz should be consistent to deliver a fair assessment; some of the platforms (including Quizizz) do not limit or cut off the activity when the allotted time is gone. The number of students in each group should also be no more than three in order to increase meaningful participation among all group members.

Assessment Related Tools and Technologies Webinar

As educators, we must always remember the objective of education is learning, not teaching, and that teaching and learning are not synonymous.  Simply because the information was taught does not guarantee that the students learned or what they learned.  So, how do we ensure that happened???

To ensure learning occurred, defining learning outcomes for what students are intended to learn is the first step. Learning outcomes are not simply a list of the topics to be covered in the course. They are statements of measurable and observable knowledge, attitudes, and skills learners will achieve. At each institution, learning outcomes are defined for the entire curriculum, for a course, and for each lecture or class session. These learning outcomes are followed by assessment of learning. There is no true way to know if students achieved the intended learning outcomes without assessment.

ACPE recognizes this in the Standards 2016 with an entire section devoted to assessment and a standard pertaining to assessment of educational outcomes. In Standard 24, ACPE requires implementation of a plan to assess achievement of educational outcomes through systematic, valid, and reliable knowledge-based and performance-based formative and summative assessments. This is the assessment of the measurable and observable learning objectives for the curriculum, course, and lecture. This is not course grades. ACPE specifies the inclusion of outcome data from assessments summarizing overall achievement of learning objectives by individual students and the aggregate of students, along with tools utilized to capture students’ reflections on personal/professional growth and development. Luckily, a growing number of educational technologies make this task easier.


Wallace Marsh, MBA, PhD

Electronic testing has several advantages over paper and pencil exams.  First, it allows an instructor to ‘assess’ the assessment to determine the overall reliability and validity of the examination and of the individual questions within the examination.  Second, where tagging of the questions to content is permissible, it allows for the assessment of programmatic, curricular, and individual course objectives and the partial mapping of a curriculum.  There are many electronic testing products such as Examsoft®, Questionmark Perception®, and an assortment of exam/quiz features within a learning management system.  Two of the main challenges with tagging question content is the participation of the faculty to do it and the reliability of what is tagged.  Options to address these issues include faculty training, periodic reminders to tag content, providing a guidebook to tagging the content and finally instituting a ‘coding’ check reliability process.


Margarita DiVall, PharmD, MEd

Rubrics improve validity and reliability of student ratings during performance-based assessments or evaluation of writing and project work. Best practices for rubric development should be followed to ensure evaluation criteria are tied to learning outcomes for a given activity. E-rubrics enable generation of not only grades but achievement of specific competencies, and allow faculty and students to easily identify areas of strength and opportunity for improvement. Multiple platforms exist for e-rubrics including learning management systems (e.g. Blackboard), experiential management systems (e.g. E-Value), and ExamSoft. When selecting a specific platform, consideration should be given regarding ability to map rubrics to learning outcomes of interest, reporting options, ability to aggregate data across multiple assignments in a course and across many courses in the curriculum. Additional considerations should include grading options such as multiple graders, self-assessment, peer assessment, etc.

An example of e-rubric implementation in a skills lab course at Northeastern University using ExamSoft platform was discussed. The importance of validating mapping, using assessment data to improve curriculum, and closing the assessment loop was emphasized.

Audience Response Software (i.e. clickers)

All presenters

Audience Response software allow faculty to poll students during synchronous or asynchronous instruction and can be used as a formative assessment and active learning strategy. Literature has shown that polling technologies improve student engagement, motivation, and learning outcomes. Various software are available and offer web and cloud-based interface for instructors and students. When selecting a specific product, consider ease of use, cost, types of questions available, and integration with learning management systems. Presenters discussed their experience with several polling software products. 

To watch the recorded webinar, follow this link!


Creating an Active Learning Space

Elaine L. Demps, PhD.                                                      

Director of Instructional Design and Support Services

Texas A&M University Irma Lerma Rangel College of Pharmacy  

The Texas A&M University Irma Lerma Rangel College of Pharmacy is 10 years young. Even so, from early on, we began yearning for a space that was more conducive to teaching small groups. That space would nicely complement our three large lecture halls and would be used for our weekly integrated pharmacotherapy rounds and recitations. This desire became a reality in 2013 for our second campus in College Station where we planned to admit a first cohort of 30+ students in fall 2014.

We began the needs assessment by asking the faculty and students who were experiencing the integrated pharmacotherapy rounds and recitations in a large lecture hall at our first campus in Kingsville: What works? What doesn’t work? We also looked for exemplars and came across Indiana University’s collaborative learning studio, the SCALE-UP Project, and TILE. We even stumbled upon a then newly published special issue of New Directions for Teaching and Learning that was dedicated to active learning spaces.1 And we came across furniture makers who specialized in products that facilitated collaboration, e.g., Steelcase, Spectrum, and Teknion.

The pictures below show the before and after of the renovation. Each small-group station is a D-shaped table that seats seven students. We installed two 32” displays at the end of the table, four HDMI inputs for laptops, eight power plug outlets, seven data jacks, one camera, one gooseneck microphone controlled by four push-to-talk buttons, one ceiling speaker above each station, and seven headphone jacks with individual volume controls. Because our curriculum would be delivered to both campuses synchronously, each station had to be integrated into the videoconferencing system. But, by installing an audiovisual bridge, we planned to use the same equipment, e.g., the videoconference camera and microphone at a station, for web conferencing. That way, clinical faculty located anywhere in Texas could join a small group using Zoom or WebEx and facilitate that group’s work while the entire class was connected to our first campus by the Cisco TelePresence videoconferencing system. By the way, the headphone jacks were installed for that reason—to prevent cross talk when a small group was communicating with a distance facilitator.

The empty space in the center of the room was intentional. We needed this classroom to be dual purpose: to serve as a lecture-style room and as an active learning space. To serve that need, we purchased chairs equipped with work surface tablets that could be moved to the back when the chairs were placed at the stations or brought to the front when the chairs were placed in the center of the room.

We completed the renovation in fall 2015. During the 2015-16 academic year, we used it as an active learning space only. This year, now with three cohorts at our second campus, the room is also being used for lectures. So, what has been the feedback? As an active learning space, it indeed fulfills all the items on our initial wish list. The key point I took away from that special journal issue on active learning spaces was that spaces don’t dictate behavior but do influence it. I think that’s true. This room is indeed conducive for small-group teaching and learning. However, when configured for lectures, the room is less optimal. The primary reason seems to be because the equipment and electronics are ideal for small groups located at the stations. For example, for safety reasons, we chose not to install power outlets into floor cores in the middle of the room and now students don’t have a way to charge their laptops when the room is configured in lecture-style mode. Also, using the ceiling microphones, rather than the push-to-talk microphones at the stations, yields less effective communication.

We currently don’t have an active learning space at our first campus. I hope we get to add one. That would be awesome.


1. Baepler P, Brooks DC, Walker JD, editors. Active learning spaces. New Directions for Teaching and Learning. 2014 Spring; 137:1-98. doi: 10.1002/tl.20080.




pic 4.jpg

An electronic immediate feedback assessment technique for use in Team-Based Learning

Gary D. Theilman, Pharm.D.                                                                                                          University of Mississippi School of Pharmacy

An electronic immediate feedback assessment technique for use in Team-based Learning

A “Preventive Medicine and Public Health” course in the spring of the third-professional year has been taught for the past 5 years using the Team-based Learning process.

An overview of the Team-based learning process.   The project was developed for use during the “Group Test” portion.           Source:

An overview of the Team-based learning process.   The project was developed for use during the “Group Test” portion.          


The students begin the session by completing a short (10 question) multiple-choice Readiness Assurance test on the pre-class reading.  Immediately after taking the Individual Test, students break up into teams and re-take the same quiz as the Group Test.  While they are still not allowed to use books or notes, they may now work together to come to a “consensus” answer to each question.

Traditionally, the students have used Immediate Feedback Assessment Technique (IF-AT) scratch-off cards during the Group Test.  These cards are similar to instant lottery tickets.  The test has been set up so that the correct answer for each question corresponds to the location of a printed star that is under one of the scratch-off squares.


An IF-AT card showing an incorrect guess for question 2.   If the star is not under the first box scratched off, the team makes a second choice.    They continue to make guesses until they discover the correct answer.   Each incorrect choice costs them points. Naturally, the test must be designed so that the correct answer corresponds to the pre-printed location of the star. Vendors provide different versions of the scratch-off cards so that students can’t just memorize the correct answers from previous weeks.

An IF-AT card showing an incorrect guess for question 2.   If the star is not under the first box scratched off, the team makes a second choice.    They continue to make guesses until they discover the correct answer.   Each incorrect choice costs them points. Naturally, the test must be designed so that the correct answer corresponds to the pre-printed location of the star. Vendors provide different versions of the scratch-off cards so that students can’t just memorize the correct answers from previous weeks.

By the end of the test, the students have found the correct answers for all the questions.  The instructor collects the cards and assigns a test score based on how many boxes were scratched-off.

We had used the IF-AT cards for several years.  However, we had reached a point where we had run out and would need to obtain more.

At that time, the minimum order for the cards is a pack of 500 (about $85).     These 500 cards are all identical and the stars are in the same locations on each card.   To prevent the students from memorizing the location of the stars from week-to-week, a different card layout must be used each week.   This may necessitate purchasing 4-5 different packs of 500 cards and then rotating which card set is used each week.     Our class had only 10 teams.   For the 9 weeks we had the team tests, we would be using 90 cards.   However, we may need to purchase 2000-2500 cards in order to have 4-5 different layouts available.

Another problem is trying to force the test choices to conform to the pre-printed locations of the stars on the card.   For example, in a question which has a logical order of item choices like

What is the usual value of a year of life used for insurance company estimates?

A.  $25,000

B.  $50,000

C.  $75,000

D.  $100,000

One must either find an item number on the card where the pre-printed answer is “B” or must rearrange the choices to match where the star is printed.  A similar situation occurs with questions where the correct answer is

D.  All of the above

One must find an item on the card where the star is under “D”.   If there are no more “D is correct” items on the preprinted card, the question must be modified.

The TBL model calls for the instructor to assess which questions were giving the class trouble and spend a few minutes reviewing the concepts associated with the question (the Instructor Feedback stage).   The simplest solution would be to collect the cards from the teams and glance over them quickly to see which questions had multiple scratches.   This is not difficult with small classes, but one year we had a class with 42 teams.   “Glancing quickly” through 42 cards to try to determine which questions gave students problems takes a bit more effort.

The objectives of this project were to develop a process that would

  • Allow the instructor to design the test without having to force the answers to conform with the pre-printed IF-AT cards.
  • Avoid the need to purchase hundreds of cards that might never be used.
  • Better assess the level of student difficulty with each question in order to tailor the “Instructor Feedback” portion to better meet student needs.

A web-based application was created that allowed students to electronically ‘scratch-off’ choices in the same way they would with a physical IF-AT card.   

Before the class, the instructor would create a test using whatever questions and whatever item order was desired.   The correct answers to each question were entered into an online database.









Another webpage was written to generate slips of paper with a shortened URL and a QR code.   These slips of paper were distributed to the student teams at the beginning of the team test.

The students could either use the shortened URL or could scan the QR code to open a webpage that displayed the scratch-off card.


The page was designed to display correctly on laptop screens, tablets or smartphones.   Tapping (or clicking) on a choice would display an animation of the choice “turning over” to reveal either an “X” or a “Checkmark” on the other side.  Sliding (or clicking) the edge of screen would move from question-to-question.


Students discussing which answer to choose during the “team test”. The team-based learning model lends itself well to the use of lecture halls.   Students are assigned to sit with their teammates.   During the team portions of the class, the students merely turn in their seats to interact with each other.   The animation of the choice “turning over” was set to be slow enough that there is a bit of suspense as the student waited to see if they got a “check” or an “X”.

Students discussing which answer to choose during the “team test”. The team-based learning model lends itself well to the use of lecture halls.   Students are assigned to sit with their teammates.   During the team portions of the class, the students merely turn in their seats to interact with each other.  

The animation of the choice “turning over” was set to be slow enough that there is a bit of suspense as the student waited to see if they got a “check” or an “X”.

Another team is accessing the scratch-off card using a cell phone.  The quiz questions are on the pink pages.

Another team is accessing the scratch-off card using a cell phone.  The quiz questions are on the pink pages.