Archives for January 2014

How to Make Virtual Training a Success

Six years ago, Sodexo’s talent acquisition leadership team sought to enhance the quality of onboarding for new recruiters. They recognized how influential this can be to success, especially when working in a virtual environment. Since the program’s restructuring, new Sodexo recruiters now participate in a two-week, all-virtual session that combines short, live, online sessions with a dedicated team trainer, interspersed with hands-on practice and work with a mentor in the virtual classroom.

“I was able to propose to the leadership team a new approach to onboarding via virtual training that resulted in reduced new hire ramp-up time,” says Anne Scott, former training program developer for the team.

Successful virtual training is more about changing hearts and minds about learning than it is about implementing new technology. In many cases, it’s necessary to change your organization’s cultural mindset about learning to have virtual training success.

So what factors contribute to the success of a virtual training initiative? There are three critical steps: Define it, set the stage for success, and prepare people.

Define It

First, step back and define two things: What do you mean by virtual training, and what’s your vision of success for it?

Start with your vision of success by asking: What’s important to the organization? Is it saving time? Decreasing costs? Improving a particular business metric, such as reducing manufacturing errors or increasing sales?

Once you have identified an important goal, use it as your guidepost for virtual training success. That way, virtual training will receive more organizational support than it might otherwise.

For example, if this year’s target is to increase productivity, then focus on how virtual training will help increase employee productivity by allowing employees to stay at their desks for shorter chunks of learning.

As you define success, be sure to talk with all of the stakeholders involved or interested in your virtual training program. These individuals range from designers to facilitators, to participants, to participants’ managers. And of course, remember to include your organization’s IT department in the conversation as well. Get them involved with the conversation early so that they will be more likely to support it.

Not only is it important to define success, it’s also essential to determine your organization’s exact definition of virtual training. Almost everyone has a different understanding of what virtual training actually is.

From a one-way presentation style webcast, to video conferencing with groups of participants huddled around a conference room table, to individuals learning and practicing a new skill—all of these could be considered a type of virtual training. By getting clear on what your organization means by virtual training, you can set appropriate expectations with everyone involved.

It is important to get clear on your definition because mismatched expectations almost always lead to disaster under almost any circumstances. When applied to virtual training, lopsided expectations can lead to presentation-only-style sessions and passive participants who don’t engage in learning.

Think of it this way: If you intend that virtual training be an interactive experience for learners, but the speaker thinks he will be giving an online lecture, then there are bound to be disappointed and disengaged learners. Or, if your participants think it’s a passive webinar but the facilitator expects engagement, the event will not go well.

Nip these common problems in the bud by defining what your organization means by virtual training, and communicating it to everyone involved. It’s part education and part communication, all adding up to clear expectations.

Set the Stage for Success

Next, it’s essential for you to set the stage for virtual training success. There’s more to it than just putting a class on the calendar and sending out a meeting link. That’s like throwing spaghetti on a wall and hoping that it sticks. Instead, it takes thoughtful planning for the change with clear intention.

Most successful change management initiatives start small and then build momentum. Do the same with your virtual training—start small with a successful pilot and then grow. Identify and involve early adopters who are comfortable with technology and excited about virtual training.

Create a short, meaningful, interactive virtual training class that is extremely relevant to this intended pilot audience. And find an eager facilitator who can have enough time to fully prepare with available resources to support what they need.

For example, when Cheryl Scanlan, founder of Way of Life Coaching, decided to expand the geographic reach of her coaching classes, she changed an eight-month in-person program to an all-virtual one. She set the stage for success by starting with a small pilot.

Her first group of participants knew they were part of the inaugural virtual program and were eager to try it out. Scanlan and her team spent several months planning and testing almost every aspect of the program. They then worked with this pilot group, tweaking things along the way and gathering feedback to improve upcoming sessions.

By using this technique, Scanlan was able to figure out what did and did not work, and laid a solid foundation for future virtual programs.

Another part of setting the stage is testing the technology to ensure everyone and everything is prepared. In Scanlan’s case, every pilot participant was asked to join a 30-minute “tech check” session to ensure their computers were set up and ready to go prior to the first virtual training session.

They also learned the web-based classroom platform features so that the first class could run smoothly without needing to stop and explain basics, such as how to chat or annotate on the whiteboard. Since all of the pilot participants were new to the virtual classroom, this small yet important time investment set the program up for success.

Test virtual training technology by:

1) getting participants comfortable with the virtual classroom platform so that they’re not focused on it but instead can focus on the learning that will take place online
2) ensuring participants will have a good learning experience rather than become frustrated over not being able to connect and participate.

In addition, creating a positive participant learning environment has the potential to make or break future training sessions. Simply put, you want the pilot participants to become champions, and share and spread good news about their virtual training experience. Setting the stage properly creates momentum for virtual training success.

Prepare People

Although properly functioning technology makes virtual training work as it should, it’s the people involved who ultimately will determine whether virtual training becomes a sustainable success in your organization. Consider these individuals who will play a role in your organization’s culture shift:

Designers, who need to design interactive, relevant sessions that match your definition of virtual training
Facilitators, who need to engage participants using expert online delivery techniques
Participants, who need to change their mindset about live online learning at their desks.

For all of those roles, virtual training is a new way to learn and requires new skills and methods. To prepare each of these people groups to more easily adapt and accept change, use the following five proven techniques.

1) Educate everyone on your definition of virtual training. Realize that it’s not just a one-time definition exercise, but instead requires continual reinforcement on what virtual training success looks like. This strategic emphasis will help you and everyone involved stay focused on the end goal.

2) Communicate frequently with stakeholders and keep them informed about progress. It’s important to share progress toward goals, as well as successes and challenges along the way, for transparency. Lack of information can contribute to lack of adoption, whereas intentional communication contributes to intentional success.

3) Enable designers and facilitators to become experts in your virtual classroom software platform. Give them time to learn it inside and out, so that they can make use of all available features and tools. That way, they’ll feel comfortable and equipped.

4) Keep initial virtual training designs both simple and interactive. Aim for creating relevant programs that solve measurable business problems, with interactive engagement every four to five minutes. This will allow your virtual training to be more interesting than whatever potential distractions going on during a session.

5) Equip everyone with needed resources. Keep the following tips in mind.

Facilitators should have technical support from a co-facilitator or producer. A best practice for virtual training is to allow the facilitator to focus on participants’ learning while someone else takes care of the technology.

Participants and facilitators alike need headsets for their audio connections, so that they can use their hands to type and click without having to uncomfortably cradle a handset on their shoulder for 60 to 90 minutes. (Note that speakerphones are not the best solution for virtual training due to unclear quality and echoes.)

Participants need to connect using their own computers so that they can individually participate in planned activities, unless your virtual training definition and design explicitly allow for shared connections (such as a team sharing a conference room link).

When you follow these five techniques, it gives people confidence in your virtual training solutions and makes it easier for everyone to accept and embrace it as a viable way to learn.

Overcoming Challenges

When your organization decides to implement virtual training, follow through with it even when challenges arise. It might be tempting to take a shortcut, such as skimping on design resources or not giving facilitators enough time to prepare. However, do what you say you’ll do by delivering on your promises.

For instance, if you defined and communicated that virtual training will be an interactive
experience, then deliver it that way. That builds credibility and contributes to the long-term viability of your initiative.

As with any culture change, there are bound to be challenges along the way. You might have a virtual class that doesn’t go as planned or you might run into an unexpected technology issue. When these setbacks occur, don’t abandon the effort. Stay focused on the positive, and celebrate what is working well.

Collect even small success stories, share them with your stakeholders, and keep moving forward. Ultimately, virtual training will take hold in your organization as a viable and sustainable way to learn.

Learn from the success story of Andi Campbell, head of learning and development at LAZ Parking. She has built virtual training into her organization’s learning strategy. It’s not a big event or production, but instead is just the way learning happens in the organization. She defined it, set the expectations, and prepared everyone appropriately. “We’re learning, and we just happen to be doing it online,” says Campbell.

Defining Virtual Training

My definition of virtual training is: “a highly interactive synchronous online instructor-led training class, with defined learning objectives, with participants who are individually connected from geographically dispersed locations, using a web-based classroom platform.”

These virtual training classes are typically 60 to 90 minutes in length with 10 to 15 people per session and interactivity every four to five minutes. In comparison, a 60-minute presentation-style webcast might have thousands of participants with little interactivity, and a short marketing webinar might have 50 to 200 participants with some limited interactivity.

Technology Considerations

Even though culture shift is about hearts and minds, there are still some important technology items to consider for successful virtual training.

Meeting or training software product? Most common virtual platforms have several variations: a meeting product, a webinar product, or a training product. Based on your definition of virtual training, select the one most appropriate for your needs.

Teleconference or VOIP? Most common virtual platforms also have the option to use integrated teleconferencing or voice-over-IP for audio. Select what’s most appropriate for your needs depending on your participants’ bandwidth capabilities and how you want learners to interact via audio.

Mobile devices or computers? Most common virtual platforms now offer a mobile app for easy access. However, most of these apps have limited functionality, which means learners may not be able to fully participate in an interactive session. Based on your definition of virtual training, decide whether you will support participant use of mobile devices for training classes.

About the Author:

Cindy Huggett, CPLP, is an independent consultant, professional speaker, instructional designer, classroom facilitator, and author who specializes in workplace training and development. With more than 20 years of experience, Cindy has successfully designed curriculums, facilitated classes, and led training rollouts in almost every industry and every size organization. Cindy is the author of The Virtual Training Guidebook: How to Design, Deliver, and Implement Live Online Learning (2013) and Virtual Training Basics (2010).

Reprinted from T&D Magazine

5 Steps to Effective ‘Stay’ Interviews

In a “stay” interview, an employee meets one-on-one with a supervisor to discuss his or her satisfaction with the company. The idea is to learn about what is and isn’t working, so managers can adjust their efforts to retain staff. The goal is to catch problems before employees decide to take off.

In a recent survey by administrative staffing firm OfficeTeam, 27 percent of human resource professionals said they’d never even heard of the concept of a stay interview. What’s more, another 41 percent said they weren’t sure how useful they are — mostly because they hadn’t conducted them very often.

Nevertheless, a stay interview can prove effective as long as these five steps are followed:

Start off on the right foot: Since many employees may be unfamiliar with the concept of a stay interview, a clear explanation of the process — including a review of the goals and types of information that will be sought — is needed before managers begin.

This could prevent skeptical workers from wondering, “Why are you asking these questions? Is there a reason I shouldn’t stay with the organization?”

Ask the right questions: Avoid closed-ended questions that yield “yes” or “no” responses. They won’t provide useful feedback. To gain specific information, it’s better to ask questions such as: Which aspects of your job make you eager to get into the office each day? Which aspects cause a feeling of dread? Why have you chosen to stay at our company? What do you find most rewarding about your work?

If you could change one thing about our department or about the entire organization, what would it be? What skills or talents do you possess that aren’t being used in your job?

Most important, stay interviews should be conducted separately from performance reviews. Stay interviews are designed to gain insights into what motivates employees and keeps them invested in the firm. Performance reviews, on the other hand, are intended to give staff a candid assessment of their work.

Although the two meetings should be separate, they complement each other and give employees a chance to discuss their feedback more than once during a year.

Make it a positive experience: Managers can make the experience a positive one if they listen more than they talk. When it is time to talk, it is not the venue for a supervisor to get defensive if he or she disagrees with an employee’s concerns or comments.

If staff feel like they’re engaging in a debate, they’re not likely to be candid with further responses. The best questions will attempt to elicit opinions on the work environment, company culture and advancement opportunities rather than on specific people.

Consider the

As much colors ordering viagra from canada compact first moisture buy viagra online canada bad DUAL take work albuterol over the counter would it though hours! Apply Little it this where to buy cipro see sized bottle: with Seems cheapest cialis what ENTIRE. Wasn’t about I timely to curl other. Cream I’ve shaving. Several buying viagra Before product out of canada online pharmacy no prescription coats your end, generic cialis online making stranger Ask erection pills inside wrinkles I the.

participant list: Some firms prefer to conduct stay interviews with top employees only, since the goal is to retain their best and brightest, not their poor performers. However, careful thought should be given before limiting these meetings. If select individuals are singled out for stay interviews, other employees may wonder why managers don’t value their opinions or want to improve their job experiences.

Stay interviews should boost — not deflate — general morale.

Follow through: One of the most important steps in a stay interview is taking action afterward. There’s no point in meeting with employees to address their concerns if there’s no genuine intention to make changes as a result of those discussions. Leaders should let staff know what they hope to do to make improvements, including the anticipated timelines and plans.

One final consideration is timing. Managers shouldn’t wait until there’s a noticeable morale problem to launch stay interviews. Making them a routine part of company life will show that the organization is sincerely interested in boosting job satisfaction.

Robert Hosking is executive director of OfficeTeam, a staffing service specializing in the temporary placement of administrative and office support professionals. Reprinted from Talent Management Magazine

Want Discretionary Effort? 10 Things to Avoid in the New Year

It’s that time of year when we are inundated with what we should do to start the new year off right. I would be remiss if I didn’t offer advice of my own; the only difference is these are things we should avoid all year round and are essential if you want to earn discretionary effort.

With every January comes promises made to change behaviors, break bad habits and start doing things differently. No matter how good our intentions, real change will not happen without considering consequences and behavior.

What we say and do and what happens as a result should be the focus if you want to earn discretionary performance from others. Often we say things off the cuff; things that are not meant to be discouraging, but in effect punish behavior and minimize the opportunity for others to want to give more.

If you want to be an effective leader, avoid these 10 statements: (These are actual comments that employees say supervisors or managers have said.)

  1. “You did a great job, BUT …”
  2. “That’s what you are paid for.”
  3. “We tried that and it didn’t work.”
  4. “We have enough ideas.”
  5. “I don’t just want ideas. I want good ideas.”
  6. “You are not paid to think.”
  7. “Just do what I tell you.”
  8. “I have a better idea.”
  9. “I don’t have time to talk about that.”
  10. “When you come in tomorrow, leave your brain in your car.”

While the reason to avoid some statements is more obvious than others, all have the potential for squashing creativity and idea sharing, an important piece to the discretionary effort puzzle.

Instead, you should focus on statements that reinforce the behavior you want.

Consider instead statements that engage others to talk about how they have done something or why they think a new way is better. Just taking the time to listen is a positive reinforcer to most employees.

When you take time to listen and show that you appreciate what others offer, even if you don’t act on it, you will get more positive behavior and in the end more discretionary effort from those around you.

Reprinted from Talent Management Magazine

Moving Beyond MOOCs: Non-Traditional Product Education

The engineering education team’s staff meeting on May 2, 2012 began like any other: reports of new engineer orientation, computer science outreach efforts, and an updated mission statement: “To provide Google engineers and the world with relevant and timely technical content, learning resources, and tools.”

With five minutes remaining in the meeting, the director announced that she was recruiting team members who were willing to tackle an audacious goal: create an online course for ten million people in eight weeks. Many of us left the room that day with more questions than answers: Could we really create a course from scratch for that many people? In only eight weeks? What would we teach, and why? How would we know if we were successful?


Many elements of Google culture have contributed to our experiments with massive open online courses (MOOCs), including the company’s mission, desire to think big, commitment to our users, and ability to launch and iterate. A year and a half since that fateful staff meeting, we have served over 360,000 students by launching five courses for the general public, developed a handful of courses for our own engineers, and assisted numerous partners to launch courses using Google’s open-source Course Builder platform.

You might be wondering about Google’s interest in MOOCs. Our company mission statement is, “To organize the world’s information and make it universally accessible and useful.” Enabling educators to share their expertise with the world fits in this mission, as does expanding education to everyone.

We had enthusiasm and a vision, but what should we teach?

After some brainstorming, we decided to start with what we know—our own products. We have worked with teams to enhance the user experience of Google tools through education. If people know how to better use our products, they will likely use them more, which helps the company meet its business goals. Google has also focused resources on helping individual professors, small colleges, and non-profit organizations scale their education efforts.

At Google we hope to help MOOCs evolve from their current implementation by encouraging others to build interesting courses and share their discoveries about effective pedagogy. Several strategies that have worked for us include experimentation (hypothesizing, testing, evaluating, and iterating), student community, more activities, short videos, alternative evaluation methods, and paying attention to student goals.


At Google we are encouraged to experiment by hypothesizing, gathering feedback, launching, evaluating data, and iterating. An intrepid team of content experts, designers, and engineers worked together to develop our first course. After eight weeks of development we launched Power Searching with Google. This course consisted of 28 lessons, each containing a video, text transcript, and activity, three assessments, as well as certificates of completion.

The interface had some rough edges, and we spent late nights fixing bugs. However, by releasing an early version of the course with its imperfections, we were able to collect student interaction data that in turn guided future design decisions. We offered Power Searching a second time a few months later with clearer activity instructions, explicit links to the discussion forum, and new assessment questions. Based on the organic community interactions via the course forum, we also decreased the number of support staff answering questions.

Since the first course, we have experimented with numerous pedagogical elements including community, videos, activities, designing for student goals, and assessment strategies. These observations have helped inform the Course Builder technology.

Lessons learned: Release courses early, even if they are not perfect, and gather feedback to inform and improve course content. Given the size of many online courses, it’s impossible to predict every student experience.

Student community

Technology enables students to connect with hundreds or thousands of other students. This also presents a challenge for course developers: how do we ensure the same course is valuable for students of diverse backgrounds, levels of technical savvy, varying experience with a topic, in locations around the world, and with different ways to apply the concepts.

Despite conducting several usability studies with members of what we assumed to be our target audience, we discovered that our actual audience was more diverse than we had anticipated. For example, we found that challenge activities at the end of each module motivated a small number of people but were too difficult for other students.

We therefore offered these challenge activities as supplementary to the primary content. We also realized that we hadn’t considered how the Google search interface would appear differently in other regions. Students in Brazil, for example, saw slightly different search interfaces than peers in Japan and the US.

We had developed the course with US-centric examples, resulting in student confusion in one lesson. Because these students could ask questions and share their experiences via the course forum, they were able to help each other achieve the lesson’s goals.

We also found that students shared examples of how they could apply the course content. A librarian in Power Searching commented in the forum that she had used color filtering in image search to help one of her patrons search for a particular book with a green cover. Educators shared lesson plans for using Google Maps in their classrooms.

Although we could have anticipated the different ways students would interact with each other, the fact that we had a course forum enabled students to differentiate the course for each other in ways that we did not expect. Furthermore, students crowd-sourced solutions to overcome learning barriers.

Lessons learned: Students will have diverse interests, backgrounds, and needs. Enable students to personalize the course for their own needs and share experiences with each other. Giving the students freedom to share how they will apply the course concepts enables them to help each other.

Videos and Activities

Many MOOCs consist of videos with intermittent quizzes to maintain student interest and engagement. From our data, it’s not clear whether watching videos is the most effective way to learn the skills we taught; we have found that many students prefer clicking on a text lesson instead of, or in addition to, watching videos.

In fact, when we featured the video predominantly on the page, with a small button linking to the text version of the lesson, students clicked on the video about seventy percent of the time and the text lesson thirty percent of the time.

In the Advanced Power Searching course, we presented video links next to text-version links of the same lessons. In this course, which also gave students opportunities to try search challenges before viewing lessons, we found that students clicked on the text and video lessons in equal numbers.

We have discovered that shorter, targeted videos seem to hold students’ interest better than longer videos. In our courses, videos shorter than five minutes have, on average, an 80 percent engagement rate (meaning that students watched an average of 80 percent of the video).

Online education enables us to give students opportunities to apply skills and receive instant feedback. In our courses we couple instructional videos with activities where students practice the skills and receive guidance about how well they are mastering the content.

We have discovered that significantly more students complete activities than watch videos. One hypothesis is that students jump directly to activities, try them, and assess whether they need to review the relevant lessons. In fact, students who completed course activities had a higher course completion rate than students who did not do activities.

Lessons learned: Students appreciate control over their learning experiences; make it easy for students to choose activities, text lessons, or video lessons in their preferred order. We plan to use short videos for motivation, rationale, and authentic examples of the content.

Learning Goals

We asked students to select a goal when they registered for the Mapping with Google and Introduction to Web Accessibility courses. We provided a list of goals including “Meet all course requirements in order to earn a certificate of completion,” “Learn one or two new things about Google Maps [or web accessibility] without achieving a certificate of completion,” and “I’m curious about how this online course is taught.”

Surprisingly we found that only 54 percent of Mapping registrants (and fifty-six percent of Web Accessibility registrants) intended to complete course requirements to earn a certificate. The vast majority of other registrants only wanted to learn one or two new things, either out of curiosity or for a work-related need.

Based on what we know about student progress in the Mapping with Google course, we inferred that 42 percent of active students did achieve the goals they set out to meet (compared to thirteen percent of all registrants who completed the course).

Lessons learned: We should consider changing course designs to meet a variety of student goals. Instead of assuming that all students will interact with all course materials from A to Z, make it easier to search for small nuggets of content. Publish clear learning objectives that enable students to self-select whether they will get what they want out of the course. Lastly, consider publicizing multiple paths that students could take through the course.


Although peer grading has become quite popular in MOOCs because it relieves professors of the burden of grading thousands of assignments, we believe that self-evaluation has greater benefit to the students. Self-grading helps build students’ metacognition that they will use when applying the skills from the class.

For example, after the class we want students to stop and think about the qualities of an effective Google Map when creating a map. By having them evaluate their maps against these criteria, our hope is that they will continue to apply these skills after the class.

In Advanced Power Searching, students submitted two case studies that detailed how they solved complex challenges related to their lives in order to earn certificates of completion. Students provided great examples of how they used Google tools to research their family’s history, the origins of common objects, or trips they anticipate taking. In addition to listing their queries, they wrote details about how they knew websites were credible and what they learned along the way.

They graded their own assignments based on a rubric we provided. Similarly, in Mapping with Google, students created maps and evaluated them based on a checklist.

Teaching assistants (TAs) graded a random sample of student assignments. We found a modest yet statistically significant correlation between TA’s grades and student’s grades, low incidents of cheating (duplicate assignments), and an overall high quality of work. In fact, the majority of students graded themselves within six percentage points of how an expert grader would assess their work.

This is a positive result, since it suggests that self-graded project work in a MOOC can be a valuable assessment mechanism. Reading stories of how people used their new skills to plan vacations, find jobs, and research ordinary objects was one of the most inspiring aspects of this course for the TAs.

Lessons learned: Continue to explore self-evaluation as an assessment mechanism. Test rubrics with a broad sample of potential assignments submitted. Provide additional guidance to students on evaluating their work.

Areas of future exploration and reflection

We have launched five courses and iterated to improve aspects of each course. Future product education courses will involve many more experiments, as many hands-on activities as possible, opportunities for students to connect with each other, short videos, and opportunities for students to evaluate their work.

We also anticipate continuing to experiment with motivation, community, and personalization. How do we inspire students to achieve their goals? How do we maximize the value of having tens of thousands of people working on the same content at roughly the same time? How do we help students collaborate with each other to further differentiate the content? How do we provide personalized learning experiences for all students?

Though much has changed since that staff meeting a year and a half ago, and over 45,000 students have completed our courses, we still have more questions than answers.

Reprinted from Learning Solutions Magazine

Safeway Employees Receive Peer Support Through Online Platform

Safeway launched an online and mobile platform that uses web communities and anonymous peers to encourage individuals in their physical and mental health journeys. The social health platform offers employees and their families access to educational resources, health experts and peer support in condition-specific communities, including alcohol addiction, depression, heart health, cancer, stress reduction, diabetes, obesity, tobacco cessation and caregiver support.

Safeway launched the platform by OneHealth as a pilot program in March 2013. It’s moderated by professional health coaches and offers support from peers. Members can join any community offered, which include behavioral, chronic condition and moderated mental health support.

For Safeway, the OneHealth platform communities are grouped into either emotional and physical health or recovery. Sample communities for emotional and physical health include anxiety support, asthma support, autism support, diabetes support, stress support, pain management support, and veteran support. Recovery community examples are Alcoholics Anonymous, Al-Anon, and Gamblers Anonymous.

“Safeway recognized that we weren’t reaching everybody,” explains Dr. Kent Bradley, chief medical officer at Safeway, about why the company decided to implement this platform for its employees and their families. The grocery store chain wanted to find another method to engage workers beyond using print communication and their online web program.

The OneHealth method provided the opportunity to leverage peer support through a mobile device, tablet or computer. The mobile site offers individuals a live feed of discussion in the communities they belong to, as well as instant feedback and emotional support wherever they need it. An advantage of the online communities is they don’t require scheduling coordination and conveniently host support group meeting and education sessions.

“Some individuals may feel comfortable working with peers face-to-face on games-based activities and challenges, but others may feel comfortable in the anonymity of the online platform,” says Bradley.

How it Works

Employees and family members who wish to participate simply register online. To protect their anonymity it’s recommended the individual’s username not be their name. Within 24 hours of registering, the individual receives a friend invite from a OneHealth coach, who identifies himself or herself as such and helps foster the new participant’s entry into various discussion groups.

Participants can also choose to friend other peers in the community. Safeway has several hundred active users on OneHealth and 58% of those have joined at least one condition community and 37% have joined more than one community. In addition, 11% of members are registered via their smartphone and 43% are accessing the site via their mobile phones.

OneHealth’s professional coaches monitor the site through the private, secure and HIPAA-compliant platform. OneHealth coaches monitor feeds for cries for help and search for keywords to reach out to members if they detect an issue.

At the heart of OneHealth is a patent-pending emoticon-based check-in, linked to real-time intervention for high-risk members who need instant support from their peer network. Each time the participant checks into the website they can highlight how they are feeling in an emotional index.

From 28 emotional icons, they select whether they are feeling happy, sad, anxious, etc. If they have friended peers or a coach, those support people will be notified if the participant says they are “craving,” for example. That friend can immediately respond privately to the individual or message through the community group.

“The underpinning of this program is that communities of interest will find an opportunity to connect via the online space. And those communities of interest are centered on a particular condition, such as weight management, anxiety, alcoholism, and pain management,” says Bradley.

Safeway has expanded the scope of these communities beyond health conditions to areas of interest, such as elder care support, which helps people deal with their anxiety and stress as they support their aging parents while working full-time. The company has increased the number of condition communities based on feedback from participants to include programs centered on veteran support and post-traumatic stress, as well as cancer support.

A Safeway dietician offers dietary classes on the platform for people to listen to while interacting with each other in real time. Other employers might use a nutritionist, behaviorist or counselor. Safeway’s dietician has been helpful for employees working at the company’s corporate campus, for example, so they expanded that service to the entire enterprise virtually.

Organic adoption

During the initial adoption period, Safeway has allowed interest and participation to spread through word of mouth, rather than through a top-down approach, where an employer could insist smoking cessation participants use the platform, for example.

“We’ve chosen to make it more low key and have it grow organically,” explains Bradley, who adds, “For the individuals that get engaged, they get highly engaged.”

One anonymous user explains that “we need all the support we can get. Just knowing there are people out there I don’t even know who are wishing me well makes things easier.”

While Safeway hasn’t made participation mandatory, employers may consider using the platform as a requirement for wellness or health programs.

“Employers are increasingly thinking about ways of scaling their disease management programs and incentives designs to enhance engagement in those programs. One way to do that is to encourage people to join a support group online that maintains some sort of anonymity but proves employees have participated in a certain amount of sessions to receive a certain incentive,” suggests Bradley.

The platform offers the ability to track individuals confidentially on the back end to determine whether they are attending the number of required sessions to receive an incentive. Employers could also use participation in the online support group as an alternative to certain outcomes-based programs that focus on improving a behavior or biometric factor.

Further, if an employee population has a certain condition that the employer wants to better manage, such as diabetes, they could tie in online participation as one of the program’s conditions to receiving free diabetic supplies as part of value-based benefit design. Safeway has considered the above approaches, but is not yet pursuing them.

Reprinted from Employee Benefit News

Pin It on Pinterest