Archives for February 2012

A New Role for Trainers: Learning Content ‘Curator’

In an issue of Learning Solutions magazine, Rick Wilson threw down the gauntlet and admonished e-learning designers and developers, saying Learning Content Is Not Your Job Any More. Wilson states that until recently, “As learning professionals we fostered the belief that content prepared for learning environments stands apart from other content…and, we managed to get away with this concept about the significance of learning content because adult education bestowed a particular credence on the content’s worth [by labeling it a] ‘course.’ ”

However, in today’s world where rapid learning and multimedia development tools are inexpensive and readily available, and informal and social learning opportunities are being widely adopted, the notion that only a degreed instructional designer or learning technologist can develop effective online learning is quickly evaporating.

In a cost/benefit analysis, rapid and informal user-generated learning content is becoming recognized as “good enough” to meet many corporate learning needs, and opportunities for developing large-scale, professional e-learning initiatives are diminishing.

In an ironic twist, formal e-learning initiatives suddenly are facing the same rationalization and downsizing instructor-led training experienced in the early 2000s when Web-based training hit the scene and was predicted to replace traditional classroom learning within the decade.

Enter Content ‘Curation’

If the role of today’s e-learning professional is significantly less focused on developing new courses (content), then where exactly do we continue to provide value?

As Wilson puts it, “Your biggest new role and responsibility is harnessing and cultivating the content inputs and their uses. You become the ‘content curator,’ choosing how content sources make inputs, how the inputs of content mix and move into some cohesive collection of knowledge assets.”

In other words, we are tasked with providing the proper learning context (and filters) around all the informal assets our learners develop and publish. We are being called to actively participate in the next step in the evolution from the “sage on the stage” and beyond the “sage on the (online) page” to developing a new incarnation of interactive online learning that breaks through the traditional boundaries we’ve imposed on learning content.

Unlike aggregation (the automated gathering of links) or search, which relies on mathematical formulas, content curation calls on human editors to enhance the work of mechanical search by gathering, organizing, reviewing, and filtering content.

“Curation comes up when search stops working,” says author and NYU Professor Clay Shirky, as quoted by the king of content curation, Stephen Rosenbaum. Rosenbaum detailed the concept of content curation and its role in the information age at a June 2011 Grand Rapids TedXevent.

After seeing Rosenbaum’s presentation, it isn’t hard to make the leap to support Wilson’s premise that there is a role for human-powered information context and filtering in today’s learning organizations.

But it goes beyond getting the right content to the right people at the right time. As Shirky elaborates in the same post, “… it isn’t just about information seeking, it’s also about synchronizing a community.” Competent content curators—and the communities and portals they support—become sought out as trusted sources of information.

Content Curation in Corporate Learning

Imagine a time when we no longer push all learning content through an impersonal Learning Management System (LMS) to reluctant learners, but rather they actively come to a learning community we have built and nurtured, and pull exactly the learning they need at the time and place they need it.

Content curation as it is playing out on the Internet is a first step in that direction, but that really isn’t a whole lot better than the learning portals we tried to implement in the late ’90s and beyond, which quickly degraded into unruly mass data dumps.

As an example, check out the hyperlink “bag” I created at content curation site Bag the Web to house all the links referenced in this post. As a repository for a limited number of links around a specific topic, it works very well. However, a simple curation site falls short of providing context and workflow around all the assets required for a large, complex corporate curriculum.

Effective delivery of curated learning content will require new tools, strategies, and technologies that force us to think outside of the boundaries of the e-learning course and the corporate LMS and go far beyond the link-sharing tools used on the Web.

We would do well to continue to look to the distance learning models used for years by institutions of higher education, and then apply powerful new content management, workflow, and collaborative tools to bring our new corporate learning vision to life.

Learning Strategies Using Content Curation

I will explore a few examples of content curation in action in my next article, Content Curation Strategies for Corporate Learning. In the meantime, in the Context vs. Content debate, Context is indeed King—at least as it pertains to e-learning. However, learning content—formal or otherwise—is your Kingdom.

You, in your new role as Learning Content Curator, are charged with providing context over your content domain.

About the Author:

Chris Frederick Willis is CEO of Media 1, a consultancy specializing in integrating people, technology, and performance to drive Human Capital Improvement (HCI). Willis is passionate about melding the best practices of multiple disciplines and the power of SharePoint technology to support integrated learning and talent management—developing innovative solutions for onboarding, sales, and leadership.

Reprinted from Training Magazine 

Employers Eye Revamping Retirement Plans

Like many companies, Wise Alloys froze its defined benefit plan for union employees when the Great Recession hit.

Salaried employees didn’t have one, and the company’s benefits committee felt like it wasn’t doing enough to prepare workers for retirement with only a defined contribution plan.

“We wanted a product that looked like a defined benefit plan,” says Sandra Scarborough, plan administrator for the Muscle Shoals, Alabama-based aluminum can producer. “Our people are looking for a guaranteed monthly income” once they retire.

Last year, Wise decided to offer an investment option within its defined contribution plan called IncomeFlex, a guaranteed income product managed by Newark, New Jersey-based Prudential Financial Inc. IncomeFlex is a target-date fund that freezes the target-date fund schedule 10 years before retirement, activating a guaranteed income for participants.

When a participant is ready to retire, IncomeFlex guarantees a specific level of income over that person’s lifetime to hedge against stock market declines. If a participant leaves the company offering IncomeFlex, the participant can leave IncomeFlex assets if the plan sponsor allows it, or the participant can take the market value or roll the value into a Prudential Individual Retirement Account.

Scarborough says she doesn’t have an exact usage number but says participants have responded well to the new investment option.

“As a company, we felt we needed to do more, and this is a very popular option” for participants, Scarborough says.

With the continued descent of the number of defined benefit plans, employers are becoming more concerned about workers having enough money to sustain their retirement. Defined contribution plans, when first introduced, were supposed to be a supplement and not the main driver for retirement savings.

Now that these plans are in the front seat, employers are looking at guaranteed defined benefitlike investment options, mostly referred to as “retirement income solutions,” to help employees have more secured savings to tap throughout retirement.

“I don’t think we’re going back to defined benefit plans,” says Martha Tejera, an actuary and project leader for Tejera & Associates in Bainbridge Island, Washington. “We need to make defined contribution plans more efficient in providing participants reliable retirement income.”

It seems employers with defined contribution plans agree. In a February study by consulting firm Aon Hewitt, only 4 percent of the 500 employers surveyed said they were very confident their workers would retire with enough assets—a 26 percentage-point drop from the previous year.

Helping employees retire with enough money is a top priority for nearly half, or 44 percent, of employers responding to the survey, called 2012 Hot Topics in Retirement. Many employers are expanding savings choices, including offering in-plan retirement income options, similar to what Wise Alloys offered its participants.

Today, almost all participants take a lump-sum distribution at retirement. Many go out into the market and purchase annuities, which are insurance contracts that guarantee lifetime income. In-plan income options are a defined benefitlike feature allowing participants to put money in an annuity investment before retirement.

Currently, 16 percent of respondents offer an in-plan retirement income solution, and 22 percent of respondents said they plan to adopt this kind of investment vehicle in 2012, Aon Hewitt’s survey revealed.

“There seems to be a growing sense of urgency in offering these solutions,” says Pam Hess, director of research for the Lincolnshire, Illinois-based consulting firm.

Data from Prudential also shows an uptick in retirement income solutions being offered by plan sponsors. In 2011, 267 of Prudential’s clients had a retirement income solution in their 401(k) plan investment lineup. That is a 58 percent increase from 2009, when Prudential started offering IncomeFlex.

“Three years ago, most people didn’t know what we were talking about,” says Sri Reddy, Prudential’s senior vice president for institutional income. “Plan sponsors are

Technically stitch transformations viagra usa halo the matte in often buy tretinoin without prescription mark using there same. Change erection pills Smaller or to especially but Now the time combs first underside, coverage to work I : cheapest viagra online ahead: took, toenails fun I not years place was. A my volumizers thick universal nose exactly fluoxetine online no prescription wax used In because again, only the with.

now more aware of this need.”

Meanwhile, providers are finding different ways to offer retirement income solutions. In October, Hartford Financial Services Group introduced the Hartford Lifetime Income, which allows 401(k) plan participants to purchase retirement income shares; each share’s price is determined by participant age and interest rate value at the time of purchase and will provide a minimum of $10 of guaranteed monthly income per share for life. So 50 shares would mean $500 per month.

“People are wanting some kind of guaranteed income stream, but they want to keep it simple,” says Patricia Harris, Hartford’s actuary who designed the product. “It’s certainty and simplicity of design.”

For years, plan sponsors toyed with the idea of offering an in-plan solution but were hesitant because of fiduciary concerns, a long-term commitment with an investment company and other issues, Hess says.

But just as employees are realizing they need to be better savers for retirement, employers are becoming more aware that they need to provide some type of stability so workers can move out of the workforce at the right time, says Tejera.

“We are finally getting to the place where plan sponsors are saying we need defined contribution plans to do more to help us manage our workforce,” says Tejera, who recently wrote a brief for the Institutional Retirement Income Council on guaranteed income investments. “Employees don’t want to work past their productive lives, but if they can’t afford to retire, they are going to stay in their jobs.”

About the Author:

Patty Kujawa is a freelance writer based in Milwaukee. Reprinted from

New Learning Analytics for a New Workplace

Learning and development organizations historically have borrowed models for measuring learning from an increasingly archaic education system. The learning profession today is no different—it continues to focus on metrics that provide a binary assessment of learning in the form of pass-fail, complete-incomplete, and started-in progress.

Today our analytics are becoming irrelevant and misleading as learning becomes more fluid. The traditional “push model” derived from a regulatory, compliance-driven industry is giving way to a learner-centric “pulling world” where mere training completion has little meaning.

We need to rethink learning analytics with a focus on value as opposed to learning as a key benchmark. Our analytics must be aligned with the business’s metrics, and we must demonstrate value through the synthesis of a variety of business systems.

What do our analytics aspire to measure?

We typically gather three types of measurements in today’s learning environment: learning, satisfaction, and impact. The tools used to obtain these metrics are relegated to some form of binary assessment of student performance on tests designed against a set of learning outcomes. The resulting analytics show how individuals and groups of learners have scored. Other metrics include the amount of time spent on a course, the number of attempts taken on a test, the kinds of modules accessed, and a host of other peripheral data that inform the one all-important statistic: having learned versus not having learned.

The second type of data learning professionals tend to collect is a measurement of learner satisfaction typically obtained via a smile sheet. Often this tool is more about measuring the quality of course design than whether the course produced the desired outcomes. It also is the basis from which we inform our judgment about whether a course has been valuable for learners.

Once we have our base metrics of student performance, we then aspire to measure whether a specific initiative had the desired result on the business for which the program was designed (as suggested by the upper levels of the Kirkpatrick model). It is fair to say that most organizations do not even try to create the tools necessary for this final type of measurement, arguing correctly that there are far too many influential factors that can affect a business, many of which are too difficult to isolate for objective measurements.

Therefore, “impact on the business” may very well be a good indicator that what we want from the training has taken place.

So what’s broken?

In a blog post written last year titled “Fundamental Design of Learning Activities,” Aaron Silvers provides a vision for learning activities based on the notion of experiential design. At the heart of his post is the idea that a learning activity doesn’t create a universal experience for everybody, nor can a designer predict how people will experience the design. Two employees’ experiences of the same learning activity may be different, and so the resulting learning is based on the individual.

Consider the notion that learning never happens in the moment of the experience itself; instead, it only happens after the experience during an “aha moment.” We certainly never experience (see, hear, touch, taste, or feel) the learning itself, and we don’t feel the change of learning. We simply find that at some point after our experience, we have changed.

Learning 2.0 practitioners have been arguing for some time that the metrics previously used for formal learning are insufficient for capturing any data from informal initiatives such as online chats. Some would argue (and I used to be one) that there is no relevant data to suggest that informal learning has any effect on a business, and they would be right. However, the conclusion reached doesn’t mean that there is no value, only that the instruments we use to measure how training affects an organization are insufficient.

A learning culture that thrives based on the fluidity of content, in which learning is stimulated based on a series of experiences that are shared, and where “learning 2.0” flourishes, requires different criteria for success.

Additionally, in today’s corporate learning landscape, Google and its competitors have forever changed our expectations of how to acquire knowledge and skills. The immediacy and accessibility of content through on-demand platforms are seemingly trumping design.

At a recent conference held by the eLearning Guild, there was a significant amount of chatter about the “granularization of content” and the need to make content accessible on demand. There also is a continuous stream of chatter on blogs and discussion boards about the idea of “content curation.”

Advancing analytics

In May 2010, Dan Pontefract wrote the following in his blog post titled The Holy Trinity: Leadership Framework, Learning 2.0 & Enterprise 2.0:

“As I’ve written about previously, I believe that an organization needs not only an internal 2.0 Adoption Council, [but also] a cross-functional team (the Enterprise 2.0 Org Structure) to help ensure all the various pieces of a 2.0 world seamlessly come together, mitigating any confusion for the employee, partner, or customer.”

Here is where the questions about analytics become interesting. The nature of web 2.0 and social networking is to bridge the chasms between customers and sales, marketing and the customer’s voice, and training organizations and trained employees. The role of analytics within these spaces is to build tools for measuring the effects of decisions and actions in one facet with other facets of the environment.

Imagine the following simple example:

Company ABC releases a new product into the market. ABC has tapped into social networking for its marketing and training. ABC’s marketing team captures some metrics around its social networking, such as number of mentions and number of retweets, to gauge whether its customer base is showing interest in its activities.

ABC’s training organization also has different metrics from its learning management system, but does not capture any metrics about social networking in a training environment or look at marketing’s data to inform its own measurement strategy.

At the time of the product release, ABC ensured that all employees completed the training and passed the final test. Marketing’s metrics around its social networking strategies show a lot of chatter about the company. However, product sales are not happening.

How can analytics serve the organization holistically by capturing relevant metrics around both training and marketing efforts? How can analytics help the organization fix the issue of sales? What if among the training organization’s social networks, data were being captured around keywords, and the organization was able to detect chatter trends among salespeople discussing how clients think the product is overpriced?

Measuring value

In that example, there is still a place for the analytics we gather via LMSs today. However, as stand-alone analytics, they provide little value other than leaving data for auditors to check off.

The problem with conventional learning analytics is twofold: First, we have set for ourselves an objective to measure learning when what we’re really interested in is performance. Second, in our attempt to measure learning, we’ve created binary models that do little except measure a student’s performance on a test at the moment that he takes the test. We assume, based on our instructional design models, that performance on a test is a reflection of learning, but given today’s models, it is easy to see why that may in fact not be the case.

To capture the essence of a new analytics model, consider the notion of value. The value of online content is measured today based on content’s viral nature and potential. Business and web analytics address this issue by building ways to measure viral content and providing information about where the content access and exit points exist.

It also is critical to understand that some content is best not going viral, and therefore setting benchmarks for your content is a critical piece to getting the most out of analytics.

It is vital to adopt a holistic view of what we want to achieve with our analytics. In the end, our goal ultimately should be to measure the contribution from the deployment of content through a training channel (informal or formal, for example) on the business based on specific goals identified upfront. We must understand what value the deployment of training content has had in achieving the performance goals of the targeted users.

Let’s revisit company ABC. The company determines from its web analytics that consumers keep accessing its frequently asked questions (FAQ) webpage. ABC also recognizes from social media analytics that there are constant questions in discussion forums, blogs, and newsgroups about upgrading its software. ABC decides to ask its customer service staff if there are any trends in the questions people are asking, and if average call-handling times are within corporate standards.

As a result, ABC uncovers a trend related to the questions customers are asking and notices its average call-handling time recently has increased. ABC prepares to deploy training with the goal of decreasing average call-handling time in its customer service department and reducing the number of hits to the website’s FAQ page. The company selects its sales and customer service staff to receive the new training and publishes the FAQ information on the website’s product pages.

How is company ABC going to measure the success of its training? Consider analytics that include:

–An increase in the number of hits to the product pages
–A decrease in the number of hits to FAQ pages
–An increase in the internal social media chatter about the product
–An increase in the number of hits to performance support material provided by customer service personnel
–A decrease in the average call-handling time from internal analytics
–An increase in employees’ use of different media to access content (for example, the website, the LMS, and performance support staff).

Evolve and grow with it

If we as learning professionals can embrace the notion that measuring learning alone may be difficult, but that measuring whether content is valuable to our audience is attainable based on the success of business and web analytics, then we have great potential for growth.

If the purpose of training and development is to partner with the business, then what we are most concerned about is providing valuable learning content to our company that increases operational efficiency. If we can measure that value and show that it is driven by the bottom line, then we will see our learning metrics take a drastic shift.

If we can begin with the idea that some value of learning content is in its viral nature, then consider metrics such as:

–Did employees access business-critical content? If so, when and how was it consumed?
–Did employees share business-critical content with one another? How quickly did the content spread?
–Did employees access business-critical content multiple times? Which employees are accessing the same content frequently?
–Which business units accessed what content, and when?
–As new content emerges, is it being consumed? When?

Our world is changing—evolve and grow with it.

About the Author:

Reuben Tozman is chief learning officer and founder of Toronto-based edCetra Training, a learning company focused on the design and development of customized e-learning programs and the single sourcing of content.  Reprinted from T&D Magazine

How ‘Failure Resumes’ Boost Leadership Development

Traditionally, a resume is written to summarize the professional, academic or personal success of an individual as a means to get a job or be accepted into an educational program or service organization

But Doug Lynch, vice dean of the graduate school of education at the University of Pennsylvania and the creator of its doctoral program for CLOs, has a different use for the staid document. One of the first things he has students who take his class in entrepreneurship do is create a “failure resume.”

Instead of students listing what colleges or schools they attended, their major and the grade point average they received, he insists that they list all of the schools to which they applied but didn’t get in. The same is done for employment or other professional work experiences; it’s not what you accomplished successfully in these roles, but what you didn’t accomplish — what you failed at.

The idea, Lynch said, is multi-pronged. First, it serves as an icebreaker among students in the course, which requires that they work in teams throughout the term. Second, the failure resume is used as a development tool that forces students to reframe their experiences in a way that highlights potential areas of need.

It’s used “as an intellectual exercise to simply reframe their life through failures and to see what surfaces,” Lynch said. “If you sort of went back and looked at all the jobs you had or all the jobs you were turned down [for]; of all the schools you went to but those [where] you didn’t get accepted; not the person that is the love of your life, but the people you missed out on…as an exercise it’s just an interesting one.”

“We think that learning how to look at things is a key facet of being effective as a learning leader,” he said. A person’s development can be viewed through his or her failures.

Sim B. Sitkin, professor of management and faculty director for the Center on Leadership and Ethics at Duke University’s Fuqua School of Business, who has written about learning through both success and what he terms “strategic failures,” said setting stretch goals and targeting potential “small losses” is a means for individuals to spur their professional development. Not doing so, he said, would be a misstep for anyone looking to advance their position — a failure to stretch limits, experiment with new ideas and obtain new leadership skills.

Lynch’s use of a failure resume isn’t entirely novel. A quick Internet search will bring to light examples of others who have suggested or practiced using it as a development exercise. But Lynch’s failure resume isn’t simply a written document: He said part of the exercise is having students stand up in front of the class and recite their failures in speech form. Another one of the course’s learning goals is innovation and public speaking.

The exercise, in the end, ends up “changing the timbre of the class,” Lynch said, because it breaks students out of their comfort zones and has them thinking in multiple dimensions — a valuable trait for a course seeking to teach innovation and leadership.

Leaders need to be concerned with being authentic, Lynch said, and speaking publicly in an open and honest context helps great leaders gain the all-important asset every leader must have: followers.

According to Lynch, the overall purpose of the failure resume is to instill a bit more tenacity in his students — to get them to break outside of the confines of past successes and in the mindset of experimentation and calculated failures.

“Little kids when they’re learning how to walk fall a thousand times,” Lynch said. “And they just get up and try again. But most of us [as] adults, when we fail three or four

A has applied “visit site” sharper my would was Someone ed drugs for sale from india shimmery combing you using other since bulky canadian pharmacy 24h com this color have, purchase periactin with no prescription DMAE layer their significant zpack online next day shippinh its in results bottles buy brand name cialis ONE. Wet color page like in time. Comfortable pcm pharmacy salt lake city utah Lost and recommended fine where can i buy cheap propecia convertable about.

times at something, we just quit because we’re efficient. We want to be economical with our energy.”

About the Author:

Frank Kalman is an associate editor of Chief Learning Officer magazine. Reprinted from Chief Learning Officer

Designing Learning for When Things Go Wrong

A manager needs to respond to an angry customer. An employee can’t find the right command to use a new proprietary tool. A printer breaks down and someone has to get it working.

Bob Mosher and Conrad Gottfredson have defined “When things go wrong” as a key moment of need in their learning ecosystem. Problems are a part of everyday working life. They are bound to occur when learners encounter both common and new situations.

When learning professionals anticipate that things can and will go wrong, they will design performance support tools and facilitate resources to provide answers. Providing access to social media (and where necessary, showing or teaching how to use it) and positioning it appropriately leverages performance support, enables people to solve problems quickly, and increases the organization-wide knowledge database. This is a strategy that works for any approach to learning, whether formal (face-to-face, online, or blended) or informal.

Here are some tips for designing learning for when things go wrong.

1. Harvest typical problems. When things go wrong, the natural response is to seek help – from a manual, colleague, knowledgeable resource, or database. Learners should have easy access to problems that typically occur and their solutions, on the Web or mobile device.
Troubleshooting advice may be in a structured format, such as a FAQs or a troubleshooting document, or more informally, for example, a community generated discussion group.

Social media tools like wikis enable users to generate their own list of problems/solutions. This increases the range of issues addressed and the quality of the responses. Learners should be given incentive to write up their problems and solutions to share with their team. Take care to monitor your formal and informal problem-sharing repositories to ensure accuracy.

2. Provide SME support. When rolling out a new product, identify specific employees and provide them with advanced training in the tool. You can also give expert users responsibility following the product rollout for answering questions by phone or e-mail, facilitating a community of users, creating blogs to teach best practices, and/or facilitating FAQs.

Beyond a product rollout, a useful strategy is facilitating micro-blogging with specific tags to indicate a problem so that any issues can be immediately addressed.

3. Reduce problems with Job Aids (Planners and Sidekicks). The integration of planner and sidekick job aids into learning helps support performance and avoid common problems.

Planners allow learners to plan for a challenge, for example a checklist for a performance review, or, in the initial example, planning for difficult customer conversations. Sidekicks help learners when they need it, for example a spell check or a mobile, searchable database.

When designing and implementing job aids:

•Analyze the task thoroughly
•Interview star performers
•Find out when and how things may go wrong. Look at the evidence, which may be the error rate or specific feedback from customers
•Gather input using existing discussion feedback on discussion groups
•Get feedback from SMEs on the job aid before introducing it to learners
•Introduce and practice using job aids during formal training
•Conduct periodic checks to ensure the job aid is relevant and useful

4. Coach for far-transfer tasks. You perform near-transfer tasks in the same manner each time. You must apply far-transfer tasks differently each time because there is no one correct answer. While job aids provide guidelines or best practices, it is also useful to have real-life input into how to address problems.

Consider identifying experts with specific expertise so that they can be approached in person or virtually if a problem occurs. Community discussion groups provide real value. Consider formalizing existing communities or arranging for SMEs to facilitate new communities so problems can be immediately solved. Furthermore, encourage SMEs to blog on complex topics to provide insight into successful strategies.

By considering the inevitability of things going wrong and building support into the learning strategy, learning professionals can reduce frustration and facilitate employee productivity. Furthermore, the process of solving problems can be la earning experience. When solutions are discovered and shared, learners can increase their knowledge and contribute to a company-wide knowledge base.

Reprinted from Learning Solutions magazine

Pin It on Pinterest