New term for me: Tesla

Posted by gminks in instructional design | Tagged , , , | 6 Comments

From this BoingBoing post, which was written by Clay Shirky, introduced me to a new term. Tesla, or “time elapsed since labs attended” refers to testing software applications. Clay gives this definition for tesla:

a measure of how long it’s been since a company’s decision-makers (not help desk) last saw a real user dealing with their product or service

The goal behind having a low tesla number is getting near real-time data on what users are having problems with, with the goal of real-time continual improvement of the product.

So can this be applied to education? Going back to this semester’s study of the Dick and Carey Instructional method, we’re supposed to be collecting this sort of learner feedback at different points of development. One of the biggest criticisms is the time that it takes to collect this feedback, and actually lack of time is why some of the feedback is not collected.

As instructional designers, if we have a course that is out of the design process and into a state of “production”, should we be concerned with that Tesla number? Or is this too much of a reach?

6 Responses to New term for me: Tesla

  1. Ken Allan says:

    Kia ora Gina

    Thanks for this. Tesla is not a new term to me, but that meaing of it is.

    Catchya later
    from Middle-earth

  2. gminks says:

    Hey Ken! That’s what I meant, a new way to talk about Tesla. The idea that Tesla could have been an Aspie totally fascinates me..

  3. Mollybob says:

    You have introduced a new term to me too. I thought you were talking about a brand of electric car at first. As a fellow designer I know it can be really hard to get meaningful feedback with everything else going on, but the difficukty doesn’t diminish the importance. I have been guilty of saying “yes, I’ll get to that feedback soon” and leaving it too long. I’ve never measured how often we gather feedback so the thought of something like tesla hadn’t occurred to me, but without relevant realtime feedback we fly blind. How do we know our programs are working and relevant otherwise? Perhaps some tesla benchmarks for designers would be beneficial? Thanks for the thought provoking idea – I’m going to share it with my colleagues.

  4. Ken Allan says:

    Tēnā koe Gina

    He wasn’t the only aspie prominent in history, as you probably know already 🙂

    Catchya

  5. MadKat97 says:

    I can understand the TESLA number as applied to decision makers, but … who plays that role in corporate training? Some organizations already have a “technical refresh” schedule for technical coures, corresponding to software release dates. In addition, the maturity of a product comes into play as well (mature in the sense that the customer facing feature set hasn’t changed that much in several release [Office 2007 is the anti-example]). It’s a valuable concept, but not sure how it applies to corporate education.

  6. Jeff Goldman says:

    Regarding, “As instructional designers, if we have a course that is out of the design process and into a state of “production”, should we be concerned with that Tesla number? Or is this too much of a reach?”

    We should be abosultely concerned with it. And the ADDIE model requires it. Some may think the last phase (Evaluation) is an ending point, but after evaluation one must circle back to the 1st phase of the model (Analysis).

    I for one place an evaluation plan in every course design plan (e-learning and classroom training) and identify exact post production review points along with who will be reviewing it. In addition, I have a “last reviewed” date on the 1st page of each course, hopefully instilling confidence that the course is up to date and functioning properly.

    Thanks for this great post and introducing me to this term (TESLA).

Leave a Reply

Your email address will not be published. Required fields are marked *