Let’s get a few points out of the way. I miss the tribal gathering of the L&D clan this year quite keenly. I miss bumping in to people I have not seen for a while. I miss catching up with people I know will be there. I miss the day away from the desk and the inbox (how I yearn for that this February). I, kind of, miss touring the exhibition floor and seeing who has won the biggest stand award and which business is dressed as superheroes. And, OH HOW I MISS the beer and chat at the end of the day. I even miss the DLR to Excel – yes, it’s that bad. (I do feel a little better after that keyboard rant, however).
Having hosted a session at the event, I am now wondering how that compared to hosting a session in the room. What did I miss about that analogue experience? The set-up and briefing was so much better this week. I had time with the organisers and time with the chair, as usual. The major upgrade as a presenter, however, was considered, expert care, attention and advice from James Booth in rehearsing, familiarising and guiding me along the Adobe Connect path. There is always a little time to check out the room, maybe shuffle some seating etc. but it pales in comparison to the run through I had and what I learned from James (and Anthony, to be fair). It was as close to full rehearsal as you can get without actually rehearsing with an audience.
As a presenter/host/performer, I noticed that the quality of conversation was better as well (or I could detect it to be). The ability to check on the chat panels as we moved enabled me to reflect on and respond to ideas and queries as we talked. This is more conversational and felt more natural than the managed Q&A and mic passing of the break-out rooms. It allowed attendees to continue chatting and asking each other questions without interrupting the session. This is a digital upgrade in my mind. All of this is written down and available in the platform too. This felt better.
Inevitably, as a performer, I wonder how it went. Did they really like me, my needy ego is enquiring? This is much harder to tell in the Adobe Connect environment – reading the room is, literally, reading the comments. My hunch (with no comparative data), is that there was a good amount of chat and some good questions and contributions to the topic areas I had designed around. These lead me to believe we were along the right lines as a group. One indicator from the old world, as a speaker, is the number who approach you afterwards. So far, two kittens and one teenager making breakfast. All unimpressed.
I have wondered before about data sets for webinars and online conference sessions and not found much on offer. To gauge the success of the session, we might expect to see data on:
- Number of attendees
- Number attendees as a % of registrants
- Staying power – % who stay to the end
- Leaving points – where people checked out
- Chat volumes
- Chat volumes as a % of attendees
- Active chatters – number making more than one contribution
- Chat topics/phrases/language used
- Clickers – those who click on resources and links
- And feedback ratings/scores, of course
Do conference organisers gather this kind of data? Would it help? Seems that the digital event world should be investing in this kind of data.
Thanks again to the Learning Technologies team for the invitation, I have plenty to consider, which is what I consistently feel from the event. I continue to highly recommend it.
One response to “How was the LT digital experience?”
Thank you Myles for the great feedback, it was a pleasure working with you !