The NEPUG Spring 2010 meeting was a big success. We had 19 people on-site (including one walk-in and not counting the Progress people who helped out or dropped in from time to to time nor those who were watching the WebEx from another room) and an additional 26 remote attendees. That may be an all time record for NEPUG! 25 people responded to the post meeting survey -- a better than 50% response rate.
More or less random notes:
General
=======
The meeting topic was very popular and the speaker an excellent choice -- this probably helped to draw quite a few people, especially the remote attendees.
The all day format seems to work well for us. We have done this several times now and we seem to get better attendance than at the evening meetings.
We did not, however, manage time very well. More on that later.
Pre-event planning was a bit too ad-hoc and various communications snafus marred our performance (details scattered in notes). We need to develop a checklist and pay more attention to the details.
The "workshop" parts of the agenda were the major attraction for a lot of people but the most troublesome to execute. More on that later.
The mixed live & remote audience worked reasonably well. We did have occasional audio issues that could probably be mostly resolved by clipping a microphone on the presenter (when he turned away from the speaker phone remote people could not hear). Needing to "repeat the question" was an issue as well (just like it always is in a large room). Local audience "chatter" was an occasional problem.
Announcements & Such
====================
We sent out a series of meeting announcements and reminders -- more than usual while trying to avoid being completely obnoxious.
PEG was the #1 source of referrals (30%). Our internal mailing list was #2 and word of mouth #3 (although if you add in "other", which seems to be mostly "detailed word of mouth", it becomes #1...).
The web based online communities (Progress Communities and ProgressTalk) were disappointing as referral sources.
Progress Corp sent out an "e-mail blast" that was, in theory, targeted at our geography but only 3 people reported seeing it (I never saw it and I am on lots of Progress mailing lists -- both electronic and paper...)
We might have done better if we had worked harder at "PUG Outreach". We had a couple of late inquiries from other PUGs.
Registration & EventBrite
=========================
EventBrite was a very easy service to use for handling the registration and payments. They made it dead simple.
One attendee was nervous about making an online payment with a company that was unknown to him (EventBrite).
Having a cost for the meeting and requiring registration did not appear to negatively impact attendance and it was very useful to know how many people would be attending.
We got our final registration Sunday night (the meeting started Monday morning...)
We should have had a "check in" process locally. We had a walk-in attendee but didn't realize it until we were well under way and he needed login credentials.
Every e-mailed meeting announcement resulted in a few more people registering.
The "ticket" was confusing to some people. Especially for remote attendees. I overlooked the option to suppress that within EventBrite. It might be handy as part of a check-in for on-site people but it was pointless for remote attendees.
EventBrite's fees probably should have been "baked in" to the ticket price rather than an add-on (that is an available option).
Nice features of EventBrite included the capability to tailor the data collected at registration (including custom data elements), the capability to print name tags, easy e-mails to all registered attendees and an exported attendee list.
EventBrite was very prompt about mailing our proceeds check.
The Facility
============
The directions that we provided for getting to the meeting were late and inadequate. We also neglected to instruct people where to park and door signage was poor.
The room was nice but too small. Especially for an all day event. We had 18 registered, 1 walk-in, our speaker and a couple of occasional Progress people. Peak occupancy was probably 22. It felt like 12 would have been the comfort limit.
The kitchen area right next to the room was great. Progress shared their coffee with us. It was very good.
Getting in and out of the building required someone to keep a watch on the lobby to let people back in.
We provided snacks and drinks from Costco. Lunch was pizza that we had delivered. This kept expenses down. There was some negative feedback regarding lunch on the survey but since there were more responses to the question than there were local attendees I suspect that that was mostly remote people making jokes. (Everyone that expressed an opinion at the event itself seemed happy with the arrangements.)
Running network cables to each seat in the room was a mess. We probably should have just had everyone use the WiFi. (We were concerned about bandwidth on the WiFi but probably didn't need to be.)
Progress' internet connection was very fast and efficient. No problems there. Progress had no objections to us providing our own WiFi access point -- but that could be an issue at other locations. (The idea was for the speaker and the moderator to use the WiFi while everyone else was using the cabled connections. In reality at least half a dozen people used the WiFi.)
Progress provided the facility and WebEx at no cost. That was a big plus and the comments above and below should not take away from that.
By comparison we might have been able to have the meeting at a hotel or other similar location. That would have had some pros & cons:
Cons: Cost. The Nashua Marriott Courtyard wanted $300 for a 30
person room, $33.50 per person to feed us (required for the $300
room price), $280 for a projector and a screen and $200 for internet
access. Total $1,237 (22 people). I suspect that we would have
found other costs as we got deeper into it but probably would have
stayed under $2k.
Pros: Bigger room with other, even larger, options easily
available. Easy to find. Parking and building access much
easier etc.
Worries: Internet connection speeds, WiFi, WebEx, Conference
Bridge.
The Cloud
=========
Using the Amazon Cloud for the workshops made them a technically viable option. Having to manage and configure local desktops plus laptops would not have been as easy or as consistent.
Progress was very helpful and supportive in configuring the Cloud images and providing licenses to Progress products for the meeting.
Having said that... we needed to make the image available to people in advance of the meeting so that they could test their connectivity and setups. There was a
*lot* of nervousness about that. I had to field a lot of questions about it the week prior and I didn't have much in the way of answers. The answers changed a few times too. Which led to some confusion and some unhappy survey results.
The major issue with providing early test access is that leaving the images running all weekend costs money. But in the grand scheme of things it wouldn't have been much. But maybe we can find a way to have them shutdown automatically after some period of inactivity?
Total Cloud costs for 50 images running all day were estimated at around $125.
There were issues with starting so many instances of the common image. Amazon has some limits that we were not aware of until we tried to start 50 instances Monday morning. Roy Ellis has the details. Roy was encouraged to turn this experience into a White Paper since this particular usage of cloud computing is enough different from the others that Progress has published that it seems valuable to document it.
We did not use the ability for the instructor to "remote control" a desktop except for one remote participant in a coffee break. For an "emergency scenario" it would have been too slow. Mike needed to login to the VM as an Administrator and then finding the remote desktop administration and then taking over control of his session. That's a minimum of a minute or two for getting on someone else's desktop. That's too slow. Maybe next time we should install something like TeamViewer on the VMs. That seems a much better fit for instant remote controlling a remote participants desktop.
Cloud performance was excellent.
WebEx
=====
The WebEx setup worked well for sharing the presentations and demos with the online audience. There were a couple of minor adjustments that were made along the way and we probably should have had Marlys online right at the start.
Having someone dedicated to monitoring the webex chat for questions seemed to work well for bringing remote participants into the loop.
A couple of times Mike's screen and the WebEx were out of sync. Easily fixed but we had to keep an eye on it or he'd be talking about stuff that remote attendees couldn't see.
The biggest mistake we made was not recording the sessions. That turned into a major problem when it became clear that we were not going to finish on time and people began asking about recordings. Rob managed to get the recording turned on but we still missed the earlier stuff.
The Event Itself
================
We were probably too "wide" on topics and therefore too shallow on details. All the topics were interesting but we needed better focus. Given Mike the benefit, he did a very good job of working in OO, the OE Architect and the New GUI into a single stream of thought. That is a lot for anyone to be able to present in a single day.
The workshops were harder to deliver than we expected. Too much time was lost on figuring out small details that could have been resolved by a "cheat sheet" or detailed offline instructions. In hindsight, we should have been more scripted when it came to the workshop routines. We should also have spent more time doing things. More doing and less lecture time.
The level of experience of different participants contributed to some people being impatient to move on and others being mired in the details. (This is probably a problem in all workshops.)
We did a poor job of sticking to the timetable. This was apparently more frustrating for remote attendees than local. A suggestion for that would be that someone would be sitting next to the presenter in the front row as a "moderator" that keeps an eye on the watch and reminds people to keep quiet and focused while the presenter could concentrate on the workshop itself. Like a room monitor at Exchange conferences.
The Survey
==========
Survey Monkey made the post meeting survey easy. The feedback obtained was very valuable.
Prodding people to complete the survey worked to some degree. But the response rate tailed off quickly after the first couple of days. I closed the survey after a week.
Some sort of incentive might help to get more responses but 25 out of 45 is a pretty good rate. Perhaps we could make access to downloads contingent on completing the survey?
We got a lot of great feedback about what we did well, what we did poorly, how we can improve and what we might want to offer in the future from the survey.
Some questions should have been specific to local or remote attendees. That was a design flaw -- Survey Monkey does support that.