Push Your Event Marketing E-mails Ahead of the Pack

During close to 30 years of direct marketing to help trade shows and conferences boost attendance and sell exhibitor space, clients often have asked for guidance on event industry response for e-mail campaigns. Now we can enhance data pulled from our proprietary research and experience with Eventbrite’s new “2017 Event E-mail Benchmarking Report,” comparing survey responses from over 340 event organizers across the U.S. and U.K. for a range of event types and sizes.

Benchmarks to Emulate

If you’re an event marketer with a fuzzy notion of the basic response measure of click-to-open rate (CTOR), you’re not alone. The benchmark report found that 39% of respondents said they didn’t know their average CTOR. That’s an ignorance that these event pros need to remedy if they hope to catch up with even average e-mail results. The rest of the U.S. event organizers surveyed reported an average CTOR of 12%. That was higher than their U.K. brethren, who only cited a 9% average, but far behind the enviable 17% in the U.S. who reported a CTOR of 21% or higher! Festivals scored the best average e-mail CTOR (14%), while classes and workshops had the lowest (9%).

Copy & Design to Boost Click-to-Open Rates

Event marketers who want to improve CTOR can commit to a number of basic creative tactics. First, they can revisit layouts and make sure they direct recipients to a compelling and clear call-to-action. Then, copy should be relevant, personalized and spam-filter avoidant, running from a great subject line that entices opens to copy that wins clicks. Obviously, mobile-optimization is a must now that the majority of e-mails are opened on mobile devices. Note that the most effective e-mails today also include an engaging image. E-mail research has found that e-mail campaigns with imagery have a 42% higher CTOR than campaigns without images, for example. (Don’t forget to comply with CAN-SPAM opt-out and privacy regulations, of course.)

Target, Test, Automate, Integrate

As data brokers, we must remind that response is even more dependent on the quality of targeted opt-in e-mail data, whether house or rental lists, and use of professional software and database support for list segmentation, updating and permission management as well as results tracking, testing and analysis. Indeed, regardless of carefully crafted e-mail creative, results measurement and analytics are essential to a direct marketing basic: testing of creative, lists and targeting to find what works best. Automation of event updates and confirmation/thank-you e-mails has also proven its value in maximizing click-through rates and conversions/registrations. And, finally, e-mail gains the most reach as part of a consistently branded, multi-channel effort, leveraging social media’s e-mail list building strategies, for example, as well as the proven marketing power of direct mail. (Ask us about our Digital2Direct marketing program that matches postal and opt-in e-mail records to send targeted mail and e-mail to the same recipients.)

For more metrics from the new event e-mail benchmarking survey, get the free report at https://www.eventbrite.com/blog/academy/2017-event-email-benchmarking-report/

How Direct Mail Testing Factors Differ by Product Stage

Direct mail success is all about testing — lists, offer, creative, and, of course, the product/service itself. While there’s no single formula that applies to all our direct mail consulting clients, Malcolm Decker’s excellent article “How to Test Your Direct Mail” in Target Marketing magazine’s resource section offers some useful guidelines.

Testing for a New Product

Decker differentiates the weight given the various direct mail testing parameters by a product’s life cycle–new product testing; honing success of an existing product; and testing to revive a mature product. For example, his ideal new-product test is mailed to 120,000 names, with the house list providing less than 20% of names mailed, and testing of 15 different lists, three different prices/offers, and three different creative packages. In looking at the relative contributions of testing factors, he notes that even the most well-researched new product can impact results by 30% plus or minus. Mailing lists–ranging from tightly targeted response lists to larger, broader and thus riskier lists–will contribute another plus or minus 30% to success, based on Decker’s experience. Then the price/offer will deliver another 30% up or down. And last, the creative factor for a new product can move the testing needle by another plus or minus 10%. Decker assumes proper timing since the difference between the peak season and the trough in demand is a whopping 40% of response (check Who’s Mailing What! archives and seasonality tables if unsure).

Honing Success and Maturity Challenges

Once marketers have a couple of years of mailing results to help determine price elasticity, list universe, creative preference, premium impacts, etc., Decker notes that the 30-30-30-10 relationship of start-up testing has shifted. The product can’t add much to response unless it is revised. The list universe is substantially explored, so new, more effective list contributions are scarcer; lists now potentially improve results by just 10% up or down. New offer twists, on the other hand, can goose interest in a well-known product by plus or minus 40%, and creative changes in copy and design can help re-position and expand markets for a potential 50% either way. Once a mature product’s proven marketing choices face the challenges of competition or changing tastes and demographics, the key factors shift once more. Testing now may involve a restaged product for widened appeal, which can deliver a 20% shift in either direction. Plus, a restaged product can open up the known list universe to new lists and improved results from existing or marginal lists, for another 20% difference. And a retooled product will require more price/offer testing that can shift results another 30% up or down. Finally, new creative strategy can breathe life into response for a potential 30% gain (or dip).

A Caveat on Formulas

Decker’s exposition is a quick guide for allotting effort and resources in direct mail testing at each stage in a product’s life cycle, but marketers should realize that formulas are sometimes contradicted by market experience, Decker warns. As he notes, the strongest list among 15 may produce 20 times the revenue of the weakest list! New creative can beat a proven control by a 100% bump in response. And no formula applies equally to all product types, from computers to cornflakes. Download the whole article at  http://www.targetmarketingmag.com/resource/how-to-test-your-direct-mail/