A Wealth of Ways to Use Data for B2B Editorial Management

A Wealth of Ways to Use Data for B2B Editorial Management

June 26, 2020 Uncategorized 0

“Get Serious About Numbers!” is the title of the presentation I made this week at the annual virtual conference of The Alliance of Area Business Publishers (AABP). As I noted to my audience, now is a particularly good time to apply quantitative analysis to your editorial operations and performance. Of special concern should be how well you are prepared to do a quantitative match-up of your publication with current and potential competitors. “Variations of approaches listed in this report may already be in place at your firms,” I told AABP workshop attendees. “But it’s equally likely that even if you know such programs exist, you’ve not found time to get serious about putting numbers to work for you.”

This lack of seriousness describes the situation for most B2B publishers. My long-term efforts to convince editorial leadership to manage by means of hard data instead of vague adjectives has not been supported enthusiastically. For example, editorial vulnerabilities spotted in my first e-news delivery study eight years ago persist today.

If more precise standards are now on your agenda, consider how you might handle three key tasks in the immediate future:

  • Determine how long it realistically takes to complete every task you might assign to a staff member.
  • For all staff members who conduct interviews, prepare a sample list of questions that the interviewee can only answer with a number.
  • When screening editorial job applicants, make use of a checklist of the key traits you are looking for in new hires. Score each item with 0 to 3 points and record the total for each applicant. Having the results at hand can be especially useful if you need a tie-breaker to decide between two candidates.

Cover Maximum Quantitative Bases

There are plenty of ways to apply quantitative thinking to editorial activities. Among the most complicated is a 20-day evaluation system for measuring the time spent on different types of work. In my earlier consulting years, I used seven factors: original writing, editing work of others, production time, travel, author recruitment, miscellaneous meetings, and online activity. Eventually, I spun off online activity for separate analysis due to its unique complexities. Today with digital media there are at least a dozen components worth quantification.

The idea here is to have each editor being evaluated track how much time is spent monthly on each type of work. You may determine that a month consists of anywhere from 20 to 22 8-hour days, depending on your situation. How often does the total exceed the monthly day limit? How much time do certain factors contribute to the overage? Are shortcuts available that can bring the excess in line? Coming up with the answers may not be easy, but the results will help you determine how to optimize your staff’s activities and assignments.

Now let’s consider some less demanding analytical options:

  • Graphics analysis is a good place to begin. For print publications, my system looks for 80% of editorial pages to carry four-color illustrations, no more than 10% of editorial pages to rely on all-type layouts, and at least 20% of pages to offer infographics.
  • Define and score important show-issue factors, which may include as many as 20 items.
  • Distribute self-scoring profiles to staff for training. One example I’ve given is a 10-factor self-assessment of complaint handling. Other profiles I’ve developed address editorial marketing, feature writing, meeting trade show challenges, becoming someone in your industry, and identifying editorial burnout.
  • Analyze editing basics. For openers, you can apply the Fog Index to measure the readability of finished articles. If Fog Index calculations are too time consuming, you may want to switch, as I recently did, to a less complicated approach based on maximum average sentence length not exceeding 25 words.
  • Use or modify this set of 25 qualitative factors to get a big-picture assessment of your publication’s performance.

E-News Scoring Is Worthy Effort

Let’s now consider probably the most significant example of numbers at work: e-news scoring. Generally speaking, when it comes to the quality of e-news, most B2B websites have consistently missed the mark in all eight of my annual studies. In the latest study, I set a minimum target score of 60% for content clearly reflecting enterprise effort. Only four sites of the 50 that I analyzed met the target. Only four more managed to earn scores above 40%.

Here is the list of standards I use in my analysis and and the maximum scores possible for each: urgency (10), enterprise (20), quotes used (20), headlines (10), key news first (10), word count (10), average sentence length (10), and total links (10). You can shuffle standards and scores within the set of articles you analyze to focus more heavily on a different factor.

More details on my approach to e-news scoring can be found in the following articles:

April Tweets: Eight Annual E-News Study Finds Basic Editing Lessons Still Unlearned

8th Annual E-News Study Highlights: Improving Enterprise Reporting Is Key to Beating the Competition

Want to Calculate Your Enterprise Reporting Score? Begin by Defining Three Values

For an even deeper dive into the details of quantitative editorial analysis, you may wish to refer to my two books, Get Serious About Competitive Editorial Analysis and Get Serious About Editorial Management.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.