Competitive Analysis 101: Eight-Factor Test of Editing Skill

Competitive Analysis 101: Eight-Factor Test of Editing Skill

May 2, 2019 Uncategorized 0

A newly revised online news scoring system from Editorial Solution Inc. will definitely test your editing skills. This competitive analysis system concentrates on eight key editorial factors most likely to trip up writers and editors of online publications. ESI will be applying this system in all online projects.

Of course, you are also welcome to try this system out on your own. If you do, here’s what you’ll be evaluating:

  1. Universality. As many articles as possible should be relevant to the broadest readership. You could argue that new product announcements fulfill that requirement. True, but this new system is not meant to be applied to new product sections. In fact, I’ll be creating a separate scoring option later for that purpose.
  2. Enterprise reporting. Scoring is evaluated on the basis of the degree of enterprise reflected in online articles, and will be rated as “high,”” “low,”” or “zero.”” Since my site evaluations will continue to assess 10 articles on the day of review, it is to be hoped that more than half fall into the “high” category.
  3. End-user quotes. Newsworthy quotes from vendors, associations, and other sources will earn partial scores. But end-user connections are what really seem to be elusive for most B2B sites. For example, in a site test I recently completed, only one of the 23 quotes was end-user sourced. This limited showing has been par for the course in my past e-news studies.
  4. Story-telling headlines. In many projects that include headline assessment, the effort by headline writers hasn’t been all bad, but it could be better. The major shortfall is absence of high-interest numbers in heads crying out for a quantitative flavor.
  5. News-first intros. Many site packages reviewed continue to fall into the “source first” trap. Especially annoying are those articles that begin with the name, title, and affiliation of the source. Sometimes this practice results in a batch of pointless prose before the real story begins. In my workshops, I always badger folks about arriving at a key news point within the first ten words of the article. In one scoring trial involving 20 articles, only seven reached the target. One article ran through more than 200 words before before getting anywhere.
  6. Sentence flow analysis. I am a big Fog Index fan, but for those desiring to avoid the math mania involved, sentence flow analysis is my alternative. In a nutshell, if the average sentence length in an article exceeds 25 words, breaking up some long sentences is called for. In the 20-article test I referred to previously, 10 exceeded the 25-word target.
  7. Word count. My preference for a good word count target is 750-800 words.
  8. Link visibility. I follow a formula calling for an average of at least one link per article posted. Sounds easy enough, doesn’t it? But you’d be surprised. In fact, the 1.0 average is peanuts.

So there you have it. The maximum score possible using this system is 100, with 20 points for factor 1, 15 points for factors 2 and 5, and 10 points for the remaining factors. All sorts of scoring variations are possible if you wish to be more precise. Or contact Editorial Solutions Inc. to have us do the scoring for you!

Previously in this series

Measuring Editorial Enterprise

Is Your Content Really Better Than Theirs?

Upcoming: The next entry in this series will begin a review of a 25-factor approach to evaluating and scoring editorial quality.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.