My ‘Too Long; Did Read’ review of the ‘CISO Engagement and Decision Drivers Study‘ from CyberTheory is, by necessity, much more negative than I’m generally comfortable writing, but it truly deserves the treatment. Despite the title of the study, it barely talks about engagement with CISOs in any meaningful way. It might be useful to a Marketing team, but is almost impossible to decipher and misses its target. Due to the colors chosen for the plots and graphs of the report, any hope of the reader drawing intelligence from the study is quickly drowned in in a sea of blue ink.
To be clear, I’ve worked with and known much of the team at the Cyentia Institute and respect them greatly. I’ve followed their work for years and know what they’re capable of. I wouldn’t spend the time needed to read the report from front to back and comment on it if I didn’t know, beyond a shadow of a doubt, they are capable of something much better than what’s shown in this report. Please look away, Wade and team!
Overall Impression – When I review a report, I’m examining three aspects: A) What data is the report drawing on, B) How was the data analyzed, and C) How was the data visualized. I’m also looking at how it was laid out and edited, but that’s generally a minor part of my analysis. The CISO Engagement report fails, or nearly fails, on every one of these measurements. The data and the visualizations are rendered useless by the color choice, and the analysis is window dressing with key words thrown into the mix. I’m looking for guidance on how to use the data, rather than generic SEO feedback.
Who should read this? Marketing and content creation teams might gain some insight from this report. It is primarily aimed at people trying to connect to CISOs after all. I would suggest that marketing teams skip straight to page 22 (or is it 39?) and the section titled ‘Reaching Your Audience’. There are a significant number of ‘Marketing Takeaways’ that may contain nuggets of wisdom for their consumption.
Security professionals should avoid reading this report. We’re not the target. If you’re interested to see what an SEO driven content team thinks will grab your attention, dive in. But you’d better like blue and cyan, because there are no other colors to choose from in the CISO Engagement study.
The Good – The conclusions and reflections on page 29 have some good, if basic, feedback for marketing teams. I especially like the realization that “…campaign strategies lack an understanding of the subject matter, which leads to misperceptions…”. I’ve worked with good marketing teams who get this and engage subject matter experts for guidance. This is something to keep in mind no matter what industry you work in.
The examination of current events on article traffic (pg 17) is also a decent reminder to use the news to create traffic. Because the survey is drawing from ISMG’s data, I think this section fails to understand the difference between a news organization capitalizing on a breaking story and a vendor writing blog posts about how their product is the silver bullet for today’s disasters. This section deserves some careful thought; being involved with or acknowledging current events is generally a good idea. But the report fails to highlight how easy it is for a vendor to be seen as an ambulance chaser instead.
The cover is a pretty design. Which is where the positive feedback ends.
The Bad – There’s a lot to unpack in the negative feedback I have. The data behind this report that could have been displayed and analyzed very different to make this report useful. Instead, any intelligence appears to have been drowned in favor of … something. I can’t rightly tell what the real purpose of the report is.
First off, the title. This report isn’t about CISO engagement. The first real mention of catching the attention of a Chief Information Security Officer doesn’t enter the report until page 26. Prior to that, it refers to ‘C-level executives’, and occasionally uses the acronym, but it’s rare. There are 48 uses of ‘CISO’ in the text, with most (25) of those simply being in the title in the footer of each page. Subtle hint: if you want someone to read your report, actually make it about the topic you promised and put that content as close to the front as you can!
Page numbering is another problem with the report. While the front and back cover are presented in portrait mode, the rest of the report uses landscape mode. The way the pages are numbered, every internal page is marked by an even number, even though only a few actually make use of a two column format. This isn’t a 54 page report, it’s a 31 page PDF that’s pretending it was supposed to be printed.
The table of contents promises a research methodology page, which the report fails to deliver. Yes, it does show a simplistic breakdown of different elements of the data. However, this is nothing close to telling us where the data came from or how it was manipulated to create the plots and graphs for the report. Even if this self-aggrandizing pulp were considered insight into the data, it belongs at the end of the report. The first page of content should be reserved for a summary, a set of major talking points, a TL: DR, etc. In other words, hit them with your hook as quickly as you can, not with a dead trout.
By far, my biggest beef with this report is the choice of color. This report lost all credibility with the very first figure, on page 6 of the PDF. In the world of data visualization, color is used as a method of conveying information to the reader. This report strips out a vast majority of the information contained in colors, instead using shades of blue and cyan.
This goes far beyond my usual rant about using color blind friendly palettes! Quite frankly, figures 1 & 2 are completely unreadable and a waste of their space and my time. Figure 2’s river plot is a travesty, with colors being reused repeatedly, which makes it impossible to draw conclusions from the visualization. The ‘observations’ for the plot are on the next page and even say, “If you squint, you may be able to make out…”. No, even if you squint, they’re impossible to make out.
The blues look pretty, if slightly monochrome and repetitive. They also wash out any hope of making use of visualizations. Which is made worse because there’s little evidence the plots are related to the content in the first place. Please do research on palettes that are Color Blind Friendly for the next version of the report. David Nichols offers a good starting point in his Coloring for Colorblindness article.
The data itself is of questionable value to most readers. The parent organization of CyberTheory, ISMG, has well over a decade of data on the effectiveness of different formats. My concern is the data is more reflective of their support of the different formats than it is of general trends. Without a proper methodology segment, it would be easy to think this report was drawn from a wider pool of data.
Overall – If you want guidance on SEO or content generation, read the various ‘Marketing Takeaways’ sections. You shouldn’t have to squint to gain something from the visualizations, but that’s exactly what CyberThreat is asking you to do. In the report, in their own words, none the less. Simpler plots, better color choices, and analysis that draws conclusions for the reader would make for a vastly better version of the report in the future. Complex visualizations are great for data nerds, but the majority of the people targeted by this report don’t have the time, the energy, or the background to read a river plot or a dot plot at a glance. They shouldn’t have to.
I give the ‘CISO Engagement and Decision Drivers Study’ a D-. It’s saved from receiving an F because of the Marketing Takeaways, but only by a slim margin. By the way, the Introduction promises “… the brands creating the most engagement from their Q1 content marking efforts.” I don’t see that delivered upon anywhere in the report.