In 2014, 3,408 B2B companies were surveyed for the B2B Content Marketing Trends-North America report from the Content Marketing Institute. According to the report, respondents produced over 42,000 pieces of content per year to support content marketing efforts. Of this content, 37% (or 15,692 assets) represent long form content such as eBooks, white papers, and case studies, where the PDF works best to meet this need. Considering the complexity of the PDF format, companies with publishing workflows depend on different test suites to make sure their particular work practices yield the desired results. Working on the support side of the PDF ecosystem, I’ve seen my share of malformed PDFs and all the problems that could stem from them. I’m thankful to all such suites which weed out billions of flawed PDFs before they become a problem for my customers, and consequently, for me. However, in this article, I want to take the time and caution against over-reliance on test suites and appeal to the common sense of publishers and content creators alike.
I will try and outline the role that test suites play in PDF-centered workflows for those of you who might be foreign to the PDF industry. According to the article “PDF in 2016: Broader, Deeper, Richer” from the PDF Association, Phil Ydens, Adobe Systems VP of Engineering, estimates that there are, conservatively, 2.5 trillion PDF documents are created each year. Even if we consider a portion of these documents to be personal, we are still left with a colossal number of files that find their way to digital publishers, print presses, and archives around the world. In order to ensure feasible workflows, companies need to establish standards for both the PDFs that are submitted, as well as the tools they use to optimize the files to the output standards they are trying to meet. This is where the test suites come in play. While test suites differ, their main objective is to determine whether workflows are behaving as expected. In order to test your optimization tools, some suites will provide you with a package of PDFs that you can run through an already established workflow and see whether your process can pass the test (pun intended). Below I will examine a PDF provided by a test suite utility which I will call G-test for this article. I will not provide the actual name of the product since I believe the test suite does a great job at helping companies pick the right PDF tools, check whether their applications are conforming to the PDF standards among other offerings I’m not familiar with. The objective of this article is to offer people a better understanding of the test process and the results they see.
The G-test is a popular test suite that is used to gauge the accuracy of a given PDF tool to prepare files for printing among other things. As such, some versions of it contain PDFs that are too complicated to process by many PDF utilities currently on the market. I will use one of the PDFs I came across more than once as an example.
The file is constructed by 16 “squares” under which we have 16 “X”s. Both the “squares” and the “X”s are in ICCbased color space and the intent of the test is to demonstrate whether a given PDF tool can prepare the file for printing. As explained in the PDF itself after pre-print processing, the “X”s could be faintly visible or invisible but not clearly visible which constitutes a failure. As I mentioned earlier, I’ve come across this file on multiple occasions since it has caused a lot of distress to some of my customers. If you open the file with Adobe Reader, you will see the underlying X’s without any alternations to the PDF, but one could argue Reader doesn’t convert PDF to PDF/X (Press-Ready PDF, as defined by Acrobat DC). Furthermore, when using Acrobat 11 Pro or Acrobat DC to convert the file to PDF/X, the output still had some unwanted artifacts.
I proceeded to test with different PDF tools from various price ranges, but I couldn’t produce the desired output. I examined the PDF trying to see whether some of the objects had color space different than what was declared, but everything appeared to be as stated. I didn’t find any syntax errors nor unexpected isolated groups or knockout settings. I referenced the PDF ISO to make sure the output the test was aiming to achieve is in line with the PDF Standard.
By now, you must have realized I was irritated by this test file a lot more than any of my customers that reported it. In order to achieve the correct result, I had to change the target output color space profile. What made a lasting impression on me was that the resulting PDF was Press-Ready according to many of the print shops in the area. In a sense, the G-test has done exactly what it was designed to do, it exposed vulnerability in a workflow that involves PDFs identical in specification to the test file. I was looking at the situation from the wrong perspective. Test suites play an important role when selecting a PDF tool for your business or hobby; if given workflow doesn’t pass with flying colors when you run it against a test suite, don’t rush to change things. Make an accurate guesstimate whether you will make use of the PDF specs that are triggering the failure and create detailed guidelines for the PDFs that your customers are going to create and submit to you for printing or publishing.