Everyone knows Yelp reviews are unscientific.
Now experts warn that customer surveys with a lot more at stake are being put together using similar methods.
Rankings published by a firm called Black Book Market Research are used by major corporations in the medical-information technology business to win lucrative contracts from hospitals and health-care systems. Black Book, led by industry veteran Douglas Brown, describes itself as “fiercely independent,” but it’s paid by the vendors it rates, lifts large portions of its reports from other suppliers and uses crowd-sourcing methodology that has more in common with Yelp and Amazon than with major market-research firms such as J.D. Power, according to experts.
A number of leading sellers of health-care information technology, including Allscripts Healthcare Solutions Inc., Cerner Corp. and McKesson Corp., have promoted Black Book rankings in their press releases, regulatory filings, earnings calls and sales pitches. Black Book’s findings have also made their way into congressional testimony and press reports on the coronavirus.
Until earlier this year, Black Book’s leadership web page was populated with photos and fake biographies of several people who never worked for the firm. Anil Raj, a managing director at an energy company in India, was surprised to learn that his photo was used to depict “Vijay Jain,” billed as Black Book’s director of finance.
“It most certainly is a photograph of me,” Raj said in an email. “It is disturbing to see this image is being used by a company I have no connection to.”
Brown said the photos and bios were the result of problems at an outsourcing firm in India and that Black Book has taken them down. Brown said he’s never met his billings-and-collections people, who work overseas. The apparently fictional “Vijay Jain” was listed as Black Book’s finance director for two years.
Black Book now lists Brown as its lone executive. Even so, Black Book said it polled an average of 840,000 industry professionals last year in over 300 ongoing market surveys. That level of production puts Black Book roughly on par with big, nationally known researchers like J.D. Power and Gartner Inc.
“Black Book manages to complete the crowd-sourced survey processes with an efficient number of full-time staff supplemented by several outsourced market-research support firms, part-time staff, contractors, and consultants,” Brown said in a written response to Bloomberg’s questions.
Misinformation, quack remedies and mixed messages have complicated the struggle against the coronavirus pandemic, and though Black Book’s reports touch on Covid-19 only indirectly, claims from a medical-research source that has fake employees and uses methods questioned by industry experts can contribute to the confusion.
Two professional researchers, who reviewed Black Book’s methodology at Bloomberg’s request, cautioned that Black Book’s crowd-sourcing is susceptible to producing skewed results. Black Book says its minimum sample size can be as few as five respondents per brand. That’s compared with a typical industry minimum of 30, according to one of the researchers, who requested anonymity. As a result, Black Book’s polls appear to be more similar to “qualitative” Yelp or Amazon.com Inc. consumer ratings than they do to the surveys produced by a J.D. Power or Nielsen, the researcher said.
Brown said some crowd-sourcing techniques can yield accurate results.
But Sunghee Lee, an associate research scientist at the University of Michigan’s Survey Methodology Program, said Black Book’s lack of detail in its disclosures on methodology made it impossible to evaluate for scientific rigor.
“Overall, the description does not provide necessary information for readers to be able to tell whether their participant-recruitment method can stand against a scientific evaluation,” Lee said.
BlackBook’s 2018 health-care cybersecurity report included language identical to that used by market researcher Grand View Research Inc. two years earlier. Grand View, which received no credit from Black Book for its content, didn’t respond to requests for comment.
Brown said that only about 10 pages of a 2019 report’s 350 pages were “researcher written.” The rest consisted of “downloads of survey results” from research-outsourcing firms that Black Book re-formats, he said.
Although much of its survey content is aggregated from other sources, Brown said Black Book generates over half its revenue selling it to hospitals, medical practices and health insurers. Another revenue stream involves vendors who pay for content, promotional rights and custom reports. Brown emphasizes that his firm does not charge information-technology vendors to be ranked and not all of them buy his products.
Four of eight public companies Black Book has recently awarded top rankings did buy unrestricted licenses to share, edit and reformat its results, use Black Book’s seal in promotions and receive media follow-up at no extra charge, according to Brown and the company website. Rival research firms produce similar vendor rankings and news releases, he said.
A disclaimer on Black Book’s website says that its findings, conclusions and recommendations are gathered “from both primary and secondary sources, whose accuracy we are not always able to guarantee.” The company added that it “can accept no liability whatever for actions taken based on any information that may subsequently prove to be incorrect.” This warning is unusual enough in the industry that one of the experts reviewing Black Book’s methodology called it a red flag.
Allscripts CEO Paul Black has frequently referred to his firm’s top Black Book rankings. In a press release announcing the rating in a 2019 report, Black said that “Black Book’s extensive, comprehensive and unparalleled objective methodology makes this recognition a credible indicator of Allscripts’ expanding global footprint.”Brown said his firm “does not recommend any vendor nor does Black Book have any mutually beneficial business relationships with any vendor.” But an email chain from 2017, reviewed by Bloomberg, shows the two firms apparently working together.
An Allscripts executive told Brown that his colleagues “were talking about the possibility of you creating a report similar to the one you did in Singapore for this large opportunity” involving a sale to a U.S. health-care provider.
Brown responded by asking Allscripts for details about the prospective client’s facilities. It “helps in finding MDRX top ranks faster,” Brown said, referring to Allscripts by its stock ticker. In a follow-up, Brown wrote that he was seven invoices behind in billing Allscripts and would get them out the following day.
In a written reply to questions, Allscripts said it has reviewed Black Book’s survey methods and deemed them statistically valid and free of vendor influence. Allscripts doesn’t pay Black Book to be included in its rankings but instead for access to in-depth data the market researcher collects while conducting its surveys.
Allscripts “has on occasion decided to publicize the findings, if we feel it’s in Allscripts’ best interest to do so,” said the statement. Allscripts has no influence over Black Book’s rankings and refutes “any suggestion that brings our integrity into question,” it said.
Cerner is an Allscripts rival that in 2017 was looking to win a lucrative electronic-recordkeeping contract from the U.S. Department of Veterans Affairs. Black Book compiled a report with data from the first quarter of that year that rated Cerner, Allscripts and three other vendors based on prescription-tracking, patient-portal experience and other criteria, according to slides reviewed by Bloomberg. Cerner ranked first in 30 of 33 categories. It’s not known what role, if any, the rankings played in what proved to be Cerner’s winning bid for the VA contract.
A few months later, in its 2017 mid-year health-records survey of mid-sized medical providers, Black Book placed Allscripts ahead of Cerner in almost every category. Brown said the rankings involved two separate reports and products, despite involving the same vendors. “It is not apples-for-apples comparisons,” he said.
“Cerner works with various third-party industry analyst firms to better understand our position in the marketplace,” said Angela Vogen, a Cerner spokesperson. The VA didn’t respond to requests for comment.
McKesson, a drug distributor that competes in health records, has also celebrated its top Black Book ranking in a press release. Last year, it touted its selection as No. 1 in cancer-care records, describing Black Book as an “unbiased industry-leading source for polling, surveys and market research.”
McKesson said in a written statement that “based on information provided from Black Book, it is our understanding that Black Book conducts its annual ranking surveys directly with our customers and is responsibly gathering market research and analyses for its studies. We have no influence on the rankings provided by Black Book and our only involvement is purchasing the report and licensing to share the rankings in communications.”
Along with generating kudos, Black Book’s results have raised questions as well. In 2016, the U.S. Government Accountability Office reported that most Affordable Care Act users were satisfied with the program. Its findings were based on polls of between 400 and 1,000 users each by Deloitte, the Kaiser Family Foundation and others, it said.
Black Book released a similar ACA poll five months later that it said involved 34,800 consumers. It found only 22% of respondents satisfied. A news report on users’ opinions of the ACA characterized Black Book’s findings as an “outlier.”
Read or Share this story: https://www.detroitnews.com/story/business/2020/07/01/pollster-fake-staff-gave-top-grades-firms-landing-deals/112049662/