Counting clicks

This month, EPMonthly ran an article about the cost of poorly-designed EHR on ED operations. The EPMonthly authors - Augustine and Holstein - ask some good questions and made some good points. But the data they used to ground their piece came from a peer-reviewed article that unfortunately leaves a lot to be desired.We ran an editor's note at the end of the EPMonthly piece, succinctly stating my objections to the original peer-reviewed research. But since this "4000 clicks" study has gotten traction elsewhere, I felt compelled to make my detailed criticisms of the article publicly available:The authors of the "4000 clicks" study chose to evaluate ED physicians' performance with McKesson Horizon. In KLAS surveys of ED physicians, McKesson scored at or near the bottom for many fields, including provider satisfaction, perceived workflow integration, and speed of charting. The authors assert that many other studies where EHRs improved ED throughput metrics and reduced errors are biased - because these studies use data from "the most innovative institutions" with presumable customizations. They also note that "software vendor" has been the leading factor in determining ED improvement and provider satisfaction. So I'd expect they'd want to pick a system with average scores, implemented at a fairly typical ED. Yet the authors don't make mention of any ED characteristics and chose one of the worst-ranked ED information systems. The authors justified their choice of McKesson...
Source: Blogborygmi - Category: Emergency Medicine Authors: Source Type: blogs