This article was originally published in Legaltech News.
In-house legal organizations need to manage the quality of their legal work just as rigorously as they manage outside counsel spend. Unfortunately, few organizations do, partly due to the lack of process and technology for defining and then evaluating quality.
Leading organizations define quality in ways that are specific, measurable, attainable, relevant to organizational priorities, and time-bound (S.M.A.R.T.). This can be as simple as capturing a subjective net promoter score from a relevant audience at the conclusion of a legal matter to more elaborate systems that involve capturing multiple data metrics that feed into different categories, such as file management, knowledge/skills, and service. A S.M.A.R.T. approach also includes written definitions of those categories and calls for organizations to reach out to “easy graders” who tend to consistently provide 4- or 5-star ratings to make sure those graders really understand the definitions provided.
This definition and subsequent measurement of quality truly do matter. Putting downward pressure on cost (as most legal ops organizations are already doing) without any system for detecting corresponding impact on quality creates risks so obvious they don’t need to be explained.
So, how are organizations setting themselves up to communicate quality expectations and realities?
1. They have automated systems for capturing impressions of quality.
A duct-tape quality rating system (think Excel or email) is going to be a high-maintenance headache with a low ROI that eventually gets abandoned because everybody hates it. High-performing organizations, in contrast, let technology do the heavy lifting.
For instance, one insurance carrier that I know uses a system that requires claims examiners to enter quality impressions at the close of a matter and sends them repeated email reminders if they fail to do so. Impressions actually get captured, and nobody has to waste time keeping track of who has or has not responded.
2. They recognize that excellence is in the eye of the beholder.
Even when law departments have a system for capturing impressions of quality, they typically are capturing impressions from inside counsel about how well outside counsel handled a matter. That’s fine, but they should also capture impressions from the people who really matter—the clients in the business unit of the corporation they are trying to help.
As legal industry consultant Tim Corcoran recently wrote in this scathing article, the client mostly defines quality as “business velocity”— fixing the problem with minimal fuss and getting out of the way so the corporation can proceed with its plans. But gathering impressions of quality from in-house counsel, though valuable, is likely to reflect a definition of quality that is much more academic. Quality will be centered around the extent to which outside counsel has demonstrated encyclopedic knowledge and technical artistry in their area of expertise. Trouble is—the client doesn’t care. They just want the thorn pulled out of their side and for you to disappear.
3. They investigate incidents of poor quality and take remedial measures.
Another large organization I interviewed that collects quality impressions on thousands and thousands of legal matters every year makes a point to immediately investigate any matter that is closed with a low-quality rating. When a low rating occurs, the quality team goes even further by reviewing the output and speaking directly to the rater to understand what really happened. The team does not wait until some arbitrary future time period to review reports of poor quality; they pounce on them immediately. In some instances, the team determines that the low rating was more of a reflection of the unrealistic expectations of the rater than the quality of the counsel’s work. In others, they may conclude that, while the matter went poorly, the root cause was not in the performance of counsel but poor internal business processes that stymied counsel’s ability to provide otherwise excellent work.
The bottom line is that ratings should not always be taken at face value. They should be evaluated critically to see whether the problem lies a little closer to home than appearances would suggest. Then, broken internal business processes—like the failure to provide timely access to witnesses or case documents to counsel, lack of feedback on counsel’s proposed case strategy, failure of internal folks to respond to settlement offers, and more issues—can be uncovered and remedied.
4. They train firms to view quality ratings as real by having real conversations about quality—and real consequences when that quality is found lacking.
Leading corporate law departments know that not communicating or backing up quality ratings with consequences sends the wrong message. Law firms can be led to believe that those ratings are not “real” and can be ignored.
At the very minimum, quality ratings are shared with law firms so they can consider ways to improve their offerings and do business in the future. Firms that respond in this positive way telegraph sincere commitment to the relationship. Firms that don’t, don’t.
Quality ratings should be woven into Quarterly Business Review’s requests for hourly rate increases, panel management, and other programs that can affect the firms’ bottom lines. For instance, if panel firms do not achieve a certain threshold quality rating, they may be put on probation or removed from the panel altogether. Low ratings might also make firms ineligible for hourly rate increase requests. Some corporate law departments also withhold a certain percentage of what law firms invoice and release those funds only when certain conditions occur; this practice could be driven by a quality component. Others might pay bonuses when certain quality thresholds are crossed on certain types of matters.
Organizations that do not take a systematic approach to quality may be paying a huge price without even knowing it. By adopting some of the S.M.A.R.T. methods outlined above, you can make sure your organization is not one of them.