Data Revenue Cycle Management

How do you get all of the information you want out of a system? Some systems have great reporting, some have mediocre reporting, and some have terrible reporting. What we’ve found is that even the best systems almost never allow you to extract all the information you need in one report. Even when custom reporting is available, allowing you to select fields, the process becomes incredibly complex due to the table structures within relational databases and the challenge of turning that into a flat file.

Navigating the Complexities of Data Extraction

Those are the problems you deal with when working with the best systems. Now, let’s talk about the not-so-great systems, which, unfortunately, are the majority. These are very common and pose significant challenges. As an example, when trying to extract data, we go through a mapping process to figure out what information is needed. We ask questions like: What do they want? What functions do they want to use the data for? Once we determine the necessary fields to answer these questions or perform these functions, we then map them within the system, identifying where those fields exist, whether in reports or hidden in other areas of the system.

Real-World Example: A Data Extraction Nightmare

For a recent case, we needed to extract check number, EFT number, and check date. We encountered the following issues:

  • Payments Detail Report: It includes the check or EFT number but lacks any way to link it to other reports. There’s no visit number or even a probabilistic key, such as patient ID combined with the date of service, making it an island report that doesn’t connect to anything else.
  • Payment Application Report: This report doesn’t include a visit number but has a claim number, which we can cross-reference to a visit number or encounter ID. It also includes a patient ID, which can be combined with the date of service to create a probabilistic key and cross-reference back to other data.
  • Charge Entry Date: To find this, we needed a completely different report called a Daily Report. We used the information in this report to cross-reference with other data points and join them together.
  • Charges Report: This report contains a lot of the information we needed, though I’ll spare you the details.
  • Daily Account Activity Report: This report includes a list of records that changed that day. We used it as a lookup table to find other information because it isn’t included in this report or any other report.

Creating Bots and Scraping Data

To get the information we needed, we had to create two bots. One bot went into the individual patient records from the Daily Account Activity Report, navigated through various screens, clicked on claims, and retrieved information like the insurance balance and patient balance. Then, another bot scraped the transactions to identify which payer the balance corresponded to, ensuring that the dollar balance was correctly associated with the correct payer.

There’s no way to tell, just by looking at the insurance balance, whether it’s for a primary, secondary, or tertiary payer, or if the payer changed during the process. For example, if they billed one insurance company and later realized it was wrong and billed a different company, the balance still shows as primary. We had to map that correctly and ensure it displayed accurately.

The Complexity Grows

We also had to create another bot to extract all the 835s (ERAs) and gather payments information from non-835 sources, like paper payments, paper checks, and patient payments. None of these reports had a check or EFT date associated with them, which is crucial for bank reconciliation. So, instead of just six reports, we needed 14 reports to link together to get all the information we wanted.

Productivity and Builder Performance

To track productivity and understand how well billers are performing, how many claims they’re working on, what’s being resolved, and so on, we had to create a separate report. This report had to include information about which user worked on a claim, updated notes, and so forth. The process is nuts—it’s exhausting.

The Moral of the Story

In the end, we needed 14 different reports or bots, running daily, to extract all the data, perform extensive transformations, and make the data usable. Why is all of this required? Because their API is terrible. And by terrible, I mean complete garbage—totally unusable. We’ll discuss the problems with APIs another time.

But the moral of the story is that you frequently can’t trust the API. This means you must do a massive amount of work to extract raw data from various locations within the system, clean it up, string it all together, and transform it effectively to get useful data.

Author

voyant

Leave a comment

Your email address will not be published. Required fields are marked *