Skip to Content, Navigation, or Footer.
The Daily Cardinal Est. 1892
Thursday, April 09, 2026
Daily_card_pic.jpeg

Behind two students’ plan to simplify the operating room with AI

University of Wisconsin-Madison juniors Ruffin Bryant and Noah Kalthoff turned a biomedical engineering class project into an agentic artificial intelligence tool for surgical charting.

Doctors usually spend 10-15 minutes writing reports after performing surgery, which are used to extract specific codes corresponding to surgery type — something professor and UW Health surgeon Tuo Peter Li called a tedious post-operating charting process.

When he pitched the issue as a potential project in courses BME 200 and 300, where students solve problems drawn from real-world clients, roommates and University of Wisconsin-Madison juniors Ruffin Bryant and Noah Kalthoff used his suggestion as a springboard. They formed the basis for student startup OpScribe-AI, an artificial intelligence surgical documentation tool.  

Now, they plan to create their own dataset by hiring medical residents to perform surgeries on cadavers.

“We go to school in the morning, and then the entire rest of the day, we're running a company,” Bryant told The Daily Cardinal.

Details from surgical reports are used to extract ICD-10 codes — specific alphanumeric codes corresponding to surgery types, patient afflictions and procedures performed. Hospital-employed medical coders use surgeons’ reports to identify these ICD-10 codes and submit them to insurance companies, who determine the proper charge for the procedure. 

Both Bryant and Kalthoff were interested in using AI to assist with these reports, solving Li’s problem. They wrote a short essay to win their top choice, and got started right away.

“We were committed,” Kalthoff said. “This was kind of our baby from the beginning.”

Currently, surgeons must write a post-operative report after every surgery describing its events, including the medical need for the operation, the equipment used, complications, estimated blood loss and a comprehensive list of all procedures. The report is passed to a hospital’s medical coders, who parse through the report, forming a pricing estimate of the surgery to submit as a claim to the patient’s insurance.

“This is the last thing [surgeons] want to do at the end of the day,” Kalthoff said. “We’ve talked to multiple surgeons. They've all said the same thing: this is the worst part of their job. It is not why they went to medical school.”

The accuracy of an operative report is vital. In the case of a botched surgery or one with complications, the ICD-10 codes and resulting pricing estimates differ. A recent paper analyzed 158 cases of prostatectomy surgeries, finding 27.2% of surgeon operative reports had at least one “clinically significant discrepancy.” With the paper’s AI model, only 12.7% of reports had a significant error.

Kalthoff said these reports can be inaccurate because of surgeons falling back on “cookie-cutter templates.” Surgeries are getting longer and occur frequently amid a national shortage of professionals in the field, creating an unmanageable workload for many. Additionally, Kalthoff said insurance agencies want more codes to justify additional costs, making the reports themselves longer.

Inaccuracies often lead to over-billing or under-billing insurance companies, incidents for which doctors can be held legally liable for fraud. If a surgeon does not fully explain their procedure in the report, medical coders may not find the keywords they need to prepare the correct ICD-10 codes, causing under-billing. Alternatively, if a report is ambiguous, medical coders might upcode, or insert codes relating to procedures that did not occur, causing over-billing. 

Enjoy what you're reading? Get content from The Daily Cardinal delivered to your inbox

In the best-case scenario, a medical coder will speak to a surgeon to correct potential inaccuracies, adding time to the process. In the worst-case scenario, “you can’t bill the insurance the right amount of money [and] the hospitals end up fronting the cost,” Kalthoff said.

OpScribe-AI can generate its own report, using images or video from the operating room. “It can do it in a live fashion or a drag and drop, where it's recorded and the surgeon drops it into our user interface,” Kalthoff said.

With OpScribe-AI generating the reports and extracting ICD-10 codes, Bryant and Kalthoff hope to reduce the entire process down to five minutes with just an AI report and a human review.

How does it work?

OpScribe-AI is a multi-system agentic AI running on a Qwen AI model, said Dharanjay Bhaskar, the team’s advisor and a RISE-AI professor in the biomedical engineering department. Qwen is a Chinese AI company whose product is similar to OpenAI’s ChatGPT. 

Using a Qwen model, Kalthoff and Bryant optimized separate AI agents tailored to specific surgeries, specifically endoscopic and laparoscopic surgeries — minimally invasive, robot-assisted surgeries with a camera attached to a small device the surgeon manipulates inside the body — as well as open reduction surgeries, where the surgeon primarily operates on the patient.

Bryant and Kalthoff said endoscopic and laparoscopic training footage was relatively easy to obtain for the model, but open reduction footage proved a challenge. To try and combat this, they recruited a group of medical residents from UW Health, who will be paid for their time, to perform full-length surgeries on cadavers, creating their own dataset. The procedures will be recorded with head-mounted and overhead cameras, with angles similar to cameras embedded in surgical lamps in operating rooms.

A large language model analyzes each phase of the surgery separately, extracting surgical publications from the internet to gather medical context about the procedures performed. Then, using the publications and its existing datasets, the model transcribes each surgical phase into a text summary. 

“We are relying on existing medical literature to give ourselves the correct terminology and medical context,” Bhaskar said.

Another AI summarization agent then assembles the separate summaries into a long transcript, from which ICD-10 codes and final operative reports are generated in hospital or department-specific formats. Yet another AI agent reviews the operative reports, comparing them to the video to identify and fix information.

Bryant and Kalthoff currently run their model on Bhaskar’s lab processors. The founders recently partnered with Amazon Web Services to host their datasets and train the model in a secure online location, meeting HIPAA patient privacy standards if surgeons wanted a trial of the product.

“We can't just take patient specific data and throw it into a model and then analyze it, because it's illegal,” Bryant said. “You need a protected, HIPAA-secure infrastructure… This Amazon partnership is the last hurdle we need to run real clinical pilots with a small niche customer base.”

Bryant said their next iteration includes “federated learning,” where OpScribe-AI will identify patterns in its report data — such as instruments used, anatomical findings and procedural steps — and sell these trends back to the hospital. He said the data could help hospitals identify niche inefficiencies, like tools surgeons keep on the prep table but rarely use, or a specific complication with geriatric knee surgeries.

How did they find the time?

Bryant and Kalthoff met in Dejope Residence Hall their freshman year, and are both biomedical engineering majors with business minors. Li’s medical charting problem was both of their top picks in BME 300, among project ideas aggregated from industry partners, hospitals and “anyone who might have an engineering-solvable issue” in the greater Madison community, Bryant said.

The medical charting team comprised three juniors and three sophomores. During the fall semester, Bryant and Kalthoff averaged two to three hours a day on the project outside of class. Bryant said he was “lowkey obsessed with the idea.”

After the two officially formed the startup, their workload increased to more than four hours a day, made more feasible because they currently live in the same house, across the hall and are “attached at the hip, for the most part,” Kalthoff said, though hopefully not requiring surgery any time soon. 

“It makes it easy for communication, for sure: he’ll shout down the hallway, ‘Did you do this?’ or ‘Did you get this?’ or ‘What do you think about this?’” Bryant said.

To accelerate the creation of OpScribe-AI, the team relied on generative AI for coding assistance. 

“We definitely used the help of AI to build AI,” Bryant said. “In a space where everyone's moving at lightning speed to try and get the next thing out there, you're falling behind if you're not.”  

They’ve since been accepted into gener8tor, a competitive startup incubator, as one of five teams out of around 1,100 applicants. Through the program, they attend weekly meetings at Waukesha’s Applied AI Laboratory with experienced entrepreneurs in the field called a “mentor swarm.”

All six team members are contributing to an upcoming research paper comparing OpScribe-AI’s performance to similar medical transcription AI models.

“From an academic side, we're showing that our model is top-tier,” Bryant said. 

Support your local paper
Donate Today
The Daily Cardinal has been covering the University and Madison community since 1892. Please consider giving today.

Powered by SNworks Solutions by The State News
All Content © 2026 The Daily Cardinal