What does "synthesizing impossibly large amounts of data" mean?
What does "synthesizing impossibly large amounts of data" mean?
To go back to the example I used in this thread This LLM took data from hundreds of studies and created an approximate summary of all that information. The doctor continued to look at different combinations to develop a treatment plan for the patient www.nytimes.com/2025/03/20/w...
The LLM can't replace a doctor. But as a tool it took work that would have taken months and squeezed it into a few days. The patient in question had a couple weeks.
Mostly this says to me that we're really bad about indexing academic papers. Letting the LLM do it is cool, but it's hard to know what, if anything, is missed. Anecdotes make great stories, but this application seems of limited utility (to me).
Of course it's limited utility. That's the point. A bespoke tool for a bespoke problem. And while I'm no doctor, I can't imagine digging through hundreds of medical papers and test results in a time frame that's actually helpful to a patient is all that easy.
A very, very expensive bespoke solution to a problem. Unclear if it is not wasted.
Fortunately, just about every tool a doctor has is very, very expensive.
That is bullshit.
how much do you think your dentist's x-ray machine costs? A cheap set will run you over $30k. An MRI machine is north of half a million dollars. Specialized equipment tends to be costly.
Doctors don’t own those MRI machines. Hospitals and medical systems do. 30k is less than many new cars.