I am literally drowning in PDFs right now and my master's thesis is due in exactly 19 days so I am honestly panicking a bit. I am focusing on carbon sequestration modeling and these papers are like 50 pages long each with so much dense technical jargon and math that my brain just shuts off halfway through. I tried using ChatPDF because everyone was talking about it but it kept hallucinating the actual results? Like it told me the p-values were significant when they definitely werent in the actual text so now I dont trust it at all for the heavy lifting.
Then I looked into Humata and it seemed better but the free version is so limited and I am a broke student in Seattle so spending 15 bucks a month just to read papers feels like a lot if it isnt perfect. My logic was that maybe there is a tool that actually understands the science part better than just a general LLM? I saw some stuff about Elicit or Scispace online but then some threads on Reddit say they are only good for finding papers and not actually summarizing the specific methodology sections which is what I really need. I need to know exactly how they set up their sensors and what the error margins were without spending 4 hours per paper.
I was thinking maybe Claude is better for long contexts but then I have to copy-paste everything or pay for the pro version to upload files and I dont know if it handles tables well. If anyone has actually used these for real academic work—not just like a summary of a news article—please let me know what actually works for technical stuff. I am seriously running out of time and I still have like 30 more sources to go through before I can even start my own data analysis...
I've extensively tested these for technical synthesis and I'm very satisfied with the accuracy.
Would love to know this too