Skip Navigation
Advice for channel using GPT-4-Turbo for scientific paper summaries?
  • I've engaged with the machine since GPT-3 Davinchi. The accuracy of the models have improved tremendously, and they can now even perform complex maths such as those pertaining to relativity (they use Python to calculate). I'm using the latest and greatest model that was just released a little over a week ago; in my experience it's more than capable. Main limitation I have is maths, as I'm not using the Assistants API which gives access to code interpreter, so I'll just avoid most pure mathematics papers and stick to the sciences.

    Yeah, np. This is the paper it's based off of: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9233534/ . Please keep in mind that it's target audience is not academic in nature, so the language use has been altered to reflect that (e.g. using "theory" not as we'd use it like in theory of gravity, but how most people would use it to reference speculation or hypothesis)

    Introduction: Hey everyone! Today, let's embark on a fascinating journey back in time to the Hadean Earth, over 4 billion years ago. This was a time when our planet was quite different, covered in a sea of molten rock and constantly bombarded by meteorites. Yet, it was during this hostile period that the very pillars of life as we know might have been set. We'll dive into a scientific study that suggests how some of Earth's oldest rocks might have been more than just barren stones; they may have been catalysts that set the stage for the development of RNA, which is crucial for life as we know it.

    The RNA World: RNA stands for ribonucleic acid. It's a molecule that plays a key role in genetics and the synthesis of proteins in all living organisms today. There's a hypothesis, known as the "RNA World" theory, which suggests that before DNA and proteins became fundamental to life, RNA might have done all the work. RNA could have been both the genetic material and the molecular machine in the earliest forms of life.

    The Big Question: But how did RNA come about in the first place? This has puzzled scientists for years because RNA is a complex molecule – not something easily formed in nature without a helping hand. To support the idea of the RNA World, we'd need evidence that RNA could form in conditions that existed on the early Earth.

    The Research Study: In comes this groundbreaking study! Scientists have discovered that when you mix certain chemicals called ribonucleoside triphosphates (the building blocks of RNA) with specific types of ancient volcanic rocks, known as mafic rock glasses, something amazing happens. These rocks act as a catalyst, turning the building blocks into RNA, without any need for biological processes.

    Why Rock Glasses?: These ancient rock glasses were formed during volcanic eruptions and meteorite impacts on the early Earth. They had the right chemical properties to help stitch together RNA building blocks. It's kind of like having a robotic assembly line, where rocks are doing the intricate work of piecing together RNA, piece by piece.

    The Experiments: Here's what the researchers did. They took these rock glasses, underwater, mixed them with RNA building blocks, and let chemistry do its magic. Over time, they noted the formation of RNA chains which could have been the precursors to early life.

    The Results: The RNA that formed was not perfect. It had a good amount of links known as 3'-5' linkages, which we see in modern RNA, but the researchers couldn't rule out other types or even a bit of branching. Despite this, the RNA chains were long enough to hold genetic information and potentially perform some simple reactions.

    Limitations: No scientific study is perfect, and it's important to talk about what this experiment doesn't tell us. For one, the RNA made during the study isn't exactly like the RNA in our cells today; some parts are put together differently. Also, the study was done in ideal lab conditions – and we can only guess if conditions on early Earth were the same. Another thing is, even if volcanic glass can help make RNA, we still need to understand where the RNA building blocks came from.

    Conclusion: The findings give us a valuable peek into how the seeds of life may have been sown on our ancient planet. These rock glasses might have been nature's first-time life-supporting lab bench, making RNA without any living assistance.

    So, there you have it – a discussion on a paper that bridges the gap between geology and biology and offers a tantalizing glimpse at how life might have begun in the primordial soup of early Earth. This evidence sheds light on our most distant origins and reminds us that life, even in its simplest form, always finds a way.

  • Advice for channel using GPT-4-Turbo for scientific paper summaries?
  • It's actually not. Abstracts are targeted at academics or researchers, and oftentime preserve the complexity. Take for example the abstract of the paper this video's about:

    Reported here are experiments that show that ribonucleoside triphosphates are converted to polyribonucleic acid when incubated with rock glasses similar to those likely present 4.3–4.4 billion years ago on the Hadean Earth surface, where they were formed by impacts and volcanism. This polyribonucleic acid averages 100–300 nucleotides in length, with a substantial fraction of 3′,-5′-dinucleotide linkages. Chemical analyses, including classical methods that were used to prove the structure of natural RNA, establish a polyribonucleic acid structure for these products. The polyribonucleic acid accumulated and was stable for months, with a synthesis rate of 2 × 10−3 pmoles of triphosphate polymerized each hour per gram of glass (25°C, pH 7.5). These results suggest that polyribonucleotides were available to Hadean environments if triphosphates were. As many proposals are emerging describing how triphosphates might have been made on the Hadean Earth, the process observed here offers an important missing step in models for the prebiotic synthesis of RNA.

    While it is less complex than the paper, it is nevertheless dense and jargon endowed. Your average person with a highschool education will either not understand it well or be absolutely turned off by its density. They're also just very unlikely to stumble across it.

    I could have the machine reword it, but the information is not comprehensive, which reduces quality. By having the entire paper in its context window, the LLM is less likely to hallucinatinate. Plus the added information helps it make better summaries based on all the paper's sections, importantly the limitation section.

  • Advice for channel using GPT-4-Turbo for scientific paper summaries?
    m.youtube.com From Rock to RNA: Unveiling Earth's First Life Engineers

    Dive into a spectacular scientific discovery that transports you over 4 billion years back in time. Explore an era when Earth was a cauldron of molten rock a...

    From Rock to RNA: Unveiling Earth's First Life Engineers

    So I've been working on an implementation of GPT-4-Turbo that's designed to ingress entire papers into its context window and process them into summaries that would be understandable by someone with a highschool education (originally went for 8th grade max, but that led to rather patronizing results lol). The machine tells me what the content should be for a given paper and I make it using a few tools like Premier Pro and Photoshop. I've never made videos like this before though, so it's a bit rough.

    I was hoping to use this tool to expand access to scientific papers by the general public. Those papers are hella dense and def need some translation.

    I've attached my first full attempt at using my tool. It's kinda rough, but I'd love to get some feedback on how to make it better (especially as it pertains to the script).

    7
    “Hire me”
  • Sucks that your rick roll got taken from you. I understand how hard it must feel, so please know that I'm never gonna give you up, never gonna let you down, never gonna run around and desert you

  • What do you think human civilization will look like in 10 years?
  • That must mean that the cats will have transcended by then if such advanced humans could still not understand them. Welp, guess my only option is still blink slowly and pay them katzen the respect they expect

  • What does kbin mean? Why is kbin.social called that?
  • If you're using ChatGPT, you're using an outdated model. GPT-4 is better than it and has been out for some time.

    ChatGPT is weaker and not something I'd use frequently. GPT-4 is much stronger and much more useful. And the next generation is coming soon, which will be better than GPT-4.

    It's not like fusion in its current state since LLM AIs are already ready for use. The only task is making it more effective. Using the fusion example, it would be like if we'd finally developed a reactor that generates more energy than it consumed, and now only sought to make it create more power

  • What does kbin mean? Why is kbin.social called that?
  • ChatGPT won't be what takes your job, it's what comes after.

    I seriously don't understand the people who disregard it based on the state of ChatGPT. It will only get better over time. And by over time, I mean weeks to months; this tech is maturing faster and faster. The change from GPT3 to ChatGPT to GPT4 to GPT4 that can see images, has all come in the blink of an eye.

    Human labor is on borrowed time

  • College professors are going back to paper exams and handwritten essays to fight students using ChatGPT
  • AI doesn't necessitate a machine even being capable of stringing the complex English language into a series of steps towards something pointless and unattainable. That in itself is remarkable, however naive it may be in believing you that a foldable phone can be inflated. You may be confusing AI for AGI, which is when the intelligence and reasoning level is at or slightly greater than humans.

    The only real requirement for AI is that a machine take actions in an intelligent manner. Web search engines, dynamic traffic lights, and Chess bots all qualify as AI, despite none of them being able to tell you rubbish in proper English

  • College professors are going back to paper exams and handwritten essays to fight students using ChatGPT
  • It actually is artificial intelligence. What are you even arguing against man?

    Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn't AI because you don't like it is like saying rock and roll isn't music

  • Cost of a 128KB computer with floppies in 1985
  • Hey, I recognize you from this comment! You flipped that switch so many decades ago, ruining everything I had worked so hard for. I'll always remember.

    Those lost 50KB of work will forever be etched into my mind. Quite literally: the second I get my hands on a 30TB neurolink you bet your goddam ass I'm making a 50KB text file with your name on repeat, so that I'll always hear your name echo in my thoughts. "u/Kalkaline@programming.dev flipped my surge protector's switch", for x in range infinity

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)WA
    WackyTabbacy42069 @reddthat.com
    Posts 31
    Comments 79