Rohan Paul

Rohan Paul

@rohanpaul_ai

I Build & Write AI stuff. β†’ Join my LLM Newsletter - https://t.co/Jfj0r0we5f πŸ’Ό AI Engineer

t.co Joined Jun 2014
9
Threads
0
views
32.9K
Followers
18.2K
Tweets

Threads

Sam Altman on job loss by AI. This is in a podcast after the release of the new 01 OpenAI model πŸ“. --- Full Video from "St. Louis Public Radio" YouTube Channel (Link in comment)...

A small useful utility Github repo - DocAI Extract structured data from unstructured documents using Answer .AI's Byaldi, OpenAI gpt-4o, and Langchain's structured output. https:/...

You can Crawl entire website with Claude 3.5 or GPT4-o with this open-sourced tool firecrawl. πŸ’― Turn entire websites into LLM-ready markdown or structured data. Scrape, crawl and...

Looks like an exhaustive work here, a 103 page long Synthetic Data Generation paper. "Comprehensive Exploration of Synthetic Data Generation: A Survey" πŸ‘¨β€πŸ”§ Surveys 417 Synthetic...

Brilliant new paper, HUGE for LLM's internalized knowledge πŸ”₯ Out Of Context Learning > In Context Learning | Fine-tuning can teach new concepts better than ICL πŸ“Œ Finds a surprisin...

Another 'WOW' paper - Upto 20x improvement in inference throughput with Block Transformer compared to vanilla transformers with equivalent perplexity.🀯 How ❓ by MASSIVELY reducing...

This 76-page paper on Prompting Techniques has become quite popular. A nice read for your weekend. - "The Prompt Report: A Systematic Survey of Prompting Techniques": ✨ Explores...

A key to making your LLMs work better: just throw everything into the context window πŸ’‘ For many datasets, for most of the time, the long-context ICL (in-context learning) outperfo...

Sliding Window Attention is such a brilliant idea πŸ’‘ And it was one of the secret sauces behind the legendary Mistral-7B, which enabled it to handle 100k+ token sequences with line...