Padlet to Notion
Over the past few months, I've been working on what seemed like a straightforward task: moving a collection of bookmarks from Padlet into a Notion database. What I didn't expect was that this project would become an accidental experiment in working with three different AI assistants—Claude, ChatGPT, and Notion's built-in AI—each bringing something different to the table.
This isn't a tutorial. It's more of a reflection on what it's actually like to use AI tools for a real, messy, multi-step project that doesn't fit neatly into a single conversation.
The Problem That Wasn't One Problem
The migration seemed simple enough at first. Export bookmarks from Padlet as a CSV, import into Notion, done. But anyone who's tried this knows the reality: when you import URLs via CSV, Notion treats them as plain text. You don't get those nice preview cards with thumbnail images that appear when you paste a link manually.
Those thumbnails—Open Graph images, technically—matter. I'm building this database for students to browse design and development resources. A wall of text links isn't inviting. Visual previews help people find what they're looking for.
So the real project became: how do I get hundreds of bookmarks into Notion with their preview images intact?
And then, once the bookmarks were in, I realised the tagging system I'd inherited from Padlet was a mess—geography mixed with topics mixed with format types, vague categories that didn't help anyone find anything. So the project expanded again.
ChatGPT: The Conversation That Hit a Wall
I started with ChatGPT, explaining the OG image problem. The response was clear and helpful—it explained exactly why Notion doesn't fetch metadata from CSV imports and laid out several workarounds: manual pasting (tedious), automation tools like Make.com, or a Python script to pre-fetch the image URLs.
When I asked ChatGPT to process my CSV directly and add the OG image URLs, we hit reality: ChatGPT's environment couldn't access the web to fetch those images. It offered to write me a script to run locally instead.
This was useful information, but it meant I'd need to shift to a different approach. ChatGPT had diagnosed the problem well but couldn't execute the solution within our conversation.
Claude: Iterating Through the Edge Cases
I brought the same problem to Claude, and this is where things got interesting. Claude could actually run code and attempt to fetch those OG images from the web.
The first script worked—mostly. But "mostly" is where real projects live. Some sites returned "no image found" even though images were clearly visible on the page. Others timed out. Some threw 403 Forbidden errors, blocking the automated requests entirely.
What followed was a genuine debugging collaboration. I'd report back what failed, and Claude would refine the script:
Version 2 fixed the dimension-parsing bug and added multiple fallback methods for finding images—checking Open Graph tags first, then Twitter cards, then structured data, then scanning for large images on the page.
Version 3 tackled specific site types. Framer websites, for instance, use non-standard image loading that the basic approach missed. The script learned to look for data-framer-image-url attributes and background images in CSS.
Version 4 addressed the 403 errors with anti-detection measures—rotating user agents, adding CloudScraper for sites with bot protection, trying mobile user agents as a fallback.
Each iteration felt like pair programming. I wasn't writing the code, but I was testing it against reality and reporting back what the real world threw at it. The script got progressively more robust because it was being shaped by actual failures, not hypothetical ones.
Notion AI: Restructuring the Mess
With the images sorted, I had a different problem: the tagging system was chaos. Legacy tags from Padlet included a jumble of topics ("UX research," "Typography"), formats ("Tutorial," "Tool"), geography ("Norwich," "UK"), and vague categories that didn't help anyone.
This is where Notion's built-in AI came in. Working directly within the database, I used it to help design a cleaner taxonomy:
A new "Topics" field with about 30 student-facing categories
A separate "Region" field for geography
A migration workflow to batch-process old records, mapping legacy tags to the new structure
The Notion AI's strength here was contextual—it was working within the actual database, suggesting how to structure the properties and helping think through the migration logic. It wasn't fetching external data or writing scripts; it was helping me organise what I already had.
What I Actually Learned
Different tools for different phases. ChatGPT was good for initial diagnosis and understanding the problem space. Claude was better for the iterative, code-heavy execution where I needed to run things and refine based on results. Notion AI made sense for the organisational work that happened inside Notion itself. None of them was "best"—they were suited to different parts of the project.
AI doesn't eliminate iteration; it accelerates it. The OG image script went through four major versions. Each one failed in new ways that the previous version hadn't anticipated. The AI didn't predict all these edge cases upfront—but it responded to them quickly when I reported back. The feedback loop was fast, and that's what mattered.
Real projects are messier than demos. Every AI demo shows a clean prompt and a perfect response. Real work involves sites that block automated requests, data structures that don't match what you expected, and problems that expand as you solve them. The useful skill isn't getting AI to do something impressive once—it's knowing how to keep the conversation going when things don't work.
The human stays in the loop. I wasn't just prompting and waiting. I was testing scripts, identifying which sites failed and why, making judgment calls about which tags mapped to which topics. The AI handled things I couldn't do efficiently myself (fetching hundreds of URLs, generating code I couldn't write from scratch), but the direction and the quality control stayed with me.
The Boring Truth
The project is still ongoing. The database exists, the images mostly work, the tagging system is cleaner. There are probably still edge cases I haven't found yet. Some bookmarks will need manual attention.
That's the unglamorous reality of using AI tools for actual work: they don't deliver perfect solutions in a single prompt. They're collaborators in an ongoing process—sometimes brilliant, sometimes frustrating, always requiring you to stay engaged.
But that's also what makes them useful. Not as magic answer machines, but as capable partners who can take on the parts of a problem you couldn't handle alone, while you stay in the driver's seat for everything else.
This article was written with the help of Claude, synthesising chat logs from conversations with Claude, ChatGPT, and Notion AI.