You've seen it in Slack, Discord, and Twitter—paste a link, and it magically transforms into a rich preview with title, description, and image. This is called link unfurling, and in this tutorial, you'll learn how to build one from scratch.
What We're Building
By the end of this tutorial, you'll have a production-ready link unfurler that:
- Extracts metadata from any URL in real-time
- Displays beautiful, responsive previews
- Caches results for performance
- Handles errors gracefully
- Works with any website, including SPAs
Here's what the final result looks like:
┌─────────────────────────────────────────────────┐
│ ┌─────────┐ │
│ │ Image │ Example Website │
│ │ │ This is the page description that │
│ └─────────┘ appears in the unfurled preview... │
│ example.com │
└─────────────────────────────────────────────────┘
Understanding Link Unfurling
Before we code, let's understand how link unfurling works:
- User pastes a URL into a text input
- URL detection identifies valid URLs in the text
- Metadata extraction fetches and parses the target page
- Preview rendering displays the extracted data beautifully
The challenging part is step 3—metadata extraction. You can't simply fetch a URL from the browser due to CORS restrictions. You need a backend service.
Project Setup
Let's create a new Next.js project:
npx create-next-app@latest link-unfurler --typescript --tailwind --app
cd link-unfurler
Install additional dependencies:
npm install @tanstack/react-query zod
We're using:
- React Query for data fetching and caching
- Zod for runtime type validation
The Architecture
Our unfurler has three main components:
┌──────────────────────────────────────────────────────┐
│ Frontend │
├──────────────────────────────────────────────────────┤
│ URLInput → URLDetector → MetadataFetcher → Preview │
└──────────────────────────────────────────────────────┘
│
▼
┌──────────────────────────────────────────────────────┐
│ API Route │
├──────────────────────────────────────────────────────┤
│ /api/unfurl → Validate URL → Fetch Metadata → Cache │
└──────────────────────────────────────────────────────┘
│
▼
┌──────────────────────────────────────────────────────┐
│ Metadata Service │
├──────────────────────────────────────────────────────┤
│ Katsau API / Custom Scraper / Other Provider │
└──────────────────────────────────────────────────────┘
Step 1: URL Detection
First, let's create a utility to detect URLs in text:
// lib/url-detector.ts
const URL_REGEX = /https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)/gi;
export function extractUrls(text: string): string[] {
const matches = text.match(URL_REGEX);
return matches ? [...new Set(matches)] : [];
}
export function isValidUrl(url: string): boolean {
try {
new URL(url);
return true;
} catch {
return false;
}
}
// Test
console.log(extractUrls('Check out https://github.com and https://google.com'));
// ['https://github.com', 'https://google.com']
Step 2: API Route for Metadata Fetching
Create an API route that fetches metadata. We'll use Katsau's API for reliable extraction:
// app/api/unfurl/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { z } from 'zod';
const requestSchema = z.object({
url: z.string().url(),
});
interface UnfurlResponse {
success: boolean;
data?: {
url: string;
title: string;
description: string;
image: string | null;
favicon: string | null;
siteName: string | null;
};
error?: string;
}
// Simple in-memory cache (use Redis in production)
const cache = new Map<string, { data: UnfurlResponse; timestamp: number }>();
const CACHE_TTL = 1000 * 60 * 60; // 1 hour
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const { url } = requestSchema.parse(body);
// Check cache
const cached = cache.get(url);
if (cached && Date.now() - cached.timestamp < CACHE_TTL) {
return NextResponse.json(cached.data);
}
// Fetch metadata from Katsau API
const response = await fetch(
`https://api.katsau.com/v1/extract?url=${encodeURIComponent(url)}`,
{
headers: {
'Authorization': `Bearer ${process.env.KATSAU_API_KEY}`,
},
}
);
if (!response.ok) {
throw new Error('Failed to fetch metadata');
}
const metadata = await response.json();
const result: UnfurlResponse = {
success: true,
data: {
url,
title: metadata.data.title || new URL(url).hostname,
description: metadata.data.description || '',
image: metadata.data.image || null,
favicon: metadata.data.favicon || null,
siteName: metadata.data.siteName || new URL(url).hostname,
},
};
// Cache the result
cache.set(url, { data: result, timestamp: Date.now() });
return NextResponse.json(result);
} catch (error) {
console.error('Unfurl error:', error);
return NextResponse.json(
{ success: false, error: 'Failed to unfurl URL' },
{ status: 500 }
);
}
}
Step 3: React Query Setup
Set up React Query for efficient data fetching:
// lib/query-client.ts
import { QueryClient } from '@tanstack/react-query';
export const queryClient = new QueryClient({
defaultOptions: {
queries: {
staleTime: 1000 * 60 * 60, // 1 hour
gcTime: 1000 * 60 * 60 * 24, // 24 hours
retry: 2,
refetchOnWindowFocus: false,
},
},
});
Create the provider:
// components/providers.tsx
'use client';
import { QueryClientProvider } from '@tanstack/react-query';
import { queryClient } from '@/lib/query-client';
export function Providers({ children }: { children: React.ReactNode }) {
return (
<QueryClientProvider client={queryClient}>
{children}
</QueryClientProvider>
);
}
Step 4: The Unfurl Hook
Create a custom hook for unfurling:
// hooks/use-unfurl.ts
'use client';
import { useQuery } from '@tanstack/react-query';
interface UnfurlData {
url: string;
title: string;
description: string;
image: string | null;
favicon: string | null;
siteName: string | null;
}
async function fetchUnfurl(url: string): Promise<UnfurlData> {
const response = await fetch('/api/unfurl', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ url }),
});
if (!response.ok) {
throw new Error('Failed to unfurl');
}
const data = await response.json();
if (!data.success) {
throw new Error(data.error);
}
return data.data;
}
export function useUnfurl(url: string | null) {
return useQuery({
queryKey: ['unfurl', url],
queryFn: () => fetchUnfurl(url!),
enabled: !!url,
});
}
Step 5: The Link Preview Component
Now the fun part—building the preview component:
// components/link-preview.tsx
'use client';
import { useUnfurl } from '@/hooks/use-unfurl';
import { ExternalLink } from 'lucide-react';
interface LinkPreviewProps {
url: string;
}
export function LinkPreview({ url }: LinkPreviewProps) {
const { data, isLoading, error } = useUnfurl(url);
if (isLoading) {
return <LinkPreviewSkeleton />;
}
if (error || !data) {
return <LinkPreviewFallback url={url} />;
}
return (
<a
href={url}
target="_blank"
rel="noopener noreferrer"
className="group block border border-gray-200 dark:border-gray-800 rounded-xl overflow-hidden hover:border-blue-500 transition-all duration-200 hover:shadow-lg"
>
<div className="flex flex-col sm:flex-row">
{/* Image */}
{data.image && (
<div className="sm:w-48 h-32 sm:h-auto flex-shrink-0">
<img
src={data.image}
alt={data.title}
className="w-full h-full object-cover"
onError={(e) => {
e.currentTarget.style.display = 'none';
}}
/>
</div>
)}
{/* Content */}
<div className="flex-1 p-4">
{/* Site info */}
<div className="flex items-center gap-2 mb-2">
{data.favicon && (
<img
src={data.favicon}
alt=""
className="w-4 h-4 rounded"
onError={(e) => {
e.currentTarget.style.display = 'none';
}}
/>
)}
<span className="text-xs text-gray-500 dark:text-gray-400 uppercase tracking-wide">
{data.siteName}
</span>
<ExternalLink className="w-3 h-3 text-gray-400 opacity-0 group-hover:opacity-100 transition-opacity" />
</div>
{/* Title */}
<h3 className="font-semibold text-gray-900 dark:text-gray-100 line-clamp-2 group-hover:text-blue-600 dark:group-hover:text-blue-400 transition-colors">
{data.title}
</h3>
{/* Description */}
{data.description && (
<p className="text-sm text-gray-600 dark:text-gray-300 mt-1 line-clamp-2">
{data.description}
</p>
)}
</div>
</div>
</a>
);
}
function LinkPreviewSkeleton() {
return (
<div className="border border-gray-200 dark:border-gray-800 rounded-xl overflow-hidden animate-pulse">
<div className="flex flex-col sm:flex-row">
<div className="sm:w-48 h-32 bg-gray-200 dark:bg-gray-800" />
<div className="flex-1 p-4 space-y-3">
<div className="flex items-center gap-2">
<div className="w-4 h-4 bg-gray-200 dark:bg-gray-800 rounded" />
<div className="w-20 h-3 bg-gray-200 dark:bg-gray-800 rounded" />
</div>
<div className="w-3/4 h-5 bg-gray-200 dark:bg-gray-800 rounded" />
<div className="w-full h-4 bg-gray-200 dark:bg-gray-800 rounded" />
</div>
</div>
</div>
);
}
function LinkPreviewFallback({ url }: { url: string }) {
const hostname = new URL(url).hostname;
return (
<a
href={url}
target="_blank"
rel="noopener noreferrer"
className="flex items-center gap-3 p-4 border border-gray-200 dark:border-gray-800 rounded-xl hover:bg-gray-50 dark:hover:bg-gray-900 transition-colors"
>
<div className="w-10 h-10 bg-gray-100 dark:bg-gray-800 rounded-lg flex items-center justify-center text-lg">
🔗
</div>
<div className="flex-1 min-w-0">
<p className="font-medium text-gray-900 dark:text-gray-100">{hostname}</p>
<p className="text-sm text-gray-500 truncate">{url}</p>
</div>
<ExternalLink className="w-4 h-4 text-gray-400 flex-shrink-0" />
</a>
);
}
Step 6: The Message Input Component
Create an input that detects URLs as you type:
// components/message-input.tsx
'use client';
import { useState, useCallback } from 'react';
import { extractUrls } from '@/lib/url-detector';
import { LinkPreview } from './link-preview';
import { Send } from 'lucide-react';
export function MessageInput() {
const [text, setText] = useState('');
const [detectedUrls, setDetectedUrls] = useState<string[]>([]);
const handleChange = useCallback((e: React.ChangeEvent<HTMLTextAreaElement>) => {
const value = e.target.value;
setText(value);
// Debounce URL detection
const urls = extractUrls(value);
setDetectedUrls(urls);
}, []);
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
// Handle message submission
console.log('Sending:', text);
setText('');
setDetectedUrls([]);
};
return (
<form onSubmit={handleSubmit} className="space-y-4">
{/* URL Previews */}
{detectedUrls.length > 0 && (
<div className="space-y-3">
{detectedUrls.map((url) => (
<LinkPreview key={url} url={url} />
))}
</div>
)}
{/* Input */}
<div className="relative">
<textarea
value={text}
onChange={handleChange}
placeholder="Type a message... Paste a URL to see the preview!"
className="w-full p-4 pr-12 border border-gray-200 dark:border-gray-800 rounded-xl resize-none focus:outline-none focus:ring-2 focus:ring-blue-500 dark:bg-gray-900"
rows={3}
/>
<button
type="submit"
disabled={!text.trim()}
className="absolute right-3 bottom-3 p-2 bg-blue-500 text-white rounded-lg disabled:opacity-50 disabled:cursor-not-allowed hover:bg-blue-600 transition-colors"
>
<Send className="w-4 h-4" />
</button>
</div>
</form>
);
}
Step 7: Putting It All Together
Update your main page:
// app/page.tsx
import { MessageInput } from '@/components/message-input';
export default function Home() {
return (
<main className="min-h-screen bg-gray-50 dark:bg-gray-950 py-12">
<div className="max-w-2xl mx-auto px-4">
<h1 className="text-3xl font-bold text-center mb-2">
Link Unfurler Demo
</h1>
<p className="text-gray-600 dark:text-gray-400 text-center mb-8">
Paste any URL below to see it unfurl like Slack!
</p>
<div className="bg-white dark:bg-gray-900 rounded-2xl shadow-xl p-6">
<MessageInput />
</div>
</div>
</main>
);
}
Advanced: Debouncing URL Detection
For better performance, debounce the URL detection:
// hooks/use-debounce.ts
import { useState, useEffect } from 'react';
export function useDebounce<T>(value: T, delay: number): T {
const [debouncedValue, setDebouncedValue] = useState<T>(value);
useEffect(() => {
const timer = setTimeout(() => setDebouncedValue(value), delay);
return () => clearTimeout(timer);
}, [value, delay]);
return debouncedValue;
}
Use it in the message input:
const debouncedText = useDebounce(text, 300);
useEffect(() => {
const urls = extractUrls(debouncedText);
setDetectedUrls(urls);
}, [debouncedText]);
Production Considerations
1. Rate Limiting
Protect your API route from abuse:
import { Ratelimit } from '@upstash/ratelimit';
import { Redis } from '@upstash/redis';
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(10, '1 m'), // 10 requests per minute
});
export async function POST(request: NextRequest) {
const ip = request.ip ?? '127.0.0.1';
const { success } = await ratelimit.limit(ip);
if (!success) {
return NextResponse.json(
{ error: 'Too many requests' },
{ status: 429 }
);
}
// ... rest of the handler
}
2. URL Allowlisting
Prevent unfurling of internal/sensitive URLs:
const BLOCKED_DOMAINS = [
'localhost',
'127.0.0.1',
'0.0.0.0',
// Add internal domains
];
function isAllowedUrl(url: string): boolean {
try {
const { hostname } = new URL(url);
return !BLOCKED_DOMAINS.some(blocked => hostname.includes(blocked));
} catch {
return false;
}
}
3. Caching with Redis
Replace in-memory cache with Redis for production:
import { Redis } from '@upstash/redis';
const redis = Redis.fromEnv();
async function getCachedUnfurl(url: string) {
const cached = await redis.get(`unfurl:${url}`);
return cached ? JSON.parse(cached as string) : null;
}
async function setCachedUnfurl(url: string, data: UnfurlResponse) {
await redis.set(`unfurl:${url}`, JSON.stringify(data), { ex: 3600 });
}
Why Use a Metadata API?
You might wonder: "Why not scrape the pages myself?"
Building your own scraper means handling:
- CORS issues requiring a backend
- JavaScript-rendered pages (SPAs, React sites)
- Rate limiting from target sites
- Different meta tag formats (OG, Twitter, Dublin Core)
- Relative URL resolution
- Encoding issues
- Timeouts and retries
- Caching infrastructure
A metadata API like Katsau handles all of this, letting you focus on the user experience.
Conclusion
You've built a production-ready link unfurler! Here's what we covered:
- URL detection with regex
- API route for metadata fetching
- React Query for caching
- Beautiful preview components
- Production considerations
The complete source code is available on GitHub. For reliable metadata extraction at scale, check out Katsau's API—it handles all the edge cases so you don't have to.
Building something cool with link unfurling? Get your free API key and start unfurling!
Try Katsau API
Extract metadata, generate link previews, and monitor URLs with our powerful API. Start free with 1,000 requests per month.