The Default Choice
If you've ever built a portfolio, the default advice is GitHub Pages. It's free, it connects to your repo, and it takes about five minutes to set up. For most people, that's where the decision ends.
But I wanted two things GitHub Pages makes awkwardly hard: real control over my DNS and an actual security layer in front of my site. Both of those led me to Cloudflare.
It's a CDN First
Cloudflare Pages deploys exactly like GitHub Pages. You push to main, it builds, it deploys.
The difference is what happens after the build. GitHub Pages serves your static files from GitHub's servers. Cloudflare distributes them across a global CDN. The latency difference isn't theoretical, it's measurable, especially outside the US.
DNS management is also cleaner. If I want a custom domain on GitHub Pages, I point my registrar's nameservers at GitHub and wait for propagation. With Cloudflare, they are my DNS provider. Changes happen in seconds.
The Security Gap
This is the real reason I switched.
Cloudflare operates as a reverse proxy. When you visit this site, you aren't actually hitting my hosting. You're hitting a Cloudflare edge server, which then fetches the content. Because they sit in the middle, they filter the traffic.
DDoS protection is automatic. SSL/TLS just works. And bot management drops scrapers and vulnerability scanners before they ever load a page.
Let's talk about that last one.
Dealing with Bots
If you've solved an image puzzle to prove you aren't a robot recently, you've probably interacted with Cloudflare Turnstile. It's a less annoying alternative to Google's reCAPTCHA.
Websites need this because a large share of internet traffic isn't human. Bot crawlers are automated programs that visit websites at scale. Some are legitimate: search engine indexers, uptime monitors. Others are not.
- Scrapers pull email addresses to sell to spam lists.
- Credential stuffers throw leaked passwords at login screens.
- Vulnerability scanners probe for old WordPress exploits or open
.envfiles.
Websites try to block the bad ones because unchecked bots waste server resources, distort analytics, and in worse cases extract data or probe for exploits. The traditional answer was CAPTCHA: make the user prove they're human by solving a visual puzzle that bots struggle with.
Cloudflare Turnstile is a quieter replacement. Instead of interrupting the user, it runs a challenge in the background through browser signals, behavioral patterns, and network analysis, and makes a call without usually asking anyone to do anything. That small "Checking..." message before a form loads? That's often Turnstile.
My portfolio doesn't have a login screen to stuff. But scrapers mining my contact info? That happens to everyone. Bots hammering the site and inflating analytics? Also real.
Cloudflare's free tier handles this quietly. I have Bot Fight Mode enabled, so bad actors get challenged or dropped at the edge. Real humans just see the website load fast.
Why Not Shared Hosting?
You could rent a $5/month shared hosting plan with cPanel or a basic Nginx setup. That makes sense if you need a database or are running WordPress.
For a static site, though? You're paying money to manage infrastructure you don't need, and you're giving up the global CDN and automated security that Cloudflare gives you for free.
The Actual Reason
I'll be honest. The real motivation isn't just slightly faster loading times.
Configuring deployment pipelines, managing DNS, and understanding how a reverse proxy works: this is web infrastructure knowledge that actually matters. Using Cloudflare Pages isn't just about finding a place to put my HTML. It's practice for production environments.
GitHub Pages is fine. But Cloudflare gives you an enterprise-grade setup at no cost, and learning to operate it teaches you things shared hosting never will.