Build in Public
March 2026 12 min read

We Published 45 Blog Posts in 30 Days. Here's What Actually Happened.

Real Search Console data, real rankings, real mistakes. No theory — just what happened when we tried to build an entire content library in a month.

We launched syxoai.com in mid-February 2026. Four weeks later, we had 45 blog posts live, 32 trade pages published, product pages built, and a resource hub wired up. Roughly 100 URLs total on a domain that didn't exist 30 days earlier.

This is what the data actually shows. Not what we expected, not what we hoped — what happened. Every number in this post comes straight from Google Search Console.

Some of it's encouraging. Most of it's humbling. All of it's real.

Why We Published 45 Posts in 30 Days

Let's get this out of the way: publishing 45 blog posts in a month is probably not best practice. We know that. We did it anyway.

The theory was simple. Google ranks pages, not websites. More indexed pages means more chances for Google to test us in search results. More tests means faster feedback on what works. Instead of publishing 2 posts a week for 6 months and waiting to learn anything, we wanted to compress the learning cycle.

We also wanted to test whether AI-assisted content workflows could realistically shortcut the typical 6-12 month SEO timeline for a brand new site. Every marketing blog says "it takes time." We wanted to know exactly how much time and exactly what happens in those early weeks.

So we went all in. 45 blog posts. 32 trade pages. Product pages. Pillar pages. Everything published within 30 days.

Here's what happened.

The Content Breakdown

Not all 100 URLs were the same. Here's what we actually published:

Total: roughly 100 indexed URLs within 30 days of domain registration.

Every piece was written using AI workflows — Claude, ChatGPT, structured prompt chains. But every piece was also edited by hand. We weren't publishing raw AI output. We were using AI to draft, then spending 20-40 minutes per post on editing, restructuring, and adding our own examples.

That distinction matters for what happened next.

Week 1-2: The Honeymoon

Google indexed fast. We submitted every URL through Search Console the day it went live, and most pages appeared in the index within 48 hours. Some within hours.

Then something exciting happened. Several blog posts started showing up at genuinely competitive positions:

Position 7. On a site that was two weeks old. With zero backlinks. Against established domains.

We got excited. We started planning content calendars based on these positions holding. We talked about how our AI workflows had cracked the code.

We shouldn't have.

The numbers behind those positions were tiny. Total for the first two weeks: approximately 184 impressions and 4 clicks. Four. That's not traffic. That's a rounding error.

But the positions themselves felt real. And they were — temporarily.

Week 3-4: The Correction

Week three is when things got interesting. And by interesting, I mean painful.

Impressions actually went up. Way up. We jumped from 184 per week to 758 per week. More of our pages were appearing in search results. Google was clearly aware of us.

But the positions collapsed.

Pages that had been sitting at position 7-8 disappeared entirely. Not dropped to position 20 — gone. Vanished from the results we could track. Our average position across all queries went from around 26 to 70.

Clicks? Zero. Not one click in the entire third week.

What happened is well documented in SEO circles but rarely talked about with real numbers: Google gives new content a brief "honeymoon" period. It places your pages in competitive positions temporarily to test whether searchers engage with them. Click-through rate, dwell time, bounce rate — Google's measuring all of it.

If the signals are strong, you keep the position. If not, you get pulled back while Google reassesses.

We got pulled back. Hard.

The 758 impressions with zero clicks tells the whole story. Our pages were appearing in results, but at positions so low (50-100) that nobody was clicking. Google had given us a test, we hadn't passed, and now we were sitting in the penalty box while it figured out where we actually belonged.

The One Query That Actually Stuck

One query defied the pattern. While everything else collapsed, "keyword research in 90 minutes" held steady at position 4.8 with 94 impressions.

Position 4.8. On a one-month-old domain. No backlinks.

We spent a long time thinking about why this one survived when everything else didn't. Three reasons stood out:

It was ultra-specific. Not "keyword research" — that's Ahrefs territory, and we'd be delusional to compete. "Keyword research in 90 minutes" is a constrained query. It implies a method, not a topic. It's someone looking for a specific workflow, not a general education.

Big sites hadn't targeted this exact phrase. Ahrefs has a keyword research guide. Semrush has one. Moz has one. None of them had written specifically about doing keyword research in 90 minutes. That specificity gave us a gap to fill.

The intent match was tight. Someone searching "keyword research in 90 minutes" wants a step-by-step workflow with a time constraint. That's exactly what our post delivered. Not theory. Not a tool comparison. A timed workflow.

This became our north star. The pattern that works isn't about volume or even quality in the abstract — it's about finding queries where specificity gives you an edge that authority can't override.

What Worked and What Didn't — The Honest Breakdown

Four weeks of data is enough to see patterns, even if it's too early for conclusions. Here's what the numbers tell us.

What Worked

Specific workflow posts with constraints. Anything with a timeframe ("in 90 minutes"), a method ("step-by-step"), or a constraint ("without paid tools") performed measurably better than broad topic posts. These queries have lower search volume, but we could actually compete for them.

Branded pages stayed stable. Our homepage held steady at position 4 for branded queries. The brand facts page sat at position 3.5. These weren't affected by the honeymoon correction because they weren't competing — nobody else ranks for "Syxo." Obvious, but worth noting: brand equity is the one ranking you can't lose to bigger sites.

Fast indexing through Search Console. Every page we submitted was indexed within 48 hours. The few pages we forgot to submit took 7-10 days. If you're publishing and not submitting through Search Console, you're leaving a week of potential ranking time on the table.

What Didn't Work

Broad topic posts competing with HubSpot and Semrush. "How to use ChatGPT for marketing" reached position 7 during the honeymoon. Then it vanished. Every major marketing blog has published this exact topic. We had no business competing there with a one-month-old domain and zero backlinks.

Trade pages against established agencies. Our 32 trade pages ("marketing for plumbers," "marketing for electricians") are sitting at average position 85. They're competing against actual SEO agencies who've been serving these industries for years, with client testimonials, case studies, and hundreds of backlinks. We're an education platform trying to rank for service queries. Wrong competitive set entirely.

Generic "how to" content. Any post where our title could have been written by any marketing blog performed poorly. "How to use AI for marketing" is a title that exists on 500 websites. We added nothing unique to that conversation.

The Surprise

One niche profession post — AI marketing for real estate agents — briefly appeared at position 3. Position 3, above established marketing sites.

Why? Because major marketing blogs write about "AI marketing" broadly. They don't write about AI marketing specifically for real estate agents. The niche intersection — AI + marketing + specific profession — was a gap nobody had filled.

It didn't hold (honeymoon again), but the signal was clear. Niche intersections where big sites haven't written specifically are the opportunity for new sites.

Find Your Biggest Marketing Gap

The free AI Marketing Systems Score tells you which of your 5 systems needs attention first.

Take the Free Quiz

The Trade Pages Experiment — Honest Assessment

Let's talk about the trade pages specifically, because this was our biggest strategic mistake.

The idea was straightforward: create 32 pages targeting local service businesses. "Marketing for plumbers." "Marketing for flooring companies." "Marketing for landscapers." Each page would rank for its niche and funnel service businesses to our products.

Here's the reality after 30 days:

Position 85 means we're on page 9 of Google. Nobody scrolls to page 9. These impressions are phantom — Google is testing us in the index but nowhere near where humans actually look.

The competitive problem is structural, not fixable with better content. The sites ranking for "marketing for plumbers" are actual marketing agencies that serve plumbers. They have case studies from plumbing clients. They have testimonials from plumbers. They have years of content about the plumbing industry specifically.

We're an AI marketing education platform. We teach systems. We don't serve plumbers. Google knows that, and no amount of content optimization is going to overcome the fundamental mismatch between what we are and what these queries want.

These 32 pages represent roughly 40 hours of work — writing, editing, publishing, internal linking. That's 40 hours we could have spent on 8-10 deeply researched posts targeting queries we can actually win.

Lesson learned the expensive way.

What We'd Do Differently

Hindsight is useful only if you're specific about what you'd change. Here's our list:

Write 10 posts, not 45. Depth over breadth. Every one of our 45 posts was decent. None of them were exceptional. If we'd taken the same total effort and concentrated it into 10 posts, each one would have been 4x better. Better research, more original data, stronger examples, more comprehensive coverage. At month one, you don't need volume. You need a few pieces that are genuinely worth linking to.

Every title needs a constraint. Timeframe, budget, method, or audience. Not "keyword research" but "keyword research in 90 minutes." Not "AI marketing" but "AI marketing system for a one-person business." Not "content strategy" but "build a marketing system as a solopreneur." The constraint is what separates you from every big site that's already written the generic version.

Target problem keywords, not tool keywords. "How to get traffic to a new website" is a problem keyword. "How to use ChatGPT for marketing" is a tool keyword. The difference matters because problem keywords attract people looking for solutions (potential customers), while tool keywords attract people looking for tutorials (often other marketers). And tool keywords are dominated by the tool companies themselves.

Write moat content with original data earlier. This post you're reading right now — with real numbers, real mistakes, real Search Console data — is the kind of content that earns backlinks. Nobody else has our data. It can't be replicated by a bigger site because it's about our specific experience. We should have written this in week two, not week five.

Skip the trade pages entirely. 32 pages targeting an audience we don't serve, competing against businesses that do serve them. Complete misallocation. Those hours should have gone into 5-6 original research pieces or build-in-public posts.

Focus on backlink-earning content first, keyword-targeted content second. New sites have a chicken-and-egg problem. You need backlinks to rank, but you need rankings to get noticed enough to earn backlinks. The way to break the cycle is to create content that people link to because it's original — data, experiments, frameworks, tools — not because it ranks. Rankings come later, after the backlinks arrive.

The Pattern That Works

After staring at our Search Console data for weeks, one pattern is unmistakeable.

Every post that held its position or showed upward movement followed the same structure in its title: [Specific task] + [Specific method or tool] + [Surprising constraint].

Not "AI SEO." That's a topic. "Keyword Research in 90 Minutes." That's a task with a method and a constraint.

Not "Content Marketing." That's a category. "Content Strategy for a One-Person Business." That's a task with an audience constraint.

Not "How to Rank on Google." That's a question with a million answers. "How We Went from Zero to Page 1 in 30 Days." That's a specific claim with a timeframe.

The reason this works is competitive dynamics. Big sites own the broad queries. Ahrefs owns "keyword research." HubSpot owns "content marketing." Neil Patel owns "how to rank on Google." You're not going to outrank them with better content — they have 15 years of domain authority and thousands of backlinks.

But they haven't written about keyword research in 90 minutes. They haven't written about content strategy specifically for one-person businesses. They haven't published their actual Search Console data from a 30-day publishing experiment.

Specificity beats authority when you're a new site. Not always and not forever — but consistently enough to build traction while you earn the authority you need for bigger keywords later.

Where We Go from Here

We're not stopping. The data from month one is humbling but not discouraging. We learned more in 30 days of live testing than we could have learned in 6 months of reading SEO blogs.

Here's the plan for month two and beyond:

Earn backlinks through original data. This post is the start. We'll publish monthly data updates showing our Search Console numbers, what's working, what's not. Build-in-public content with real numbers is the highest-value backlink earner for a site our size. Other bloggers and newsletter writers link to original data because they can't create it themselves.

Write fewer, better posts. Target: 2-3 posts per week instead of 10-15. Each one targeting a validated problem keyword with a specific constraint in the title. Every post gets at least 2 hours of research and editing, not the 40 minutes we were spending during the sprint.

Build topical clusters. Instead of scattering across 45 topics, we're building deep clusters around three themes: content strategy for small teams, SEO for beginners using AI, and getting traffic without a budget. Ten posts per cluster, all interlinked, all reinforcing each other's authority.

Retire the trade pages. The 32 local service pages aren't helping. They're diluting our topical authority by making Google think we're a marketing agency rather than an education platform. We'll either redirect them to relevant blog posts or noindex them.

Target the keywords that stuck. "Keyword research in 90 minutes" worked. We'll find 20 more queries with the same profile: specific, constrained, uncontested by big sites, and aligned with what we actually teach.

The 45-post sprint wasn't a waste. It was the most expensive market research we've ever done — and every number in our Search Console is a lesson we couldn't have learned any other way.

We'll be back next month with the updated data. Follow along if you want to see what month two looks like for a real site, with real numbers, making real mistakes in public.

Frequently Asked Questions

There's no magic number. Based on our data, 10 well-targeted posts per month outperform 45 broad ones. The key variable isn't volume — it's keyword specificity. Each post should target a query where you can realistically compete. For new sites, 2-4 high-quality posts per week targeting long-tail keywords with constraints (timeframe, method, budget) will build traction faster than flooding Google with dozens of generic articles.

Yes, but with caveats. Our AI-assisted posts reached positions 7-13 within the first two weeks for several keywords. Google doesn't penalise AI content — it penalises low-quality content regardless of how it was made. The posts that held their rankings were the ones we edited heavily by hand, added original data to, and targeted specific queries the big sites hadn't covered. Pure AI output without editing and original insight won't sustain rankings.

Our data shows Google indexes new pages within days if you submit through Search Console. Initial rankings (positions 7-20) can appear within 1-2 weeks — but these are often temporary "honeymoon" rankings that Google uses to test your content. Expect a correction around weeks 3-4 where positions drop significantly. Sustainable rankings for a new site typically take 3-6 months for low-competition keywords, and 6-12 months for anything moderately competitive.

Based on our experiment, fewer long posts win. Our 45-post blitz produced mostly thin results because we spread effort across too many topics. The posts that actually ranked were the longer, more specific ones with original angles. A 2,000-word post targeting a precise keyword with real examples will outperform five 500-word posts on generic topics every time — especially on a new site without domain authority.

For low-competition, long-tail keywords — not necessarily. Our post targeting "keyword research in 90 minutes" held position 4.8 with zero backlinks. But for anything moderately competitive, backlinks are the difference between page 3 and page 1. Our broad topic posts couldn't compete without them. The strategy that works for new sites: start with specific keywords you can rank for without backlinks, then create original-data content (like this post) that naturally earns them.

Find Your Biggest Marketing Gap

The free AI Marketing Systems Score tells you which of your 5 systems needs attention first.

Take the Free Quiz