Case study: How I tanked, and then saved, a $50k website
I acquired a website for $50k and lost 50% of the traffic. Here's how I fixed it and grew the site back to $6k/mo.
I email a lot of webmasters asking to buy their site. It’s really been my secret weapon over the years to piggy back the authority and build out high-performing online businesses.
This is the story of one such purchase, from late 2021. The short version of the story is this:
I acquired an abandoned site and paid a 7-year multiple of $50k for it. Immediately following the migration to a new, and very much improved, version, the site lost 50% of its traffic. Ouch. :(
As you can see from the analytics chart, the story does have a happy ending, and the site now earns $6-7k a month passively. Programmatic SEO, ftw.
If you’re interested in acquiring and migrating sites, let this tale of misfortune help you to make better decisions than me.
Let’s jump in.
Table of Contents
- The benefits of persistent email outreach
- Due diligence and escrow process (or lack thereof)
- Finding the data sources and making them useable
- Technical SEO considerations (and my nearly fatal mistake)
- Creating the site and writing 29 (!) data templates
- What do you do when you lose 50% of your traffic?
- The long road to recovery
- What did I learn?
- What next for this site?
The benefits of persistent email outreach
There’s a fine line between being persistent and being annoying. I’m pretty sure I have breached that line - and will continue to breach it - but it is an artform.
In the case of this website (no URL reveal on this just yet) it actually took 3 emails and 25 days before I even got a reply from the owner.
The site was making ~$600 a month from AdSense, but had previously made $1,300 a month. The niche itself focuses around a specific type of US data which is updated regularly, but this hadn’t been done for years.
The owner wanted $50k, which was a 7x annual multiple. This is more than I am usually willing to pay but I was looking at it with the following in mind:
- The data hadn’t been updated in years, which is an opportunity to freshen up the content.
- The content itself was thin. I knew that with programmatic SEO I could create much more engaging and analytical long form content around the new data I planned to source.
- It was on AdSense and had enough traffic to move to another network, so I knew I’d get a big boost in earnings from that alone.
- Technically it was a nightmare, with duplicate content, poor on page SEO = quick wins!
So, I agreed to the $50k price.
Due diligence and escrow process (or lack thereof)
I’m quite relaxed about DD and as long as I can verify the income and traffic and use Ahrefs or Semrush to check for naughty backlink profiles I’m usually happy to proceed.
The timeline on this went like so:
- Escrow transaction set up on October 10th
- Funds received in Escrow on October 14th
- Transfer of all assets (domain name, existing codebase, spreadsheets, SQL database) completed by October 21st
- Inspection period for me to confirm receipt of the assets ends on October 24th.
- Seller gets $$$.
During the handover period, I set up the exact same OG site with a new host, and left it while I got to work trying to figure out what a new version would look like.
This is where the chaos ensues…
Finding the data sources and making them useable
A few of the data sources were very, very, simple, and could be downloaded as a CSV from the US government site.
For those, I quickly threw them into a MySQL database and added an index to speed up searches on the almost 10 million rows of information.
However, the primary topic of the website was the trickiest of all the datasets. I needed to get this right so that Google would still consider the site a topical source of authority. Honestly, the data was PAINFUL to work with.
Here is what I was working with:
- Data was found in multiple different locations and needed to be combined
- Most of the data was in the form of codes, which then map to tables with the real values
- Every single field or row had a random ID, that did not explain what the value was
- There were almost 200 data points that I wanted to track and monitor
I hadn’t worked with this type of data before, so I found a Python developer to write me a set of scripts that would pull the data in from all of the locations, along with pretty column names that described what each piece of data was.
In total, there were 20 different data sources and 6 mapping files that were needed to get the data into a single CSV that could be imported to MySQL.
To get the data into the correct format, there were also 6 different scripts that needed to be run in order AND all of this was only possible after I had manually downloaded all of the raw files. Phew!
Note: This approach worked but when it came to updating the data again in 2022 I had not only forgotten how to run the scripts, but also realised that the code wasn’t great and was still very time consuming to set up.
I’ve now rewritten this entirely myself and have a very automated way of getting the data using government API endpoints.
It took around 7-10 days before all of this work was completed and I had a database ready to go. The next step (which I worked on while the dev created these scripts) was to sort out my technical SEO needs so that I wouldn’t lose any search traffic when it came time to flip the switch. LOL.
Technical SEO considerations (and my nearly fatal mistake)
Whenever you do a migration, you need to make sure that you have a complete list of old => new URLs so that you can do 301 redirects as necessary.
Let me tell you. The old URLs on this site were horrendous. For example:
A large part of the programmatic SEO method I planned to use on this site meant that I needed proper URL structure and sections that could be programmatically mapped to the right content.
The structure of the existing site was so erratic, with content in the same sections having completely different URLs, that it was virtually impossible to keep the existing URLs as-is.
This was my biggest “mistake”, but I don't truly consider it to be one because I didn't really have any other choice.
In migrations after this, I have mostly been able to keep the new URLs the same and to say it was plain sailing in comparison would be an understatement.
I knew when I made the decision to change the URLs that there was a risk of the site losing a lot of traffic, but the long-term benefit of clean URLs, well-structured sections, and removing the nauseating hyphen and underscore combo, was worth it.
For a couple of dozen URLs, you could probably get away with completely changing the URLs and having no issue.
On this site, however, I completely changed over 110,000 URLs.
Once I’d decided to YOLO this $50k on a risky SEO strategy and had figured out the sections of the site I wanted to build, I needed to at least try and mitigate the risk by doing proper 301 redirects.
To do this, I turned to my trusty steed, Screaming Frog.
I crawled the entire OG site and exported the document to Excel so I could get to work. What followed was a painstaking week of mapping all of the URLs found in SF to the sections I’d build in the new site.
There were 10 sections of the site that were just so painful to try and work with and map to new URLs that I used regular expressions to just redirect entire directories to either the home page or another, relevant URL.
During this process, I also created title tag templates for each of the sections I’d build that would improve upon the unoptimised previous tags.
This was done with some very basic keyword research to see what terms people were actually typing into Google for these topics and trying to fit those naturally into titles and descriptions (and later in the content template as well).
By this point, the developer had worked his magic and I was ready to create the actual site itself.
Creating the site and writing 29 (!) data templates
If all of the work getting to this point was tough, we’re about to ramp it up a notch.
Setting up the site was pretty straightforward with my PHP framework. I have since rebuilt this framework using code created by ChatGPT and it’s faster and more usable than ever.
I created all of the routes for each of the sections so I could serve the correct content when a user hit a specific URL.
I also set up all of the title tags and descriptions and wrote code to get the right data from the database.
After that, it was onto the actual meat and potatoes of the whole thing. The site content.
The existing site was pretty much 1-2 paragraphs of text and a big ass table of data. I knew that if I wanted this site to fly, I had to do a lot better than that.
Across the 29 templates, I wrote a total of nearly 20,000 words.
The templates all contain one or more of the following:
- Variable data which was very long-tail specific to the current page
- Data tables (sometimes with links to other pages)
- Lots of internal links to other sections or pages in the current section
- Comparisons of the data to previous years
- Custom sidebar with more information and links
- A tool for users of the data to link back to the page
Once I had the site working locally to a point I thought felt complete, I took the list of OG URLs from Screaming Frog, changed the domain to my local test URL and ran the crawl again.
99% of the URLs all redirected where they were supposed to, but it did pick up a few errors which I was able to fix and recrawl.
Finally, just 25 days after Escrow closed, the new site was ready to launch on November 8th:
Then it all went wrong…
When the site launched on November 8th, it had 8,189 unique visitors that day. By the 31st of November it had been crushed by 55%, to just 3,662.
What do you do when you lose 50% of your traffic?
Shit. On Tuesday, November 16th, 2021 I checked my traffic for the previous day. It was down 40% vs the previous Monday and, as the days went by, it increased to more than 50%.
I did what any self-respecting person would do in this situation. I immediately went to find out if Google had launched an algorithm update that I could blame. They had.
So now I was in a situation where I was unsure whether the massive, sweeping changes I had made were the cause, or if I'd been caught up in the Broad Core algorithm update.
I actually didn’t panic too much. I’d acquired this site with the funds from a 7-figure exit and, while I definitely wasn’t excited about the prospect of losing $50k, I didn’t need a return any time soon, so I could be patient.
However, as I mentioned before, this site was on AdSense. This meant that even though I had taken a crushing blow with the traffic loss, I was still above the threshold needed to move to AdThrive.
Switching to AdThrive meant that even though the traffic was severely down, it was actually still earning considerably more than it did before.
Now all I needed to do was get the traffic back…
The long road to recovery
I knew the likelihood was that the new version of the site would still have a lot of 404 errors, so one of the first things I did was set up a custom report in Google Analytics to track those:
Once a week, I would export this report and manually map any 404 errors to a new URL on the site. I did this dozens of times!
In addition, I checked for 404 errors in Google Search Console, and also mapped and imported those.
My 301 redirect database table ballooned to more than 130,000 rows!
There was no doubt in my mind that the new site was far superior to the previous version, and my working theory was still that Google just saw the sheer volume of change and decided to fully re-evaluate the entire site.
This meant that I needed to get as many pages of the new site crawled as soon as possible.
During the build process, I created sitemaps for all of the different sections, but I was following the Google spec of no more than 50,000 URLs per sitemap.
After a tip from Adam Gent, I followed the advice from Barry Adams and switched the sitemaps to a maximum of 10,000 URLs to try and encourage a more thorough level of indexing.
Another tip from Adam was to put together a huge sitemap index containing all of the original URLs so that Google could crawl them, follow the 301, and hopefully figure out that this new site was worth ranking highly.
When I could drag myself away from wallowing in misery and staring forlornly at my analytics, I took a look at the site in Ahrefs to analyse all of the ranking losses that caused the biggest drops in traffic.
Once I had a list to work with, I hired an expert writer with a lot of knowledge in this niche to create content for me on all of these topics.
The goal was to bolster the new data with content that was truly outstanding quality, written by somebody who knew the subject matter completely.
I continued to monitor 404 errors, working through lost rankings to try and make the site as useful as possible.
Then I waited.
For six months.
Until the May 2022 Broad Core algorithm update.
I saw signs of recovery around May 25th and 26th. Traffic kept increasing throughout June and July, peaking at over 20,000 daily visitors by the end of August.
The Google gods had delivered.
Whatever had happened to cause the site to drop so dramatically in the 2021 core update seemed to be completely reversed in 2022 which, combined with the new data and content, grew the site to levels never seen before (although they did drop back down after the peak).
Once the site had fully recovered, I was optimistic about further growth, and hired the expert writer again (along with another, very prolific writer with the same experience) to produce blog posts for the site.
The site currently has 240+ blog posts, 254,000 indexed pages, and traffic has leveled out to around 12-14,000 daily visitors, earning $6-7k a month.
What did I learn?
I’ve done a lot of migrations. Most of them turned out very favourably, but this site was an example of what can happen even if you seemingly do all of the technical things well.
Obviously, if I had the choice, I would never change the URL when migrating to a new design or layout.
I’ve acquired a few sites recently that do weird things with the urls (like
+plus-with-hyphens) and for the most part I’ve been able to keep them the same as the previous site.
Here’s what the traffic looks like for that particular site:
I’ve also learned with both acquired and new sites, to try and keep the initial launch to the fewest amount of pages possible.
What that looks like will vary from site to site, but my rule of thumb is that if I need a sitemap index page to contain multiple 10,000 URL sitemaps, then it’s probably too much to launch in one go.
Better to start smaller, allow for indexing to happen, and then slowly add new sections to the site over time.
It’s also important to NOT PANIC when you get caught up in an algorithm update. Try to think analytically about what might have happened, and what you can do to fix it - but don’t rush into making changes.
And if you think you won’t ever get caught by an algorithm update, just wait a bit longer ;)
I try to remember that I am building these sites for real people first and foremost. What would they want to see on the page? Can I surface data or information that they can’t find elsewhere?
If everybody else in your niche is doing the same thing. Don’t do the same thing. Try a different approach or angle.
The final thing I’ve learned is that building SEO-powered businesses requires extreme levels of patience.
I’ve sold many sites too early, only to see the traffic sky-rocket and the new owners generate five figures a month.
Six months is a long time to wait for a recovery, but that was almost a year ago now and seems like a distant memory. The road is long and winding, so stay the course.
What next for this site?
This site has been passively earning since September 2022. The only update I made was to add the latest data from the government in December. Adding another year to the database meant that all pages using that data were instantly changed as well.
No blog posts have been published yet in 2023 as I worked on getting other sites up to a similar level of earnings as this one.
The problem for me as a solo operator is that I have to split my time between projects, and taking them to the next level to protect against another algorithm update takes time and effort.
The list of potential options I have for this site include:
- Create an email newsletter. The main users of this site are very passionate hobbyists, and it’s only logical that the next step would be to create a newsletter. One of the writers is so quick and experienced in this niche that they could easily produce a weekly newsletter for me. The problem? He’s also incredibly good at writing for my other sites!
- Build and sell a digital product. There are online tools that people pay for, ebooks that could be written, and templates that could be built to sell to this audience. There’s also a very interesting print-on-demand option that I’ve considered - but requires a significant content investment to make it work.
- Continue to scale the content. On top of the authors, I hired a content agency to create affiliate product posts that are starting to perform well. The next step would be to look at product vs product content.
- Affiliate partnerships. I worked with an affiliate partner last year to add an interactive lead gen widget to certain pages, and earned around $1,000 a month from those. They didn’t continue, but there are lots of angles and direct deals that could be made.
- Integrate new data sources. I have a database of nearly 90 million rows (seriously) that some companies charge for access to. This could be converted into a free online tool that visitors can use to do research on their hobby.
- Acquire similar sites in the same niche. This is something I’ve tried to do and had a few interested replies but nothing concrete. If I could create a mini-network of related sites I could apply the techniques from my primary site to capture multiple page one listings. Also makes the network more compelling in the event of a sale. Which brings me to the final option…
- Sell the site for a ~5x return. The site is currently worth somewhere around $250k, which means if I sold it right now I’d make a 5x return on my original investment (keeping it simple and not including investment time + dollars or revenue). That’s a pretty good result because the buyer would have all of the above options available to grow it further.
I’m in no hurry to make a decision just yet. Selling can be a draining process and I’m not sure if it’s worth the effort right now.
In the meantime, I plan to start the newsletter ASAP and try to get as many signups as possible while I have plenty of search traffic to convert.
And that’s it. Use this info to make better decisions than me when acquiring and migrating websites!
If you have any questions or want me to expand on this case study, let me know on Twitter.
Link to this page
I can haz links? If you can, do me a solid and link to this article if you mention it in your blog post or newsletter.
<a href="https://ian.is/blog/abandoned-website-case-study/">Case study: How I tanked, and then saved, a $50k website</a>