Welcome to the new Web Developer's Journal built with Nuxt Content!

How I Built DOJ.me

You probably heard or even visited sites like downforeveryoneorjustme.com and some of the other similar websites. It's a very simple business model. People search for websites that might be down and they are served with a result along with some ads. The ads make money and everybody is happy.

I am not just a developer but also an entrepreneur. So I have been thinking about starting such a site for a long time. Starting it from scratch though is a lot of headache. There are huge established websites, so SEO-wise good luck beating them. They have one weakness though, their domain names are loooong.

Enter DOJ.me

One day I was looking through Godaddy auction listings and noticed doj.me, a 3 letter domain. The SEO metrics looked interesting so I took a closer look. This website was used exactly for what I wanted to build myself. It changed owners at least once, and at one time there were some spammy content on it too.

I saw a big potential, and considering that the price was incredibly low, I placed a bid. I was thinking someone will jump in with a higher bid, but the auction ended and I was the winner.

The Strategy

Since I know SEO the most, that's what I am going to focus on the most. The site is already ranking for some of the keywords in the top 10, so I don't plan to push link building other then what naturally occurs. I have added a copy link button so that people can share it easier.

Once the site is in decent position for some of the keywords, it will naturally grow since it's much nicer to share such a short domain name than the ones mentioned earlier. Natural growth from there.

The Potential

It is really hard to estimate how much a site like than can make. My research suggests it can be as low as $1000/mo and as much as $30k. That is 100% passive, so it's worth my time. Since I am a self-taught web developer, I thought this will be an ideal practice project.

The Tech

The tech stack is what I know best: Docker, Nginx, Node, Mongodb.

During my research I have found out that these sites can generate a huge amount of traffic and DDOS attacks are common. I had some experience with OVH in the past. They offer very good pricing on VPS-s. Currently the site is running on a \$25 VPS.

The MongoDB database is running on the same machine. I am not worried about data loss, since the data is not essential and can be rebuilt literally in a day.

The big selling point for OVH is that they provide unlimited DDOS protection for all their services. In addition to that, they also offer a 1Gbit/s connection. That should be more than enough to handle traffic spikes.

It might not be 100% uptime, but until it makes money, I don't care.


I am using Docker swarm to easily deploy and scale services. Kubernetes' setup on a VPS is a nightmare, so Swarm is good enough for now.


Actually I am using Openresty with AutoSSL. It's a nice setup. Nginx is the point of hardening and security. This is where I have done some rate limiting, security headers and caching headers.


The application is split in two in the backend, both containers running Nodejs. One is for serving the website, the other for doing the actual checking.

The one that's serving up the website is running on Express. The other one is running a ping utility and Axios with interceptors to measure performance.


Since data is not that important in this case, I don't need transactions or any fancy RDBMS feature. Mongodb is just perfect for this.

There is actually a second Mongodb instance running that is automatically updated weekly. It pulls down a list of bad websites. It's important to have a blacklist so that DOJ does not link to questionable websites. This solution is not perfect since I have just noticed that an adult website is featured on the homepage. That's bad for SEO and the brand as well. I am planning to add a word based blacklist in addition to the existing one.


Redis is used for caching. I am using the bitnami image since that's very easy to configure as a caching layer.

The website is very fast. Before I added Google Analytics the whole website was ~40KB. Analytics doubled that, but it's still good.

On the frontend I have used vanilla Javascript and the Parcel bundler.

There are still some little bugs present, but for the most part the website is ready. The server needs some hardening still.