SEODavid Naylor Launches Robots.txt File Builder

David Naylor Launches Robots.txt File Builder

I was actually outside the Hilton discussing this project less than two weeks ago with David Naylor and he has already delivered the first part of his idea.

Many people screw up their robots.txt file and deny the search engine spiders access to their sites. Dave thought it would be a great idea to create a central site that acts autonomously where people can have their robots.txt file created and stored to insure good interaction with the spiders.

His initial offering allows people to create the file and then copy and paste it into a page they can upload to their own site. Eventually Dave wants to host the pages himself and make sure the spiders correctly spider them. The site would be the central location for all spiders to get the right written file for any website.

The subtle differences between the spiders can be adapted, but Dave felt it would also be a way to get uniformity from the engines once they saw people using the site in sizable numbers.

The other ancilary benefits he was discussing was the ability to determine load times for a given site and get the spiders to visit at low traffic times so as to not overload the client’s site capabilities.

I am impressed how quickly he got started on this. But then again he did share it at SES NYC with other fast to market players…. maybe he correctly guessed he better move on it quick before someone else did.

Great job so far Dave… now don’t forget the rest!

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y