One webmaster in John’s 09/10/2021 hangout was asking about problems with pages that had similar content, impacting site quality. Most of the pages involve pages for specific postal codes. The content is not super interesting for the users on these pages, as the webmaster explains it.
The webmaster is facing indexing issues. Around 90 percent of their pages are excluded from the index. They wondered where they went wrong and why the pages they created are not being indexed by Google.
John explained that sites that analyze and spit out a lot of data are not always useful for users, especially in cases where niches are already highly competitive and these types of sites already exist. You have to work on the quality of the page and make sure that you’re also providing pages that are useful for people. It’s not enough to simply regurgitate data that’s already being spit out by everyone else.
Create your own unique insights and create content that is supplemental in nature to the redundant data. Anything beyond just regurgitation can help improve your chances of getting indexed.
The main issue is quality and making sure that your pages are high-quality, unique, and useful enough for the user that Google will want to index them no matter what.
This occurs at approximately the 18:18 point in the video: