It is possible to increase these pages to the file being explicitly disregarded. Robots.txt files use a little something called the Robots Exclusion Protocol. This website will very easily deliver the file in your case with inputs of pages to become excluded. It's really a safe, uncomplicated tool to employ https://seotoolstube.com/