Or even better, have contracts with the companies. Maybe unlikely for them, but I think “scraping” is too often assumed to be “bad” in some way. The company I work for does a lot of web scraping, but we have contracts with our partners to scrape their websites. They may still have robots.txt that ask users not to scrape some areas, but we are allowed to bypass those.