That is nice, so many companies don't stop to think how their product might be abused. Love to see that you've given it some thought.
One think I might add: limit how many requests per second your clients can make. Even if they can just scrape a small set of sites, they can still hammer a site into the ground. One of the things I think Google has been doing really well since their start is knowing when to back of a site. So either rate-limit your clients, or back of a site if you see responses slowing down.
We just had a company hit one of our sites pretty heavily, and when asking them of back off a bit, they just asked if we could perhaps rate-limit them. Apparently they don't know how to reduce their traffic themselves.
One think I might add: limit how many requests per second your clients can make. Even if they can just scrape a small set of sites, they can still hammer a site into the ground. One of the things I think Google has been doing really well since their start is knowing when to back of a site. So either rate-limit your clients, or back of a site if you see responses slowing down.
We just had a company hit one of our sites pretty heavily, and when asking them of back off a bit, they just asked if we could perhaps rate-limit them. Apparently they don't know how to reduce their traffic themselves.