Regarding "doing all you can to prevent/discourage automated scraping" - there has been various advice going around over the past several years over ways individual users can prevent this - locking their journal, turning off indexing, editing their journal style, turning on adult content warnings, etc. Can you give any information on whether any of those are worth doing above and beyond what you as a site are doing to prevent AI scraping? I'd love to feel more like I knew which of the advice going around was in any way accurate, and some of those have major downsides too.
(On many websites, people are increasingly recommending "lock to logged-in users only" to prevent scraping - does DW have any plans to introduce that level of lock? Would it actually help with keeping out AI scrapers here?)
no subject
Regarding "doing all you can to prevent/discourage automated scraping" - there has been various advice going around over the past several years over ways individual users can prevent this - locking their journal, turning off indexing, editing their journal style, turning on adult content warnings, etc. Can you give any information on whether any of those are worth doing above and beyond what you as a site are doing to prevent AI scraping? I'd love to feel more like I knew which of the advice going around was in any way accurate, and some of those have major downsides too.
(On many websites, people are increasingly recommending "lock to logged-in users only" to prevent scraping - does DW have any plans to introduce that level of lock? Would it actually help with keeping out AI scrapers here?)