The Robots Exclusion Protocol, often called “robots.txt,” is used by websites to communicate with web crawlers and other web robots, or “bots”. It can signal to the web robot which areas of the website should not be processed or scanned. This presentation given by seasoned legal technology professionals Stacey Brandenburg and Jeff Landis of ZwillGen will help practitioners understand what they need to consider when they encounter Robots.txt in the wild, and possible implications of various decisions. — FISD is the global forum of choice for industry participants to discuss, understand, and facilitate the evolution of financial information for the key players in the value chain including consumer firms, third party groups, and data providers. It is a dynamic environment in which members identify the trends that will shape the industry, and create education opportunities and industry initiatives to address them. For more information about membership, please contact Tracey Shumpert at tshumpert@siia.net