Red Flags

There are many commonly used phrases that should prompt you to ask your vendors questions about their practices. Some things you might not find as often in policies, but when you do see them they should immediately raise a red flag. When you find a red flag in a vendor’s privacy policy, make a note and be sure to ask them to give you more details before entering into a contract.


Selling/sharing information

  • Any vendor should be able to explain the lifecycle of a user’s data. If you see a privacy policy that mentions sharing data with fourth parties, ask for specifics. While you might trust the security and privacy practices of the vendor you’re contracting with, do you know how this fourth party handles user data? Any mention of selling user data should be a huge red flag. Libraries already pay to access a vendor’s platform; vendors should not also make money off of a user’s data.

    Example: “Google uses the data collected to track and monitor the use of our Service. This data is shared with other Google services. Google may use the collected data to contextualise and personalise the ads of its own advertising network. You can opt-out of having made your activity on the Service available to Google Analytics by installing the Google Analytics opt-out browser add-on.”

Storing/tracking location data

  • Libraries and vendors should always strive to collect the least amount of data required to offer a service. Using GPS coordinates to target the exact location of a user can mean that person may be easily identified.

    Example: “When you access or use the Service, we may access, collect, monitor and/or remotely store ‘location data’, which may include GPS coordinates (e.g. latitude and/or longitude) or similar information regarding the location of your device. Location data may convey to us information about how you browse and use the Service. Some features of the site, particularly location-based services, may not function properly if use or availability of location data is impaired or disabled.”



Third-party integrations for user authentication

  • Many people like the convenience of using their Facebook, Google, or Microsoft account to log in to various services across the web. Sometimes these user authentication portals have embedded third-party trackers that give the platform access to a wide range of PII.

    Example: “We may receive information about you from third parties. For example, the Service may use Facebook or Google for user authentication. You should always review and, if necessary, adjust your privacy settings on third-party services before linking or connecting them to the Service.”

Clear gifs/web beacons/tracking pixels

  • These are transparent images embedded on websites and in emails. They are mostly used in conjunction with cookies and track user behavior across the web. They can be used in emails to notify the sender when a recipient has opened a message. Web Beacons cannot be denied or blocked like cookies. The most pervasive of them can even give over specific location data.

    Example: “We use pixels to learn more about your interactions with email content or web content, such as whether you interacted with ads or posts. Pixels can also enable us and third parties to place cookies on your browser.”

Email communication (signing people up for marketing emails)

  • The ideal setup for a user to access a vendor’s product through the library would be where they do not need to share their email to create an account. Their library card number and PIN should be sufficient. When this is unavoidable, it is important that the vendor use the email address sparingly and not push advertising messages to the user.

    Example: “We will contact you through email, mobile phone, notices posted on our websites or apps, and other ways through our Services, including text messages and push notifications.”

Disclosure of information

  • Vendors may get requests from law enforcement to disclose user data. This is part of the reason we want vendors to collect the least amount of information possible. It is reasonable to see a notice in a privacy policy that states that a user’s information may be shared with law enforcement, but the vendor’s ability to release users’ information should be limited in scope. Seek to add contractual language that requires a vendor to notify the library when a request to disclose information is made and to only release users’ information when compelled by law.

    Example: “Regardless of the choices you make regarding your information and to the extent permitted or required by applicable law, we may disclose information about you to third parties to: (i) enforce or apply this Privacy Policy or the Service Terms; (ii) comply with laws, subpoenas, warrants, court orders, legal processes or requests of government or law enforcement officials; (iii) protect our rights, reputation, safety or property, or that of our users or others; (iv) protect against legal liability; (v) establish or exercise our rights to defend against legal claims; or (vi) investigate, prevent or take action regarding known or suspected illegal activities; fraud; our rights, reputation, safety or property, or those of our users or others; violation of the Service Terms; or as otherwise required by law.”

Ownership of data

  • The details around ownership of data can usually be found in the vendor contract. What you’re looking for in a privacy policy is language that describes what happens to that data if a company is bought, sold, or transferred. A library should not be forced to share its user data with a new company until they have had the opportunity to enter into a new contract. Keep an eye out to see if the privacy policy clearly states if the library or its users have ownership over the data they provide directly to the vendor. Library user data should never be allowed to become a business asset of the vendor.

    Example: “In the event that a division, a product or all of Company is bought, sold or otherwise transferred, or is in the process of a potential transaction, personal information will likely be shared for evaluation purposes and included among the transferred business assets, subject to client contractual requirements and applicable law.” 


  • While there are no 100% guarantees that user data can be secured, when a privacy policy uses soft language (e.g., may, try, might, etc.) or calls out their inability to secure user data, it is a red flag. This language is used to absolve the company of legal responsibility should a breach occur. Look for privacy policies that tell you how they secure the data, not that they are likely unable to do so.

    Example: “The security of your data is important to us but remember that no method of transmission over the Internet or method of electronic storage is 100% secure. While we strive to use commercially acceptable means to protect your Personal Data, we cannot guarantee its absolute security.” 


Exercise | Scavenger Hunt! 

Locate the privacy policy from at least one of your library vendors. Read through the policy and compare it with the red flag and commonly used phrases lists in this guide.

  • What vendor policy did you look at?
  • What red flags did you find?
  • What other red flags not listed did you discover?
  • What else did you find that you didn’t understand?
  • Take these red flags to your vendor (or library worker that is responsible for vendor products) and ask for clarity.

Download Exercise