Skip to main content

Safeguard Your Workplace Collaboration in Slack

Jamie Tsui
UIT Client Experience and Solutions

Slack offers a wide variety of integrations and apps that connect the platform to third-party services. While some of these can enhance your Slack experience and make collaboration more efficient, it’s essential to exercise caution to protect your workspace and data.

Always refer to university guidelines and data security best practices, particularly when

  • considering the use of third-party bots or apps, particularly those that use generative AI technologies or
  • handling sensitive data.

💡Tip: Refer to Protecting Sensitive Data at Stanford for more information about data security.

PHI and other sensitive information

Due to legal limitations imposed by our agreement with Slack, protected health information (PHI) isn’t permitted on the Stanford Slack grid. This means no PHI data should be shared on Slack via direct messages, group chats, channels, file uploads, or interactions with Slack apps and bots. In the coming months, new controls will be added to the Slack platform to prevent the sharing of files that contain PHI data. More details on this change will be shared  soon.

Remember sensitive data, especially data containing PHI or personally identifiable information (PII), should not be shared with external large language models (LLM) or other generative artificial intelligence (AI) services, including Slack apps or bots. 

Those in the School of Medicine who want to use LLM or AI services with PHI or PII should continue to use Stanford Medicine’s Secure GPT site, a new resource built and hosted by Stanford Medicine’s Technology and Digital Solutions team. Secure GPT is powered by GPT 4.0 and provides a safe, secure environment for asking  questions, summarizing text and files, and helping to solve a range of complex problems. Learn more about Secure GPT.

💡Tip: To view best practices around generative AI, including a list of AI tools being evaluated by UIT, visit the Responsible AI page.

A word about Jigso and other LLM and AI apps

Some members of our Slack community have received messages on Slack from the Jigso app, a chat-based assistant. Jigso isn’t sponsored or supported by the university. Like many other apps,  it hasn’t yet been reviewed by the Information Security Office (ISO) or Stanford Privacy Office.  

You should always proceed cautiously before installing new third-party apps, add-ins, or bots, especially if they integrate with external LLM or AI services. Remember that although advertised, apps or bots may not be approved for use by the university. 

If you are a workspace owner, you have a higher level of responsibility when it comes to managing apps. By default, members can install apps without approval. However, if you’d like, you can enable controls to pre-approve or restrict certain apps for your workspace. 

 Review Slack’s security recommendations for guidance around understanding apps and permissions. 

💡Tip: For questions a university service and its application to privacy and security, contact ISO or refer to Stanford’s approved services.

See also

Get help

DISCLAIMER: UIT Blog is accurate on the publication date. We do not update information in past blog entries. We do make every effort to keep our service information pages up-to-date. Please search our service pages at uit.stanford.edu/search.