Claude for Education
Claude for Education is an AI platform from Anthropic that enables Stanford users to access advanced AI tools for teaching, learning, research, and administrative work. It is designed to support critical thinking, responsible AI use, and academic integrity within a secure, university-managed environment.
At Stanford, Claude for Education includes features such as Claude Chat, direct API access to Claude models, and more.
Features
- Anthropic Claude Models – Access advanced Claude AI models through the Claude website or Anthropic APIs to support writing, analysis, research, and problem-solving.
- Claude Code – A developer tool that helps users write, modify, test, and manage software using natural language. Ideal for coding, development, and automating multi-step processes.
- Claude Cowork – An AI-powered productivity assistant that connects to local files and applications to complete multi-step tasks from start to finish.
- Learning Mode – A feature that guides users through the reasoning process rather than providing direct answers, helping build critical thinking through structured questioning.
- Claude Skills – Create consistent results for specialized tasks by defining how documents are generated, data is analyzed, and workflows are automated.
- Integrations and Tools – Connect Claude to other applications to interact with web content, communication channels, and everyday workflows.
Designed for
More information coming soon.
Data security
As generative AI services, Claude Chat, Claude Code, and Claude Cowork require a high level of caution. Their use is restricted to protect university and personal information.
- This service is not currently approved for Stanford data.
- Claude may not be used to transmit or store high-risk data, as defined by the Information Security Office.
Do not input high-risk data into Claude. Prompts and conversations should contain only publicly available or non-sensitive information. This is a critical security requirement. Prohibited data includes, but is not limited to:
- Confidential or proprietary university information
- Protected Health Information (PHI)
- Personally Identifiable Information (PII)
- Code or software intended for a patent application
- Any information you are not authorized to share publicly
Treat all conversations with Claude as public. Do not assume privacy for any information you provide.
- Please review Stanford’s guide to using agentic AI safely.
- Always follow responsible AI best practices.
- For more information, refer to Anthropic’s Privacy Center.
Rates
More information coming soon.
Get started
More information coming soon.
Get help
- For assistance with accessing your Claude for Education account, please submit a Help request.
- For help using Claude, visit Anthropic’s Help Center.
Learn more
Review the following resources to help you get the most out of your Claude experience:
See also
Explore University IT services and training to support AI innovation at Stanford.
