Summary
Details
[Introduction]
To help organizations respond more quickly to external security threats, Microsoft Teams will allow users to report suspicious external users directly from Teams. These reports will surface in the Teams admin center, giving admins greater visibility into potentially risky interactions and enabling faster investigation and response. This enhancement will build on existing reporting capabilities and will use user signals as an additional layer of protection against phishing, impersonation, and social engineering attacks.
This message is associated with Microsoft 365 Roadmap ID 560547.
[When this will happen]
- Targeted Release: We will begin rolling out in early June 2026 and expect to complete by early June 2026.
- General Availability (Worldwide): We will begin rolling out in mid-June 2026 and expect to complete by late June 2026.
[How this affects your organization]
Who is affected
- All Microsoft 365 tenants using Microsoft Teams
- Users who interact with external users in Teams
- Admins managing Teams security and reporting in the Teams admin center
What will happen
- Users will be able to report suspicious external users directly in Teams, in addition to blocking them:

- Reporting will be available during first time external chat requests and from an external user profile card.
- Reporting will be supported across Teams experiences including chats, meetings, channels, and search results.
- When a user is reported, the submission will appear in the Teams admin center under Protection reports > User reported security submission report

- Admins will be able to review reported users and investigate potential phishing, impersonation, or other suspicious activity.
- Additional metadata will be available through report export to support investigation and response workflows.
- This feature will be enabled by default and will respect existing Teams messaging policies.
[What you can do to prepare]
No action will be required if you want to keep user reporting enabled.
Admins may choose to take the following actions:
- Review current Teams messaging policies and confirm that the Report a security concern setting remains enabled.
- Familiarize security and helpdesk teams with the User reported security submission report in the Teams admin center.
- Update internal security guidance or user education materials, if applicable.
- If a malicious external user is identified through these reports, block that user at the tenant level using External access settings to prevent further communication attempts.
[Compliance considerations]
| Question | Answer |
| Does the change store new customer data, and if so, where? | Yes. Reports submitted by users about external users will be processed and made available to admins for review and investigation, extending existing reporting workflows. |
| Does the change alter how existing customer data is processed, stored, or accessed? | Yes. User generated reports about external users are processed and made available to admins for review and investigation, extending existing reporting workflows. |
| Does the change provide a new way of communicating between users, tenants, or subscriptions? | Yes. Users will be able to submit reports about external users, which will be communicated to tenant admins through the Teams admin center reporting experience. |
| Does the change alter how admins can monitor, report on, or demonstrate compliance activities? | Yes. Admins will gain access to a new User reported security submission report in the Teams admin center, which will enhance monitoring and investigation capabilities related to external security threats. |
| Does the change include an admin control? | Yes. The feature will be governed by the existing Report a security concern setting in Teams messaging policies and can be enabled or disabled by admins. |
Change History
Never Miss a Microsoft 365 Update
Join thousands of IT professionals who rely on DeltaPulse for real-time Microsoft 365 change intelligence, automated notifications, and community insights.