Please wait while we prepare your content.
Last Updated:
This policy defines how BookGur reviews, moderates, and, when necessary, removes user-generated content or restricts user access to maintain a lawful, ethical, and safe platform. It operates in conjunction with BookGur's Terms and Conditions.
This policy applies to:
Content may be reviewed for any of the following reasons:
Users may request a review of a moderation decision by submitting a written appeal through their account interface or designated contact form. Appeals must include:
All appeals will be reviewed by a secondary moderator or compliance officer not involved in the initial decision. Final decisions are binding and not subject to further internal appeal.
BookGur reserves the right to immediately terminate accounts that demonstrate:
Termination may occur without prior notice in cases involving criminal activity or imminent harm.
For legal or investigative purposes, BookGur may preserve removed or restricted content in a secure, non-public archive. This data will not be used or displayed publicly and may be disclosed only under lawful process (e.g., subpoena, court order, or law enforcement request).
Once content is permanently removed for violation of this policy, it cannot be restored. Users are responsible for maintaining their own backups of any original material uploaded to the platform.
BookGur will comply with valid requests from courts, law enforcement, or regulatory bodies when such requests pertain to criminal investigations, intellectual property enforcement, or threats to safety. BookGur may, at its sole discretion, report illegal content or activity to authorities when necessary to prevent harm or crime.
BookGur may modify or expand this policy at any time to reflect changes in law, technology, or community standards. Updates take effect immediately upon publication. Continued use of the platform constitutes acceptance of the revised policy.
By using BookGur, you acknowledge and agree that BookGur reserves the full right to moderate and remove content at its sole discretion and that such actions are not subject to compensation, restoration, or external arbitration beyond what is outlined herein.
Internal classification framework for consistent moderation decisions
| Level | Classification | Description |
|---|---|---|
| Level 0 | No Violation | Compliant content |
| Level 1 | Minor Violation | Technical/formatting issues, mild infractions |
| Level 2 | Moderate Violation | Policy breach without criminal implication |
| Level 3 | Major Violation | Serious breach: hate, plagiarism, impersonation |
| Level 4 | Critical Violation | Criminal: exploitation, threats, illegal content |
| Severity | Action | Account Status | Notification | Retention |
|---|---|---|---|---|
| Level 0 | None | Active | No notice | — |
| Level 1 | Corrective notice | Active | 7 days to revise | 30 days |
| Level 2 | Remove + warning | Flagged | Explanation | 90 days |
| Level 3 | Remove + suspend | Suspended | Formal notice | 1 year |
| Level 4 | Remove + ban | Terminated | If permitted | Indefinite |
BookGur's detection systems may automatically flag or isolate content based on:
All automated flags require human verification before permanent action, except in emergency safety situations.
All retained material is stored in encrypted archives separate from live user data
This matrix and the broader moderation policy are reviewed annually or upon major legal change. Updates may redefine severity categories or retention rules as technology and law evolve.
BookGur retains absolute discretion in determining violation levels, enforcement actions, and appeal outcomes. This matrix serves as a reference framework, not a limitation of authority. In all cases, BookGur's determination is final.