AUP Enforcement

{"firstRow":["$TYPE","$CATEGORY"],"columns":[["$TYPE","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data"],["$CATEGORY","Jan 2024 - Jun 2024","Jan 2024 - Jun 2024","Jan 2024 - Jun 2024","Jan 2024 - Jun 2024","","Jul 2023 - Dec 2023","Jul 2023 - Dec 2023","Jul 2023 - Dec 2023","Jul 2023 - Dec 2023","","Jan 2023 - Jun 2023","Jan 2023 - Jun 2023","Jan 2023 - Jun 2023","Jan 2023 - Jun 2023","","Jul 2022 - Dec 2022","Jul 2022 - Dec 2022","Jul 2022 - Dec 2022","Jul 2022 - Dec 2022","","Jan 2022 - Jun 2022","Jan 2022 - Jun 2022","Jan 2022 - Jun 2022","Jan 2022 - Jun 2022","","Jul 2021 - Dec 2021","Jul 2021 - Dec 2021","Jul 2021 - Dec 2021","Jul 2021 - Dec 2021","","Jan 2021 - Jun 2021","Jan 2021 - Jun 2021","Jan 2021 - Jun 2021","Jan 2021 - Jun 2021","","Jul 2020 - Dec 2020","Jul 2020 - Dec 2020","Jul 2020 - Dec 2020","Jul 2020 - Dec 2020","","Jan 2020 - Jun 2020","Jan 2020 - Jun 2020","Jan 2020 - Jun 2020","Jan 2020 - Jun 2020","","Jul 2019 - Dec 2019","Jul 2019 - Dec 2019","Jul 2019 - Dec 2019","Jul 2019 - Dec 2019","","Jan 2019 - Jun 2019","Jan 2019 - Jun 2019","Jan 2019 - Jun 2019","Jan 2019 - Jun 2019","","Jul 2018 - Dec 2018","Jul 2018 - Dec 2018","Jul 2018 - Dec 2018","Jul 2018 - Dec 2018","","Jan 2018 - Jun 2018","Jan 2018 - Jun 2018","Jan 2018 - Jun 2018","Jan 2018 - Jun 2018","","Jul 2017 - Dec 2017","Jul 2017 - Dec 2017","Jul 2017 - Dec 2017","Jul 2017 - Dec 2017","","Jan 2017 - Jun 2017","Jan 2017 - Jun 2017","Jan 2017 - Jun 2017","Jan 2017 - Jun 2017","","Jul 2016 - Dec 2016","Jul 2016 - Dec 2016","Jul 2016 - Dec 2016","Jul 2016 - Dec 2016","","Jan 2016 - Jun 2016","Jan 2016 - Jun 2016","Jan 2016 - Jun 2016","Jan 2016 - Jun 2016","","Jul 2015 - Dec 2015","Jul 2015 - Dec 2015","Jul 2015 - Dec 2015","Jul 2015 - Dec 2015","","Jan 2015 - Jun 2015","Jan 2015 - Jun 2015","Jan 2015 - Jun 2015","Jan 2015 - Jun 2015","","Jul 2014 - Dec 2014","Jul 2014 - Dec 2014","Jul 2014 - Dec 2014","Jul 2014 - Dec 2014","","Jan 2014 - Jun 2014","Jan 2014 - Jun 2014","Jan 2014 - Jun 2014","Jan 2014 - Jun 2014","","Jan 2013 - Dec 2013","Jan 2013 - Dec 2013","Jan 2013 - Dec 2013","Jan 2013 - Dec 2013","","Jan 2012 - Dec 2012","Jan 2012 - Dec 2012","Jan 2012 - Dec 2012","Jan 2012 - Dec 2012"]]}
{"Jan 2024 - Jun 2024":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our <a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a> and <a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our <a href=\"https://help.dropbox.com/account-settings/report-abuse\">reporting tool<\/a> or by <a href=\"https://www.dropbox.com/report_abuse\">completing this form<\/a>. When we become aware of instances of apparent CSAM, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.<\/p>\n<p>From January through June 2024, we submitted 26,894 CyberTip reports to NCMEC and disabled access to 26,244 distinct accounts and 133,542 individual pieces of violative content under our policies against child sexual abuse and exploitation material.&nbsp;<\/p>\n<p>Our team takes care in enforcing our policies. When a user thinks we made a mistake in our enforcement, they may contact <a href=\"https://www.dropbox.com/get_help/login-issue/disabled-account/logged-out-form\">Dropbox support<\/a> to request a review of that decision. In the first half of 2024, we received 2,099 appeals from accounts disabled under Dropbox\u2019s child sexual exploitation and abuse policy. We reinstated access in 3.6% of those cases.<\/p>\n","Jul 2023 - Dec 2023":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our&nbsp;<a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a>&nbsp;and&nbsp;<a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our&nbsp;<a href=\"https://www.dropbox.com/report_abuse\">reporting tool<\/a>&nbsp;or by <a href=\"https://www.dropbox.com/report_abuse\">completing this form<\/a>. When we become aware of instances of apparent child sexual abuse material, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.&nbsp;<\/p>\n<p>From July through December 2023, we submitted 33,963 CyberTip reports to the National Center for Missing and Exploited Children (NCMEC) and disabled access to 31,474 distinct accounts and 273,565 individual pieces of violative content under our policies against child sexual abuse and exploitation material.<\/p>\n<p>Our team takes care in enforcing our policies. When a user thinks we made a mistake in our enforcement, they may contact <a href=\"https://www.dropbox.com/support/\">Dropbox support<\/a> to request a review of that decision. In the relevant half, we received 2,256 appeals from accounts disabled under Dropbox\u2019s child sexual exploitation and abuse policy. We reinstated access in 2.6% of those cases.<\/p>\n","Jan 2023 - Jun 2023":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our <a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a> and <a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our <a href=\"https://www.dropbox.com/report_abuse\">reporting tool<\/a> or by completing this form. When we become aware of instances of apparent child sexual abuse material, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.&nbsp;<\/p>\n<p>From January through June 2023, we submitted 20,003 CyberTip reports to the National Center for Missing and Exploited Children (NCMEC) and actioned 18,430 distinct accounts and 169,726 individual pieces of content for violating our policies against child sexual abuse and exploitation material.<\/p>\n<p>&nbsp;<\/p>\n<p><b>A note to provide context<\/b><\/p>\n<p>Actioned means we\u2019ve enforced our policies by disabling access to the account and/or the piece of content.<\/p>\n","Jul 2022 - Dec 2022":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our&nbsp;<a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a>&nbsp;and&nbsp;<a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our reporting tool or by&nbsp;completing <a href=\"https://www.dropbox.com/report_abuse\">this form<\/a>. When we become aware of instances of apparent child sexual abuse material, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.<\/p>\n<p>From July through December 2022, we submitted 19,214 CyberTip reports to NCMEC and actioned 17,829 distinct accounts and&nbsp;151,787* individual pieces of content for violating our policies against child sexual abuse and exploitation material.<br>\n<\/p>\n<p><b>A note to provide context<\/b><\/p>\n<p>Actioned means we\u2019ve enforced our policies by disabling access to the account and/or the piece of content.<\/p>\n<p><i>*On January 31, 2024, we updated the number of individual pieces of content actioned for violating our policies against child sexual abuse and exploitation material. Initially, the report stated&nbsp;975,499 individual pieces of content had been actioned. The correct number is 151,787. Other metrics in this report remain unchanged.<\/i><\/p>\n","Jan 2022 - Jun 2022":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our&nbsp;<a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a>&nbsp;and&nbsp;<a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our reporting tool or by&nbsp;completing <a href=\"https://www.dropbox.com/report_abuse\">this form<\/a>. When we become aware of instances of apparent child sexual abuse material, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.<\/p>\n<p>From January through June 2022, we submitted 26,730 CyberTip reports to NCMEC and actioned 25,336 distinct accounts and&nbsp;222,647* individual pieces of content for violating our policies against child sexual abuse and exploitation material.<br>\n<\/p>\n<p><b>A note to provide context<\/b><\/p>\n<p>Actioned means we\u2019ve enforced our policies by disabling access to the account and/or the piece of content.<\/p>\n<p><i>*On January 31, 2024, we updated the number of individual pieces of content actioned for violating our policies against child sexual abuse and exploitation material. Initially, the report stated&nbsp;382,261 individual pieces of content had been actioned. The correct number is 222,647. Other metrics in this report remain unchanged.<\/i><\/p>\n","Jul 2021 - Dec 2021":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our&nbsp;<a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a>&nbsp;and&nbsp;<a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our reporting tool or by&nbsp;completing <a href=\"https://www.dropbox.com/report_abuse\">this form<\/a>. When we become aware of instances of apparent child sexual abuse material, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.<\/p>\n<p>From July through December 2021, we submitted 24,115 CyberTip reports to NCMEC and actioned 22,799 distinct accounts and&nbsp;261,769* individual pieces of content for violating our policies against child sexual abuse and exploitation material.<br>\n<\/p>\n<p><b>A note to provide context<\/b><\/p>\n<p>Actioned means we\u2019ve enforced our policies by disabling access to the account and/or the piece of content.<\/p>\n<p><i>*On January 31, 2024, we updated the number of individual pieces of content actioned for violating our policies against child sexual abuse and exploitation material. Initially, the report stated 425,847 individual pieces of content had been actioned. The correct number is 261,769. Other metrics in this report remain unchanged.<\/i><\/p>\n","Jan 2021 - Jun 2021":"<p>Child sexual exploitation and abuse has no place on Dropbox. This kind of material violates our <a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a> and <a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>, and we will swiftly disable any accounts found with this content. Dropbox uses a variety of tools, including industry-standard automated detection technology, and human review to find potentially violating content and action it as appropriate. We also encourage our users to report inappropriate content they come across through our reporting tool or by <a href=\"https://www.dropbox.com/report_abuse\">completing this form<\/a>. When we become aware of instances of apparent child sexual abuse material, we disable the account and make a report to the National Center for Missing and Exploited Children (NCMEC), in accordance with applicable law.<br>\n<br>\nFrom January to June 2021, we submitted 24,131 CyberTip reports to NCMEC and actioned 23,000 distinct accounts and&nbsp;339,716* individual pieces of content for violating our policies against child sexual abuse and exploitation material.&nbsp;<br>\n<br>\n<b>A note to provide context<\/b><br>\nActioned means we\u2019ve enforced our policies by disabling access to the account and/or the piece of content.<\/p>\n<p><i>*On January 31, 2024, we updated the number of individual pieces of content actioned for violating our policies against child sexual abuse and exploitation material. Initially, the report stated&nbsp;438,175 individual pieces of content had been actioned. The correct number is 339,716. Other metrics in this report remain unchanged.<\/i><\/p>\n"}

Child Sexual Abuse Material (CSAM)

{"firstRow":["$TYPE","$CATEGORY"],"columns":[["$TYPE","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data","","data","data","data","data"],["$CATEGORY","Jan 2024 - Jun 2024","Jan 2024 - Jun 2024","Jan 2024 - Jun 2024","Jan 2024 - Jun 2024","","Jul 2023 - Dec 2023","Jul 2023 - Dec 2023","Jul 2023 - Dec 2023","Jul 2023 - Dec 2023","","Jan 2023 - Jun 2023","Jan 2023 - Jun 2023","Jan 2023 - Jun 2023","Jan 2023 - Jun 2023","","Jul 2022 - Dec 2022","Jul 2022 - Dec 2022","Jul 2022 - Dec 2022","Jul 2022 - Dec 2022","","Jan 2022 - Jun 2022","Jan 2022 - Jun 2022","Jan 2022 - Jun 2022","Jan 2022 - Jun 2022","","Jul 2021 - Dec 2021","Jul 2021 - Dec 2021","Jul 2021 - Dec 2021","Jul 2021 - Dec 2021","","Jan 2021 - Jun 2021","Jan 2021 - Jun 2021","Jan 2021 - Jun 2021","Jan 2021 - Jun 2021","","Jul 2020 - Dec 2020","Jul 2020 - Dec 2020","Jul 2020 - Dec 2020","Jul 2020 - Dec 2020","","Jan 2020 - Jun 2020","Jan 2020 - Jun 2020","Jan 2020 - Jun 2020","Jan 2020 - Jun 2020","","Jul 2019 - Dec 2019","Jul 2019 - Dec 2019","Jul 2019 - Dec 2019","Jul 2019 - Dec 2019","","Jan 2019 - Jun 2019","Jan 2019 - Jun 2019","Jan 2019 - Jun 2019","Jan 2019 - Jun 2019","","Jul 2018 - Dec 2018","Jul 2018 - Dec 2018","Jul 2018 - Dec 2018","Jul 2018 - Dec 2018","","Jan 2018 - Jun 2018","Jan 2018 - Jun 2018","Jan 2018 - Jun 2018","Jan 2018 - Jun 2018","","Jul 2017 - Dec 2017","Jul 2017 - Dec 2017","Jul 2017 - Dec 2017","Jul 2017 - Dec 2017","","Jan 2017 - Jun 2017","Jan 2017 - Jun 2017","Jan 2017 - Jun 2017","Jan 2017 - Jun 2017","","Jul 2016 - Dec 2016","Jul 2016 - Dec 2016","Jul 2016 - Dec 2016","Jul 2016 - Dec 2016","","Jan 2016 - Jun 2016","Jan 2016 - Jun 2016","Jan 2016 - Jun 2016","Jan 2016 - Jun 2016","","Jul 2015 - Dec 2015","Jul 2015 - Dec 2015","Jul 2015 - Dec 2015","Jul 2015 - Dec 2015","","Jan 2015 - Jun 2015","Jan 2015 - Jun 2015","Jan 2015 - Jun 2015","Jan 2015 - Jun 2015","","Jul 2014 - Dec 2014","Jul 2014 - Dec 2014","Jul 2014 - Dec 2014","Jul 2014 - Dec 2014","","Jan 2014 - Jun 2014","Jan 2014 - Jun 2014","Jan 2014 - Jun 2014","Jan 2014 - Jun 2014","","Jan 2013 - Dec 2013","Jan 2013 - Dec 2013","Jan 2013 - Dec 2013","Jan 2013 - Dec 2013","","Jan 2012 - Dec 2012","Jan 2012 - Dec 2012","Jan 2012 - Dec 2012","Jan 2012 - Dec 2012"]]}
{"Jan 2024 - Jun 2024":"<p>Dropbox\u2019s <a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a> and <a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a> prohibit publishing, sharing, or storing content that contains or promotes terrorism or violent extremism, including terror or violent extremist propaganda. Dropbox relies on a combination of proactive and reactive tools to detect terrorism or violent extremism content and enforce our policies. These tools include leveraging industry-standard hash matching detection technology, a trusted flagger program, external reports from members of the public and our users, and manual review by highly trained analysts. We strongly encourage those who come across terror or violent extremist content on Dropbox to report it through our <a href=\"https://help.dropbox.com/account-settings/report-abuse\">reporting tool<\/a> or by <a href=\"https://www.dropbox.com/report_abuse\">completing this form<\/a>. When we find terror or violent extremist content that violates our policies, we will disable access to that content and take steps to prevent it from being further shared. When warranted, such as when accounts appear to be used solely for purposes of disseminating terrorist or violent extremist propaganda, we may also disable the associated account.&nbsp;<\/p>\n<p>From January through June 2024, Dropbox disabled access to 1,249 pieces of terror or violent extremist content and disabled 484 accounts. We received 232 public reports of potential terror content and acted on every report.<\/p>\n<p>Users who believe we\u2019ve made a mistake in actioning their accounts can ask us to review that determination by contacting <a href=\"https://www.dropbox.com/get_help/login-issue/disabled-account/logged-out-form\">Dropbox support<\/a>. From January through June 2024, Dropbox received 0 appeals from users who claimed their content or accounts were disabled in error under our terrorism and violent extremism policy.<\/p>\n<p>From January through June 2024, Dropbox received 0 removal orders issued pursuant to EU Regulation 2021/784 (addressing terror content online).&nbsp;<br>\n<\/p>\n","Jul 2023 - Dec 2023":"<p>Dropbox\u2019s <a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a> and <a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a> prohibit publishing, sharing, or storing content that contains or promotes terrorism or violent extremism, including terror or violent extremist propaganda. Dropbox relies on a combination of proactive and reactive tools to detect terrorism or violent extremism content and enforce our policies. These tools include leveraging industry-standard hash matching detection technology, a trusted flagger program, external reports from members of the public and our users, and manual review by highly trained analysts. We strongly encourage those who come across terror or violent extremist content on Dropbox to report it through our <a href=\"https://www.dropbox.com/report_abuse\">reporting tool<\/a>. When we find terror or violent extremist content that violates our policies, we will disable access to that content and take steps to prevent it from being further shared. When warranted, such as when accounts appear to be used solely for purposes of disseminating terrorist or violent extremist propaganda, we may also disable the associated account.<\/p>\n<p>From July through December 2023, Dropbox disabled access to 854 pieces of terror or violent extremist content and disabled 493 accounts. We received 286 public reports of potential terror content and took no action on 5 reports. When Dropbox takes no action pursuant to a report, it may be because the provided link was invalid, the content no longer existed, or the content did not violate our Acceptable Use Policy.<\/p>\n<p>Users who believe we\u2019ve made a mistake in actioning their accounts can ask us to review that determination by contacting <a href=\"https://www.dropbox.com/support/sign-in-issues\">Dropbox support<\/a>. From July through December 2023, Dropbox received 0 appeals from users who claimed their content or accounts were disabled in error under our terrorism and violent extremism policy.&nbsp;<\/p>\n<p>From July through December 2023, Dropbox received 0 removal orders issued pursuant to EU Regulation 2021/784 (<i>addressing terror content online<\/i>).&nbsp;<\/p>\n","Jan 2023 - Jun 2023":"<p>Dropbox\u2019s&nbsp;<a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a>&nbsp;and&nbsp;<a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a>&nbsp;prohibit publishing, sharing, or storing content that contains or promotes terrorism or violent extremism, including terror or violent extremist propaganda. Dropbox relies on a combination of proactive and reactive tools to detect terrorism or violent extremism content and enforce our policies. These tools include leveraging industry-standard hash matching detection technology, a trusted flagger program, external reports from members of the public and our users, and manual review by highly trained analysts. We strongly encourage those who come across terror or violent extremist content on Dropbox to report it through our&nbsp;<a href=\"https://www.dropbox.com/report_abuse\">reporting tool<\/a>. When we find terror or violent extremist content that violates our policies, we will disable access to that content and take steps to prevent it from being further shared. When warranted, such as when accounts appear to be used solely for purposes of disseminating terrorist or violent extremist propaganda, we may also disable the associated account.<\/p>\n<p>From January through June 2023, Dropbox disabled access to 786 pieces of terror or violent extremist content and disabled 565 accounts. We received 429 public reports of potential terror content and took no action on 2 reports. When Dropbox takes no action pursuant to a report, it may be because the provided link was invalid, the content no longer existed, or the content did not violate our Acceptable Use Policy.<\/p>\n<p>Users who believe we\u2019ve made a mistake in actioning their accounts can ask us to review that determination by contacting&nbsp;<a href=\"https://www.dropbox.com/support/sign-in-issues\" style=\"background-color: rgb(255, 255, 255); font-size: 0.8125rem;\">Dropbox support<\/a>.&nbsp;From January through June 2023, Dropbox received 0 appeals from users who claimed their content or accounts were disabled in error under our terrorism and violent extremism policy.&nbsp;<br>\n<\/p>\n<p>From January through June 2023, Dropbox received 0 removal orders issued pursuant to EU Regulation 2021/784 (<i>addressing terror content online<\/i>).&nbsp;<br>\n<\/p>\n","Jul 2022 - Dec 2022":"<p>Dropbox\u2019s <a href=\"https://www.dropbox.com/terms\">Terms of Service<\/a> and <a href=\"https://www.dropbox.com/acceptable_use\">Acceptable Use Policy<\/a> prohibit publishing, sharing, or storing content that contains or promotes terrorism or violent extremism, including terror or violent extremist propaganda. Dropbox relies on a combination of proactive and reactive tools to detect terrorism or violent extremism content and enforce our policies. These tools include leveraging industry-standard hash matching detection technology, a trusted flagger program, external reports from members of the public and our users, and manual review by highly trained analysts. We strongly encourage those who come across terror or violent extremist content on Dropbox to report it through our <a href=\"https://www.dropbox.com/report_abuse\">reporting tool<\/a>. When we find terror or violent extremist content that violates our policies, we will disable access to that content and take steps to prevent it from being further shared. When warranted, such as when accounts appear to be used solely for purposes of disseminating terrorist or violent extremist propaganda, we may also disable the associated account.&nbsp;<\/p>\n<p>From July through December 2022, Dropbox disabled access to 757 pieces of terror or violent extremist content and disabled 207 accounts. We received 291 public reports of potential terror content and took no action on 1 report. When Dropbox takes no action pursuant to a report, it may be because the provided link was invalid, the content no longer existed, or the content did not violate our Acceptable Use Policy.<\/p>\n<p>Users who believe we\u2019ve made a mistake in actioning their accounts can ask us to review that determination by contacting <a href=\"https://www.dropbox.com/support/sign-in-issues\">Dropbox support<\/a>. From July to December 2022, Dropbox received 0 appeals from users who claimed their content or accounts were disabled in error under our terrorism and violent extremism policy.&nbsp;<\/p>\n<p>From July through December 2022, Dropbox received 0 removal orders issued pursuant to EU Regulation 2021/784 (addressing terror content online).&nbsp;<br>\n<\/p>\n"}

Terrorism and Extreme Violence