Why we exist?

Blocjerk is a privacy focussed, user owned adult entertainment app.

We're here to liberate the globally oppressed, unbanked or undocumented through the power of blockchain, AI & stakeholders. Beyond the problems we're setting out to solve that are covered in the following pages, our core goals will never change;

  1. To provide a decentralised, permissionless & censorship resistant protocol for mature content creators and consumers that requires no KYC or bank, while offering private payments, data analytics, wank2earn & other cutting edge innovations through the full utilisation of rapidly advancing blockchain & AI technologies.

  2. To sustainably scale and actually have the resources to disrupt a giant market without a billion dollar bankroll. We utilise our decentralised physical infrastructure network (DeePin) of shared computing power to host, transcode and deliver content or data while rewarding miners.

  3. To create a a user centric ecosystem that self moderates and removes the middle-man while tokenising ownership to empower & minimise the fees incurred by creators as well as consumers.

  4. To build a complete suite of data analytics toolkits that help creators find the biggest spenders, users find their ideal creators and businesses increase ROI by choosing the right partners.

  5. To solve the rise of illicit online content and protecting children, victims of crime or the generally vulnerable via a more efficient decentralised moderation system to compete with the failing incumbent (https://www.bbc.com/news/uk-63249018)

Although we can't tackle the dark web, the point is this is happening on existing mainstream websites and apps on the internet today as the BBC report above unveils. Some stats that hit home

  • The US Sentencing Commissionโ€™s October 2021 report, which studied child sexual abuse image producers, found that OSEAC (online sexual exploitation and abuse of children) has increased by 422% over the last fifteen years (Federal Sentencing of Child Pornography Production Offenses, 2021)

  • In 2021, NCMEC received more than 29.3 million (up 35% over 2020) CyberTipline reports containing over 84.9 million images, videos, and other content related to suspected child sexual exploitation (up 29.8% over 2020). Over the past fifteen years, the number of reports of suspected child sexual abuse materials (CSAM) made to the CyberTipline has increased by 15,000% (CyberTipline 2021 Report, National Center for Missing and Exploited Children).

  • Transparency reports of seven of the biggest social networks were reviewed to find out how prevalent child abuse is on their platforms. Transparency reports typically include content removals, which are broken down into various categories. This report those related to child nudity, abuse, and sexual exploitation. Key findings

  • TikTok saw removals nearly double between 2019 and 2020

  • YouTube has seen a 169% surge in removals between 2018 and 2020

  • Facebook saw a modest 3% decrease between 2019 and 2020 (Source: Comparitech, February 24, 2021)

  • 2020 was a record-breaking year, with more than 21.7 million reports of suspected child sexual exploitation made to NCMECโ€™s CyberTipline. This marks the highest number of reports ever received in one year โ€“ a 28% increase over 2019 โ€“ and a disturbing 97.5% increase in online enticement reports alone. (NCMEC, Rise in Online Enticement and Other Trends: NCMEC Releases 2020 Exploitation Stats, Feb. 24, 2021)

  • In 2020, IWF analysts processed 299,600 reports, which include tip offs from members of the public. This is up from 260,400 reports in 2019. This is an increase of 15%. Of these reports, 153,350 were confirmed as containing images and/or videos of children being sexually abused. This compares to 132,700 in 2019 - an increase of 16%. Every report contains between one, and thousands of child sexual abuse images and videos. This equates to millions of images and videos. Of these, 68,000 reports were tagged as including โ€œself-generatedโ€ child sexual abuse content โ€“ a 77% increase on 2019โ€™s total of 38,400 reports. (Internet Watch Foundation, Jan. 12, 2021)

Last updated