
The proliferation of social media fraud on platforms like TikTok has prompted an array of legal actions from government agencies, state attorneys general, and private citizens seeking remedies for deceptive practices. These legal responses address the multifaceted nature of fraud occurring on the platform, ranging from consumer protection violations to privacy breaches and financial crimes. As TikTok’s user base has expanded to tens of millions of Americans, including vulnerable youth populations, the platform has become a breeding ground for various fraudulent schemes that exploit users’ trust and personal information.
Recent enforcement actions against TikTok and its parent company ByteDance demonstrate the growing concern among regulators about the platform’s business practices and its role in facilitating fraud. The Federal Trade Commission (FTC) has taken significant steps to hold TikTok accountable for violations of children’s privacy laws, while state attorneys general have filed lawsuits alleging deceptive practices that harm consumers. These legal actions reflect a broader trend of increased scrutiny of social media platforms and their responsibility to protect users from fraudulent activities.
The legal framework governing social media liability continues to evolve as courts and legislators grapple with novel issues presented by digital platforms. Traditional legal concepts of fraud, misrepresentation, and consumer protection are being applied to the digital realm, often requiring adaptation to address the unique challenges posed by social media environments. Understanding these legal developments is crucial for both platform operators and users seeking to navigate the complex landscape of online fraud prevention and remediation.
Federal Regulatory Actions Against TikTok
The Federal Trade Commission has emerged as a leading enforcer against TikTok’s alleged fraudulent practices, particularly regarding children’s privacy. In August 2024, the FTC, working through the Department of Justice, filed a significant lawsuit against TikTok and ByteDance for flagrant violations of the Children’s Online Privacy Protection Act (COPPA). This legal action followed a 2019 consent order that TikTok had allegedly violated, demonstrating the platform’s pattern of non-compliance with federal regulations.
The FTC’s complaint contains serious allegations about TikTok’s business practices, including that the company knowingly allowed millions of children under 13 on their platform without obtaining parental consent as required by law. According to the complaint, TikTok maintained accounts of children it knew were underage unless specific conditions were met, with human reviewers spending merely five to seven seconds reviewing each account. This cursory review process failed to adequately protect children’s privacy and allowed the company to collect personal data from underage users without proper consent.
Perhaps most troubling, the FTC alleged that TikTok built “back doors” into its platform that allowed children to bypass age verification systems. The company permitted account creation without age verification through third-party services like Google and Instagram, classifying these as “age unknown” accounts that grew to number in the millions. Even TikTok’s “Kids Mode” service, ostensibly designed to provide greater protection for younger users, allegedly collected and used children’s personal information in violation of COPPA, including sharing this data with third parties such as Facebook.
State Attorneys General Lawsuits
State attorneys general have taken increasingly aggressive action against TikTok, with a wave of lawsuits filed in October 2024. A bipartisan coalition of 14 attorneys general, led by New York Attorney General Letitia James and California Attorney General Rob Bonta, filed individual state lawsuits alleging various violations of state consumer protection laws. These coordinated legal actions reflect growing concern about TikTok’s impact on consumers, particularly young users.
The lawsuits focus on several key allegations, including that TikTok misrepresented the safety of its platform for young people despite evidence of negative impacts on mental health and body image. State attorneys general have also targeted TikTok’s promotion of dangerous “challenges” that allegedly led to injuries and deaths. These legal actions further allege that TikTok created addictive features specifically designed to manipulate users into compulsive use, including the “for you” feed, push notifications, autoplay, endless scroll, and beauty filters.
North Carolina Attorney General Josh Stein’s lawsuit exemplifies this approach, alleging that TikTok knowingly created a product harmful to children while deceiving the public about its dangers. The complaint specifically charges that TikTok designed its app to addict young users with features including infinite scroll, autoplay, likes, filters, algorithmic recommendations, and alerts. It further alleges that TikTok misrepresented the app’s safety while its executives and employees internally acknowledged the platform’s addictive and harmful effects on children. The lawsuit seeks both injunctive relief to stop the alleged violations and monetary penalties.
Class Action Lawsuits and Settlements
Private litigation has played a significant role in addressing TikTok fraud, with class action lawsuits resulting in substantial settlements. In a landmark case, TikTok agreed to pay $92 million to settle allegations that it had unlawfully collected users’ biometric and personal data. This settlement, which received final approval in August 2022, resolved claims that TikTok violated federal law and Illinois’ Biometric Information Privacy Act (BIPA) by collecting users’ biometric information without proper notice or consent.
The class action complaint alleged that TikTok used “automated software, proprietary algorithms, AI, facial recognition, and other technologies” to profit commercially from users’ biometric data. It further claimed that the app “clandestinely vacuumed up and transferred” user information to servers in China without adequate disclosure. The settlement covered approximately 89 million U.S.-based TikTok users, including children as young as eight years old, and required TikTok to implement new privacy compliance measures.
Illinois users received additional compensation under the settlement due to the state’s unique Biometric Information Privacy Act, which allows consumers to seek monetary damages if their biometric information is wrongfully taken. The settlement required TikTok to initiate a new privacy compliance training program and take other steps to protect users’ privacy going forward. This case illustrates how private litigation can complement regulatory enforcement actions in addressing fraudulent practices on social media platforms.
Financial Fraud and Money Laundering Concerns
TikTok’s platform has raised significant concerns about its potential use for financial crimes and money laundering. In June 2024, Utah Attorney General Sean Reyes filed a lawsuit alleging that TikTok operates as an unlicensed money transmitter, facilitating illegal activities through its virtual currency system. The lawsuit specifically targeted TikTok’s LIVE feature, which allows viewers to purchase and send “gifts” with monetary value to other users.
According to the Utah complaint, “Because TikTok refuses to appropriately oversee virtual currency exchanges, every transaction that takes place on the platform avoids regulatory schemes designed to identify and stop sexual exploitation and other illicit activities, like money laundering, terrorism financing, drug sales, and illegal gambling.” This allegation highlights the intersection between financial regulation and consumer protection in addressing social media fraud.
The lawsuit referenced a Financial Crimes Enforcement Network (FinCEN) advisory that defines administrators of convertible virtual currencies as money transmitters subject to anti-money laundering requirements. These requirements include adopting compliance programs, identifying customers, assessing illicit finance risks, and screening transactions. TikTok’s alleged failure to comply with these regulations potentially creates an environment where fraudulent financial activities can flourish without proper oversight.
In February 2025, a Utah court denied TikTok’s motion to dismiss the state’s lawsuit, allowing the case to proceed. The court found that the state had sufficiently alleged that TikTok knowingly allowed underage children to join the platform, targeted them with gifting features, and misrepresented the platform’s safety while incentivizing sexual content—all while taking a 50% commission on transactions. This ruling suggests that courts may be increasingly willing to hold social media platforms accountable for financial fraud occurring on their services.
Check Fraud and Banking Scams
A particularly troubling trend emerged in September 2024 when a check fraud scheme went viral on TikTok, demonstrating how the platform can rapidly spread fraudulent activities. The scheme, known as “check-kiting,” involved JPMorgan Chase customers depositing bad checks at ATMs and immediately withdrawing the funds before the checks could be verified. Videos of the scheme spread quickly on TikTok, with users encouraging others to exploit what they falsely characterized as a “glitch” in Chase’s system.
The consequences for participants were severe. Many users who participated in the scheme later posted videos showing negative account balances, sometimes exceeding $38,000. Chase Bank addressed the incident promptly, stating that “depositing a fraudulent check and withdrawing the funds from your account is fraud, plain and simple.” The bank announced plans to freeze accounts involved in the fraud, share surveillance footage and other information with law enforcement, and pursue legal action against perpetrators.
This incident highlights several important legal issues related to social media fraud. First, it demonstrates how social media platforms can rapidly amplify fraudulent schemes, causing significant harm before platforms or authorities can respond effectively. Second, it shows how users may be misled about the legality of their actions, with some participants apparently believing they were exploiting a technical error rather than committing fraud. Finally, it illustrates the potential legal consequences for individuals who participate in fraudulent schemes promoted on social media, including criminal charges, financial liability, and long-term damage to their credit and banking relationships.
Phishing and Cryptocurrency Scams
TikTok has become a vector for sophisticated phishing scams that exploit the platform’s credibility to target users’ sensitive information. In September 2024, phishing defense company Cofense reported a new scam using TikTok URLs to redirect users to malicious sites targeting Microsoft 365 credentials. This technique bypasses user suspicion by capitalizing on the trust TikTok users have for the platform, highlighting the evolving nature of phishing campaigns that leverage social media.
The phishing campaign involved emails claiming to be Office 365 alerts urging users to follow a TikTok URL to prevent deletion of their emails. Once clicked, the link redirected through various sites before landing on a fake Microsoft login page. The scam employed sophisticated techniques to appear legitimate, including auto-filling users’ email addresses and providing seemingly authentic support options. This example demonstrates how fraudsters leverage TikTok’s reputation to conduct traditional phishing attacks with enhanced credibility.
Cryptocurrency fraud has also proliferated on TikTok, with the Better Business Bureau reporting numerous investment scams targeting platform users. These scams typically begin with direct messages from seemingly legitimate profiles offering cryptocurrency investment opportunities with promises of doubling or tripling initial investments within days. Victims are often directed to communicate off-platform and send money through digital wallet services or cryptocurrency transfers. The fraudsters then claim to “invest” the money, which disappears along with the scammer once payment is received.
European Regulatory Actions
European regulators have taken significant enforcement actions against TikTok for privacy violations that relate to fraudulent practices. In September 2023, the Irish Data Protection Commission (DPC) fined TikTok €345 million for breaching EU data law in its handling of children’s user accounts. This fine, the largest to date for the platform, followed an investigation into TikTok’s data processing practices between July and December 2020.
The DPC found that TikTok failed to provide sufficient transparency information, implemented inappropriate platform settings for young users, employed “dark patterns” to manipulate user choices, and maintained ineffective age verification measures. For example, all user accounts defaulted to public settings, with interface designs making it more challenging to opt for privacy settings. The commission determined that TikTok had committed multiple breaches of the General Data Protection Regulation (GDPR), including violations of principles relating to transparency, data minimization, and security.
This European regulatory action complements U.S. enforcement efforts by addressing similar concerns about TikTok’s data practices, particularly regarding children. The DPC’s decision required TikTok to bring its data processing into compliance with EU regulations within a three-month timeline, in addition to paying the substantial fine. This case illustrates the global nature of legal responses to social media fraud and the increasing convergence of privacy regulation and consumer protection across jurisdictions.
Legal Theories and Liability Frameworks
The legal actions against TikTok employ various theories of liability that reflect the complex nature of social media fraud. Consumer protection laws form the primary basis for many claims, with state attorneys general alleging violations of statutes prohibiting deceptive and unfair business practices. These laws typically prohibit false or misleading representations about products or services, including misrepresentations about a platform’s safety features or data collection practices.
Privacy laws provide another avenue for legal action, particularly regarding the collection and use of personal information without adequate disclosure or consent. The Children’s Online Privacy Protection Act establishes specific requirements for online services directed to children under 13, including obtaining verifiable parental consent before collecting personal information. State laws like Illinois’ Biometric Information Privacy Act create additional protections for specific types of sensitive data, with private rights of action that enable individuals to seek remedies for violations.
Financial regulations also play an important role in addressing fraud on social media platforms. Money transmission laws require entities that transfer funds between parties to register with appropriate authorities and implement anti-money laundering controls. The application of these laws to virtual currency systems within social media platforms represents an emerging area of legal development, with potential implications for platform operators’ responsibilities regarding fraudulent financial transactions.
Platform Liability and Section 230
A critical legal issue in addressing social media fraud involves the scope of platform immunity under Section 230 of the Communications Decency Act. This federal law generally shields online platforms from liability for content posted by users, stating that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This provision has traditionally provided significant protection for social media companies against claims based on user-generated content.
However, recent legal developments suggest potential limitations to this immunity in cases involving platform design and business practices that facilitate fraud. The Utah court’s decision allowing claims against TikTok to proceed despite Section 230 arguments indicates that courts may distinguish between merely displaying user content and actively contributing to unlawful activity through platform design and policies. The court found that allegations about TikTok knowingly targeting children with certain features and misrepresenting safety measures went “beyond displaying unwelcome and actionable content.”
This evolving interpretation of Section 230 may have significant implications for platform liability regarding fraudulent activities. While platforms likely retain immunity for simply hosting user content that happens to be fraudulent, they may face increasing liability for design choices, algorithmic recommendations, or business practices that actively promote or facilitate fraud. This distinction could shape future legal strategies for addressing social media fraud, with potential focus on platform design and business models rather than specific instances of fraudulent content.
Remedies and Enforcement Mechanisms
Legal actions against social media fraud on TikTok seek various remedies to address harm and prevent future violations. Regulatory enforcement actions typically request civil penalties, which can be substantial. For example, the FTC’s complaint against TikTok noted that the FTC Act allows civil penalties up to $51,744 per violation per day, potentially resulting in massive financial liability given the scale of alleged violations.
Injunctive relief represents another important remedy, with regulators and private plaintiffs seeking court orders requiring changes to platform practices. These injunctions may mandate specific reforms to address fraudulent activities, such as implementing stronger age verification systems, enhancing disclosure of data collection practices, or modifying features that facilitate fraud. The effectiveness of such injunctions depends on courts’ willingness to impose detailed requirements and monitor compliance over time.
Private litigation, particularly class actions, can provide compensation to individuals harmed by fraudulent practices. The $92 million TikTok biometric privacy settlement illustrates how class actions can deliver monetary relief to large numbers of affected users while also requiring changes to business practices. However, the individual recovery in such cases is often modest, raising questions about whether such settlements adequately deter future misconduct or compensate for actual harm suffered.
Prevention Strategies and Platform Responsibilities
Beyond legal remedies for past violations, addressing social media fraud requires forward-looking prevention strategies. Platform design plays a crucial role in either facilitating or preventing fraudulent activities. Features that prioritize engagement and virality without adequate safeguards may inadvertently promote the spread of fraudulent content, as demonstrated by the rapid proliferation of the Chase Bank check fraud scheme on TikTok.
Content moderation represents another essential component of fraud prevention. Platforms must develop effective systems to identify and remove fraudulent content before it reaches large audiences. This task presents significant challenges given the volume and variety of content on social media platforms, the sophisticated techniques employed by fraudsters, and the tension between content removal and free expression values.
User education also serves as an important preventive measure. Financial literacy experts emphasize the need for users to understand basic principles such as “there is no such thing as free money” and to recognize warning signs of fraud. As one financial consultant noted regarding the Chase Bank incident, “It’s not free money, and ignorance is not an excuse for breaking the law.” Platforms can contribute to user education through clear warnings about potential scams and information about how to report suspicious activity.
International Dimensions and Jurisdictional Challenges
The global nature of social media platforms creates significant jurisdictional challenges for legal actions against fraud. TikTok operates across national boundaries, with its parent company ByteDance based in China while serving users worldwide. This international structure raises questions about which laws apply to the platform’s activities and which authorities have jurisdiction to enforce those laws.
Data transfer issues further complicate the legal landscape. Allegations that TikTok transfers user data to servers in China have raised concerns about both privacy and national security. These cross-border data flows may implicate various legal regimes, including data protection laws, international trade agreements, and national security regulations. The interaction of these different legal frameworks creates complexity for both regulators seeking to address fraud and platforms attempting to comply with sometimes conflicting requirements.
Enforcement cooperation between different national authorities becomes increasingly important in this context. The coordinated filing of lawsuits by multiple state attorneys general demonstrates one approach to addressing jurisdictional limitations, with different authorities working together while operating within their respective legal frameworks. International regulatory cooperation, though more challenging, may become increasingly necessary as social media fraud continues to transcend national boundaries.
Future Trends in Legal Responses to Social Media Fraud
The legal landscape surrounding social media fraud continues to evolve rapidly, with several emerging trends likely to shape future developments. First, increased regulatory focus on platform design and business models, rather than individual instances of fraudulent content, may lead to more systemic approaches to addressing fraud. This shift reflects growing recognition that certain platform features and incentive structures may inherently facilitate fraudulent activities.
Second, the convergence of consumer protection, privacy, and financial regulation in addressing social media fraud suggests a more integrated regulatory approach. As demonstrated by the various legal actions against TikTok, fraudulent practices often implicate multiple legal frameworks simultaneously. Future regulatory efforts may increasingly coordinate across these different domains to address the multifaceted nature of social media fraud.
Finally, the role of private litigation in complementing regulatory enforcement is likely to remain significant. Class actions and other private suits can address harms not fully captured by regulatory actions, while also providing compensation to affected individuals. The success of the TikTok biometric privacy settlement demonstrates the potential effectiveness of private litigation in addressing certain types of fraudulent practices, particularly where specific statutory frameworks create private rights of action.
Conclusion
The legal actions taken against social media fraud on TikTok reflect a complex and evolving response to the challenges posed by fraudulent activities on digital platforms. Federal regulators, state attorneys general, and private litigants have employed various legal theories to address different aspects of fraud, from privacy violations to consumer deception to financial crimes. These multifaceted approaches demonstrate both the seriousness with which authorities view social media fraud and the challenges of effectively regulating global digital platforms.
As TikTok and other social media platforms continue to evolve, legal responses to fraud will likely adapt accordingly. The tension between promoting innovation and protecting users from harm remains a central challenge for both regulators and courts. Finding the appropriate balance requires careful consideration of platform design, business incentives, user behavior, and the limitations of traditional legal frameworks when applied to novel digital contexts.
Ultimately, addressing social media fraud effectively requires collaboration between platforms, regulators, and users themselves. While legal actions provide important accountability mechanisms and remedies for past harms, preventing future fraud depends on creating platform environments that inherently discourage fraudulent activities while empowering users to recognize and avoid scams. The ongoing legal developments surrounding TikTok illustrate both the progress made in addressing social media fraud and the significant work that remains to be done.
Citations:
- FTC Lawsuit Against TikTok for Child Privacy Violations
- DC Attorney General Files Complaint Against TikTok
- TikTok Lawsuit Raises Money Laundering Concerns
- Utah Court Ruling Denies TikTok Motion
- Chase Bank Faces Viral TikTok Check Fraud
- Cofense Report on TikTok Phishing Scams
- TikTok Settlement Shapes U.S. Privacy Protections
- FTC Refers TikTok Complaint to DOJ
- NJ Attorney General Sues TikTok for Youth Harm
- TikTok Faces State AGs Legal Challenges
- TikTok Class Action Lawsuit Details
- TikTok Fined 345M by Irish Data Commission
- JPMorgan Chase Reports TikTok Glitch Fraud
- BBB Alert on TikTok Cryptocurrency Scams
- 92M TikTok Privacy Violation Settlement Approved
- Washington AG Sues TikTok for Mental Health Harm
- NC Attorney General Sues TikTok for Child Harm
- TikTok Free Money Trend Leads to Consequences
- Fegan Scott Case Against TikTok
- University of Maryland Article on TikTok Laws
- VA Attorney General Sues TikTok for Harmful Content
- DC AG Sues TikTok for Preying on Users
- TikTok Faces Lawsuit Over Retaliatory Termination
- Ohio University on TikTok Ban and Data Security
- CSBA Blog on TikTok Lawsuit Updates
- FTC Statement on TikTok DOJ Referral
- The Hilltop on TikTok Ban Discussions
- Arkansas AG Wins TikTok Lawsuit Approval
- SC Attorney General Sues TikTok for Violations
- VA AG Urges SCOTUS to Uphold TikTok Ban
- TikTok Community Guidelines Enforcement Report
- TikTok Pays 92M to Settle Data Theft Suit
- JPMorgan Sues Customers Over TikTok Glitch Trend
- CNN on Chase TikTok Trend and Fraud
- TikTok Trend Encourages Chase ATM Check Fraud
- Fox5 DC Update on TikTok Ban Status
- Cointelegraph on TikTok Crypto Scams
- USA Today on Chase Bank ATM Fraud Glitch
- The Verge on U.S. TikTok Ban News
- NY Post TikTok Scam Warning for Users
- KUTV on TikTok Free Money Fraud Trend
- NBC News on GOP Senators and TikTok Extension
- Kiplinger on IRS Skirting TikTok Ban
- Bloomberg Law on JPMorgan Suing TikTok Fraudsters
- Reddit BigSEO on TikTok Search Volume Reporting
- TikTok Video by Moby Siddique on Scams
- Shopify Community on TikTok Ad Scams
- Reddit Discussion on TikTok Scam Legality
- TikTok Video by Jason Davis on SEO
- BeBold Digital Blog on TikTok Scams
- TikTok Build in Public Video on Fraud
- Hennessey Digital on Law Firm SEO Keywords
- YouTube Video on TikTok Fraud Issues
- TikTok Video by Darren Shaw on SEO
- TikTok Video by WebHive Digital on Scams
- Action Fraud on TikTok Scam Alerts
- TikTok Safety Guide on Avoiding Scams
- FTC Orders Social Media Platforms on Ad Surge
- Hunton on FTC Referral of TikTok Complaint
- NC DOJ Orders TikTok to Comply with Investigation
- TikTok Lawsuit Highlights Money Laundering Risks
- Arkansas AG Praises SCOTUS TikTok Divestment Ruling
- Utah Consumer Protection Releases TikTok Complaint Info
- NC AG Josh Stein Sues TikTok for Harm
- Cutter Law on Section 230 Social Media Lawsuits
- FTC Sues TikTok for Violating Child Privacy Law
- Utah Blog on TikTok Live Unsealed Redactions
- TikTok Data Privacy Settlement Information
- Clifford Law on TikTok Data Privacy Settlement
- ICO Fines TikTok 127M for Misusing Data
- Social Media Victims on TikTok Lawsuit
- Bloomberg on TikTok Fine for EU Data Breach
- Iowa AG Sues TikTok for Misleading Parents
- The Record on TikTok Fine in Italy
- Fortune on JPMorgan Suing Over Money Glitch
- Motley Rice on TikTok Social Media Lawsuits
- CNN on Supreme Court TikTok Ban Ruling
- Local12 on TikTok Check Fraud Trend
- Politico on Supreme Court Upholding TikTok Ban
- Hennessey on SEO Scams and Unethical Practices
- Search Engine Journal on Top SEO Scams
- Formation Media on Avoiding Sneaky SEO Scams
- Elephant in the Boardroom on Low Competition Keywords
- Tenable on TikTok Ad Scams and Moderation
- Vvitch Digital Blog on TikTok Rules
- Exploding Topics on Low Competition Keywords
- CRS Reports on TikTok Ban Legislation
- SW Law on Supreme Court TikTok Restrictions
- Constitution Center on TikTok Ban and SCOTUS
- FreightWaves on Federal Court TikTok Ban
- Silicon Republic on TikTok 500M GDPR Fine