9+ Is Facebook Shadow Banning You? Tips & Signs


9+ Is Facebook Shadow Banning You? Tips & Signs

The follow in query includes limiting the visibility of a person’s content material on the Fb platform with out notifying the person that their attain has been restricted. This will manifest as lowered appearances in information feeds, search outcomes, or different areas the place content material is displayed. As an illustration, a submit is probably not proven to all of a person’s followers, or their profile is likely to be much less prone to seem in search outcomes when different customers search for it.

The importance of such content material suppression lies in its potential affect on free speech and the dissemination of data. The advantages claimed by those that implement these measures usually heart on mitigating the unfold of misinformation, hate speech, and different dangerous content material. Traditionally, using such strategies has been debated in relation to platforms’ accountability to each uphold freedom of expression and guarantee a protected and informative setting for customers.

This text will now delve into the strategies reportedly used to implement these visibility restrictions, look at the potential penalties for each customers and the platform itself, and discover the continued debate surrounding its justification and transparency.

1. Decreased Visibility

Decreased visibility serves as a main indicator and consequence of content material suppression on Fb. It represents a measurable decline within the variety of customers uncovered to a selected submit or profile, typically with out express notification to the content material creator. This phenomenon straight impacts the attain and affect of people and organizations using the platform.

  • Algorithmic Downranking

    Algorithmic downranking includes adjusting the parameters inside Fb’s rating algorithms to decrease the precedence of particular content material. This may increasingly entail reducing the probability of a submit showing in customers’ information feeds, or demoting a profile in search outcomes. For instance, posts from a person flagged for potential violations of neighborhood requirements, even when not definitively violating them, is likely to be proven to a smaller proportion of their followers. The implication is a refined erosion of the person’s potential viewers with out direct censorship.

  • Suppressed Distribution to Followers

    Even with a longtime community of followers, content material distribution will be selectively restricted. Which means whereas a person could have a whole lot or 1000’s of followers, solely a fraction may very well see their posts. This selective suppression will be carried out based mostly on numerous elements, together with perceived content material high quality, alignment with Fb’s preferences, or the person’s previous posting conduct. The consequence is a disconnect between the person’s meant attain and the precise affect of their content material.

  • Diminished Search Discoverability

    When a person’s content material or profile is subjected to lowered visibility, its discoverability by means of Fb’s search perform will be considerably impacted. Which means different customers trying to find associated matters or particular profiles could also be much less prone to discover the affected content material. For instance, if a person constantly posts a couple of explicit matter deemed controversial by Fb, their profile could also be pushed additional down in search outcomes for associated key phrases. This discount in discoverability successfully isolates the person from potential new audiences.

  • Lowered Engagement Metrics

    Decreased visibility typically interprets into decrease engagement metrics, corresponding to fewer likes, feedback, and shares. This decline in engagement can additional reinforce the impact of content material suppression, as Fb’s algorithms could interpret low engagement as a sign that the content material isn’t related or invaluable to customers. This creates a self-reinforcing cycle, the place lowered visibility results in decrease engagement, which in flip results in additional reductions in visibility. This in the end hinders a person’s potential to construct a following or take part in significant discussions on the platform.

In abstract, lowered visibility is a vital element of content material suppression. The multifaceted strategies employed to attain this final result, from algorithmic changes to selective distribution and diminished search discoverability, underscore the refined but vital affect on content material creators and their audiences. This manipulation of visibility is central to the continued debate surrounding the ethics and transparency of content material moderation insurance policies.

2. Algorithm Manipulation

Algorithm manipulation, within the context of Fb, refers back to the adjustment of parameters throughout the platform’s rating algorithms to change the visibility and distribution of content material. This manipulation is a vital element within the implementation of methods to cut back the attain of explicit customers or content material varieties, a course of typically termed content material suppression. The connection lies within the cause-and-effect relationship: algorithm manipulation serves because the mechanism by which lowered visibility, a key indicator of this technique, is achieved. For instance, Facebooks algorithm prioritizes content material deemed participating. By subtly downranking content material from a selected person, the algorithm can successfully restrict its unfold with out outright elimination. That is achieved by adjusting variables such because the frequency with which that person’s posts seem in information feeds or the precedence given to their content material in search outcomes. The significance of this manipulation stems from its potential to regulate the circulate of data on the platform with out resorting to overt censorship.

Moreover, the algorithms, of their complexity, obscure the exact guidelines governing content material visibility. This lack of transparency exacerbates the problem of figuring out situations of visibility restriction. As an illustration, a person would possibly expertise a sudden and unexplained drop in engagement metricsfewer likes, feedback, and shareswithout understanding the underlying trigger. This decline is likely to be attributed to adjustments within the algorithm, changes to viewers demographics, or the standard of the content material itself. Nevertheless, if the algorithm has been manipulated to particularly downrank the person’s content material, the decline isn’t natural however moderately the results of deliberate intervention. Sensible purposes of understanding this connection lie in demanding larger transparency from Fb concerning algorithm operation and content material moderation insurance policies. The power to determine and analyze algorithm manipulation is essential for shielding freedom of expression and making certain a degree taking part in discipline for all customers.

In abstract, algorithm manipulation is a basic side of content material suppression methods. It offers the technical means to cut back content material visibility with out overt censorship, elevating vital issues about transparency and accountability. Understanding the hyperlink between algorithm manipulation and its penalties is crucial for navigating the complexities of content material moderation and selling a extra equitable data setting on Fb. The problem stays in creating strategies for detecting and quantifying algorithmic bias, and advocating for larger person management over the algorithms that form their on-line expertise.

3. Content material Suppression

Content material suppression represents a core mechanism by means of which visibility restrictions are carried out on platforms corresponding to Fb, functioning as a key element of what’s also known as covert content material moderation. It includes limiting the dissemination of particular data or opinions with out essentially eradicating the content material totally, thereby affecting the attain and affect of customers. This method is meant to deal with perceived violations of neighborhood requirements or to mitigate the unfold of misinformation, nevertheless it additionally raises issues about censorship and freedom of expression.

  • Algorithmic Filtering

    Algorithmic filtering includes utilizing Fb’s algorithms to cut back the visibility of posts based mostly on numerous elements, such because the presence of sure key phrases, the supply of the data, or the person’s previous conduct. For instance, posts containing hyperlinks to information articles deemed unreliable by third-party fact-checkers is likely to be downranked in information feeds. The implication is that whereas the content material stays accessible, its potential viewers is considerably diminished, hindering the person’s potential to interact with a wider neighborhood.

  • Decreased Information Feed Visibility

    Content material suppression typically manifests as a discount within the frequency with which a person’s posts seem within the information feeds of their followers. This will happen by means of refined changes to the algorithm that prioritize different varieties of content material, or by straight limiting the attain of particular posts. As an illustration, if a person regularly posts about political matters, their content material is likely to be proven to a smaller proportion of their followers, successfully limiting their potential to take part in public discourse. The affect is a diminished potential to share data and interact in discussions with a person’s established community.

  • Content material Labeling and Warnings

    The addition of labels or warnings to content material can function a type of suppression by discouraging customers from participating with or sharing the flagged materials. This usually includes inserting a discover on a submit indicating that the data has been disputed by fact-checkers or that it might violate neighborhood requirements. For instance, posts containing probably deceptive details about vaccines is likely to be labeled with a warning directing customers to dependable sources of data. Whereas the content material stays seen, the added context and potential stigma can deter customers from sharing or accepting the data at face worth.

  • Search Consequence Demotion

    Content material suppression may also prolong to look outcomes, the place particular profiles or posts are pushed additional down within the search rankings, making them much less prone to be found by customers trying to find associated matters. For instance, a person who regularly shares content material that has been flagged for misinformation would possibly discover their profile showing decrease in search outcomes for related key phrases. The consequence is a lowered potential to draw new followers or have interaction with people who’re actively searching for data on the matters the person discusses.

These sides of content material suppression on Fb are intertwined. The usage of algorithms to filter content material, scale back information feed visibility, add labels and warnings, and demote search outcomes collectively contributes to a follow that limits the attain and affect of customers with out express censorship. The connection lies within the deliberate manipulation of visibility, impacting each the distribution and reception of data on the platform. Understanding these mechanisms is essential for evaluating the moral implications of content material moderation insurance policies and for selling larger transparency within the administration of on-line discourse.

4. Consumer Unawareness

Consumer unawareness kinds a vital, typically defining, side of the phenomenon of restricted visibility. Its significance stems from the truth that the absence of notification concerning content material limitations straight impacts a person’s understanding of their on-line attain and affect. This lack of transparency inhibits a person’s potential to regulate their content material technique, handle perceived violations, or attraction selections affecting their visibility. A sensible instance includes a person experiencing a sudden drop in engagement metrics with none corresponding notification or clarification from the platform. They might attribute the decline to adjustments in viewers curiosity or basic algorithmic shifts, remaining unaware that their content material is being intentionally suppressed.

The sensible significance of person unawareness manifests in a number of methods. First, it undermines the person’s potential to take part successfully in on-line discourse, as their meant message could not attain the meant viewers. Second, it erodes belief within the platform, as customers could understand an absence of transparency and equity in content material moderation practices. Third, it limits the person’s capability to study from previous posting conduct or adjust to neighborhood requirements, as they’re disadvantaged of the data wanted to make knowledgeable selections about their content material technique. As an illustration, if a person’s posts are constantly being downranked as a consequence of using sure key phrases, they would wish to concentrate on this to change their posting habits.

In abstract, person unawareness isn’t merely a facet impact however a defining attribute of restricted visibility practices. It exacerbates the potential for censorship and manipulation, hindering customers’ potential to grasp and navigate the platform successfully. Addressing this concern requires larger transparency from platforms concerning content material moderation insurance policies and practices, together with mechanisms for customers to grasp and attraction selections affecting their visibility. By acknowledging and addressing person unawareness, platforms can foster larger belief and accountability within the administration of on-line discourse.

5. Free Speech Considerations

The follow of proscribing content material visibility on Fb with out informing the person raises vital free speech issues. These issues stem from the surreptitious nature of the motion, which successfully silences or diminishes sure voices with out offering a chance for rebuttal or attraction. When content material is suppressed, even when not explicitly eliminated, the person’s potential to specific themselves and interact in public discourse is curtailed. The trigger and impact are direct: content material suppression insurance policies, carried out with out transparency, result in a chilling impact on speech. The significance of free speech lies in its foundational function in democratic societies, permitting for the open trade of concepts and the flexibility to carry energy accountable. Its restriction, even by means of refined means, can undermine these rules. An actual-life instance would possibly contain a non-profit group whose posts about social justice points constantly attain fewer customers than anticipated, hindering their potential to lift consciousness and mobilize assist. The sensible significance of understanding this connection lies in recognizing the potential for such practices to disproportionately affect marginalized teams or dissenting voices.

Additional evaluation reveals that these issues are compounded by the dearth of due course of. When a person’s content material is suppressed, they’re typically left unaware of the explanations behind it and are unable to problem the platform’s determination. This absence of procedural equity contrasts sharply with the rules of free expression, which usually require a transparent justification for proscribing speech and a chance for the speaker to be heard. In a sensible software, demanding larger transparency from Fb concerning content material moderation insurance policies might assist handle this concern. This would possibly contain offering customers with clear explanations when their content material is suppressed, together with a mechanism for interesting these selections. With out such measures, the platform dangers turning into an arbiter of acceptable discourse, probably silencing viewpoints which are unpopular however not essentially dangerous or illegal.

In conclusion, content material visibility restriction implicates basic free speech rules. The dearth of transparency and due course of in these practices poses a problem to open discourse and might undermine the flexibility of people and organizations to specific themselves successfully. Addressing these issues requires a dedication to larger transparency, accountability, and procedural equity. By recognizing the connection between free speech and content material suppression, stakeholders can work in the direction of making a extra equitable and democratic on-line setting. The problem lies in balancing the necessity to mitigate dangerous content material with the crucial to guard free expression, making certain that content material moderation insurance policies will not be used to silence legit voices or stifle public debate.

6. Misinformation management

Misinformation management is regularly cited as a justification for content material visibility restriction on Fb, together with the follow of content material suppression. The perceived trigger is the speedy and widespread dissemination of false or deceptive data, with the meant impact being a discount in its attain and affect. Misinformation management is positioned as a significant element of platform integrity and public security, aiming to forestall hurt that may come up from the acceptance and proliferation of false narratives. As an illustration, throughout public well being crises, platforms have suppressed content material selling unproven or disproven remedies, citing the potential for hurt to people counting on such data. The sensible significance of understanding this connection lies in recognizing that the acknowledged purpose of mitigating hurt is commonly weighed in opposition to issues about censorship and the potential for legit data to be caught in a broad web.

Nevertheless, the hyperlink between misinformation management and visibility restriction isn’t with out complexities. Defining what constitutes misinformation and figuring out the suitable response is a subjective course of, elevating issues about bias and the potential for viewpoint discrimination. Moreover, the dearth of transparency surrounding content material suppression practices makes it tough to evaluate their effectiveness and to make sure that they aren’t getting used to silence dissenting voices or to suppress correct however inconvenient data. For instance, a information outlet reporting on a controversial concern would possibly discover its content material suppressed regardless of adhering to journalistic requirements, just because the platform deems the data to be probably deceptive. The sensible software lies within the want for larger readability concerning the standards used to determine misinformation and for unbiased oversight of content material moderation practices.

In conclusion, misinformation management serves as a main rationale for content material suppression on Fb, however this rationale is commonly intertwined with legit issues about censorship and transparency. The problem lies in hanging a stability between the necessity to mitigate the unfold of dangerous data and the crucial to guard freedom of expression. By acknowledging the inherent complexities and selling larger accountability, it’s doable to foster a extra knowledgeable and equitable on-line setting. Additional analysis and public dialogue are wanted to develop efficient methods for addressing misinformation with out undermining basic rights and values.

7. Platform Accountability

Platform accountability, when thought-about within the context of content material visibility restriction, particularly addresses the accountability Fb bears for its actions and their penalties. The core connection lies in the truth that content material suppression practices, even when carried out with ostensibly benign intent, have the potential to disproportionately affect customers and communities. This potential necessitates a framework of accountability to make sure equity, transparency, and recourse. The significance of accountability as a element stems from the ability Fb wields in shaping the circulate of data and influencing public discourse. A scarcity of accountability can result in abuses of this energy, silencing legit voices and undermining belief within the platform. For instance, if a political advocacy group finds its content material constantly suppressed throughout election cycles with out clarification, the dearth of accountability prevents them from difficult the choice or searching for redress, probably influencing electoral outcomes. The sensible significance of this understanding lies in recognizing that accountability isn’t merely a matter of coverage, however a basic requirement for sustaining a good and open on-line setting.

Additional exploration reveals that platform accountability encompasses a number of key dimensions. One dimension is transparency, requiring Fb to reveal its content material moderation insurance policies and the standards used to make selections about visibility. A second dimension is due course of, offering customers with clear explanations for content material suppression actions and a chance to attraction these selections. A 3rd dimension is oversight, involving unbiased audits and evaluations of Fb’s content material moderation practices to make sure equity and effectiveness. For instance, unbiased researchers might analyze Fb’s algorithms to determine potential biases and assess their affect on totally different communities. The sensible software of those dimensions lies within the improvement of sturdy mechanisms for holding Fb accountable for its actions, together with regulatory oversight, authorized challenges, and public advocacy.

In conclusion, platform accountability is inextricably linked to the follow of proscribing content material visibility. The absence of accountability creates a threat of abuse and undermines the rules of free expression and open discourse. By selling transparency, due course of, and oversight, stakeholders can work in the direction of making certain that Fb is held accountable for its actions and that its content material moderation practices are honest, efficient, and according to basic rights. The problem lies in overcoming the inherent complexities of content material moderation and in creating progressive approaches to accountability which are each sensible and enforceable. This requires ongoing dialogue, collaboration, and a dedication to fostering a extra equitable and accountable on-line setting.

8. Group requirements

Group requirements function the codified guidelines defining acceptable conduct and content material on Fb. These requirements delineate what’s permissible and what’s prohibited, encompassing a spread of matters together with hate speech, violence, misinformation, and harassment. Content material suppression, together with visibility restriction, is commonly carried out as a way of implementing these requirements, ostensibly to guard customers from dangerous or offensive materials. A direct hyperlink exists: perceived violations of the neighborhood requirements set off the applying of those suppression strategies. The significance of neighborhood requirements as a element lies of their function because the articulated justification for interventions that restrict person attain and affect. For instance, if a submit is flagged for violating the coverage in opposition to hate speech, it might be topic to lowered visibility. The sensible significance of this connection is that neighborhood requirements, and their interpretation, straight affect the scope of freedom of expression on the platform.

Additional evaluation reveals that the applying of neighborhood requirements isn’t at all times easy. Ambiguities within the requirements themselves, coupled with variations of their enforcement, can result in inconsistencies and perceptions of bias. Content material that’s thought-about acceptable by one person could also be deemed to violate neighborhood requirements by one other. Moreover, the algorithms used to determine violations will not be infallible, and should erroneously flag legit content material for suppression. For instance, satirical or humorous posts could also be misinterpreted as selling violence or hate speech. Sensible purposes of this understanding embody advocating for larger transparency within the improvement and enforcement of neighborhood requirements, and for mechanisms that permit customers to problem selections affecting their content material visibility.

In conclusion, neighborhood requirements are integrally linked to the follow of content material visibility restriction. They supply the rationale for suppressing content material deemed to violate platform guidelines. Nevertheless, the subjective nature of those requirements and the potential for algorithmic error underscore the necessity for transparency, due course of, and accountability. The problem lies in hanging a stability between defending customers from hurt and safeguarding freedom of expression. A steady dialogue and a dedication to refinement are needed to make sure that neighborhood requirements are utilized pretty and successfully.

9. Attain limitation

Attain limitation is a direct consequence and defining attribute of content material visibility restriction, a follow typically related to alleged “shadow banning on fb.” The deliberate discount of a person’s viewers constitutes the tangible impact of content material suppression efforts, the place posts or profiles grow to be much less seen to followers and potential new audiences. The significance of attain limitation as a element of such ways resides in its effectiveness: a restricted attain considerably reduces the affect and dissemination of data, successfully diminishing a person’s voice. A pertinent instance is a person experiencing a pointy decline in submit engagement (likes, feedback, shares) and not using a corresponding change in content material high quality or posting frequency, suggesting an exterior issue impacting visibility. The sensible significance of understanding this connection lies in recognizing that diminished attain serves as a key indicator of potential content material visibility restriction, prompting additional investigation into the doable causes.

Additional evaluation of attain limitation reveals the mechanisms employed to attain this final result. These embody algorithmic downranking, the place particular content material is assigned a decrease precedence in information feeds; suppressed distribution to followers, stopping posts from reaching all meant recipients; and diminished search discoverability, making it harder for brand new customers to seek out the content material. For instance, an activist group’s posts could not seem within the information feeds of their followers, even those that actively have interaction with their content material. Equally, their profile could grow to be more durable to seek out by means of search. Sensible purposes of this understanding contain creating instruments to observe attain and detect anomalies, enabling customers to determine potential situations of content material visibility restriction. Moreover, advocating for elevated transparency from Fb concerning algorithmic adjustments and content material moderation insurance policies might assist mitigate the chance of unfair or arbitrary attain limitation.

In conclusion, attain limitation is a vital part of content material visibility restriction, successfully lowering the affect and affect of affected customers. Understanding the strategies employed to restrict attain, and recognizing the symptoms of this follow, is essential for safeguarding freedom of expression and selling a extra equitable on-line setting. The problem lies in balancing the platform’s must handle content material with the rights of customers to specific themselves and interact with their viewers. Addressing this requires ongoing monitoring, advocacy, and a dedication to transparency and accountability.

Ceaselessly Requested Questions on Content material Visibility Restriction on Fb

The next questions handle widespread issues and misconceptions surrounding the follow of content material visibility restriction on the Fb platform, a phenomenon also known as “shadow banning on Fb.” The purpose is to supply readability and knowledgeable understanding of this complicated concern.

Query 1: What particularly constitutes “shadow banning on Fb”?

The time period describes the act of limiting the visibility of a person’s content material on Fb with out straight notifying the person of the restriction. This will manifest as lowered look of posts in information feeds, decrease rating in search outcomes, or general decreased engagement with the person’s content material.

Query 2: Is there concrete proof that Fb engages in content material visibility restriction?

Whereas Fb doesn’t publicly acknowledge a follow particularly labeled as “shadow banning,” there’s proof of algorithmic manipulation and content material moderation strategies that may end up in lowered content material visibility. Customers typically report experiencing inexplicable declines in engagement and attain, prompting suspicions of such practices.

Query 3: What causes would possibly Fb have for implementing content material visibility restriction?

Frequent justifications embody combating misinformation, suppressing hate speech, and implementing neighborhood requirements. Fb may make use of such strategies to demote content material deemed low-quality or irrelevant to customers.

Query 4: What are the potential penalties of lowered content material visibility on Fb?

Penalties can embody diminished attain and affect, stifled freedom of expression, and the erosion of belief within the platform. People and organizations could discover their messages will not be reaching their meant viewers, limiting their potential to take part in public discourse.

Query 5: How can a Fb person decide if their content material visibility has been restricted?

Detecting this follow will be difficult, as there isn’t any direct notification. Nevertheless, customers could observe a constant decline in engagement metrics (likes, feedback, shares), decreased look of their posts in mates’ information feeds, or issue being present in search outcomes. Third-party instruments may also present estimates of attain and engagement, probably highlighting anomalies.

Query 6: What recourse do customers have in the event that they imagine their content material visibility is being unfairly restricted on Fb?

Choices for recourse are restricted. Customers can evaluation Fb’s neighborhood requirements to make sure their content material complies with the platform’s insurance policies. Contacting Fb assist could yield restricted outcomes, however can present documentation of issues. Public advocacy and elevating consciousness of the problem will also be efficient methods.

In abstract, content material visibility restriction on Fb is a posh and infrequently opaque follow. Whereas justifications typically heart on sustaining platform integrity and combating dangerous content material, issues persist concerning transparency, equity, and freedom of expression. Customers are suggested to stay knowledgeable about these points and to advocate for larger accountability from the platform.

The next part will delve into actionable methods for mitigating the affect of lowered content material visibility on Fb.

Mitigating the Results of Content material Visibility Restriction on Fb

The next pointers purpose to supply actionable methods for customers who suspect their content material visibility is being restricted on Fb, a phenomenon typically related to surreptitious content material moderation practices.

Tip 1: Analyze Engagement Metrics Recurrently: Persistently monitor submit engagement charges (likes, feedback, shares) and general attain. Doc vital deviations from established patterns, as this will likely point out a change in content material visibility. Evaluate engagement throughout totally different content material varieties to determine potential patterns.

Tip 2: Diversify Content material Codecs: Experiment with numerous content material codecs, together with textual content updates, photos, movies, stay streams, and articles. A lowered visibility restriction could disproportionately have an effect on sure content material varieties. Shifting to different codecs can circumvent algorithmic biases.

Tip 3: Improve Content material High quality and Relevance: Deal with creating high-quality, participating content material that resonates with the target market. Content material deemed invaluable by customers is extra prone to be prioritized by Fb’s algorithms. Conduct viewers analysis to grasp content material preferences.

Tip 4: Enhance Posting Frequency Strategically: Implement a constant posting schedule to keep up viewers engagement. Posting extra regularly, with out overwhelming followers, can improve the probability of reaching a wider viewers. Analyze optimum posting occasions based mostly on viewers exercise.

Tip 5: Have interaction Actively with the Viewers: Reply promptly to feedback and messages to foster a way of neighborhood and improve viewers interplay. Energetic engagement can sign to Fb’s algorithms that the content material is effective and worthy of elevated visibility.

Tip 6: Cross-Promote Content material on Different Platforms: Leverage different social media platforms and communication channels to drive site visitors to Fb content material. This will bypass potential algorithmic restrictions and improve general visibility.

Tip 7: Overview Fb’s Group Requirements: Be certain that all content material strictly adheres to Fb’s neighborhood requirements to keep away from unintentional violations. Familiarize with the precise pointers to attenuate the chance of content material being flagged or suppressed.

Using these methods could not totally get rid of the consequences of content material visibility restriction, however can enhance a person’s potential to achieve their target market and preserve a presence on the platform.

The following sections will discover additional facets of content material moderation and the continued debate surrounding transparency and platform accountability.

Conclusion

This exploration of “shadow banning on fb” has underscored the multifaceted nature of content material visibility restriction. Decreased visibility, algorithm manipulation, content material suppression, and person unawareness type the core parts of a posh concern. Free speech issues, misinformation management efforts, and questions surrounding platform accountability additional complicate the panorama. The dearth of transparency in these practices calls for ongoing scrutiny.

Continued discourse and unbiased investigation are essential. The stability between mitigating dangerous content material and upholding freedom of expression requires fixed vigilance. The way forward for on-line discourse hinges on fostering a extra clear and accountable platform ecosystem. The consequences of shadow banning on fb has many questions and challenges to beat within the digital world.