Hello dear readers, today I want to talk about building and maintaining online communities. I’ll circle back to offline spaces in a separate issue. If you have thoughts, I will leave comments open. I may not get to your comments right away, so please keep that in mind.
First? Some background about social media that affects my perspective on this topic.
Without getting into the weeds, I think it's important to be mindful that social media can negatively impact our mental and physical health, because safety and moderation help keep communities safe. As content users, we’re not always able to enforce moderation due to the structure and privacy levels of online communities. In the absence of moderation, it’s my observation that we search for rules of engagement. No, I'm not talking about free speech. I am, however, referring to the social contract theory which states:
"Social contract theory, nearly as old as philosophy itself, is the view that persons’ moral and/or political obligations are dependent upon a contract or agreement among them to form the society in which they live."
A social contract helps determine boundaries to ensure community members engage in a safe, non-threatening way. Unfortunately, some online communities are not managed or the rules are opaque. In the absence of a social contract, it’s been my experience that members engineer one.
Twitter and Facebook do not have social contracts even though individuals do. They both possess algorithms designed out of their companies’ self-interest: engagement. That engagement doesn’t have a moral or ethical value attached; it’s solely actions taken by users to favorite, bookmark, share, post, etc. Algorithms are programmed (by people, so don’t blame the algos) to favor posts with high engagement.
This is the reason why moderation tools are either ineffective or "late to the party". There's a ton of documentation about this, too. See (1), (2), and (3). In these articles, it's been proven that social platforms rank individuals depending upon their profile or (social rank) and geolocation. The consequence is that the health and safety of some individuals are prioritized over others. This disproportionately affects marginalized identities and rarely, if ever, takes into account other factors such as age or neurodiversity. See (4), (5), (6), (7), and (8).
I'd like to politely remind you the tools didn't create inequality—they exposed and amplified what already existed. If you haven’t experienced inequality, I’m happy for you. Truly. The Internet may seem like the Great Equalizer when we're online, because our experiences include words, pictures, and videos rather than people. Only, most people don’t recognize power imbalances when they’re the ones benefiting from it. Yes, it’s human nature to lean on our experiences and perspectives. Only, our perspectives online are affected by the streams of content directed at us for the purposes of engagement, because they can reinforce our biases. See (9), (10), and (11).
I'm not writing this newsletter to fix online media platforms or make you feel bad. I absolutely believe this background is important to remember when building, maintaining, and participating in online communities both public and private. Mind you, I haven’t even broached how power dynamics, marketing/PR, and money plays into this discussion, either. If the context feels complicated, it’s because it is complicated.
Given that emphasis on engagement, how do we build better online communities that are safer and more welcoming? How do we maintain our spaces? Here's some things to consider for both public and private online spaces where groups and communities gather.
Acknowledge the Community’s Purpose
Who is the space for? Who's welcome? What does the community do? And the space where folks are active? There’s no one-size-fits all with respect to purpose or goals. Even so: every other community-related conversation originates from how you define your space.
For myself, I wanted a private Discord because I no longer enjoyed being active on Twitter, Instagram, and Facebook. That said, I also wanted to acknowledge that I didn't want to form an exclusive space around me and my work. Of course that’s related, but forming a community about me wasn’t my goal. Mind you, I’m finding it’s much easier to shape the community’s core with a limited number of members to start. That may change, but if it does I’ve established a precedent, similar to other groups managed by active community organizers, that I will ask for feedback before making further decisions.
Forming a space around a person is difficult to navigate because of the power dynamics involved. You have “an” individual—and then there's everyone else. (Arguably, small groups like bands are treated the same way.) A sense of inequality underlines this structure and, if that isn’t acknowledged or managed, can lead to unhealthy, even harmful, experiences. Yes, I am saying fandom can be wonderful. It can also be harmful, too. Look, I am not saying communities shouldn’t be formed around celebrities or creators whose jobs often incorporate a fanbase (to varying degrees). I am saying that regardless of who’s building a community that power and privilege are something to be mindful of.
Forming a space around a topic is a little easier provided you understand not everyone will agree, be knowledgeable about, or even join your space for that subject’s discourse. Sometimes, people are simply looking for a fun space to join where their friends are active online. Other times, people really want the content and just that.
Forming around a company or organization? Hoo. If you really want me to get into that, please comment. There are similarities but enough differences I’d need to devote a separate issue to that.
Recognize Roles
If you create any space, whether it’s on Discord, Twitter, Facebook, etc,, roles are crucial to your understanding of the community structure and how different users are treated. Simply, an organizer or moderator has more rights than the average user. As the moderator must prioritize people's safety over engagement? Choose your mods wisely.
Roles, depending upon platform, can be complicated to set up and maintain. Some platforms don't offer role assignment, either. There are other ways to mark moderators or community participants when a tool wasn’t built for them like changing a username, adding an icon, using a hashtag, etc. It just depends on whether your community needs moderators and how you present them.
Assess Moderation Tools
Moderation tools are crucial to building, maintaining, and engaging online communities to prioritize the health and safety of the people participating in them. These tools and their efficacy depends on the community size, structure, and its membership. It’s ultimately your call whether or not you have them, but it’s important to have this conversation from time-to-time even if you’re re-treading the same ground.
I believe the rules should be clear and enforced similarly for all community members. At first, they may not be effective when they're put into practice, especially if you're not experienced with this type of community management. Sometimes, you have to learn by making mistakes, but even that comes with a certain amount of privilege.
Ultimately, moderation tools exist to help establish how members expect to be treated in your space and what content is allowed. Unfortunately, it isn’t true that all content is safe and all members will act in the best interests of the community. We are human beings with the capacity to reach the stars while fighting deadly wars. We've all experienced what happens when platforms prioritize engagement over keeping people safe to varying degrees and that does affect how we navigate our spaces.
When you build a community, remember not everyone feels about moderation the same way you do. Even if you don’t think moderation is necessary: others do.
Post Moderation Rules
Writing and posting moderation rules helps shape the social contract for your space. These tools can include the words "moderators may ban users for any reason." Moderation rules, in my experience, will evolve along with the community over time as more members accept and implement the social contract. The rules may also be different depending on where community members interact. For example, there's a world of difference between moderating a live speaking engagement and a private user's post.
Enforce Moderation Rules Similarly
Moderation rules are enforced not just for (or between) individuals, but for the community’s benefit. This means that a community founder, a moderator, and a user must all be subject to the same rules. Remember what I wrote about content online? How we’re consuming algorthimic content by ourselves and that reinforces our biases? Sometimes, moderators incur abuse because of calls they have to make for the benefit of the community. It is challenging to be mindful of community whenever action is taken. It’s also necessary.
Moderation rules brush up against the social contract, too, because ultimately a rule is provided to acknowledge that mistakes and harm are both possible. There are a lot of power dynamics, messaging, and nuances involved in moderation rules and vary from group to group. The important thing to keep in mind is that they may change. If the community is set up properly, members will understand and accept moderation policies, even if they’re not complete, because they trust that you're acting in good faith.
If the above was challenging for you to read, I’d like to suggest taking a two-week detox from social media. There’s an evolving conversation about the loss of nuance in social media discussion and how our interactions with content have been changing. In my experience, this pertains to the prioritization of engagement and how being overwhelmed with content impacts our usage. There isn’t a ton of studies I could find about this (other than an etail article about average usage). Tangentially related is how headlines and chapter lengths have changed to accommodate shorter attention spans (12) also called The Goldfish Effect (13). Yeah, it’s another one of those complicated issues. My point, though, is that I believe hyper-personalized content (e.g. Minority Report) focused on you to encourage your engagement (clicks, purchases, blue checks, etc.) can be so isolating you lose perspective of your community.
Acknowledge Accessibility and Usage
People go where they feel welcome. That word “welcome” means different things to different people. And that’s okay. Your space may not feel welcoming to everyone. That’s also okay. Communities aren’t filled with people you’re guaranteed to like or engage with, either.
Making people feel welcome can’t be guaranteed, but can be helped. Consider that a lack of moderation tools and rules can be an active deterrent. Sometimes, a lack of accessibility not only prevents others from joining a space, but from engaging with others, too. Ultimately, needs vary from person to person, but basic accessibility needs (like alt text when possible) and moderation makes a world of difference.
It’s important for me to point out that people don’t use content/platforms in exactly the same way, either, and we don’t always share the same goals. Some people will want to join a community, read the content, and not post. Others will want to join because they’re just looking for gossip. Want active users? That word—active—isn’t something you can define for others. It’s up to your members. Someone might post once every couple of days and that's as much bandwidth as they have to spare. Or, they might post every day. My point? There's no "one way" to be part of a community.
Be Open to Feedback
Being open to feedback doesn’t mean: “I will take criticism to heart and automatically make changes to ensure your happiness.” It means that: “I will listen to what you have to say provided you are offering feedback in good faith.” You cannot make everyone happy no matter what you do or say. Sometimes, the people who are the most vocal are angry—but typically, they’re not the majority of community members.
I want to underline this because, over and over again, it’s been proven that “loud commenters” don’t reflect everyone. There is such a thing as a loud, vocal minority. See (14) and (15). The rest? The people who aren’t saying anything? That silent majority who doesn’t speak up for a variety of reasons? Those are the people you actually served, because they’re the backbone of your community. They’re the people you need to consider, because when you lose their interest all you’re left with are vocal individuals who don’t acknowledge or care about acting in good faith. And, when that’s the case, you’ve lost your community even if the data tells you otherwise.
What about you? What do you think about online communities? Anything I missed?
Share this post
Thoughts about Building and Maintaining Online Communities
Share this post
Hello dear readers, today I want to talk about building and maintaining online communities. I’ll circle back to offline spaces in a separate issue. If you have thoughts, I will leave comments open. I may not get to your comments right away, so please keep that in mind.
First? Some background about social media that affects my perspective on this topic.
Without getting into the weeds, I think it's important to be mindful that social media can negatively impact our mental and physical health, because safety and moderation help keep communities safe. As content users, we’re not always able to enforce moderation due to the structure and privacy levels of online communities. In the absence of moderation, it’s my observation that we search for rules of engagement. No, I'm not talking about free speech. I am, however, referring to the social contract theory which states:
A social contract helps determine boundaries to ensure community members engage in a safe, non-threatening way. Unfortunately, some online communities are not managed or the rules are opaque. In the absence of a social contract, it’s been my experience that members engineer one.
Twitter and Facebook do not have social contracts even though individuals do. They both possess algorithms designed out of their companies’ self-interest: engagement. That engagement doesn’t have a moral or ethical value attached; it’s solely actions taken by users to favorite, bookmark, share, post, etc. Algorithms are programmed (by people, so don’t blame the algos) to favor posts with high engagement.
This is the reason why moderation tools are either ineffective or "late to the party". There's a ton of documentation about this, too. See (1), (2), and (3). In these articles, it's been proven that social platforms rank individuals depending upon their profile or (social rank) and geolocation. The consequence is that the health and safety of some individuals are prioritized over others. This disproportionately affects marginalized identities and rarely, if ever, takes into account other factors such as age or neurodiversity. See (4), (5), (6), (7), and (8).
I'd like to politely remind you the tools didn't create inequality—they exposed and amplified what already existed. If you haven’t experienced inequality, I’m happy for you. Truly. The Internet may seem like the Great Equalizer when we're online, because our experiences include words, pictures, and videos rather than people. Only, most people don’t recognize power imbalances when they’re the ones benefiting from it. Yes, it’s human nature to lean on our experiences and perspectives. Only, our perspectives online are affected by the streams of content directed at us for the purposes of engagement, because they can reinforce our biases. See (9), (10), and (11).
I'm not writing this newsletter to fix online media platforms or make you feel bad. I absolutely believe this background is important to remember when building, maintaining, and participating in online communities both public and private. Mind you, I haven’t even broached how power dynamics, marketing/PR, and money plays into this discussion, either. If the context feels complicated, it’s because it is complicated.
Given that emphasis on engagement, how do we build better online communities that are safer and more welcoming? How do we maintain our spaces? Here's some things to consider for both public and private online spaces where groups and communities gather.
Acknowledge the Community’s Purpose
Who is the space for? Who's welcome? What does the community do? And the space where folks are active? There’s no one-size-fits all with respect to purpose or goals. Even so: every other community-related conversation originates from how you define your space.
For myself, I wanted a private Discord because I no longer enjoyed being active on Twitter, Instagram, and Facebook. That said, I also wanted to acknowledge that I didn't want to form an exclusive space around me and my work. Of course that’s related, but forming a community about me wasn’t my goal. Mind you, I’m finding it’s much easier to shape the community’s core with a limited number of members to start. That may change, but if it does I’ve established a precedent, similar to other groups managed by active community organizers, that I will ask for feedback before making further decisions.
Forming a space around a person is difficult to navigate because of the power dynamics involved. You have “an” individual—and then there's everyone else. (Arguably, small groups like bands are treated the same way.) A sense of inequality underlines this structure and, if that isn’t acknowledged or managed, can lead to unhealthy, even harmful, experiences. Yes, I am saying fandom can be wonderful. It can also be harmful, too. Look, I am not saying communities shouldn’t be formed around celebrities or creators whose jobs often incorporate a fanbase (to varying degrees). I am saying that regardless of who’s building a community that power and privilege are something to be mindful of.
Forming a space around a topic is a little easier provided you understand not everyone will agree, be knowledgeable about, or even join your space for that subject’s discourse. Sometimes, people are simply looking for a fun space to join where their friends are active online. Other times, people really want the content and just that.
Forming around a company or organization? Hoo. If you really want me to get into that, please comment. There are similarities but enough differences I’d need to devote a separate issue to that.
Recognize Roles
If you create any space, whether it’s on Discord, Twitter, Facebook, etc,, roles are crucial to your understanding of the community structure and how different users are treated. Simply, an organizer or moderator has more rights than the average user. As the moderator must prioritize people's safety over engagement? Choose your mods wisely.
Roles, depending upon platform, can be complicated to set up and maintain. Some platforms don't offer role assignment, either. There are other ways to mark moderators or community participants when a tool wasn’t built for them like changing a username, adding an icon, using a hashtag, etc. It just depends on whether your community needs moderators and how you present them.
Assess Moderation Tools
Moderation tools are crucial to building, maintaining, and engaging online communities to prioritize the health and safety of the people participating in them. These tools and their efficacy depends on the community size, structure, and its membership. It’s ultimately your call whether or not you have them, but it’s important to have this conversation from time-to-time even if you’re re-treading the same ground.
I believe the rules should be clear and enforced similarly for all community members. At first, they may not be effective when they're put into practice, especially if you're not experienced with this type of community management. Sometimes, you have to learn by making mistakes, but even that comes with a certain amount of privilege.
Ultimately, moderation tools exist to help establish how members expect to be treated in your space and what content is allowed. Unfortunately, it isn’t true that all content is safe and all members will act in the best interests of the community. We are human beings with the capacity to reach the stars while fighting deadly wars. We've all experienced what happens when platforms prioritize engagement over keeping people safe to varying degrees and that does affect how we navigate our spaces.
When you build a community, remember not everyone feels about moderation the same way you do. Even if you don’t think moderation is necessary: others do.
Post Moderation Rules
Writing and posting moderation rules helps shape the social contract for your space. These tools can include the words "moderators may ban users for any reason." Moderation rules, in my experience, will evolve along with the community over time as more members accept and implement the social contract. The rules may also be different depending on where community members interact. For example, there's a world of difference between moderating a live speaking engagement and a private user's post.
Enforce Moderation Rules Similarly
Moderation rules are enforced not just for (or between) individuals, but for the community’s benefit. This means that a community founder, a moderator, and a user must all be subject to the same rules. Remember what I wrote about content online? How we’re consuming algorthimic content by ourselves and that reinforces our biases? Sometimes, moderators incur abuse because of calls they have to make for the benefit of the community. It is challenging to be mindful of community whenever action is taken. It’s also necessary.
Moderation rules brush up against the social contract, too, because ultimately a rule is provided to acknowledge that mistakes and harm are both possible. There are a lot of power dynamics, messaging, and nuances involved in moderation rules and vary from group to group. The important thing to keep in mind is that they may change. If the community is set up properly, members will understand and accept moderation policies, even if they’re not complete, because they trust that you're acting in good faith.
If the above was challenging for you to read, I’d like to suggest taking a two-week detox from social media. There’s an evolving conversation about the loss of nuance in social media discussion and how our interactions with content have been changing. In my experience, this pertains to the prioritization of engagement and how being overwhelmed with content impacts our usage. There isn’t a ton of studies I could find about this (other than an etail article about average usage). Tangentially related is how headlines and chapter lengths have changed to accommodate shorter attention spans (12) also called The Goldfish Effect (13). Yeah, it’s another one of those complicated issues. My point, though, is that I believe hyper-personalized content (e.g. Minority Report) focused on you to encourage your engagement (clicks, purchases, blue checks, etc.) can be so isolating you lose perspective of your community.
Acknowledge Accessibility and Usage
People go where they feel welcome. That word “welcome” means different things to different people. And that’s okay. Your space may not feel welcoming to everyone. That’s also okay. Communities aren’t filled with people you’re guaranteed to like or engage with, either.
Making people feel welcome can’t be guaranteed, but can be helped. Consider that a lack of moderation tools and rules can be an active deterrent. Sometimes, a lack of accessibility not only prevents others from joining a space, but from engaging with others, too. Ultimately, needs vary from person to person, but basic accessibility needs (like alt text when possible) and moderation makes a world of difference.
It’s important for me to point out that people don’t use content/platforms in exactly the same way, either, and we don’t always share the same goals. Some people will want to join a community, read the content, and not post. Others will want to join because they’re just looking for gossip. Want active users? That word—active—isn’t something you can define for others. It’s up to your members. Someone might post once every couple of days and that's as much bandwidth as they have to spare. Or, they might post every day. My point? There's no "one way" to be part of a community.
Be Open to Feedback
Being open to feedback doesn’t mean: “I will take criticism to heart and automatically make changes to ensure your happiness.” It means that: “I will listen to what you have to say provided you are offering feedback in good faith.” You cannot make everyone happy no matter what you do or say. Sometimes, the people who are the most vocal are angry—but typically, they’re not the majority of community members.
I want to underline this because, over and over again, it’s been proven that “loud commenters” don’t reflect everyone. There is such a thing as a loud, vocal minority. See (14) and (15). The rest? The people who aren’t saying anything? That silent majority who doesn’t speak up for a variety of reasons? Those are the people you actually served, because they’re the backbone of your community. They’re the people you need to consider, because when you lose their interest all you’re left with are vocal individuals who don’t acknowledge or care about acting in good faith. And, when that’s the case, you’ve lost your community even if the data tells you otherwise.
What about you? What do you think about online communities? Anything I missed?