Social media lives Social media is somehow a part of our everyday lives, and we see different platforms that cater to varying types of communication and content. With that said, when we misuse the platforms or share too much, we potentially risk all that we care to protect. Join me as I discuss the ins and outs of the appropriate use of Social Media and some world observations on what we can do better.
[00:00:04] Welcome to MSP 1337. I'm your host Chris Johnson, a show dedicated to cybersecurity challenges solutions, a journey together, not alone.
[00:00:17] Welcome everybody to another episode of MSP 1337. I'm joined this week by Kelly Stone, a social media influencer.
[00:00:30] And many of you know her through the different roles we've seen her in over the last, well, we'll just say a couple of years,
[00:00:36] because we don't want to tell everybody how many years you've been doing this. Kelly, welcome to the show.
[00:00:40] Thank you for having me and thank you for keeping my age of scare for my own personal privacy.
[00:00:46] You know if we said it's been since the beginning of you know the MSP space, I think that keeps us pretty young right? That's not been all that long.
[00:00:55] Forever young in fact.
[00:00:57] Yeah, and we won't talk about the things that we didn't have when we were in high school because that would be a dead giveaway for yeah.
[00:01:05] So Kelly, I reached out to you on LinkedIn and said I want to tell a story about cybersecurity and social media.
[00:01:13] I think social media is on everybody's mind. We're all involved in it. We all probably get a little bit of the endorphins when someone gives us a thumbs up.
[00:01:22] Even if it's a thumbs down, it still has a ding in the you know in the alerts.
[00:01:27] But you and I were talking offline and one of the things we're talking about is the things that we see people do.
[00:01:33] That one would question whether or not they've thought about what they were posting before they posted it.
[00:01:39] You know everything from the certification that they just got or hey I was just promoted to fill in the blank job role.
[00:01:48] And I guess the loaded question right out of the gate is
[00:01:51] when is it appropriate and how does appropriateness of social media work when we're balancing work life social?
[00:01:59] How does that where do you start?
[00:02:03] I think I could probably write a book on this topic and probably should.
[00:02:08] There's my next billion dollar idea.
[00:02:11] There you go.
[00:02:12] It's wild because I think for most of us social media is a bit of an inevitability.
[00:02:19] We've got to be on it to see pictures of our friends kids interact with our parents,
[00:02:25] stay connected with the business world at large.
[00:02:29] But once you're there it can be really difficult to decide what goes where at what time and to your previous question what is appropriate.
[00:02:38] I think about it in a couple of ways.
[00:02:40] The first thing that I think about is where am I posting this?
[00:02:44] What exists on Twitter is not the same as what exists on Instagram and certainly shouldn't be the same as what you post on LinkedIn.
[00:02:52] So I think venue is very important.
[00:02:54] And I think what goes hand in hand with venue is audience.
[00:02:58] Some of these sites have the ability to curate your audience in different ways.
[00:03:02] So for example on Facebook you can add people to an acquaintance list and they see different posts than your friends who see different posts from the public.
[00:03:12] So that's a way to segment your audience much in ways that marketers like me segment your emails.
[00:03:19] So that's an option.
[00:03:20] LinkedIn obviously you connect with people and they see things that are different than people who are not connected with you.
[00:03:27] Twitter by default is public and Instagram has some various privacy filters so that's the other thing.
[00:03:34] I know back in the day people used to have things like a business and a personal Facebook account.
[00:03:40] I personally don't like that because I feel like you're likely to make a mistake if you do that.
[00:03:46] I'm switching between different profiles and logins.
[00:03:50] And if there's something that you were going to share on your personal account that isn't appropriate for your professional account I would suggest that perhaps that's something that doesn't meet the appropriateness threshold.
[00:04:02] You could have friends that end up in overlaps too that might repost something that now has bridged it whether you made a mistake or not.
[00:04:12] Exactly.
[00:04:12] And it's a very easy mistake to make especially in the previous days of X which was Twitter.
[00:04:22] It was very easy to make a mistake between a direct message and a public facing tweet.
[00:04:27] You can see politicians like Anthony Weiner who made this mistake and you can't go back on it.
[00:04:33] So there's a lot of different social media platforms.
[00:04:37] You mentioned some of the major ones and obviously we've seen the evolution of some of these platforms for good, bad or otherwise that have shifted the way in which we use them.
[00:04:48] I said I'm afraid of dropping Facebook because I might miss out on birthdays or something.
[00:04:55] I know my brother is very active on Facebook.
[00:04:57] That's kind of how I keep track of what he's doing.
[00:04:59] I live more in the LinkedIn space and I'm just curious about separating what kind of things we post being careful with those platforms.
[00:05:10] One of the things I've noticed and I'm sure you've noticed as well is there's been sort of a migration of those that used to post in Facebook or Insta.
[00:05:21] I've started really posting on LinkedIn and you see a post that says something to the effect of normally I don't post anything about my personal fill in the blank, but and it's on LinkedIn now.
[00:05:31] Right.
[00:05:32] We've also seen similar happen with Facebook.
[00:05:34] Facebook was largely solely tied to social interaction.
[00:05:39] I mean it came out of college social communication.
[00:05:44] Right.
[00:05:45] Twitter is the same thing right.
[00:05:46] How to get bits and bytes out at a very short burst that was more about notifying you about something than it was about sharing my life story in 120 characters.
[00:05:58] How does that you know traverse today?
[00:06:01] In fact, I remember I was actually at South by Southwest when Twitter launched and they used it at the venue to tell us where everything was happening instead of doing like pull up your agenda.
[00:06:12] It would just get blasted out on a TV screen that had the Twitter feed going.
[00:06:16] So to today like you know what do we do because I think you damned if you do damned if you don't like is there a like how do you balance that because I think that's a tough one in and of itself.
[00:06:30] I think it is very hard to avoid social media.
[00:06:33] I have this deep seated fantasy that one day when I retire, I'm going to be like that scene when Anne Hathaway throws her cell phone into the fountain and the devil wears Prada.
[00:06:43] I'm going to do that with my smartphone.
[00:06:44] I'm going to get a flip phone.
[00:06:45] I'm going to change my number and I am off of all of the social media.
[00:06:50] But until that point for my profession, for our industry, it can be really difficult to avoid.
[00:06:59] So I think what's interesting is certainly in the last two years we've seen the rise of the LinkedIn influencer and I've written that way too.
[00:07:09] I'm just as guilty as everybody else, but you're getting the same dopamine hit in different ways and it's changing the conversation around these two.
[00:07:19] So Twitter is a perfect example.
[00:07:21] When it first came out, it was difficult to have a true dialogue that was easy to follow.
[00:07:26] Sure.
[00:07:27] Now you can create threads on Twitter and you can have more of an open dialogue.
[00:07:32] And we've seen how that impacts public discourse as well as political discourse.
[00:07:39] You know, Twitter has been influential in many different political uprisings because of the ability to text your tweets via SMS.
[00:07:49] So that's changed the conversation too.
[00:07:51] I think the pivotal moment that changed Facebook, in my opinion, is when .edu was no longer the requirement to get in.
[00:08:00] So when everybody could get in, the conversation changed a lot.
[00:08:03] And I think that we'll continue to see that.
[00:08:06] Oh yeah, it was some time ago.
[00:08:08] It might have been slightly after it became Facebook from the Facebook.
[00:08:15] So these moments in time and these decisions change how the conversation happens there.
[00:08:20] And there's a lot of communications theory around that, but basically the venue does start to dictate the conversations that happen there.
[00:08:32] Would you like to see more of this or less of this?
[00:08:34] As you know, I use Flipboard a lot for curating news.
[00:08:37] And what I find is that if I'm not careful, I can end up in biased news state because I have said no to something that I know is either slanted too far one way or the other because it doesn't make sense.
[00:08:51] There's pieces missing or it's not true at all.
[00:08:56] And depending on what I say to it may dictate what continues to flow into my feed.
[00:09:00] And I'm always like, well, is this kind of like chat GPT like garbage in garbage out?
[00:09:05] Do I need to go look at some specific news source and suddenly have the epiphany of like, oh, I've been in the wrong place this whole time.
[00:09:14] Certainly, I think that we're waking up to this.
[00:09:16] One of the challenges that I see is that media literacy is no longer taught anywhere much like financial literacy isn't taught anywhere.
[00:09:24] So we're not thinking about critically evaluating our news.
[00:09:28] And that's perpetuated in social media because, again, the algorithm rewards us.
[00:09:33] So we see things that we like.
[00:09:34] We get that great sensation of like, oh, I know what's going on.
[00:09:37] I understand the world.
[00:09:39] Yes, we think that we understand the world until we are faced with the harsh reality of perhaps an election not going the way that we were absolutely convinced that it would because all of our news sources said it did.
[00:09:53] So the idea what's happening here is a communication theory that's played out a million different ways before.
[00:10:01] There's a great book by Neil Postman called Amusing Ourselves to Death, and he posits that we're creating towards one of two realities.
[00:10:10] Either the 1984 reality where the things that scare us are the ones that eat us alive as an organization.
[00:10:17] Exactly.
[00:10:18] Or we're heading towards brave new world.
[00:10:21] Where the things that we love kill us as a society.
[00:10:25] I would argue that we're making it the things that we love are killing us as a society because we are creating a world unto ourselves that is self-sustaining, self-perpetuating and never an idea against it enters it.
[00:10:40] There's been a few movies that have come out that way that allude to the dystopian future because of our inability to discern fact from fiction.
[00:10:52] And it's interesting because there's a lot of young adult literature that plays with this concept, the Hunger Games ideas, the way the capital lives versus how the outliers do.
[00:11:05] This idea that there is a single source of truth I think went away with the 24-hour news cycle and social media was born out of that as well.
[00:11:13] It reminds me of the Amazon series The Man in the High Tower.
[00:11:17] Yes, I think about that show often for those who are listening who are unfamiliar, it posits a world in which we did not win World War II but we didn't know it.
[00:11:30] Wait, we didn't know that we won?
[00:11:34] Right.
[00:11:36] Right.
[00:11:37] Uh-huh.
[00:11:38] Yeah, that's the part that for those living in the US didn't know.
[00:11:42] One half of the US is in a state that believes that we lost the war and it creates this whole underground system of trying to get people out and then they realize that they should have never been in that state.
[00:11:58] It's definitely dystopian in the way it plays out.
[00:12:06] Wow, we could go down a whole different trajectory talking about the things that are kind of from fantasy literature and fiction that has turned into a large part of what's influencing current media and current news cycles because of that.
[00:12:22] So as cybersecurity kind of drives largely what we talk about on this show, obviously we're focused very heavily on the privacy piece.
[00:12:33] What are you seeing as far as like, you know, we talked about this a little bit, things that people shouldn't post.
[00:12:38] I think it's okay to say, you know, new job or new promotion but is it me or are people getting more and more specific with what they put in those posts about the change?
[00:12:52] I think people are compelled to be more specific because I think that it takes more to get that serotonin hit of the thumbs up than it ever did before.
[00:13:03] You're going to have to be new and novel and different and more over the top to get those views and likes.
[00:13:09] I mean, look at what MrBeast does. Amazing content machine getting all the likes, all of the adjacent product placements.
[00:13:20] But his team tests a hundred thumbnails before anything ever goes live.
[00:13:26] So getting to that utopia of getting all of that, you have to do more, you have to say more, you have to be more evocative and that's the nature of the media beast too.
[00:13:38] You're saying that not everybody's going to get a YouTube trophy in the mail for their number of liked or watched videos?
[00:13:47] Nope, I do not think it goes on our tombstone of you had a hundred million likes.
[00:13:53] I think that we got to think about it that way too of like the half life of these things is so short too.
[00:14:00] So sure, it may feel good to get out those words, it may feel good to get that feedback but that's part of your personal brand once it's out there too.
[00:14:11] Wow. So if we were to talk about the ways in which people should be using social media, you talked about sort of separating the two but at the same time we're seeing the platforms merge and move together.
[00:14:25] I mean, once upon a time Instagram was its own thing and now it's part of the Facebook monopoly on social media.
[00:14:36] I shouldn't say monopoly, there's a lot of other independent platforms out there but at least today there's still some independent platforms.
[00:14:44] What kind of things should, going into sort of the appropriateness of what we post,
[00:14:51] like what are the, you're definitely on the right side of if in the middle it's like you're not sure if it's say work or personal in some respects, like it blurs the lines
[00:15:02] versus and then looking at it through the lens of okay this is definitely work related to what level of information is too much.
[00:15:10] Like, you know, vagueness and ambiguity don't really help anybody but getting too specific can definitely have a damaging impact because it exposes something that was unintentional
[00:15:20] and similarly with the personal side, you know, you can go too far and same idea like things we did in high school pre-social media was like well what happened and
[00:15:30] stayed there because there was no camera.
[00:15:33] Like, I guess that's the thing I'm seeing now and just would love your feedback on it because I think social media has the opportunity to be changed for the better as it continues to evolve.
[00:15:45] But if people don't know how to influence that change they just accept what it is.
[00:15:51] I think about it similarly to the sales funnel.
[00:15:55] The people that are at the top where I have the most exposure to the most amount of people,
[00:16:01] they're getting a surface level view of these things. They aren't getting the nitty-gritty details.
[00:16:06] When we get down to when we go from one to many and we get down to a more one-on-one level,
[00:16:12] yeah, I'll tell you all the dirty details.
[00:16:14] Sure.
[00:16:15] I will tell you what's actually going on. I will, you know, get into the mechanics of how we got here.
[00:16:21] I think those details are best served for people who actually want and need to know these things and care.
[00:16:29] I think that especially when it comes to career updates, personal updates,
[00:16:35] people are really looking for the highlights. They're looking for the broad strokes.
[00:16:40] I think if you get into too many details, not only are you going to lose people,
[00:16:44] you risk that exposure. Someone told me a couple of months ago, and I thought it was brilliant,
[00:16:51] on social media people want to see the scars. They don't want to see the scabs.
[00:16:56] So wait until something's in the review. Wait until you have that poignant lesson to share.
[00:17:02] Maybe don't post it while you're going through it, at least publicly.
[00:17:08] Almost like the only place to get a positive spin on the news cycle is when you talk about things
[00:17:13] that have strengthened you or made you better, not the, you know, I lost my left hand because,
[00:17:20] you know, fill in the blank. That's terrible. So along those lines,
[00:17:24] that's really interesting that you say that. It makes me think about,
[00:17:27] so if I'm the high level to the one-on-one, I would imagine that you're also saying that
[00:17:32] you're probably not having the more intimate conversation and sharing the details still
[00:17:37] in social media. Typically, no. I'm sure somebody can
[00:17:45] whip through all of my stuff and find some of the more personal things that I've put out to
[00:17:49] the world, especially in days gone by when I was a blogger, I was a much more publicly
[00:17:55] available person, but the venue evolved. And I think we did too. And I think we grow
[00:18:01] out of stuff too. I think that social media as a concept has grown out of some of that
[00:18:07] oversharing too. I don't know that we're rewarded as much as we all feel the internal
[00:18:12] cringe of, oh man, if that was me, I wouldn't be writing it.
[00:18:16] Yeah. It makes me think about the news cycles that you were talking about,
[00:18:20] if we're posting highlights. So thinking about shows like Morning Joe or Good Morning America,
[00:18:25] we're getting a glimpse of the news and even then it's only pieces of the news that they
[00:18:31] have curated and want to share with us. Does that directly tie into what we're then seeing
[00:18:38] in social media? I would say so.
[00:18:41] Cause another concept behind communications theory is that the media doesn't tell you
[00:18:47] what to think, but they do tell you what to think about. So they're curating this
[00:18:52] experience. They're curating a conversation around a very limited amount of topics. Therefore
[00:18:59] those become the topics of the day. You're not necessarily singing the same song,
[00:19:04] but you're singing from the same choir book. Sounds like hashtag the Jimmy Kimmel word of the
[00:19:10] day where he sends out anybody that can hashtag this. And then of course it's suddenly the
[00:19:15] number one most used hashtag for that social media cycle. And then of course it fades
[00:19:21] because it's no longer the relevant or important story anymore because it's been
[00:19:24] replaced by yet another hashtag.
[00:19:28] Exactly. It's a little bit about, I'm sure you've seen the articles where people are doing big
[00:19:34] data crunching on things like all of the PhD proposals of the last year and seeing how many
[00:19:41] AI buzzwords have made it into there to measure the penetration of chat GPT. So
[00:19:47] delve is one of those words. People don't delve into things nearly as much as chat GPT
[00:19:51] thinks that they do. Chat GPT delves into things that chat GPT shouldn't delve into.
[00:19:57] Yes, that too. That's another podcast. Well, and it's funny you talk about the,
[00:20:02] you know, chat GPT is a research tool and as a way to like, you know, have I thought,
[00:20:07] have I exhausted all of the categories for this topic? You brought up AI. So,
[00:20:13] you know, we're looking at some topics around generative AI and what are the risks. And so
[00:20:19] I was reviewing some content and I thought it was pretty comprehensive, but I'm like, you know,
[00:20:23] I should plug in this question in the chat GPT and see what kind of answers.
[00:20:27] And I would say that of the five that were there, they'd captured those five,
[00:20:32] but in chat GPT, there was another four and the other four were far more. It had come back
[00:20:40] with things like ethics and integrity that were missing from the list that I've been given.
[00:20:47] And I'm like, okay, this is a really good example of value coming from it versus,
[00:20:53] you know, write this for me. And so I always tell people like, if you're going to have
[00:20:57] it write something for you, you need to share with it some context of like who you are,
[00:21:02] which can get dangerous because it's also going to go out there and search all the
[00:21:06] things that you have been and be careful because it may put all of that into what
[00:21:10] it writes for you. Exactly. And that's where we bring it back to cybersecurity and what that
[00:21:17] looks like and protecting your identity and all the things that make you, you.
[00:21:21] I had a brilliant boss years ago who maintained his most effective use as a leader is at the
[00:21:27] beginning and at the end of projects. And I would say chat GPT is the same. It is most useful
[00:21:33] at the beginning ideation SEO, et cetera. And at the end proofreading, looking for logical
[00:21:40] fallacies. It is not good at the human element that you and I bring to it, which is how we
[00:21:46] speak, how we present ideas, how we get to the story, how we introduce the human element
[00:21:51] to those projects as well. Yeah, there was a, I submitted a topic to Black Hat.
[00:21:58] I don't know if I get accepted or not, but one of the things that they had is a disclaimer
[00:22:02] that any, any submissions that are determined to have come from AI generate been generated
[00:22:09] by AI will immediately be dismissed. And I'm, and I thought about that for a minute and I'm
[00:22:13] like, but who doesn't brainstorm ideas with things like chat GPT now at their fingertips?
[00:22:18] I'm not asking you to write my synopsis of what I'm going to present on. I'm just asking
[00:22:22] it to make sure that I've exhausted the scenarios that would go into what I have in
[00:22:28] my head. It's almost like a second consciousness. Once you give it enough information,
[00:22:34] it does regurgitate back to you what was in your head, but it gives you like
[00:22:38] only different paths to go down that you had never thought about. And I'm like,
[00:22:42] I might've gotten there eventually, but it definitely got there a whole lot quicker than
[00:22:46] I anticipated. But then that means I'm also sharing with a tool, stuff that's in my head
[00:22:53] that could at some point in time in the future be, I think the word is used against me.
[00:23:00] And that's the terrifying double edge sort of it is it's like a second consciousness,
[00:23:05] which I agree it is a partner. It's a partner to humans. The dream of any automation since the
[00:23:13] invention of the wheel is that it frees up human intellect to do more interesting things.
[00:23:18] It's what we tell help desk technicians about the integration of AI and machine learning,
[00:23:22] but it's on us to then go do more interesting things that provide value.
[00:23:26] Right, right. Playing video games, you know, sucking it. Yeah. So to kind of recap,
[00:23:34] I think there were sort of three, maybe four things that I think we've touched on throughout.
[00:23:39] One of them is making sure that we use the social media platforms appropriately. We're seeing
[00:23:45] a lot of not so appropriate. If it's personal, keep it in personal. Facebook is more of a
[00:23:51] personal type platform than say LinkedIn. So that was one. The other one was who is your
[00:24:00] audience? Like knowing who your audience is and being careful that you're providing content
[00:24:06] that's valid to the audience. You want to, you know, they're only going to consume your
[00:24:10] content for so long if it's not relevant. And then there goes your audience. Right.
[00:24:16] And then the third one is I think you kind of said this, you talked about the consistency
[00:24:22] of like how frequently you put stuff out there in order to kind of maintain that brand,
[00:24:28] that identity. I didn't, we didn't talk about this, but correct me if I'm wrong here.
[00:24:32] There's really not a rule on the frequency as long as the pattern is consistent. Is that fair?
[00:24:38] That's correct. And certain platforms have different quirks. YouTube will reward you for
[00:24:44] uploading at the same time, same schedule. But that said, if you upload a YouTube short,
[00:24:53] you may experience a halo effect of increased views on all of your content because the
[00:24:58] algorithm is incentivizing you to use that new communications tool. Newish, shall we say.
[00:25:03] Got it. It's the opportunity of having people click on the link enough times that it raises
[00:25:10] the algorithm and now you're showing up further up. And we see this with music, right? Where
[00:25:16] like the entire, and you see the cracking down on rooms filled with iPads that are all playing
[00:25:22] the same song to get that, you know, the number of playbacks up so that those royalties can be paid
[00:25:29] out. I need to find somebody to do that for me. Seems expensive, but maybe it pays out in
[00:25:35] the end if you can get up there on the millions of hits. I don't know. Maybe there's an AI use
[00:25:41] for that. Yeah. For those of you listening, please take the advice given just now with a
[00:25:48] grain of salt. So I think we've kind of covered the, you know, be careful with what you post,
[00:25:56] really, I think is, if, you know, if I was to say takeaways that sharing oversharing can
[00:26:02] be extremely dangerous. And it's not just the impact to you. It's potentially the impact to
[00:26:08] friends, family. And of course, if you're highly social media, doing a lot of social
[00:26:13] media in the workplace, you could have some significant impact to your organization as well.
[00:26:21] Kelly, anything else that you want to share with the audience, whether it's specific to
[00:26:25] cybersecurity or more of the broader, you know, big, you know, big tip on appropriate use
[00:26:32] of social media? If I had to boil all of this down, it is being true to yourself.
[00:26:41] I know that there's this idea of bringing your whole self to work. I don't necessarily
[00:26:47] subscribe to that. I don't think anybody wants all of that, but I do think that I think about
[00:26:54] who I want to be in these rooms, in these places and being true to that concept. And
[00:26:59] that concept may only be a portion of myself. I really only share AI headshots if I can
[00:27:06] keep that going because I feel like that adds a little bit of shade in between myself and the
[00:27:14] public as well. So just figuring out what those rules look like for you personally
[00:27:20] can provide a lot of peace of mind. So sort of like the share the scars and not the scabs?
[00:27:27] Indeed, you've got it. All right. Well, for those of you listening,
[00:27:30] this has been an episode of 1337. Thanks and have a great week.

