Found

Leigh Honeywell, Tall Poppy

Episode Summary

Leigh Honeywell has spent her career trying to prevent bad things from happening to people on the internet. She’s spent time at Slack, Heroku, and Microsoft, and is well-versed on both the technical and human sides of online harassment, and has seen first-hand how it can escalate to hacking or worse. That’s how she came up with Tall Poppy, a platform that helps organizations with a public-facing workforce, like media orgs, actively prevent this type of escalation. The Tall Poppy model turns what would be an unscalable business into a fast-growing startup.

Episode Notes

Leigh Honeywell has spent her career trying to prevent bad things from happening to people on the internet. She’s spent time at Slack, Heroku, and Microsoft, and is well-versed on both the technical and human sides of online harassment, and has seen first-hand how it can escalate to hacking or worse. That’s how she came up with Tall Poppy, a platform that helps organizations with a public-facing workforce, like media orgs, actively prevent this type of escalation. The Tall Poppy model turns what would be an unscalable business into a fast-growing startup. 

Links for this episode:

Connect with us:

Episode Transcription

Darrell Etherington  0:00  

Hey, this is Darrell Etherington. And welcome to found I'm here with Jordan Crook is my name, who is my staunch defender and permanent ally. By the verbal contract is legally binding in the state of nature. That sounds mysterious, but all mysteries will be revealed when you hear our conversation later on. But Jordan, this is the show the TechCrunch. And me and you have created to just talk to founders. And we've done a lot of that we've done more than that, maybe then, no, that's not true. I've talked to a lot of founders might not but not in this way, we've talked to founders in a very different way, a very personal way away, where we get at a lot of stuff that's like really just practical, good advice for how to build a business, but also a lot of stuff that's just like empathy content for founders, where you just want to listen to it and go like, Okay, great. Other people are feeling the way that I'm feeling.

 

Jordan Crook  0:54  

What's cool about is I think most of our conversations with founders outside of this podcast, are very, like task oriented, right? It's like, Okay, I have to write a story. Or, like, maybe you're a good fit for one of our events, or whatever. And it has like an end goal that you're trying to access something out to build this other thing. And like this is the conversation is the goal here? Yeah. So it goes kind of everywhere. And we've talked about how people feel we've talked about what challenges them what makes them feel good. We've talked about just like stuff that interests them that doesn't have to do with their product or building a company. I've loved it. Like I like that there's no end goal other than just enjoying that conversation.

 

Darrell Etherington  1:35  

Yeah, it's definitely restorative to our souls, I think is how I would put it. Sure. Yeah, as your defender, I agree. This week on found, we speak to Lee Honeywell, who is the CEO and founder of tall Poppy, and Lee is a fantastic person to talk to you about basically like a problem that is worse and worse over time. Hopefully, with the help of leaner company, it'll turn around. But basically, she formed this company to deal with the increasing kind of harassment that is targeted at people with public personas that come about as a result of their job. So she works with companies to protect their employees from attacks on kind of like their personal accounts and their personal online identities, which you know, sometimes go hand in hand with doing jobs like ours, like reporting. So Jordan, what did you think about our chat with Lee?

 

Jordan Crook  2:26  

I thought it was great. One, she's incredibly smart and insightful. So I feel like I learned a lot. Yes, but too, she comes at things with both a sense of humor and empathy. So and I think it's very representative of tall poppy as a company, you can see how she's the one that built that product, as people who use it, we do use it at ryzen. And it's like, something that can be very obtuse and very, like confusing and kind of stressful. Like, I feel like she really brings that down to this very calming level where you're learning something and you're having fun learning it, and you feel soothed by it. And that is how I would describe our conversation with Lee.

 

Darrell Etherington  3:04  

Yeah. Like she really thinks intently and intellectually about, like, why these things are happening, and not just how to fix them. So she really, you know, seems set up to create lasting long term solutions. I completely agree. Please enjoy our conversation with Lee Honeywell. Welcome, Lee. It's great to have you here. Thanks. It's great to be here. Yeah. And you recommendation from mine and Jordans colleague, Zack Whittaker, who is our beloved security reporter, one of the best in the business and a mutual connection. So he has nothing but high praise for you.

 

Leigh Honeywell  3:42  

Oh, that's so kind. I've been a huge fan of Zach's security newsletter that he's been publishing for a while. It's a great way to catch up on sort of cybersecurity news that isn't Twitter.

 

Darrell Etherington  3:51  

Yeah. And it's not even paying me to do that. This is just like a free promo at the start of the

 

Unknown Speaker  3:56  

episode. So he didn't pay us either. at all, free writing. It's, we're just like fans, but

 

Darrell Etherington  4:08  

also, I mean, we're thrilled to have you here because, you know, your story is very interesting as a founder, but then also like the company that you've built is focusing on a super, I mean, I was gonna say it's topical. But that's like, not fair to say because it's kind of always topical. Right? Like,

 

Unknown Speaker  4:24  

yeah. There's been jerks on the internet a long time. Oh, wait, wait, hold on. For a few years, at least, yours. Don't spoil the internet for Jordan. She's not aware that there are jerks on it yet. So

 

Unknown Speaker  4:40  

live in a bubble.

 

Darrell Etherington  4:42  

Believe if you want to give our listeners just kind of a taste of what you do and kind of like a high level overview. That'd be awesome.

 

Leigh Honeywell  4:49  

Yeah, for sure. So my name is Leanne Honeywell. I'm the founder of tall poppy we help organizations companies, nonprofits, civil society groups, protect their staff employees community members from online harassment and abuse, we focus on two main things. There's this unfortunate thing, particularly in the states where there's like no privacy laws, so your data ends up, spread across all of these really creepy, unnecessary what are called Data Broker websites. If you're living in the States, and you've never googled, you know, first name, last name, city name, address, you're in for like an unpleasant surprise or just like how much data is out there about you. So that's one of our focuses is this sort of personal data, personal privacy problem. And the other piece is account security. You know, anyone who's ever worked at a big organization has been through the like, click through the training of like how to keep the company safe from phishing. But there's this other whole set of attack surfaces of vulnerabilities that apply to people's personal accounts. And, you know, as an individual person who uses technology, who has online accounts with various like social media, and banks, and we're talking about online shopping and online thrifting earlier, right, like, all of these different places, have your username, your password are connected to your various social accounts. And you shouldn't have to be an expert to protect that kind of thing. But particularly if you work in a role like journalism, or you're a public health person dealing with like, anti vaxxers, all of these different things that end up with you having a sort of role in the public conversation, you end up exposed and sort of personally vulnerable to this class of attacks targeting your personal digital infrastructure. And that's that's where we come in we are this running joke of like, I ended up being this like one woman helpline of like, Oh, you know, so and so is going public with this, like me to situation, and the journalists would be like, Oh, well, let's put them in touch with Lee so that they can talk to her before they go public. And that sort of thing, where it's this, like, you know, the friend that you call to get the guidance on how to stay safe. And when it's part of your job to be a person in public, there's this like duty of care that an organization or company has to protect you in some way. And that's where we come in, we provide that protection service both proactively. And on an incident response basis, sort of when the when the caca hits the fan, right? You can swear on this podcast your time. And their little Canadian eyes get all big. Yes, yeah, exactly.

 

Jordan Crook  7:27  

We use tall Poppy, we have to disclose that right. Yeah. as a as a customer of tall Poppy. I remember hearing it many, many times. Absolutely. I'm

 

Unknown Speaker  7:37  

glad you guys said it. Because I'm not supposed to say it. But you're allowed to say it.

 

Jordan Crook  7:40  

We're allowed to say we say I mean, I personally am not but our corporate parent, or whatever it is. And we hear about it a lot. And they say tall poppy can help. Yeah. And it's great. Actually,

 

Leigh Honeywell  7:51  

yeah, no, it's been been wonderful to get to work with your colleagues and ryzen has some amazing, amazing security, infrastructure and and teammates. It's really been an absolute joy to get to work with your colleagues. They keep us on our toes. They do like fishing, fishing.

 

Jordan Crook  8:08  

Have you guys clicked on the fake phishing mails? No, we're good at it. TechCrunch is so good at it. But like, a lot of real stuff. They're like, Hey, we really need you to log in and deal with this Expense Report. I'm like fishing. I'm not going to touch it. Yeah, don't send me that expense report one more time. It's fishy.

 

Darrell Etherington  8:25  

So I mean, this, this serves as notice, like nobody even don't even try because we got it.

 

Leigh Honeywell  8:30  

I'm just having this flashback to so used to work at slack. I was on the security team there. It was the third security person at slack. And we had like a new person who joined I think HR or sales or something like a less technical role. And on there, like this poor person was like their first week on the job. And they sent out a, you know, Google form or spreadsheet or something to everyone via email, or not everyone but like a few dozen people. And all of a sudden, all of these people flood the security channel there. Was this a phishing test? This is a phishing email this Slack, nobody's

 

Unknown Speaker  9:04  

sending us an email. Wow, it was totally not a phishing email. It's actually just a person for socialization. Oh, my God, that's good. reaction.

 

Leigh Honeywell  9:14  

Yeah, you want you want people that that sort of like, that was a big piece of my sort of cultural values at that job and subsequent jobs. And definitely in the work we do at tall poppy of like, you want the security team to be there to be approachable. Not to be oh my gosh, you weren't using a password manager? What were you thinking? Right, there's this like feign surprise, which is a term from this thing called the recurse. Center. Have either of you heard of it? No. It's all you've ever thought of like a writer's retreat. what that would look like if instead of writing you programmed, that's what the recurse center is. It's this like wonderful, wonderful community based in New York. And people go on like a week or six week or three months sort of sabbatical to like deepen their practice as a programmer. An engineer, and they have these core social values. And one of them is no backseat driving. Like don't take away the keyboard from people when you're trying to learn together. One of them is no subtle isms. Like Don't be racist, even in a subtle way, like, you know, be welcoming and equitable to your colleagues, your peers because everybody's appear. And one of them is no feigned surprise, oh my gosh, you like vi versus Emacs? Like that? Sort of like, nerd anger,

 

Unknown Speaker  10:29  

right? Like, how do you not know that thing? Right? You should know that thing.

 

Darrell Etherington  10:33  

Right? That's common to fandoms too, right? I think even if you're not like, like a programmer who

 

Leigh Honeywell  10:37  

didn't watch episode, whatever of so and so like, you don't know this character. You don't know the name of that spaceship, like, huge, huge science fiction nerd and like, I am one of those people that has like, very singly encyclopedic knowledge of certain fandoms, which will remain nameless. But yeah, that feigned surprise is so prevalent in the work of sort of computer security. And it is like my personal mission to stamp it out by just being like relentlessly, like kind and supportive in everything that we've done. vcm with all the stuff, they raise my anxiety so much whenever we do those, like meetings, or tutorials, or whatever they

 

Jordan Crook  11:17  

are, because they're like, everyone knows who you are, and they're gonna come to your house, and they're gonna, like, kill you and all this stuff. And I'm like, Oh, that's scary. But you should know rather than not know, right? But there's a lot of empathy. I feel like yeah, like, I feel like they're really like, we're here to help you. We're going to, like, hook you up, like, everything's gonna be okay. Like, we got you and just like, listening to what we have to say is gonna help you a lot and always, like, brings me back to like, you know, my steady 190 BPM, you know? I feel like waving my Apple Watch of monetarily. Yeah.

 

Darrell Etherington  11:53  

Is it like a thing? Where because you when you're talking about the personal vector, right like that, okay. Your personal life is actually a vector. And maybe was that? Is that like a recent it feels recent to me, but is it not a recent thing? It's just only that we've recently come to think of it that way? Or was there a real before time when people were like, well, working person are separate. Let's keep those things to stay.

 

Leigh Honeywell  12:15  

So here's my latest theory. We used to have this idea of celebrities versus like, quote, unquote, normal people. And with social media, like, I don't quite understand how I ended up with 26,000 Twitter followers, like I've just been posting for 12 or 13 years at this point. 14 years, oh, my God on Twitter for 14 years. And somehow there's like 26,000 people and bots and fake accounts and spammers, and lovely, lovely people who have thought that something I said was funny or interesting and thus followed me and to a tiny slice of people. I am now like, I remember when I moved to San Francisco in 2014. I was walking downtown sand, which I feel like TechCrunch used to have an office that's right on Townson, okay, I was like walking, literally two blocks from the old TechCrunch office. And this woman pulls over in a car is like, Haley, I love your Twitter. And it was really a little terrifying, but also hilarious and like it wasn't threatening. It was just this like sweet person who recognize the pink hair off of Twitter. And I was literally like three weeks after I moved to San Francisco. Thankfully, it did not happen again. In the five years I lived in San Francisco. But it was this moment of like, oh, there's this weird kind of micro theme that is now a thing. And I think that like basically every journalist experiences this, I feel like the threshold is like 10 or 20,000 Twitter followers just in general. So weird internet micro fame is the first part of the theory. The second part of the theory is there's this concept in psychology called a parasocial relationship. If you've ever felt sad, when a celebrity died, you've experienced a pair of social relationship. You know, there's this celebrity who exists and you have feelings about their existence, they do not know that you exist. It's like it's a one way relationship. But it's a real like psychological phenomenon. It's a real experience you, you're having real emotions. Within that set of however many 10s of 1000s of people follow each of you on Twitter, there are people who have that like minor fandom of Jordan and minor fandom of Darrell and most of them will be completely appropriate if they it's the whole sort of La thing of like, if you see a celebrity in LA are like not allowed to talk to them. I understand. Because it's like the etiquette. So most people will follow the etiquette and not be like weird if they run into you on the streets of San Francisco or Toronto. But there's a certain percentage of people very small percentage who will not have that sort of like appropriate reaction. And now that everybody is an internet micro celebrity, more people are experiencing this weird failure of the parasocial relationship. And I think that's, that's the thing. What's new, right? Because there's whereas there used to just be like the actually famous people who were like the, you know, elected politicians and the like, actually a tear movie stars. And maybe people who wrote for the the top newspapers when people actually got like physical newspapers, right. But you wouldn't have a picture in the newspapers, they wouldn't know who you are, right. But now, like a zillion people have 20,000 Twitter followers, or a million Instagram followers, whatever. So there's this web of relationships of these parasocial relationships, the frequency at which people will have those inappropriate reactions to the parasocial dynamic is there's the specifically the inappropriate thing that happens is somebody like Jordan, Jordan tweets, these awesome things? And I follow Jordan. Maybe Jordan replies to me sometimes, right? And then I'm like, oh, Jordan knows me. Yeah, Jordan doesn't know me from a hole in the wall. But people don't like they don't get that they don't perceive that correctly. There's like a parsing error. And some of those inappropriate piercings are the like, the thing that we're dealing with here around online harassment, there's a whole other set of issues, which are more sort of culture war stuff, right? Which is like who's getting canceled? And I don't, I really don't like the term canceled culture. But that is like, the way people talk about some of this stuff is like how we define accountability in the post metoo era and all of this stuff. That is another piece of it. But I think a really big piece of it that doesn't get talked about enough is this increased number of vertices in the graph of parasocial relationships to be kind of mathy about it.

 

Darrell Etherington  16:40  

I mean, I guess, if you were like, if you were a person who was perhaps, if you look around, and you see how other people are, like, kind of interacting with folks, and you're like, Okay, like, I have this urge to perhaps do something that is anti social, but everything else is sort of normalizing me to not do that. Because I'm sort of cost that right. Whereas online, you would be, you would have it reinforced, like you would look around and see all that antisocial behavior is there for you to see that you can see people harassing someone on Twitter, and you go, oh, okay, I mean, that looks okay. It must

 

Leigh Honeywell  17:11  

be okay to like recall this person or racial slur or whatever, right? it normalizes these, like, really odious behaviors. And I think there's there's a couple of different pieces in it. One of the pieces is a platform design question, what are the tools that are available? Like when I if I reply to one of deros tweets and call him some terrible word? What control does Darryl have over the appearance of that, that tweet versus like, if I comment on one of your Instagram posts, with like a string of expletives, you can just delete that comment right? There, there isn't this idea of like deleting a tweet from the replies obviously, there's like blocking and muting. And there's these certain ways of controlling the sort of user experience. But I think so many these really interesting questions of like, what is the design of these platforms? How does it amplify or squelch anti social behavior? And what are the tools that are available to both content creators, people who have a big megaphone or also like people who are participating in these conversations, and I remember, like circa 2011, we had this whole real names conversation around Google Plus, and Google being like, everybody must use a real name. That is the thing. They're the name that is in there on the government issued ID, which is definitely their only real name or real identity. Yeah, right. Like when we saw it with Facebook, too. And Facebook has to a certain degree stuck with that policy. And yet the number of times that I've seen people post, like horrifying stuff with their LIKE, COMMENT system that's logged in on Facebook, right? Oh, yeah, I think we at this point, I really hope that we can sort of put the, like, put the idea to rest, that real names are going to be the thing that saves us from online hate, like, it's just, it's not, it's not happening. Right? So then there's the question of like, what will save us and I mean, it's almost a cliche to make this reference but there's this like, seminal computer science paper of no silver bullets that was literally written in like the 70s about like the IBM 360. It's no silver bullets. There's, there's no sort of shortcuts to doing content moderation. And I think the like, whether it's human moderators, whether it's, you know, just sprinkle a little bit AI on it, you can't actually do that there's so much like subtlety and context to what is hate what is threatening to people. And so what ends up being the like, set of solutions that as an individual, how can I interact with this platform? How can I interact with technology more broadly, I sort of come back to like, what can I control myself about how I interact with technology, my relationship to these sorts of interactions, I it's so funny, I had, I feel like you know, I've been working on online harassment issues for coming up on like, 1314 years about as well. I've been on Twitter, ironically. And I'm usually pretty good like the, you know, being funny without being somebody walked up to me to conferences like Lee, I love your Twitter. You're funny without being offensive. And I'm like, that is my brand. That's what I'm going for. I want

 

Jordan Crook  20:16  

to be funny. I wanted like, I want to do that too. How do you do that? Just start

 

Unknown Speaker  20:20  

Well, okay, I

 

Jordan Crook  20:21  

stopped trying to be funny. If we're being offensive.

 

Leigh Honeywell  20:24  

The trick is to punch up. And to be very precise. This is my, my, one of my like, key rules in life is like, if I find myself like, Oh, I know that word is like kind of deprecated because it's like kind of a racial slur, but it's one that people don't like, recognize as a racial slur. Like, what is the thing I'm actually trying to address in using this term that I know actually has like a racist origin to it or whatever. It's like, Oh, I'm trying to express that. I want to ensure transparency, or this person robbed me like, right? There's all of these sort of words that people are like, Oh, that's, you know, no big deal. But it's actually like super offensive, right? Under the hood. The answer is always to be more precise, whether and the same applies to like taking ablest language out of your sort of discourse. Like, if you wanted to say like, That thing is crazy. You actually mean that thing is, like, unreasonable? Or, like, it's always about the precision. And I think that that applies to humor, too, is like, what is the what is the thing that I'm trying to like, light up here? And being being precise about that thing is my that's my like, life hack of not being offensive. Like, which, um, it's, it's funny, because sometimes in the sort of discourse, and we have this whole thing going on with like, Basecamp right now, the tech company that's like, we don't have politics anymore.

 

Darrell Etherington  21:42  

Yes. And depending on when there's by air, people who could like what, but please go back and change it because it was, it was ridiculous.

 

Leigh Honeywell  21:50  

It's, you know, for folks who followed the Coinbase thing. It's like Coinbase v2,

 

Darrell Etherington  21:54  

right. Yes. And luckily, Bryan Armstrong chimed in, which is great. It's just great to see him back. Yeah,

 

Leigh Honeywell  22:01  

I had a point. And I lost it. deflated by this, this the situation? Oh, yeah. So part of the base camp thing is like, it's really hard to have these nuanced conversations in the workplace. Like, don't be racist. Apparently, that's a hard convert. And, you know, I have to sort of give people the credit of like, I do literally have an undergraduate degree in what I call ism studies is a program at the University of Toronto called equity studies, I did a double major in computer science and equity studies. And sometimes I when people are like, oh, what did you do your your degree, and I don't clarify that it's like equity, like equality, not like finance. Sometimes they just like, let them think it's, I'm not actually good with spreadsheets. But maybe I just like to think I'm good with spreadsheets. self deprecating is always a good way to go. As long as you're not like reinforcing stereotypes. How did this turn into the like, we explained how many master classes teach you how to be a stand up.

 

Darrell Etherington  23:01  

If you're listening to found you're probably already super interested in startups, and the overall startup ecosystem, so we've got a great deal for you, we're going to offer you 50% off either a one year or a two year subscription to Extra Crunch. Extra Crunch is TechCrunch is premium product offering. And when you go there, you'll get deep dive interviews with some of the top founders in the industry. You'll get market maps on specific verticals, then some of the most exciting areas of growth and startup land. You'll also get surveys of some of the top VCs in different areas, including different geographies. So you can subscribe to Extra Crunch at Extra crunch.com probably the easiest way or if you're already on TechCrunch, follow the links for Extra Crunch and you'll get a prompt to subscribe and then just enter that code that's found the name of this podcast during checkout and you'll get 50% off on either a one year or a two year subscription.

 

Leigh Honeywell  23:57  

So the point of the the point of the like, I sometimes managed to be funny on Twitter is I had a really like negative Twitter interaction recently where I like somebody made a joke about death panels. And I was like, actually, they, you know, they were really doing triage and La at one point during the pandemic, and it turned into this whole thing. And we ended up like having a conversation and like de escalating things, but it was like, this person just got like really mad at me. And I was like, Oh, it was actually like she was actually just making a joke. And I missed the joke. It completely missed the joke. And it could have totally like become a thing. It was one of those moments where the conversation like could have blown up and I could have become the main character of the internet. Oh, yes. You know, the goal of using Twitter is to never be the main character. That's right today. Yeah, and today's main character, I guess this week, the main character is Basecamp. And the goal is to not be that so it but it was just this reminder of you know, some of this stuff is actually like just difficult the sort of nuance of online communication. And the failure modes have this difficulty of having nuanced conversations, unfortunately ends up sometimes it weighs into this, like culture, war territory, sometimes it waves into the sort of personalization of these conversations. And sometimes it touches the sort of third rails of like, oh, you're actually, you know, you've made this joke. And all of a sudden, all the Nazis are mad at you, right? And now you have to deal with like that set of people and their particular set of like tactics of like yelling at your employer and posting your home address. And there's the sort of playbooks that in many years of observing people getting in arguments on the internet, you sort of end up identifying these playbooks and patterns of this is how the alt right gets mad at people on the internet. And this is how the anti vaxxers they love doxxing people, right, right. They love photoshopping people's kids. It's awful. Oh, just off like nasty. And yeah, that's our Unfortunately, that's our bread and butter is like helping protect people against these these different abusive patterns of conversation that sometimes start with the most like innocuous nonsense. Yeah, that gets like,

 

Darrell Etherington  26:17  

blown up. Yeah. And it can be like a misinterpretation of the source. It can be. I mean, it can be anything. Right. But like the, and we deal with it too, obviously, in the newsroom. Right. And we there's been times, and I, this isn't the right way to handle it. I don't think but I mean, there's definitely been times where people have said, like, Oh, I should cover this thing. And I'll be like, well, maybe you shouldn't cover that thing. Because a community is going to specifically target and attack you worse than it would me, for instance, if it's like interchangeable, like we both have time right now, maybe I'll should do it just take the heat, right? Which is awful. Like, it's awful, that we even have to do that. But it's also it was also a product of like, kind of like seeing this emerge and being unsure with how to deal with it. And like, it's great that you and other people in the field are actually coming up with like real viable playbooks that don't sideline people that don't mean people have to like stay out of the conversation, which is awful,

 

Leigh Honeywell  27:12  

right? So much of like, my goal as a founder and building tools and building services is around making sure that people who want to participate in public life and public discourse are able to because they think we see this, you know, I remember in the the 2018 and 2020 election campaigns in the States, there were multiple reports of candidates dropping out, because the threats that they were experiencing were so and that is like, it's so deeply toxic to the basic premise of democracy that like, you could literally be like threatened off of the electoral stage. And whether it's that or the scenario that you identify it of like choosing who gets to report on an issue, or folks self selecting out of reporting on things. One of the ways you see that so starkly is if you have to buy lines on an article, and it's you know, a person who's under represented and a person who's not, and who does the heat get directed that for, you know, the same piece of writing, although that can actually be like if the the person who is less marginalized ends up like tanking a little bit tanking being the sort of like, Video game thing, oh, the person with really high HP and they're gonna, like go to bat to take the damage, so that the other person can like sneak through the mission. That's like, one of the sort of tactics in thinking about things like sharing by lines is like, how do you elevate one person? Maybe like, have there be support from a community of peers to be able to like spread that damage around a little bit so that it's more survivable to everyone, which sounds like, it sounds really sort of brutal to put it in those terms. But it actually like, these are the ways we get through this stuff. These are the ways we make it more survivable. And so like, survival is the word for it. Right? Like,

 

Jordan Crook  28:56  

and that makes me kind of sad. Like, it feels like so much is not like, it's not proactive, right? Like, it just feels so like we're just like all hanging on like, I, I feel like I've written less opinion based articles and, like overtime, and a lot of that has to do with my job changing and not you know, getting to write as much as I would like to but like, you just learn after a few times, you know, if you get one that's a little crooked cricket opinion out there. And you're just like, Okay,

 

Leigh Honeywell  29:27  

did you make a mistake, like a factual error, and then the whole internet like, we've we've had fact checkers, and some publications don't have fact checkers anymore for budget reasons, but also people just like make sincere mistakes. And it's not, it's not fake news to actually just like, misread something or make a typo in a statistic, right. And then you see people getting fired over that kind of thing. And it's just like, come on, this is you know, nobody's perfect at doing doing these jobs. Yeah,

 

Darrell Etherington  29:58  

yeah. But that's like the like. Quarter, Jordan said like we both seen so personally, right? Like you see you were talking about how it sounds grim to describe that way. But it's not it's not it's appropriate to it because you see the effect it has on people. And it's, it's very bad. Like it's a it's a terrible effect and it has a tremendous impact on people's lives when they're on the receiving end of that kind of thing. Right. So I've never heard that analogy before. I think that's a really good one for like, thinking about the video game squad and then having like, you know, yeah, the tank character and like the whatever the defender character, the support characters like that makes a lot of sense for how you kind of like distribute roles in a team,

 

Leigh Honeywell  30:34  

I worked with some folks who were coming forward about a like serial abusers situation in a sort of esoteric programming community. And it actually the post just came out today. And a thing that I thought was really, really powerful. There were these two women that came forward. But there was also an open letter from other people in the community who either specifically corroborated like, particular pieces of the like, this person was an awful to this person at the conference, and I was there and I witnessed it, or just saying, like, we believe these two women, that sort of, like, people talk a lot about like ally hood and stuff. But that I mean, it was such a concrete show of support and expression of like, you know, we believe you, we believe your experience, and we're not going to stand by this behavior on our community. It was I like, I was a little choked up when I saw it go public, it was just like, it was such a powerful statement of like, we stand behind you.

 

Darrell Etherington  31:31  

This has been great talking about tall, Poppy, and like kind of the the problems you're addressing. But I do want to get to kind of how you got there to begin with, because it's obviously very needed, you know, like, how did you decide to create this company and to focus on this problem yourself.

 

Leigh Honeywell  31:46  

So I'd been thinking for a number of years that there was a need in this space, there's sort of a gap in what was out there. In terms of cybersecurity. We've had all this innovation in the last 1020 years around sort of the cybersecurity of businesses, and you know, how we build more secure operating systems and all this, I mean, I can't see it on video, but like I'm holding on my iPhone, it costs literally like a million dollars to hack into this iPhone. And that's, you know, that's an incredible achievement, it costs a million dollars to break into fully patched Chrome, because these companies and these communities have invested all of this like money and infrastructure into securing the sort of like basic infrastructure of our day to day lives. We even think of like what happened with zoom last spring, when the pandemic hit, I think of like the Al anon groups and stuff moving to zoom and zoom having to deal with this, like whole zoom bombing problem and like, the, you know, the steps that they took, and they did their, like, 90 day security push, and all these things hired Alex Stamos, who's also ex yahoo. varizen. And at the same time, like these are all sort of systems level and corporate, right, it's the the email antivirus or the How to secure your AWS. Right. Whereas, you know, I think about, I have my password manager with 400 different accounts in it. And, like, it's literally my job to be a cybersecurity expert. So I've, you know, put some thought into how to secure all of those different systems. But the average person who gets an email claiming to be from FedEx, and they have to fill out this form, it wasn't actually fed out, right, like, how do we as individual humans, who are just like using technology, how do we keep ourselves safe? And the the sort of contextual connection to the the problem of online harassment is that like, I can't stop people from yelling at you on Twitter, that's like, not actually in my control. But one of the most severe ways that the yelling escalates is by hacking into your online accounts. And when I say hacking, it's it's actually usually just the passwords that you used in the past. You know, if whatever password you had on LinkedIn in 2013, like miscreants definitely now have that password. So, you know, what are the places where some, you know, neo nazi that decides that he doesn't like Jordan? How is he going to target her? And how can we keep her safe? What are the things that we can do to add an individual level, keep people safe, and you know, things like password managers, two factor authentication, these are the tools that we have, but they feel sort of opaque and inaccessible, I think, to a lot of people. And our focus is really like making them more understandable helping motivate people, because I think a lot of this comes down to people sort of freeze up and they think there's nothing I can do about this like intractable problem, but it's actually like quite tractable. There's like very specific concrete things that we can all do that make us safer. In the same way as like, you got to teach people how to wash their hands properly, apparently, like and now my Apple Watch yells at me if I don't wash my hands for a full 20 seconds, which is a long time.

 

Unknown Speaker  34:52  

It's so annoying. When I'm like doing the dishes. It goes on. I'm

 

Jordan Crook  34:57  

like yeah, Randy stuff Yeah, exactly. And you're like, oh man, not now.

 

Unknown Speaker  35:03  

I'm unpacking these, like I just moved. So I'm unpacking boxes and the crinkly papers

 

Jordan Crook  35:08  

telling me to stand one more time. Like, I hate my apple. Like, I hate it, but I love it. It's Yeah, same. It's a very such

 

Leigh Honeywell  35:17  

relationship. Such a silly digression. But my Apple Watch is my replacement for Windows Phone because I still am like mourning the loss of Windows Phone. Because when I picked up my Windows Phone, it had my calendar on it. And I can't live with it. My

 

Darrell Etherington  35:32  

calendar is very beautiful Windows Phone I really liked I liked the tiles based in this

 

Jordan Crook  35:36  

world. Yeah. Sometimes I like to go back and watch old episodes of scandal because they use Windows Phone.

 

Leigh Honeywell  35:42  

So did my absolute favorite show in the whole world, The Good Wife? Oh, yes. There were like several seasons where they were 100% Windows Phone, they might

 

Jordan Crook  35:51  

dig on that. They really did. They were like, we're gonna get Windows Phone out there. I mean, every show you see, it was a valiant effort. It was a good try. Yeah, this

 

is good conversation. Wait, hold on, I want to go back to the tall poppy because I feel like one of the genius bits of it is that the product is really focused on like personal and individual accounts. But you have gone through businesses. Yeah, in order to do that, because no one like you said like, most people say like, Oh, you know, I might get hacked. And that would be scary. Or really, it's not even I might get hacked. It's like, oh, something really terrible happened to me on the internet, I should probably do something about that. And by that point, it's kind of like too late. And it really sucks. And it feels overwhelming. And they're stressed. And like no one thinks proactively about this stuff. So you're like you've essentially found a way in

 

Darrell Etherington  36:35  

but no one's no one's also going to spend their own personal resources on it. Right? Like they could, but it's not like

 

Jordan Crook  36:40  

something bad's happened. Yes. And then you're like, oh, I'll spend a lot to like, never let that happen again, because that was literally terrible. Yeah,

 

Leigh Honeywell  36:46  

there's definitely there's this huge user education piece around like, how do we get individual consumers, individual humans to be thinking proactively about security? And I think the sort of medium medium to longer term version of that is going to end up having to be some sort of like regulation or like, some sort of government meddling and interference because it is this like, public health level issue of whether it's account takeovers, or password, reuse all of these things. But yeah, the the sort of key insight that we had was, you know, if you're facing harassment in the course of your job, in the course of just like, doing your job, whether you're a journalist or developer relations is one of the ones where we saw just a lot of like, it's literally your job to like, go to conferences, and tweet a lot about whatever tech thing that your company does. And people were dealing with, just like the creepiest stalkers, and like bizarre harassment, and people getting fixated and stuff like that and feeling so vulnerable. And that was our sort of insight was, let's build something that companies can provide to their employees, rather than relying on sort of the individual that's actually sort of collectively protect a set of people for whom this is an increased risk. And often it ends up being sort of a media thing, newsrooms is big one, sometimes it's other kinds of content production like movie studios. Another surprising one that we've run into a couple of times in in recent months is actually like health related whether it's people the sort of anti Vax thing happening. And then the masks the 5g true thurs going after, like, really, extremely mundane companies that you would not expect, but that are like, installing cell phone towers, right? stuff like that, right? Where you like, all of a sudden, there's this set of intersections of the sort of existence of your organization, and people on the internet with too much time on their hands who are stuck at home and are getting like real bad. Yeah, right. And, and being able to work to protect folks who are in that sort of convergence of circumstances, has ended up being a really like, powerful part of what we've been doing,

 

Darrell Etherington  39:02  

when you could see how people would take like, like you said, like, all that time, and just like, find new targets for their illusions, I guess, right, but like, I can see it personally, because this is always professional air. But like, Jordan knows this very well about me. I buy too many things online, but like I'm always like, ridiculously expensive things that he does

 

Leigh Honeywell  39:22  

not need. The best thing about moving back to Canada, there's just a lot less stuff on Instagram that I want.

 

Unknown Speaker  39:29  

It's just like, I haven't bought yoga pants off Instagram since I moved.

 

Darrell Etherington  39:34  

But see, I could still find it even though there's less but like, I'm always looking for a different novel thing right? And then if you added more time to it, right, you become even more you're like well let's go deeper down this and like find just like totally random small shop and normally for me that expresses, like great I support a local merchant or whatever right? Like it's not a not a bad outcome necessarily for them. But like for people who are have these like antisocial tendencies, Is that means that there's a whole host of new targets like he no longer even even the micro fam thing you're talking about doesn't even apply? Or it's like, I'm gonna apply some twisted logic to find a new novel target for my kind of like urges or whatever. Right?

 

Leigh Honeywell  40:13  

Yeah. And I think the like, so much of this comes down to control too, right? If it's not the sort of micro fame dynamic, it's the like, a person who's dealing with the the reality of the past years awfulness. Often folks are looking for, like a thing that they can control. And there's a set of people for whom, like, I'm gonna make this person feel bad, I'm gonna, like be mean to this person online, and that's gonna make me feel like a bigger person. It's a way of maintaining some kind of tiny amount of control over the universe in the same way as we get people like, yelling at Target employees, because they don't want to wear a mask or whatever, right? Like there's, people are lashing out because they have so little control over their lives right now. Because we're in a global pandemic, Excuse My French, like, as awful as it is, it's like an understandable sort of psychological failure mode of, you're gonna like, lash out because you feel like you don't have control. It's so exhausting to try to always like, respond with compassion. And I have had a few friends on Facebook who've gone off the sort of Deep End of antivax and stuff and trying to it's, it's, it's like trying to maintain a connection to someone who's in an abusive relationship without sort of just being the dumping ground of like, my partner did this terrible thing to me again, and be like, Oh, I'm sorry to hear that, like, let me comfort you. Versus like, that sounds really terrible. Like, I'm going to keep a connection, but I'm not gonna like enable by being sort of a sink of this, like these bad feelings. And I think when folks get down that sort of radicalization path, whether it's anti Vax or the like, q anon stuff, it's, you know, the amount of labor required to de radicalize someone who's gone down that path is like, it's it's shocking how hard it is. Yeah. And so you end up having two things be really important. One, noticing the early signs and sort of inoculating people to the sort of like, Hey, you know, there's this misinformation going around. Like, if someone tries to get you to read this thing. It's probably this agenda, like, what is the underlying agenda here? And then there's also the sort of speaking to the audience, right? early on someone I know, actually, from Toronto posted pandemic on Facebook, and I was like, Hey, this is this is really not cool. Not cool. There's like people are actually dying. It's not it's not fatal. There's, there's actually a ton of people dying. And there's that push back, sort of like, how do you maintain the lifeline. But also make sure that you're sort of pushing back to say, like, we can't normalize this without wasting your own personal resources as well, not like that. It's a waste. But like, we only have so much we can give to things right. You know what I mean? Like, we only have so much energy, we know from the research that there's a very narrow set of circumstances under which de radicalization ever works. Yeah. Right. So that's where like, you know, this person in Toronto that posted pandemic, like we weren't tight, but we have a lot of social overlap. So I needed to just say like, Hey, this is not cool, almost for our sort of common friends more so than for her.

 

Darrell Etherington  43:20  

Yeah, I mean, we've all I'm sure that everyone listening to it had similar interactions. I personally had the experience where you're like, I don't, I don't know what to do. Like, I can't, there's nothing I can do that's additional here that could help.

 

Leigh Honeywell  43:31  

And it's so heart rending, because I know who that person used to be. Yes, exactly. And it's like there's this alien speaking and the shape of your friend who used to be like used to go to raves with and like the 2000s Yeah,

 

Darrell Etherington  43:44  

Darrell loves rubes. I just love going raves and I want to go back to the raves I've never been

 

the closest I've come is probably TechCrunch like parties or events or whatever. Which sometimes Oh, that's so sad. I just see you being like the tallest guy, but like not having it.

 

I didn't want to talk Lee like specifically just raising money for this. Was it something that did you have to do a lot of convincing or did you find people like oh, like this is something that is definitely a problem and is definitely investable.

 

Jordan Crook  44:25  

You have a pretty sick resume. Right? Yes. Like

 

Leigh Honeywell  44:29  

that. I mean, that definitely, that definitely helped. I think the folks that are on our cap table, by and large are folks who you know, one believe in the existence of the problem. I think that's a key thing where there's a set of folks who are like, man, maybe it's actually just fine that people are jerks on the internet. Like that's just inherent to the internet. VCs probably

 

Jordan Crook  44:47  

go through that micro fame para para. What was it again? When I say parasite?

 

Leigh Honeywell  44:53  

I like the parasocial relationships. Yeah, I feel like VCs have like a particular version of it that they experience. Where you're not just micro famous to your 100,000 Twitter followers, but many of your 100,000 Twitter followers want something very specific from you, which is a

 

Jordan Crook  45:10  

different dynamic. They're not. It feels like no one's really getting all that mad at VCs on Twitter, except for journalists, which is why we're all blocked.

 

Unknown Speaker  45:20  

I feel like the man, this is the margins of the last couple minutes are not sufficient.

 

Leigh Honeywell  45:28  

Yeah, I mean, I think I've actually said this in a couple of like, clubhouse conversations and stuff where there's this, I think a lot of VCs have this expectation of a very particular kind of captive trade press. And that is their, the extent of their understanding of journalism, right? And when they are faced with like, even mildly adversarial press, it's like, very confronting, yeah, like, Who is this? Who does this person think they are reporting on my portfolio company, right, versus the sort of like, recycling of PR pitches, various outlets over the years have been Have you gonna say Oh, and it's fine. TechCrunch has done so that impact it TechCrunch is very much not alone and doing that, right. And that is the sort of ecosystem a big part of the dynamic of the valley media ecosystem is this like, sort of Heyyou graphic like the founder? Yep. thing. And VCs are part of that. And founders are part of that. And when you don't have sort of challenges to that power dynamic, it, it can end up being pretty toxic, I think. And that's how you end up with instead of like, people wanting to engage in good faith, like blocking journalists on Twitter or clubhouse or whatever.

 

Darrell Etherington  46:45  

Yeah, at the slightest provocation. But I definitely think it's like, initially, there was like, it was a kind of like a hobby almost to cover the valley, right? Like, especially the startup focus stuff, like you know, Arrington over on our side did it. And then I used to work for olm, who kind of like followed Arrington and did the same thing. And like, then it became mainstream. And I think when that transition to mainstream happened is when the break happened, like people were just like, oh, wait a minute, like,

 

Jordan Crook  47:13  

and all the tech companies grew up to Yeah. And then when they grew up, they also became part of quote, like the mainstream with the good and bad that comes with it, right? Like they weren't all little darlings anymore. They were the Facebook's and the Googles and, but they still deeply self identified with this like scrappy underdog thing.

 

Leigh Honeywell  47:32  

Yeah, right. So right, like, you know, evil or the, you know, the fishbowl office, in Palo Alto at Facebook HQ, right? Like, all of these things are. So it's, it's an identity thing, rather than being like a rational part of the conversation.

 

Darrell Etherington  47:47  

It's totally rational. And I know so many billionaires personally, who still have that mindset, and you're just like, that's quite the name. What are you talking about? There was one, this is, I'm not gonna name names, but I'm gonna describe the story in such detail that if someone would like to go find names, they're able to I saw 100 millionaire, let's say recently post on Twitter that they were being bullied by a large brand, because they were about to start a competitive company. And that brand was like, blocking their mutual suppliers. And they were like, go go under dogs. And I was like,

 

Leigh Honeywell  48:24  

is it bloodying? If it's like a like commercial transaction? Like what? Words have meanings? Yeah,

 

Darrell Etherington  48:31  

words have meanings. And also, the person in question could have bought this competitor giant company 100 times over, and then just shut them down or put their name on the building. Like, it was a totally absurd, and it was like, I don't understand how you can get away with this in your brain. But it's definitely a thing like they This is a thing that happens frequently. I mean, they ingrain that identity and they don't abandon it. They don't evolve it, right. They're just like, yeah,

 

Jordan Crook  49:00  

I feel like we're making it hard for me to book people for disrupt right now. I think we should circle back.

 

Unknown Speaker  49:06  

I'm getting excited. The semiotics of the valley's relationship with the press. That's a whole it's a whole thing. Anyway, so please come on the show.

 

Jordan Crook  49:17  

Yeah, VCs love to have you in the truck. Great. But obviously, not all VCs are like that, because people invested in you. Right, like and understood the problem. And

 

Darrell Etherington  49:27  

no, it would be unfair to even paint all 100 millionaires and billionaires that brush they're so good. Yeah, of course.

 

Leigh Honeywell  49:34  

Yeah. And I mean, I think it actually speaks to the the like very different motivations that people come into whether it's angel investing or seed stage or like, quote unquote, VC, that, you know, there, there are a lot of folks for whom it is the sort of like they want to identify the next Uber and they don't, you know, they are not interested in the sort of moral and ethical nuances of what it means to disrupt And you know, who gets harmed in the process and all of those things? versus I always do, like so corny. And I'm like, What is the slightly gentler version of capitalism that like, maybe we're actually doing good here by trying to support people who are facing online abuse, often in the context of their work, and often in the context of being marginalized themselves. So it's, it's complicated. It's definitely definitely complicated.

 

Darrell Etherington  50:32  

All right, Jordan. So that was a heck of a conversation with Lee, there were huge fans of that conversation where I was just like, I'm just sitting back and trying to absorb so much of it was like, it was kind of like being back in university in like a very good way in the best possible way,

 

Jordan Crook  50:50  

like your favorite class, for sure. And like, I didn't feel like I had a lot to add. But I felt good about it. Like, I was like, that's a good thing in this situation, because I'm here to learn. And she's a great teacher. I do feel like my anxiety levels went up and down, as we were talking. She's very soothing. But we're also talking about topics that were relatively stressful. But overall, I feel good. I'm glad we I'm glad we did it, and I feel better for it.

 

Darrell Etherington  51:16  

I think she does a really good job of sort of like, also seeing it from the perspective of even people who are on the other side. It's amazing the degree of empathy that she has, and I think that's what makes her so good at tackling this particular challenge. Yeah, I'm envious of it to be honest. Yeah, I have no empathy, which is why are the apathy twins over here? Why do you hate me? You want to listen to this podcast. But seriously, do come back next. And also, go on to iTunes leave us a five star rating to celebrate just how much he loved the antipathy that we enjoyed and share. And the great conversations the great energy that our guests bring to the show, which we lack do it for them seriously. Found is hosted by myself, TechCrunch news editor Darrell Etherington and TechCrunch managing editor Jordan Crook we are produced and mixed by Yashad Kulkarni and TechCrunch his audio products are managed by Henry pic of it. Our guest this week was Lee Honeywell, co founder and CEO of tall Poppy, you can find us on Apple podcasts, Spotify, or wherever you get your podcasts and on twitter@twitter.com slash found. Also you can email us at found@techcrunch.com and thanks for listening. We'll be back next week.