Trust as Infrastructure | Bryan Cantrill | Monktoberfest 2025

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

Get more video from Redmonk, Subscribe!

Software runs on trust — not just in the code, but in the people who write it. In this riveting Monktoberfest talk, Bryan explores how accountability, transparency, and human fallibility shape the systems we build. Drawing on decades of experience and more of engineering history, he explores the critical role that trust plays in how and what we build.

Transcript

Excellent, thank you, everybody.

    [applause].

     >> You know, it’s kind of funny, Rachel jokes that she — and she hads she’d said this in her monks talk in 2022, that her only rider for Stephen was that she not follow me. You should try following you for a change!

    [applause]

     And I gotta say, the all — the three presentations this morning have been exceptional. All of the presentations as always, have been exceptional.

     Stephen, thank you so much for having us here.

     So I am to tackle some themes that we’ve heard a little bit but before we do. I want to flashback to the last time I presented at Monktoberfest, it was 2023, and I presented at Monktoberfest a lot, and I think I may have pioneered a first in my Monktoberfest 2023 talk. My talk so enraged part of the internet that I actually got a reaction video, so — so this is a — my talk was an anti-doomerist talk, which I by the way two years later absolutely stand by. We have doom. It is not of the AI assisted variety, but not the AI initiated variety. And this is a reaction to my entire talk. Look, that’s an hour and five minutes long and my daughter, now 13, then 11 or 12 when this came out, was just gobsmacked that anyone would have such a low opinion of their own time that they would spend it with like a play-by-play commentary. She’s like I really want to watch this. I’m like, it’s less exciting than you might think. So we sat down and watched it together and she’s just very excited. I just want to know more about this person. But we don’t get too far into the talk where he’s like, dad, he doesn’t seem to be listening to your talk. And it’s like, no, he’s definitely not. And another thing that was interesting about this talk and I think germane to other talks we’ve seen, this is a prominent doomerist and he’s fixated on IX. In the first 50 minutes he must have said IQ a million times and she says to me, what is IQ. And it’s pretty amazing you don’t realize how far one can progress through life and it’s like, oh, wow, you don’t know IQ and unfortunately I had to have the one that I’ve got to explain. I know how she’s going to take it, this is my daughter, by the way, not well. I know that the messenger is going to be if not shot, certainly shot at. So I explain what this is, intelligence quotient and so on and like, how do you measure it and like oh god, that doesn’t make any sense at all, are you sure you’re explaining this correctly? And here we go. This is actually very predictable. I don’t understand why anyone would believe that, why are you putting a number next to intelligence, people are intelligent for many different ways, you’re wise beyond your years, I guess I’m not doing anything wrong, I’ll take that. I’ll also say, I was very tempted if she hadn’t spent so much time explaining how bored this person must be to make a reaction video, I wanted to make a reaction video to his reaction video, but I think that would have even — but I will say this: He fast forwarded through the technically dense parts of the talk, so there’s actually some real technical density in this talk why I explain why the act of engineering is actually more than just intelligence, why you need these other attributes and I talked about some very concrete things that we have in developing a computer and it requires more than just intelligence, as it turns out and he literally fast forwards. And you’re like, bro, bro, that’s the whole talk. But moving along.

     Stephen said that another kind of tradition is that Stephen gives away my first couple of slides in the intro and he said this time he said it’s not a tweet and it wasn’t. This is the DM I sent to Stephen. I said I’ve got a conference, remain nameless and we went back and forth with the idea and I submitted the title and the abstract and as it turns out they decided they don’t want the talk after all and the organizers said, maybe this is more of a Monktoberfest talk!

    [laughter]

     And I still don’t know who is being insulted in that. I don’t know if you’re being insulted, I’m being insulted, we’re being insulted together. The conference organizer may be insulting their own conference, maybe I can’t — maybe the conference organizer is praising the two of us. Actually don’t know. Maybe it is more of a Monktoberfest talk. And I think it is. Certainly on hearing every one of the last day and a half. I want to talk about trust. But first I want to talk about trust in the digital infrastructure we’ve built because that’s what I’ve spent my career doing. Not all of you have, but many of you in the room have spent your career on building software infrastructure, these are the compilers, the databases, the operating systems, the distributed systems, and this is the stuff, by the way that’s never done. Please never tell a young person that these problems are solved, because they’re not. I had a well meaning person who was vastly senior to me when I was 22 years old, trying to tell me that I shouldn’t go into operating systems because they were done and I would have taken the generational lottery ticket and I would have cut it up if I had listened to that person’s advice. So this stuff is never done and the software needs to be robust, secure, available, secure again, and we are — we project that software into the future, when you deploy infrastructure software, you’re not evaluating it for today. You’re evaluating it for not just tomorrow, but next month and next year and years from now, potentially.

     You need to be able to project it into the future, project its competence into the future. That projection of competence into the future is trust. You need to trust the software that you’re going to deliver. And there are lots of technical elements of trust. Is the software correct? Does it actually do the right thing? Does the compiler generate the right output? Is it reliable? When the system begins to fail, how does the software cope with those failures? Of course, is it secure? How does it fare against an adversary? Can I understand, can I verify how it works? So these are all technical attributes of software. But importantly  — and there’s a lot to be said technically about these things, but they are all actually rooted in the people that make these things.

     So we have, for example, at a computer company, we have a hardware root of trust, that hardware root of trust has private keys on it, those keys that you can’t get to. It’s secure silicon. Those keys are ultimately rooted, though, with us. You’re trusting us, and the actual root of trust of every Oxide computer slide is actually sitting under armed guard at a safe deposit box at an unknown location. That is where we actually have the secret that we needed to generate. There’s a whole ceremony, literal technical term, yes, we wear robes, there’s a whole ceremony to actually do all of that and we make that transparent, but you are trusting that we have done that correctly. So all of these things have that root in human trust, and wow, human trust is it’s a lot Messier. Because with a technical artifact, you can go look at it, you can study it. It’s not going to move. You can study it to your heart’s content — I love offered the Ashley from — the presentation from my neuroscientist. It’s a totally synthetic system and we love to think of it as a biological system, but wow, do they have a much harder job than we do. We’ve got these entirely synthetic systems. They’re actually static. Humans are not this way. Humans are dynamic, we’re fickle, we’re capricious, we’ve got like new ideas that enter our head. We’re very fallible. We’re vulnerable. Ironically our trusting other people correspondence O. constitutes our vulnerability. How do we trust something that’s so unpredictable in terms of a human.

     Well, humans, fortunately also we also exist in a society. Right? This is very important, because we have accountability.

     We can be held responsible for our actions, and that sense of social responsibility at some level is important to all of us, because there are social consequences for our actions.

     Now, those consequences vary widely and sometimes it feels like maybe there aren’t consequences for this person’s actions, they’re certainly acting this way, but we all, writ large, have consequences for our actions.

     And accountability means that you’ve got responsibility and ownership. Unfortunately what that responsibility is, that itself gets really, really murky.

     Accountability is not necessarily punishment, certainly. It’s not even justice.

     Accountability is being able to say this person did that thing. They made that decision, and we can understand why.

     We can disagree with it. We may completely disagree with the rationale, but we can understand why, and if we as a society say, we disagree with this, we as a society say we disagree with this. There will be societal consequences and those societal consequences may be we don’t buy your product, those consequences may be we put you on trial, we determine the thing that we think you did and then we incarcerate you. We take away aspects of your freedom. So we as a society decide what those things are. But again there’s a fine line. You can get away with a crime, right? There’s not always justice. But at a minimum we deserve that rationalization. So as an aside computers: Not accountable. Can’t be held accountable. The history of this is really funny. This is from 1979, a computer can never be held accountable, therefore, a computer must never make a management decision. The origin of this borderline unknown. So the person who tweeted it knows where it came from. It came from their father’s basement in a stack of IBM materials, the rest of which were destroyed by a flood. So Simon Wilson has a great piece exploring this. What’s amusing is if you actually look at this image, there’s very clearly something written on the back and I said, this is great, I can get ChatGPT decipher this. Deciphers it completely and wrongly. But this is from 1979, and I think this is — again, we don’t know the origin of it but boy was it true in 1979 and it’s certainly more true now. In terms of computers and accountability in 2025, I don’t know how many people saw this, but there was a rather notable example — this is the point is this satire,* is this the onion? Actually it appears to be real that a very prominent advocate of vibe coding, who was vibe coding his app and was just insufferable on the internet about how amazing it was, that it was all vibe coded woke up to this: The system worked when you last logged in, but now the database appears empty. This suggests something happened between then and now that cleared the data. The think that happened is that the autonomous AI destroyed the database.

     And he — I mean this is just — it’s extraordinary, because he — I mean, this is how you know it has no accountability, because it explains that I destroyed the database, and he says, like, why did you destroy the database? Like, I told you don’t destroy the database. In fact, you did this last time and I said you can’t make any changes and it says, I understand that you’re not OK with me making database changes. I violated the user directive that says in caps no more changes without explicit permission and always show all proposed changes before implementing. It’s like I mean I don’t know what you call this. It’s not — I guess it’s honest. I don’t know, we’re in this like other realm of like mirror neuron impairment. Because it doesn’t have mirror neurons and that’s why it’s mesmerizing. And he asks: How bad is this on a scale of 1 to 100-and it’s like 95 out of 100. And this is catastrophic, here’s why it’s 95 and then it does the typical AI thing of I’m going to give you this entire rubric and break this down like analyze my own actions? Like, this is — OK, computers don’t have accountability. They don’t have accountability. This isn’t a person. We anthropomorphize it, but it’s not a person, it’s not a person. In fact, who did take accountability for this, the CEO of REPLIT. And I and again, but that’s who we hold responsible. The CEO is responsible for the destruction of this database. It’s not AI. It’s not the computer program. It’s not the software. So just as a reminder that this is really important, but then in terms of how we think about accountability, so we want to hold one another accountable, what does that mean? And accountability and trust have a really delicate balance, so if you have no accountability, we rely on benevolence. It was really interesting to talk to an attendee, who will remain nameless to protect his former employer and I was asking him, when the computer did a wrong thing, how do they think of it and they kind of think of it as a natural disas der, like God is mad. God sent a flood.

     And faith in benevolence is a little scary, especially when we see there’s no accountability and you don’t have the benevolence.

     But if we go to the other end and we really overemphasize that accountability, we undermine trust. Stephen trusted us to all get a Covid test before we came in the room. Accountability was we showed them a photo. It could have been easy, I mean, lots of ways to doctor that, and Stephen could have had  — there are all sorts of things that we could have done to ascertain to make sure that you could take the Covid test in front of us, to make it — but that accountability needs to be held in balance with trust and Stephen trusts us and we trust ourselves.

     So this is a constant balance, and when you’re thinking of fostering trust, you also need to think about what are the systems of accountability and how do we keep that in balance and that’s because we are human beings and we want to be trusted. We don’t want Stephen up in our grill about how we did the Covid test and doing it in front of him. We want to be able to do that in our hotel room.

     But we want reciprocity. You know, treat other people the way you want to be treated. Feels like a super straightforward one, but I think we need a societal review on that golden rule. But that reciprocity is really important when it comes to accountability.

     But trust is larger than just an individual. Individual trust is really important, but we also have collectives, groups of people and then we can trust those things and not trust those things and Rachel spoke very eloquently about a company not loving you back, but we do and can trust an organization or have that trust us back.

     And so we’re going to lump this into three tiers. You’ve got institutional trusted. Interest unfortunate in the entities that bind all of us. Governments, not just a government, like you got state and local governments, Federal Governments and so on, the way that we govern ourselves globally, but there are entities that bind all of us that we have to trust in and we do trust in is.

     Organizational trust. We’ve got trust in entities that are effectively self-advancing. That they have — like, we get it, you need to be profitable, but we can trust certain elements of that organization, say. Or you’re a foundation. Like, we get it. You need to feed yourself. But we can trust elements of that organization or not. And then community trust, which I think is really important. We’ll explain on that one in a little bit.

     And accountability and reciprocity really vary across all of these. Actually as a quick aside, can you view the rise of open source as an evolution in trust. So I know we — all times are bleak, these times are particularly bleak, but there are dimensions in which honestly the 1990s were really, really bleak for a software engineer coming up in the 1990s, it felt really suffocating, because we lived in a monopolistic world and we had a company, Microsoft, I get now for you Millennials, this really hit home when we were going to hire someone, this was at a previous gig this was at like 2014, 2015 and they were looking at offers from us and from us and Microsoft and Facebook. And took the offer from Facebook. And as I told my younger Millennial colleagues, at least he never went to Microsoft. Microsoft is fine. He went to Facebook and that’s disgusting. I had higher expectations for him and I’m very mad at him. And it was kind of an earlier indication that Microsoft’s transgressions have been I guess forgotten, but in the early 1990s, there were a lot of transgressions and there was no accountability and it felt like this is the end. And then open source erupted and it erupted because people did want — they demanded that accountability, they wanted to be able to trust entities and they were able to replace that organizational trust in an organization that was not accountable, Microsoft, to community trust and then you can build organizations on top of that community, but it completely changed the way we think of trust with respect to software.

     So these things can change. They can change, and they can change in a revolutionary fashion.

     Let’s look at an institutional trust in America. So you may think, like, God, it feels like institutional trust in America is in decline. You’re right. Institutional trust in America is in decline, and there’s one thing that all people agree on, more or less in America, which is we don’t trust our government, and that’s really unfortunate. That also means that that makes things really unstable, because people who actually really disagree with what the concrete problem is, both agree that we actually can’t trust our institutions and that is what is just thrashing us all over the place.

     I’m not going to say more other than that institutional trust is at an all-time low I think is fair to say? Is that a low that rivals all-time lows? My mother graduated from college in 1968 and loves to remind me of what 1968 looked like, rightly and there was a draft. We were fighting a foreign war, she had people she knew who died overseas for a foreign war they didn’t believe in. So institutional trust has been at lows before. But we are at a real low.

     Don’t know how to fix that one, yeah, elections matter, but moving right along.

     Organizational trust, which is to say trust from within an organization or the degree to which an organization is trusted outside of itself, organizational trust — first of all, varies widely, there are trustworthy organizations and there are organizations that you trust less. It always gutted me a little bit that if you ask Americans which brands they trust the most, it’s the ones that are most profitable. It’s like, those are the ones you should kinda trust the least, actually, but it was always, even in the bad old days of the ’90s, Microsoft was the most trusted brand. So the way we kind of trust organizations, though, is beyond just their own profitability, right? There are different ways we can trust organizations. They do vary widely, and it is much easier — this is true of all trust, but especially organizational trust, it is much easier to lose trust than to gain it. Microsoft, kudos to Microsoft, really had lost trust with a lot of people, but it’s kind of gained it back. There are other companies that have done it with a reset. My 2023 Monktoberfest talk I maybe hit on Uber kind of hard. It’s maybe built some of the trust back. I don’t trust it. I still use Lyft.

     I think it’s fair to say, especially given Rachel’s talk, I think we can say that organizational trust is fraying in 2025, to put it mildly. And I am really worried about our degraded institutional trust spilling into our organizational trust and degrading our organizational trust. Often for good reason.

     Like, often there’s a reason that that’s degrading.

     Community trust, on the other hand, is where we can find a real kind of sanctuary, so community trust — and I couldn’t find anything — I’m sure folks have studied what is happening to community trust. I’m sure there’s a degree to which loneliness is growing and so on, but it feels like broadly community trust is in much better shape than organizational trust or institutional trust, and there are a bunch of reasons for this. I loved the — I think it was Amanda who had the talk on the roller derby, right? And what was the appeal of roller derby? It was a community that shared my values, and although I still need to watch a lot more roller derby — and I have done this before. I have tried to explain sports to people that aren’t actually watching it. But it was really interesting to hear about how that community was so important to her. And the great thing about communities is we opt in their values, right? Communities have this solidity.

     We’ve got a community here in Monktoberfest. If there is wild misbehavior in our Monktoberfest community — thankfully there really hasn’t been, but if there were to be, there would be a lot of transparency about what happened. This serveses as accountability but that’s really important. And they’re fluid. If one really had a problem, you wouldn’t be invited back. We would trim our community, and or expand our community. So — so I think that this allows communities much more — it’s easier to get this kind of bedrock of trust. Now, I will say I’m very concerned about communities being Buffeted by the forces of organizational trust. Which is really a way — I’ve got to bring up relicensing at least once in this talk. * and it’s been really unfortunate to watch company after company, born in open source, and born with this kind of ideals of the community, sacrifice the trust that they have made, that they have earned by relicensing their software, and I think that it’s unfortunate, because as each one has done this, it normalizes it for the next one, and they figure, well, OK, I can actually be even shadier. And now you go back to like those first relicensing, you go back to like confluent relicensing, and you’re like, this isn’t so bad, there was no spyware or malware being injected in the free version. There was no revenue cliff.

     This is the real problem is this kind of erosion of trust — and now I also think — I believe in the strength of communities to resist that and communities need to resist it and need to continue to because we should not let declining organizational trust infect our communities.

     What is to me most important when you have a group of people working on a shared problem, whether that’s in an organization, which is what I’m going to focus on because it’s what I do, or a community, you need mutual trust, and I hope you’ve red — if you haven’t read the soul of a new machine, you’re in for a treat. Soul of a new machine still stands as a * — it’s so readable and rereadable as a saga of — this is our literature in software systems, computer systems, won a Pulitzer Prize in 1981. And Tom West is complicated so just put an asterisk on there, that Tom West is a complicated person and you’re getting kidder’s retelling of it kidder new wet.

     When I read it as a 22-year-old. I took away * a lot of the technical details. When I reread it 20 years later, there was all this organizational stuff that I had just missed as a 22-year-old. I love that there was a Civil War that erupts at Howard Johnson’s which is called the great shootout at *hoe Joes, that’s a piece of corporate lore,.

     >> West once said he would bind his team with mutual trust. My emphasis. He wouldn’t break it down into little pieces and make the task small, easy and dull. That is a one-paragraph guide to management in leadership. Binding a team with mutual trust. I would like to believe that that gets us to our noninvasive management that Ashley you referred to. I loved the taxonomy of noninvasive, semi-invasive and invasive. Binding a team with mutual trust. This is not just everything. It’s the only thing.

     Because when we do this, when we trust one another, our greatest things happen. Especially in knowledge work. People are creative, they try new ideas. They want to live up to the trust of that group of people.

     And if you — you know, and kind of an interesting thought experiment — and I think especially when — you know, when young people are concerned about the state we live in now — and I mean rightfully so, it is helpful — OK, so if you could go back any time and place, where would you go? It’s like, I would go to Xerox PARC in the ’70s, and great that’s a great answer. Xerox PARC was also at a really dark time in the early ’70s, the mid ’70s also sucked, and in any of these times you can find these pockets you can find these pockets that were extraordinary.

     And you can ask yourself, when have I felt the most gratified, I’d be very surprised if mutual trust is not why you were most gratified and this is the thing that’s most vexing, it’s like not only was I happiest, we did our best work, before we had the reorg and we went from noninvasive to invasive management, we did our best work when we had mutual trust. But how do you foster that? Does that mean everyone be nice to one another? No, there’s more nuance than that. Because you’ve got to figure out what does accountability look like when you’re fostering mutual trust. This feels reductive to say, but I think it’s really important. Team formation is really, really important. I know everyone knows this, or they say they know this, but they don’t act like they know it sometimes. I get kind of frustrated about how important — people will praise team formation, but then not really act accordingly. Stephen and I are both baseball fans, or at least you were as of — … About the 4th inning last night Stephen may not have been a baseball fan any longer, but he’ll be a baseball fan next year, a big part of team formation is recruitment. How do you find the right people to Brion to your team. I think it’s critically important to find people who share your values. You should of course find people that have integrity, that should be a constraint. But your values and your principles are actually different and you should find people that share your values, your idea of what’s relatively important, and this is essential to being able to foster that mutual trust, to know that like, I think that person is going to make a decision that I trust, because we share values. We have a shared perspective on how to actually — on how problems should be solved. I would say use those values as a lens for team formation, I think they need to be formalized and I think you need to use them for a lens for formation and it’s kind of amazing to me that people don’t see that. And I think it’s incumbent on us to seek out organizations and communities that share our values.

     I think another very important aspect of that is finding people that share your intrinsic motivation. Find people that are motivated by the same problems that you are motivated by. If you’ve got third intrinsic motivation and shared values, you haven’t solved mutual trust, but you’ve put wind to the back. You’ve got really nice tail wind to get you to mutual trust.

     Artifact creation is kind of like a part of our jobs, and that’s not just code and the other things around software. There are lots of artifacts that aren’t code that are around software, but I am a staunch believer in our own prose, in whatever domain you’re in. The prose, that explains the why.

     This is really important. It holds you accountable to your team, but actually way more importantly, holds you accountable to yourself and I’m sure this has happened to you, where you go to write something down and as you write it down, you begin to take yourself apart. You’re like, wait a minute, that doesn’t make sense, what are you doing? Or this part I don’t understand completely. I need to go understand this. So that kind of self-accountability is really, really important, and writing for us at Oxide is such a huge component of the way we conduct ourselves, that we use it in our hiring process and this is amazingly unusual. And I don’t mean like make work — I mean, like, ask someone in writing when have you been happiest in your career and why? When have you been unhappiest in your career and why. Why do you want to work here? Those are three questions that I think anybody can ask and actually if you only use those three questions, the LLM use does become really obvious. It guts me to say, that yes and you’d be amazed how many people are using an LLM, it’s like why are you using an LLM to say why you want to work here? Isn’t it entering your mind that if you are using the LLM to generate the text about why you want to work here, maybe you don’t want to work here and it guts me to say that yes, the emdash — it kills me to say it because goddammit I was using the emdash …

    [applause] that goddamn thing trained on my writing and your writing and to the LLM’s credit, they are very God at detecting when an emdash is used in a way that not you or I would use it, very even pacing and so on. But it does sadden me, and also not this, but also  — I mean that construct. I now literally just can’t hear it in conversation anymore, this is maybe like where ChatGPT is actually inducing brain damage on me. But brain damage kind of like a bank shot of brain damage. It’s people using it that’s actually damaging my brain. But when people use an LLM to say when have you been happiest in your career. Not what makes you happy. That’s an easy question to answer. I don’t want to know what makes you happy, supportive environment, yeah, yeah, yeah I get that. When have you been unhappiest and why. I will tell you a whole lot of people answer that question, you will not be surprised that they all basically rhyme with each other, they are in environments that don’t have universal trust. They’ve got engineers and managers that are trying to get the right thing to happen and managementment — that is really undermining as an individual contributor.

     So I think that’s really important. It’s still shocking to me how much this is not used. I would really — sometimes people are like, do you mind if I take the Oxide hiring process?

     >> Please, please, I would be —

     I think transparency is actually important for having a team hold itself accountable and I think we at Oxide have found some very valuable transparency measures. It’s very easy to open it up to an entire team. Everybody can look at someone who wants to work at Oxide, we can get everyone’s opinion and unlike in an interview, it is amazing to me how interviews go rasha man in a heartbeat. If you haven’t even rasha man, terrific film from the 50s, where four different people, actors, relay an event and they all * recall it very differently. In an interview if you are in with people having a conversation with someone, it’s very easy to hear what you want to hear. When you force somebody to write something down, we’ve got a ground truth that we can actually discuss. really, really important and valuable. Which, by the way does not mean it’s consensus-driven. To open up a process and give it transparency isn’t to say, let’s turn it into a Montessori school. No, no, not like that. Then we can decide what we want to do with that information.

     Another thing we can do, I’d be curious to hear if more people are doing this, because this is so unbelievably important us to, we record every meeting, and I’ve linked to the RFD that describes why we record every meeting. We’re a child of the pandemic so we got lucky. My view is we got lucky. The pandemic guided our company in a very important way. It made us remote and we got good at being remote, and now we record every meeting. An RTO mandate doesn’t make any sense at — I don’t understand why you would be doing that, because we already record every meeting and recording every meeting is mesmerizing and it has so many — there is so much value to it. I’ll give you a couple of concrete examples. One, engineers have a fear. They have a fear that those people in that meeting are deciding on the wrong thing.

     And if I don’t go from that meeting, I need to be the control rods. My purpose to go in that meeting is to do my work while I’m in the meeting, and then if I feel the temperature get up, the control rods need to go in and I need to stop whatever madness is about to happen. Many, many, many many engineers feel this way. And they feel like I need to go to that meeting. Why am I not in that meeting? What’s going on in that meeting? Well, nothing’s going on in that meeting — but when you record everything, you can always go to the meeting after the fact. Like, don’t worry if they come out and the broken thing happens, good news, you can go and watch the recording, you can get full context and you can go and write down why we shouldn’t be doing that.

     Second thing: I learned a lot about myself. Watch — I’ve watched many meetings, many at 2X. Yes, I also can’t watch myself at 2X, so that should be at some relief to many of you, I really am insufferable even at 1.5X. I do recommend going to my older talks and play them at my pacing sounds normal pacing but I sound drunk out of my absolute mind. Much drunker than I can actually get. So something to Go and entertain yourself, especially if you get me in a particularly energetic talk, where I’m really on one, then 0.5X is for you. Listen to yourself at meetings. One of the things that I’ve learned about myself that was actually very surprising is there are meetings that I was in where I was listening and I had something to say, but I’m not like a monster, I’m not going to be like, stop the meeting, I have something to say. I’m going to do what we all do, which is I’m going to wait until there’s an opening. And one of the things that I’ve learned about myself and I would imagine it’s true of all of us, keeping your little morsel warm, the little thing that you want to say, you’re kind of polishing it in your head, maybe you’re rehearsing it a little bit, that act deprives you of listening. You’re not actually listening and you think you are, you hand on heart think you are. You’re not a jerk.

     But you aren’t. And the number of times I’ve listened to myself in a meeting and I listen to what I say, and even other people who would be in the meeting would relisten to it and they wouldn’t know what I knew.

     Because they wouldn’t realize that actually I know that I wasn’t listening to that. I didn’t hear that point. You think I did because I kind of gave you the social cues that I was listening. I wasn’t. I totally didn’t hear that point.

     Interesting thing No. 3 we don’t have a ton of interpersonal conflict at Oxide. Not that I’m summoning interpersonal conflict from the gods, I want to mention to god you’re trying to punish Stephen and not me. If you’ve got interpersonal conflict in a recorded meeting, it’s really interesting. And we had a conversation where two people drew very different inferences but what had happened. In particular, an engineer had a very heavy sigh. And like, look, you know if you’re an engineer you know this, that a sigh is like a shot fired. The right kind of sigh is like, oh, it’s on now, pal, and you got to think, isn’t that just a sigh and didn’t you hear it though? It’s like ohhhh, godddd so it’s like Civil War to the most exasperated. So there was a sigh, and one engineer viewed it as absolute, like causeis belly, and the other engineer is I was just sighing because the world is so complicated. I wasn’t sighing at them how did they draw that inference? It’s like if you listen to it, it kinda sounds like it was directed at them. But you know what? This can really inform your apology. And I mean you haven’t done anything wrong, per se, but you can say, I’m sorry you drew that inference, it’s definitely not what I enintended and rewatching it, yeah, I see how you thought that. Absolutely not what I intended. And I don’t understand why we don’t do this for everything. This seems very easy and obvious that we record all this stuff. But we don’t.

     I would also be remiss in not mentioning that compensation is transparent in at Oxide and the you can complain that the salary that we pay people is either too low or too high because it’s definitely one of those. But I will tell you this and I know this is not a fit — people are like, I love it, you’re crazy, we can’t do that. I would say transparent compensation is easier than uniform compensation, but just so you no know, there is nothing that has generated our esprit de corps like that uniform compensation in terms of valuing everybody in an organization. You will — you’ll find this unsurprising that we — or because the compensation is transparent and uniform, that people feel free to talk about compensation, why not, and early on at Oxide, before we had talked about it, we were uniform, we hadn’t talked about it at all and folks were just talking about the pay cut that they come to Oxide, they took a pay cut and somebody said, I didn’t take a pay cut I actually got a pay increase. Another person said I also took a pay increase. Don’t have to tell you that was the women that got a pay increase. You’re just like, ughhhh because we’re talking about really sharp, terrific folks, and I’m like, damnit, we underpay you, don’t you understand that? You should absolutely be taking a pay cut to come to Oxide as far as many aye concerned, but it was eye-opening. The thing I can absolutely assure is that we will have transparent compensation, the uniform compensation has been wild and you can read the update on that. It has been wild, it has really fostered institutional trust. By the way if you’re saying is that really what you’re paying support people? Have you never worked with terrific support? I mean what — do you realize how much you’re telling on yourself right now? It’s like, what if we only hired the best support people you’ve ever worked with, that would be a great idea. And OK, that’s what we’re doing. OK, thank you, can you delete your Hacker News comment right now?

     But mutual trust, there’s kind of this idea that mutual trust implies total autonomy and I think autonomy is obviously essential for our craft. I don’t want to be micromanaged. I don’t want to the invasive management. I want that autonomy — that trust, so we need to have that autonomy, but we can’t have total chaos, an if you have mutual trust, I have great news for you: Clarity is really, really effective. You have mutual trust. People trust one another and when they trust one another, you say by the way, that’s the objective right now. Everyone is great, what do I need to do? Clarity is really, really, really effective. And so I think the role of organizational leadership is to achieve directed autonomy. You want to make sure that you’re fostering trust and providing clarity, and if you do those two things, you get an environment where people are — you’ve got effective mutual trust. Trust is infrastructure. Trust underlies everything, and we need to think of it deliberately as infrastructure, and we spent a lot of time thinking about the infrastructure we build. We should be thinking as much time thinking about that trust that underpins it. And there are things that are really constructive that you can do to foster that trust. It needs work. It needs maintenance.

     It needs repair.

     And I’ll reference it on the next slide, but if you’re wondering how do I repair trust, that seems really hard. You’re going to go watch Rachel’s 2022 Monktoberfest talk on introspection gaps. And I’ll continue to — I’ve sent that to as many people as I can think of and will continue to do so. It’s an extraordinary talk on how you deal with fractures in trust. That would be the fifth title. I know you had four titles. Maybe that would be the fifth one. But that really deals with how you would deal with fractures. Yes, we are at a low point in institutional trust. Organizational trust is kinda on the ropes. And by the way, if you’re in an organization and you’re completing an RTO mandate, please don’t, because if you care at all about the trust of your workforce, the second you have someone come into an office to be on a Zoom, you’ve lost it. Forever. Can never do that. You can do that zero times. You can be like, hey, you know, so I missed my kid’s soccer game because I was in here on a Zoom. That was actually before the soccer game. Zoom ended. I sat in traffic for an hour and 20 minutes. Missed my kid’s soccer game. I don’t trust you. I don’t trust you. Because spare me the bullshit about how I was more productive in the office. I wasn’t. I was on a Zoom meeting. Don’t do this. If you do this, you will destroy the trust in your organization and you won’t get it back. And they might not tell you. They’ll tell me when they’re applying to Oxide and when they’ve been unhappiest and why.

    [applause]

     And I would say in just kind of echoing the sentiment that Rachel had, too, that it can feel like, especially with the level of institutional mistrust that we have, like, there’s nothing we can do.

     There is a scale at which each of us has total agency. It may not be in your organization. For many of us it is in our organization, but there is a scale at which we have total agency. And it’s incumbent on each one of us to foster that mutual trust. That is all of our job. You cannot wait for someone else to foster mutual trust at the agency — at the scale at which you have agency. That’s on you, and it’s on me, and it’s on all of us. And that’s the agency you’ve got. And then that feels great. That’s the nice thing about that. Then you’re in like roller derby and you can kinda be like, yeah, definitely societal collapse, I get you, but roller derby was great last night. That’s important and it’s incumbent on all of us. And with that, thank you very much.

    [cheering and applause]

    

     And if I can just plug previously, I’ve got some additional stuff to go watch and listen to, but I would plug it again to watch Rachel’s talk from 2022 and if it’s not too gauche I can plug my own talk from 2017. I’m extremely grateful to Stephen because Stephen and James asked me to give this, immediately they were replying to a tweet in which I was trolled, it’s the most important talk I’ve ever given in my life. Because that talk, by writing it down, talk about forced accountability, forced me to think about what do I actually want in an organization and end it wasn’t the in the organization I was in, but it was an organization I started two years later.

     >> Thank you …

More in this series

Monktoberfest 2025 (13)