WEBVTT

00:00.000 --> 00:10.520
Howdy folks, hey, so I'm Preston Dostor, I'm a board member of the Nivenley Foundation.

00:10.520 --> 00:13.200
You only have to listen to me for like 30 more seconds.

00:13.200 --> 00:15.760
I'm just here to introduce Quinn Tessons.

00:15.760 --> 00:21.000
I've had the pleasure of working with her for the last four or so years on both the

00:21.000 --> 00:26.280
Nivenley Foundation and how we're working to help the open source community grow and

00:26.280 --> 00:30.840
hack a term, you'll have probably heard of hack a term, we're a massodon community

00:30.840 --> 00:37.880
for queer folks and tech folks and I think right now we're like the 13th or 14th largest

00:37.880 --> 00:43.800
instance but you know in that time we've encountered a lot of things on the social web, a lot

00:43.800 --> 00:48.880
of good things, a lot of maybe not so good things and Quinn Tessons has had the opportunity

00:48.880 --> 00:54.360
to do a lot of great research into new and emerging threats to our open source communities

00:54.360 --> 00:58.680
and with that, I'll just let you take it away and talk about the data.

00:58.680 --> 01:06.800
Hi, good afternoon, I am going to be talking about data, Preston is correct, but before we

01:06.800 --> 01:10.440
get started I just want to do a quick check on with everybody because the event is a bit

01:10.440 --> 01:16.440
chaotic and just remind everyone to please have your water, your snacks, your meds, and

01:16.440 --> 01:19.920
your rest strategies on your person at all times.

01:19.920 --> 01:25.120
This is a very large and distributed ecosystem and what an excellent time to practice what

01:25.120 --> 01:31.280
that means in real life and with that I'll just jump right in and get started.

01:31.280 --> 01:36.280
So before I get into the data I will also be doing a quick level set on open source

01:36.280 --> 01:41.040
as an ecosystem as not everyone has entered open source at the same time or been here

01:41.040 --> 01:45.720
the same number of years and I want to make sure that we understand a little bit of the

01:45.760 --> 01:49.720
impetus for some of the things I'll be talking later early on.

01:49.720 --> 01:56.760
And so when we talk about the launch of open source it was launched quite a while ago.

01:56.760 --> 02:02.960
She says with some nervousness a few decades ago and the entire landscape of what technology

02:02.960 --> 02:06.240
overall looked at at that time was very different.

02:06.240 --> 02:12.520
So we see in the early 1990s home computer ownership was incredibly rare and our net

02:12.600 --> 02:14.600
access was incredibly rare.

02:14.600 --> 02:19.760
Tools were incredibly cludy and it was just kind of hard to get things done now.

02:19.760 --> 02:24.400
This doesn't mean that technology didn't find a way to make that successful but it does mean

02:24.400 --> 02:30.080
that the communication was difficult in different ways than it's difficult now and things

02:30.080 --> 02:35.680
were slower and just didn't move at what we would consider to be a modern pace.

02:35.680 --> 02:40.440
We also understand that because it was a new movement it wasn't running anything that we

02:40.440 --> 02:43.320
would probably call critical at least not now.

02:43.320 --> 02:47.160
It wasn't running your healthcare, it wasn't running your national infrastructure.

02:47.160 --> 02:51.800
It was starting to run web servers and people were not only getting grounded in the

02:51.800 --> 02:56.520
just of the movement but also the implications of what it might mean.

02:56.520 --> 03:02.960
When we moved forward about two decades later so 1990s to the 2010s this had already started

03:02.960 --> 03:09.320
to shift pretty dramatically so we went from having minimal home compute to some home

03:09.360 --> 03:14.360
compute and instead of having internet access kind of nowhere most homes did start to have

03:14.360 --> 03:19.160
it and when we talk about whether or not they were shared devices and people doing lab

03:19.160 --> 03:24.560
access, yes there was lab access but again there was also home access and this is also

03:24.560 --> 03:29.240
one social media and social tools as a consequence started to both emerge and start

03:29.240 --> 03:33.080
to mature over the course of a decade.

03:33.080 --> 03:38.200
This also means that by this time open source went from being a new movement to being

03:38.200 --> 03:42.640
about a 20 to 30 year movement depending on your origin timestamp.

03:42.640 --> 03:49.440
This also means that it is running critical infrastructure by the 2010s.

03:49.440 --> 03:53.280
You're looking at things like it's running your healthcare and the other examples, government

03:53.280 --> 03:54.280
infrastructure etc.

03:54.280 --> 04:00.080
This is where we start to see it really be important to everyone.

04:00.080 --> 04:02.760
There's also the tooling barriers start to disappear.

04:02.760 --> 04:09.600
Tools became easier to use for everyone not just technologists and this means that what

04:09.600 --> 04:14.960
is the risk benefit analysis for any type of threat also shifted because you did not have

04:14.960 --> 04:20.360
to have a persistent education or actor to have an amplified effect and I will get

04:20.360 --> 04:22.160
into that in a minute.

04:22.160 --> 04:27.400
But of course from no longer the 2010s around the 2020s and now we have not only everyone

04:27.400 --> 04:31.600
has internet access we have people with multiple devices all with internet access that's

04:31.600 --> 04:36.520
kind of blanketing the globe it's no longer a home wire so to speak and this means

04:36.520 --> 04:40.600
that you don't have things like the lab set up or even a single home shared computer

04:40.600 --> 04:45.320
as people used to have you have all these chronically online devices where people can

04:45.320 --> 04:50.400
never truly turn things off and this also has the effect of things being always online

04:50.400 --> 04:55.720
including discussions in open source or activity or code or public lesson etc.

04:55.960 --> 05:04.720
At this point a lot of things are still true mature ecosystem is still true the tooling

05:04.720 --> 05:10.800
barriers are even lower and the value of open source is incredibly high so over a time

05:10.800 --> 05:15.080
when I was looking at the supply side and demand side evaluation of open source we have

05:15.080 --> 05:19.800
the supply side was being calculated as early as the early 2000s at a few billion dollars

05:19.800 --> 05:27.000
and the supply staying peaked and kind of leveled out at around 4 to 5 billion estimated

05:27.000 --> 05:32.920
over the 2010s but when we have the 2020s was when we had the first demand side evaluation

05:32.920 --> 05:41.400
of open source which was between a and i and trillion dollars so this is a massive massive

05:41.400 --> 05:47.480
industry that is being supported mostly by volunteer labor and in fact when we look at

05:47.480 --> 05:51.960
the rapid scale we can see just how many code bases and not just the code bases but

05:51.960 --> 06:01.560
their components are being used at our in open source 97% with 911 components per application

06:01.560 --> 06:05.640
this is not something I don't think anyone would have been able to explain to people

06:05.640 --> 06:12.400
in 1990s could be a consequence of a mature ecosystem at this pace and when we talk about

06:12.480 --> 06:17.560
how people are contributing to open source most technologists went from contributing to open

06:17.560 --> 06:22.440
source if they saw a project they fancied in the early odds now everybody is contributing

06:22.440 --> 06:29.280
to open source as an entry into the tech industry as an entry into open source specifically

06:29.280 --> 06:33.440
and this means that open source has gone from an anziant movement to valuable accessible

06:33.440 --> 06:40.840
and critical everywhere and this is very important to understand because I feel like we all

06:40.840 --> 06:47.000
feel like open source might be going through it right now and some of you are laughing yeah

06:47.000 --> 06:51.240
it feels like it's going through it and so because I'm a data driven human being I thought to

06:51.240 --> 06:56.680
myself I wonder what would happen if we try to substantiate the feeling the vibe because you

06:56.680 --> 07:03.160
can't get a conference presentation on vibes what the data seems to look like and so I grabbed

07:03.160 --> 07:08.600
a poll about 30 organizations and open source these are open source foundations of a variety

07:08.680 --> 07:16.680
of sizes and the idea was to take a decent cross section of the open source ecosystem and I want

07:16.680 --> 07:23.880
to say at the front a few important constraints and boundaries one I'm only sharing aggregate data

07:23.880 --> 07:28.680
I will not be talking about individual projects or foundations not even if you ask me after

07:29.880 --> 07:33.960
I also I'm going to be talking about what I call a crisis event which is scooped me in a

07:33.960 --> 07:37.880
collapse or a near collapse event this is something that causes the death of a project or its

07:37.880 --> 07:42.920
foundation or a permanent schism and a project or its foundation anything that is kind of a

07:42.920 --> 07:48.280
death now to that ecosystem and then all crises that I've researched even though I won't be

07:48.280 --> 07:53.640
talking about the individual ones are in the public domain so even if you do manage to think I

07:53.640 --> 07:58.280
think I know what this is about you may and you can probably Google it and I still won't answer you

07:59.160 --> 08:03.720
but because I'm not telling you who the foundations are I am going to tell you the shape of what the

08:03.720 --> 08:10.280
foundations are so it's roughly 50 50 split of four profit and nonprofit foundations with a slight

08:10.280 --> 08:16.360
skew into the nonprofit and that shouldn't surprise and I also balance them across t-shirt sizing

08:16.360 --> 08:21.480
between so-called extra small and extra large the idea being that if you're a smaller foundation

08:21.480 --> 08:26.440
with maybe one to two staff or maybe only an executive director and a board or something like that

08:26.440 --> 08:30.280
with a project to your name you would probably be on the small side and then an extra actual

08:30.280 --> 08:36.200
large would be something with dozens or hundreds of projects as you ramp up from large to extra

08:36.200 --> 08:43.560
actual large level setting across the founding ages was a little harder there was a long tail

08:43.560 --> 08:48.680
for the first 25-ish years or so of open source for reasons right and then there was a surge in

08:48.680 --> 08:56.440
founding between roughly the years 2005 and 2015 which is in the lower two halves there of that

08:56.440 --> 09:03.000
graph but there is still a decent spread of organizational ages in this data set as much as was possible

09:05.000 --> 09:11.800
and now for the actual data so in order to map this out I plotted it in a histogram so this is just

09:11.800 --> 09:17.480
the net number of collapse events in the 30 foundations that I cross sectioned for open source

09:18.520 --> 09:24.120
and just to be really clear I didn't map crisis events and see which foundations have them

09:24.120 --> 09:29.880
I picked the foundations first and that will be relevant to understand that that's while there

09:29.880 --> 09:34.440
may be bias in the system and academic rigor and all of that the order of operations was

09:34.440 --> 09:40.440
understanding foundations first in crisis second the other thing to understand is that they do seem

09:40.440 --> 09:46.920
to have distinct types there seem to be distinct events that cause a collapse or near collapse event

09:46.920 --> 09:52.280
you can see one of them that's easy to codify as a licensing crisis so a licensing crisis would be

09:52.280 --> 09:58.120
one of the corporations for example choosing to change the licensing of the open source

09:58.120 --> 10:02.120
and that may be causing some conflict with a community who felt like they gave their intellectual

10:02.120 --> 10:07.480
property under one understanding and having the understanding change but it is not the only type

10:07.480 --> 10:14.360
in fact the other four types are all human crises codes of conduct licensing we skipped governance

10:14.360 --> 10:20.840
moderation and leadership so of the five primary crisis types that seem to emerge by category

10:20.840 --> 10:29.320
four out of five of them are about people and also it's worth mentioning that basically every

10:29.320 --> 10:37.320
organization in the data set had a crisis events at least once and many multiple so this also

10:37.320 --> 10:43.240
means that regardless of size and age and et cetera there was a near collapse event for that

10:43.240 --> 10:52.040
ecosystem for basically everybody there's also something that we can understand when we see a

10:52.040 --> 10:57.240
pivot in the data like this it means that the older foundations had more time before they had to

10:57.240 --> 11:03.320
deal with an existential crisis than the younger foundations an older foundation basically had a whole

11:03.320 --> 11:09.400
founders worth of career to stabilize mature expand get resources et cetera before it

11:09.400 --> 11:14.840
had to deal with an existential event whereas the younger foundations barely have their bylaws

11:15.640 --> 11:23.880
ink dry and they are also having a crisis event and if we compare by year founded we can see that

11:23.960 --> 11:35.240
this is true consistently basically having for the past couple of decades the other thing I wanted

11:35.240 --> 11:40.200
to take a look at is we can see that despite they're not being many crisis events

11:41.320 --> 11:47.560
capped at the year 2000 mostly for my ability to independently verify events we don't see many

11:48.280 --> 11:55.160
collapse events right for a good few decades pre 2000 included until 2014

11:55.960 --> 12:00.600
so we can see after 2014 and during 2014 we have a surge of events relative to the

12:00.600 --> 12:07.160
prior years and then continued surges after the fact and since it's relevant to the talk I know

12:07.160 --> 12:13.320
that people do like to talk about rust and access in particular often usually due to moderation

12:13.320 --> 12:17.000
or steering committee or the things that people like to talk about but the reason I bring you

12:17.000 --> 12:22.360
up at all is well they are very similar they're very large they're very interesting they are not unique

12:22.920 --> 12:27.240
they are not unique even for the calendar year that people like to talk about in fact in combination

12:27.240 --> 12:36.680
they are two of nine events of that particular year and we can tell that communities know that

12:36.680 --> 12:41.400
this is happening whether they're articulating or not because they start to respond by changing

12:41.400 --> 12:46.760
things in their ecosystem to adapt to this reality and this means we're going to look at the

12:46.760 --> 12:52.680
20 times a little more deeply as a decade of relevance and first I want us to understand a little

12:52.680 --> 12:58.120
bit about actor versus non-actor harms and to understand actor harms we're going to talk about

12:58.120 --> 13:04.600
what everybody talks about for a quick minute so we have XCU tells and when we look at XCU tells

13:04.600 --> 13:10.120
the main thing to understand about XCU tells is that it was a stained campaign over multiple years

13:10.680 --> 13:16.120
that weaponized the contributors burnout and mental health so if we want to talk about it in terms

13:16.120 --> 13:21.960
we can say that the human state was the exploited vulnerability this is not a code vulnerability this

13:21.960 --> 13:27.640
is a human vulnerability and bottleneck in a way because they were able to infiltrate the project

13:27.640 --> 13:33.320
and get to the only person that mattered to them to have their objectives meant whatever they were

13:34.520 --> 13:39.480
and this is an interesting discussion that I will be diving into after I explain this because it

13:39.480 --> 13:45.000
talks about we talk about trust and safety and security and human harms and so forth but this is a

13:45.000 --> 13:50.520
very health oriented a very mental health oriented thing that we need to be aware of and it's also

13:50.520 --> 13:57.000
relevant when we talk about non-actor harms non-actor harms are the amplified effects of what

13:57.000 --> 14:03.400
happens when you have always online incredibly interconnected spaces so this is not an actor harm

14:03.400 --> 14:07.960
but sometimes you can't tell the difference right you have a pair that blows up because the

14:07.960 --> 14:14.760
community is mad it hasn't been merged yet and is are people mad or are if someone fueling that

14:14.760 --> 14:22.600
fire to make them more mad and they do look very similar on the surface and neither requires

14:22.600 --> 14:27.480
technical sophistication and the reason I want to bring this up and I mentioned it in the beginning

14:27.480 --> 14:34.040
a bit long form is the tooling barriers to entries have dropped very low you do not have to have a lot

14:34.120 --> 14:38.600
of complexity in your tools or a lot of sophisticated understanding or even significant research

14:38.600 --> 14:47.000
and diligence to have the impact of a dedicated actor in the 2020s right anyone can spin up a

14:47.000 --> 14:52.920
chat army even if they don't know what they are doing or what to call it because the tooling

14:52.920 --> 14:59.720
allows them to do this and for all of these reasons I want us to take separately more deeper

14:59.800 --> 15:05.560
analysis and academic rigor and expanding the data set because I think that we should take a look

15:05.560 --> 15:12.120
at some of the causes and factors and amplifications that are happening there in and so to quickly

15:12.120 --> 15:17.320
recap we have about a 40 year old ecosystem that grew from a small movement to a very large

15:17.320 --> 15:23.960
set of ecosystems sorry we are participating in these earlier communities with the understanding

15:23.960 --> 15:29.560
that they were small and that that smallness had an artificial gate of a type the tooling was hard

15:29.560 --> 15:33.400
and so people really were only joining if they wanted to be there for the most part and if they

15:33.400 --> 15:39.880
could get past the tooling right and while all of that has improved this means that everything is

15:39.880 --> 15:46.120
easier and some things are double use because modern tools have the effect that anyone can do

15:46.120 --> 15:50.600
kind of anything we understand how to talk to each other we understand how to connect to each other

15:50.600 --> 15:57.880
we can commit PRs with actually not knowing how to write code as of right now with increasing relevance

15:58.680 --> 16:05.480
right foundations are experiencing crisis events earlier and earlier and this is brought spread

16:05.480 --> 16:11.400
across their age their size or their perceived importance right so it doesn't matter if you

16:11.400 --> 16:15.640
feel important or not it doesn't matter if you feel big enough or not or mature enough or not

16:15.640 --> 16:21.720
you're kind of going to be going through it by definition the volunteer teams as such don't know

16:22.360 --> 16:27.720
kind of what they're walking into sometimes because in a lot of moderation circles you have people

16:27.720 --> 16:34.440
who treat their ecosystems the chat the discords the zoolips and so forth as if they are more

16:34.440 --> 16:42.200
closed than they are just because of the cultural momentum of that mindset and when they encounter

16:42.200 --> 16:47.320
things that they don't understand they may or may not have the language or tools to tell leadership

16:47.320 --> 16:51.160
what is happening or why it's happening or why they're taking the actions they feel they should

16:51.160 --> 16:58.200
be taking and these communication breakdowns can and have become very public and all of this

16:58.200 --> 17:02.760
leads to people having more difficult early detection because if you don't know how to articulate

17:02.760 --> 17:08.280
what you're looking for you very likely also don't know how to map it out how to look for a

17:08.280 --> 17:14.120
signal misalignment what signals even should be aligned or mislined for you to do continued research

17:14.120 --> 17:20.360
or for you to do overtime research because again you're not usually distinguishing in the moment

17:20.360 --> 17:26.040
if somebody is being authentic with you or not you're looking at behavioral patterns over time

17:26.040 --> 17:30.840
which means you need to have enough data and of granular data to be able to do that

17:32.840 --> 17:36.360
and all of this feeds into how do we keep our open source communities open

17:37.400 --> 17:42.680
because if we understand that all these chats when you're having these amplified effects it can

17:42.680 --> 17:47.720
start to make people feel like we need to shut it down somehow or close it down or do something

17:47.800 --> 17:53.000
very reactive in order to protect ourselves and while it is true that we need to change our mindset

17:53.000 --> 18:00.040
some we can protect it without being quite so drastic just by improving our understanding of the

18:00.040 --> 18:06.680
situation and acknowledging a changed reality that we find ourselves in one of these is to

18:06.680 --> 18:11.640
down acknowledge but also analyze and address what that means so we build over time what we think

18:11.640 --> 18:17.400
we need to say see if we encounter an event that we did not have adequate data to do a post

18:17.400 --> 18:24.280
more than even if it's internal only we know we need to increase the type we know we need to increase

18:24.280 --> 18:31.720
the type of data that we are gathering and apply that future forward so that we can analyze over time

18:31.720 --> 18:37.320
in the only way that we can do that is if we start to address things differently again over time

18:37.400 --> 18:43.800
so that we stop engaging with potentially inauthentic conflicts as if they are authentic every time

18:45.240 --> 18:49.400
one way that we can look at this is to look at our contributor ladders these are a concept

18:49.400 --> 18:54.360
that are pretty familiar I believe in open source communities in general most people associate them with

18:54.360 --> 18:58.520
code though you get your first privilege you get your first ability to commit you can merge

18:58.520 --> 19:04.280
PRs or you can't you need to escalate where you don't but contributor ladders are also useful for

19:04.360 --> 19:11.320
reasons we can apply organizational threat awareness to the contributor ladder as you grow a maturity

19:11.320 --> 19:16.920
and understanding of a project and or its foundation again depending on size and scope you should

19:16.920 --> 19:22.920
have an organizational threat model and be able to apply it more and more and more as you become more

19:22.920 --> 19:29.480
senior effectively what I'm proposing is as you grow in the technical admin and you move up the

19:29.480 --> 19:33.960
ladder you grow on a threat awareness and the defense of that threat you're not just aware that

19:33.960 --> 19:38.840
something could be happening you're aware of what to do about it and every time you move up the

19:38.840 --> 19:44.200
ladder that means that people will start coming to you level appropriate to help resolve things that

19:44.200 --> 19:50.120
they don't understand and they don't need to know how to articulate we also need to start writing

19:50.120 --> 19:55.480
things down better even if we don't know how to gather the signals yet and we need to write it

19:55.480 --> 20:01.480
down as consistently as possible so that if we realize that we don't have the granularity the

20:01.480 --> 20:06.680
data we need we can search through it and try again we understand where the gaps are that we've

20:06.680 --> 20:14.680
missed and we try again we understand that organizational memory and turn as a risk and organizations

20:14.680 --> 20:20.200
that don't have documentation and I don't just to mean onboarding documentation if you have a

20:20.200 --> 20:26.680
moderator or moderator team that has a lot of churn because of burnout somehow the people who step in

20:26.680 --> 20:31.560
need to acquire this threat awareness and the social awareness of the ecosystem they've just walked

20:31.560 --> 20:38.600
into because organizational amnesia is its own kind of risk if people don't know what to do and

20:38.600 --> 20:43.320
nothing's written down and they don't know how to frame it every time there's a turnover that

20:43.400 --> 20:50.520
situation will repeat and all of this also points to the fact that anything can be weaponized

20:50.520 --> 20:57.160
in social spaces and this is especially true if the cost to the attacker is low and the burden on

20:57.160 --> 21:02.520
the target is high we see this with fraudulent requests like fraudulent GDPR requests and so forth

21:02.520 --> 21:08.520
where your burden to respond as a product organization is high by definition but it is very low

21:08.520 --> 21:17.320
energy to just spam those out right and this all matters because having these pivots also means

21:17.320 --> 21:22.680
that we can have our moderation teams know how to distinguish and as part of that discernment

21:22.680 --> 21:27.960
learn to help protect the mental health of their communities as a strategy for protecting those

21:27.960 --> 21:34.360
communities this is also true for mentors when moderators and mentors and more senior people in the

21:34.360 --> 21:39.800
ecosystem introduce responsible friction into the system they are protecting the burnout level of

21:39.800 --> 21:45.080
their contributor pools in a healthy way and making it harder to move too quickly through the

21:45.080 --> 21:52.120
ecosystem we can help this by normalizing conflict in our ecosystem because if we're so conflict

21:52.120 --> 21:58.040
diverse we don't know how to have a disagreement without it blowing up that blow up is also in a

21:58.040 --> 22:03.240
tech vector so by having a healthier and more grounded community we both have a more thriving

22:03.240 --> 22:10.120
ecosystem and can defend better and annihilate better against that type of exploitation it's also

22:10.120 --> 22:15.960
important that moderator teams understand that moderation is about boundaries not punishment a moderation

22:15.960 --> 22:22.200
team can stall out if they think about moderation as punishment because if they do they need to know

22:22.200 --> 22:27.480
if the person means it if they intended if they're a threat or not or an actor or not and you're

22:27.480 --> 22:32.040
not going to know that you're just going to need to know if you have a boundary that's been violated

22:32.040 --> 22:38.280
that you need to defend in some way and really quickly I would like to remind everyone to thank

22:38.280 --> 22:43.800
your moderators and all of your teams and this isn't just for online social media though please

22:43.800 --> 22:49.000
also thank your Fediverse and so forth moderators as well but also your contributor ecosystems

22:49.000 --> 22:53.400
forget hug because all those spaces all those social code spaces have moderators

22:54.040 --> 22:59.640
we'll be back into it one of the moderators ask themselves is this behavior compatible with

22:59.640 --> 23:05.400
community health that is how you know you need to take an action of some kind and that is how you

23:05.400 --> 23:09.880
avoid the stall out of trying to figure out if something that will only be revealed over a month or

23:09.880 --> 23:15.800
a year is a different shape of problem than you think there's also a parallel to responsible

23:15.800 --> 23:21.000
friction and moderation in code bases you don't want someone to take on 50 projects at once and

23:21.000 --> 23:24.920
contribute hundreds of peers at once just because they're really excited and burn themselves out

23:24.920 --> 23:30.600
but in moderation spaces you also need to help steward this healthy balanced reality and it's

23:30.600 --> 23:35.560
the moderators who act more like social workers for the community and they need the equipment they're in

23:36.360 --> 23:41.960
to be able to respond to these types of spaces it's not just fellow technologists who are tech

23:41.960 --> 23:46.840
oriented who should be moderators it's also people who understand the socialization between people

23:47.000 --> 23:51.160
and how to help them remain connected who would need to be stewards of community in this way

23:53.160 --> 23:58.520
and so before I just go to the last slide here moderation teams you need to approach your

23:58.520 --> 24:03.000
situations without judgment which is really hard especially when you're not sure what's going on

24:03.000 --> 24:07.640
or someone's on their second third or fourth incident again depending on your code of conduct

24:07.640 --> 24:12.840
and how that's outlined and it can be very hard to not feel like you want to judge a situation

24:12.920 --> 24:17.960
or an individual in that situation but in order to be able to do this in order to frame

24:17.960 --> 24:22.520
moderation as boundary protection and that punishment you do need to make sure you're proceeding

24:22.520 --> 24:30.520
without judgment and with that all said I just wanted to make a nice recapable slide we need to

24:30.520 --> 24:35.640
have organizational threat awareness this is true for foundations or for projects that are

24:35.640 --> 24:39.720
sufficiently large that they have their own ecosystem around them and we need to center

24:39.800 --> 24:45.080
the help of development of our maintainers and this is both for their benefit and for community

24:45.080 --> 24:50.600
and occupation benefit we need to normalize conflict so that people can disagree without it

24:50.600 --> 24:55.880
becoming amplified and it doesn't become its own exploit and we need to understand and address

24:55.880 --> 25:01.240
the nature of protective measures that may themselves also become weaponized this doesn't mean we

25:01.240 --> 25:05.400
can't do anything it just needs means that we need to be aware so we can proceed thoughtfully

25:06.280 --> 25:10.280
we also have to have standardized documentation for social risk and not just code

25:11.160 --> 25:17.080
it's not only about the code if we understand that vulnerabilities can be human for example again

25:17.080 --> 25:21.400
XEU tells we know that the human was the exploit not the code although eventually it was used to

25:21.400 --> 25:27.400
make a code that was an exploit right and this means that everybody needs to work together

25:27.400 --> 25:31.160
the security specialists are the ones who understand and can do risk analysis throughout

25:31.160 --> 25:37.080
modeling of usability testing and build the guidance that everybody else can use in act on

25:38.680 --> 25:41.960
and since we're running a little slate on time I'm just going to go right to the point

25:42.760 --> 25:47.480
so as I look through all the data and I have been trying to pivot and help people understand

25:47.480 --> 25:53.880
the guidance I've spun up a working group because when you notice that there's a problem we don't

25:53.880 --> 25:58.440
solve it out on our own we work with others and in this case we have a working group about

25:58.440 --> 26:03.320
securing open source communities and the idea is to have some researchers in the space and

26:03.320 --> 26:09.640
social workers in the space and then a bunch of foundations or leadership or moderators in the space

26:09.640 --> 26:14.120
talking together about the problems and therefore the patterns and those problems so that we

26:14.120 --> 26:19.480
can solve them together and build the guidance that does not currently exist phase one of two is

26:19.480 --> 26:24.760
going to be the communication standards this is how we can get everybody used to talking to each other

26:24.760 --> 26:29.640
and a way that they don't compromise information just by steering that information for collective

26:29.640 --> 26:35.000
benefit and then we are going to use that skill to do the threat modeling and a usability

26:35.000 --> 26:41.080
testing of codes of conduct governance models and so forth to produce guidance on what to do next.

26:44.600 --> 26:50.200
We do have a session right after this talk we're going to be in building F somewhere that I can

26:50.520 --> 26:55.160
find the existing group will be being there if you want to ask me questions you can also find us there

26:56.280 --> 27:01.480
and with that I thank you all for your time feel free to come to our birds of the feather tomorrow morning.

