So about midway last year, we decided that we were going to apply a lot more machine learning, a lot more deep learning to the problem,
大约在去年年中,我们决定更多采用机器学习与深度学习,
and try to be a lot more proactive around where abuse is happening, so that we can take the burden off the victim completely. And we've made some progress recently.
使其能够主动发现侮辱信息,这样我们就能让受害者卸下重担。最近我们颇有成效。
About 38 percent of abusive tweets are now proactively identified by machine learning algorithms so that people don't actually have to report them.
机器自主学习算法能够主动发现38%的侮辱性推文,这样人们就不用举报这些了。
But those that are identified are still reviewed by humans, so we do not take down content or accounts without a human actually reviewing it.
但即使机器识别了那些信息,也还需要人来审查,所以在有人审查之前,我们不会删除任何评论或账号。
But that was from zero percent just a year ago. So that meant, at that zero percent, every single person who received abuse had to actually report it,
但重点是我们一年前是从零起步。我的意思是,以前,每个受到侮辱的人都需要亲自举报,
which was a lot of work for them, a lot of work for us and just ultimately unfair.
对于他们和我们而言是一项大工程,并且这非常不公平。
The other thing that we're doing is making sure that we, as a company, have representation of all the communities that we're trying to serve.
我们在做的另外一件事,是确保在公司里有足够的员工来代表反映我们服务的所有群体。
We can't build a business that is successful unless we have a diversity of perspective inside of our walls that actually feel these issues every single day.
若我们的团队没有多元的观念,若没有员工会每天亲身经历这些问题,我们将无法成功地建立一个企业。
And that's not just with the team that's doing the work, it's also within our leadership as well.
这一点不仅仅适用于开发小组,也适用于领导团队。
So we need to continue to build empathy for what people are experiencing and give them better tools to act on it,
所以我们要继续在同情人们的遭遇的同时,给他们创造更好的交流平台,