原文如下 Hello all, We know how important it is as a team to keep improving our moderation, and since late August, the development team has been building the tools needed by the moderation team to improve its effectiveness. Moderation hasn’t been a topic that we often spoke about publicly in the past, but we think this is a good time to improve our transparency and share some extra insight into our moderation & live operations. One of the key elements of any competitive multiplayer game is moderation, but this is not an easy challenge for any company, including Hypixel. Being part of the Minecraft platform offers an extra unique challenge that doesn’t allow us to utilize some of the solutions that owning accounts and the client would offer. Although this doesn’t stop us from striving to improve the community and moderation efforts, it does mean that we often have to spend a lot of extra resources to do it. As a team, we have always agreed to strive for a balance between strictness and forgiveness. We have a lot of young players in the Hypixel community who are often influenced or don’t understand that the action they may be doing is wrong. We strongly believe that we should educate everyone by providing a fair warning or a second chance, though we recognize this has not been a popular decision in the eyes of some of our community. So what have we been doing to improve the fairness of competition for players? A lot of our focus has been on prevention and detection systems such as Watchdog. Since Watchdog’s release in January 2016 to the time of writing this post, it has issued over 2,629,859 actions against players. I honestly wouldn’t like to imagine what it would be like without Watchdog. However, we understand that this not an all-in-one solution as cheat makers will always look for newer methods to exploit in the Minecraft client which means we are in a continuous tug of war between us and cheat makers. To help us with this issue, we have been working on broade