In a major move, the company may be looking at ways to allow users to participate in combatting abusive/offensive content. Leaker Mukul Sharma says the micro-blogging giant is developing a feature that urges users to review replies that may contain offensive/abusive language. There will be an edit button that will allow users to remove such content. Of course, this does not mean every user will be allowed free reign over the Twitterverse, and able to review and delete/edit content at will. Instead, the actions will be available for replies to tweets. In other words, users will become moderators of the reply chains that follow their tweets. This feature is not available yet, but it is currently being tested.
— Mukul Sharma (@stufflistings) June 20, 2022 It is thought the tool will alert users when a reply to one of their tweets contains offensive, bullying, or any other kind of inappropriate language. Users will also be able to respond to Twitter if they think the content is not offensive.
Editing
Twitter is also known to be testing an edit button, giving authors of tweets the ability to fix their errors or change their content. Yes, a giant platform that relies solely on the power of the word does not currently have an edit button. Either way, I will chalk these two upcoming features as small wins for Twitter. Tip of the day: Windows lets you use Cortana to translate sentences, words, or phrases, with the results read back to you automatically. This makes it particularly useful for group scenarios, but you can also type if you’re unsure about pronunciation. Cortana translation sports an impressive 40 languages and utilizes machine learning to provide natural results in many cases. Check our full guide to learn how to use Cortana for quick translations.