Facebook Ethical Responsibility on Social Media Crime

Facebook Ethical Responsibility on Social Media Crime

[Studentś Name]

[Instructorś Name]

[Course]

[Date]

Facebook Responsibility in Rescuing a Crime Victim

I do not think that Facebook has a legal duty to rescue a crime victim. Facebook has provided a platform for social networking. Anybody can post anything anytime. People post crime activities that could be live or have already happened. Facebook can only become aware of them if reported to them or if there is someone who is constantly monitoring the posts. Still, the Facebook cannot be in a position to do anything to the victim unless they have a fast and reliable means to report the act to the relevant authorities, like the police. At the same time, if somebody posts a video in the process of killing another person, the person would be dead by the time Facebook receives the information and the time it will take them to report the matter.

The only way Facebook can be of help is to enhance monitoring of videos and images posted and also have a reliable access to legal authorities to quicken reporting of a live crime. This is their ethical duty to a crime victim. They should also be fast enough to take down videos and images that are against human practices and even permanently disable the accounts of such users. To summarize, Facebook cannot be held responsible. Its workers just like any Facebook user or anyone who comes across a crime can only make the police aware so that the relevant authorities can take over the matter.

Ways in Which Social Media Platforms Can be More Proactive and Thorough in Their Review of the Type of Content That Appear on Their Sites

Introduction

Social sites are becoming increasingly popular and this has given rise to ethical issues including cyber-bulling, calling needs for investigations, (Thesera Bauer, 2014). They are also open for anyone to create an account regardless of who they are and where they come from. The intentions of joining social media vary from one person to another. The many instances of abusing social media call for evident change in ways of company operations and communications on sustainability. They should not just leave all the responsibility to FaceBook managers only, (Coombs, 2012) . Users are also so many that in a single second millions of posts are registered. Social media can, however, try some tactics to reduce unworthy posts.

Photo-matching technologies

This is a technology that can be adopted by social website owners to stop the circulation of content that has been detected and rejected already in one social website. This will help in reducing the same content being seen in every other social and people commenting about it and circulating it further. Once the content has been taken down from one site, it will be likely that it will not be seen again online unless someone tries means to repost it. Such technology can be very useful if it also applied to viral video content.

Suicide prevention tools

These are tools that can be incorporated to help posters make a choice on whether to post the content or not. It will also include the provision of contact persons or links to direct the content for reporting and rescue purposes. Police and other rescue hot lines can be included to make reporting fast and reliable. Since Facebook cannot directly punish human rights offenders, they can only ensure easy means to reach the legal authorities to make a report. Such reporting should be as fast as possible and also taking down such content should be within very short period of posting to mitigate circulation and comments.

Improved algorithms to detect undesired posts

Just like identification of spam emails, the site developers can identify certain keywords that make the posted message as unworthy and report the same or give an alert to block the content. An automated service can even reject the content from getting uploaded or anyone commenting on it or even block it from getting shared to avoid circulation.

Ways in which Facebook and Other Social Media Platforms Should Put in Place to help Prevent Acts of Violence from Being Broadcasted

Automation through artificial intelligence

Such automated services can easily detect each content posted and automatically take them down. Facebook already has automated systems for detecting and removing child pornographic materials to prevent them from going viral. The same technology can be adopted by other social media platforms and applied to any unworthy content to cub their spread.

Having a team at all hours to scrutinize the posted content and report where necessary

Having such a team can help Facebook to respond fast enough to any suspicious content and act on it accordingly. Over the past years, Facebook has been criticized for taking long hours before removing such content. This has enhanced the circulation of the same content from one platform to another.

Facebook Ethics Officer and Oversight Committee

Through research, Facebook does not have an ethical officer. There has been a lot of abusive content being posted on the platform which an ethical officer should have found a way to reduce them. An ethical officer should be able to make decisions on which content should be left and which should be removed. Even though the use of algorithms can be used to filter the content, some human decisions are needed to put rules in place to address the desired and non-desired posts. An ethical officer should be able to help the oversight committee to make FaceBook to be responsible ethically.

Changes Facebook Should Adopt to Encourage Ethical Use of Their Platforms

One of the ways that can bring sanity to Facebook use is to impose sanctions on anyone posting abusive content. Through a reliable monitoring of the posted content, Facebook should be able to trace the message origin and with the collaboration with the government and the legal authorities, the responsible individuals should face the law. Facebook should put a lot of concentration on any hateful content. Nobody should have the rights to infringe on another’s privacy or attack another just because of their differences. Attacks can come in the form of sexual abuse, race difference, religion, disabilities or diseases.

Facebook should also be able to tell whether an account is genuine or not. Hackers have manipulated genuine accounts and use them to spread hateful content. Such accounts should be permanently disabled and also improve site security to prevent hacking of user accounts. They should also be able to provide guidelines to Facebook users on how to take care of their accounts to avoid being misused by hackers. Most users are not aware of the risks involved in using online platforms. They post their content without knowing that someone else can manipulate it within their own accounts. Facebook should constantly post educative content on securing accounts. They should also be able to detect and warn the relevant users if an account is suspected to have been hacked.

Conclusion

Facebook is a social networking platform. It gives people a chance to interact regardless of who and where they are. It cannot be held legally responsible for any crime posted and circulated over social media. However, it has an ethical duty to report to the relevant authorities about any crime reported so that legal action can be taken against the responsible persons. This means that Facebook should collaborate closely with the government, the police, human rights institutions and any other relevant institution where they can report any case of online abuse. For them to make a report, they should have identified the post and therefore they should have the right structures in place to enable them to point out such content.

Facebook should hire an Ethical officer who will assist the oversight committee in deciding which content is worth and which is not. The ethical officer should be able to put rules in place or draft a code of conduct that Facebook users should adhere to. Every posted content will be scrutinized according to the set rules. Facebook should own their platform and provide regulations to anyone who wishes to join and use the platform. Online abuse, cyber-bullying and circulated crime videos can easily damage the society.

References

Theresa Bauer, (2014). The Responsibilities of Social Networking Companies: Applying Political CSR Theory to Google, Facebook and Twitter, in Ralph Tench, William Sun, Brian Jones(ed.)Communicating Corporate Social Responsibility: Perspectives and Practice (Critical Studies on Corporate Responsibility, Governance and Sustainability, Volume 6).Emerald Group Publishing Limited.

Coombs, W. Timothy (2012). Crisis communication – planning, managing, and responding, 3rd edn., Sage

Place an Order

Plagiarism Free!

Scroll to Top