Over the last couple of days there has been a lot of reports about "Facebook Depression". For those who have not been listening, there has been studies that show Facebook use can lead to depression as people are either watching other peoples lives and think how boring theirs is or simply because they are slumping when they don't get enough likes on a post. Well there is really no surprises there. I know I have been seeing this trend for a long time and have written about it before. It's obviously not limited to Facebook and is a general view on social media. People I know like to tell me how many likes they have had on a post but rarely mention the ones that don't get a lot of likes.
To some degree it's compounded by the fact that narcissism is growing at epidemic proportions but to lay blame it's a bit like having the chicken or the egg discussion. I can see that sites like FB will have problems in the future as I am starting to see people of all ages either leaving FB and other social media apps. Unfortunately there are a lot of people who are and have become addicted to the validation and attention they get through this medium and of course it's dangerous. But unfortunately if there is no self awareness within people it will continue to grow.
The danger for FB and other social media sites is what this can mean for them in the future. We all know that companies like FB are not altruistic and are in the business for money. It's a commercial enterprise. They may have started with the best of intentions but the reality is that they want to make money. This is why FB and others are always trying to make more and more information available and to reinvent themselves because in the end their value is quite often calculated in part on their membership.
We have all heard about people who have killed others using FB to hunt, people committing suicide because of the depression that was brought on by FB or just the escalation of bullying. Unless these sites start making fundamental changes and introducing much more obvious support functions they are heading towards future lawsuits by families and others. They won't be able to hide behind the excuses of public forums forever. They have a corporate responsibility to start taking action after all their product use is resulting in these things and I can't see that they will always be able to get away with it unless they dramatically increase the support to their users. Algorithms already track peoples moods and will only get more precise so these social media companies will be liable because they have the ability to do so much more positive work.
Only time will tell but I can see a time coming where online social media companies will be held to account. I'm not saying individuals shouldn't take responsibility, just that not everyone is capable of recognising those danger signs and at some level these websites will be held to account.