Author |
Message |
 |
|
 |
Advert
|
Forum adverts like this one are shown to any user who is not logged in. Join us by filling out a tiny 3 field form and you will get your own, free, dakka user account which gives a good range of benefits to you:
- No adverts like this in the forums anymore.
- Times and dates in your local timezone.
- Full tracking of what you have read so you can skip to your first unread post, easily see what has changed since you last logged in, and easily see what is new at a glance.
- Email notifications for threads you want to watch closely.
- Being a part of the oldest wargaming community on the net.
If you are already a member then feel free to login now. |
|
 |
![[Post New]](/s/i/i.gif) 2014/06/29 02:14:07
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Shas'ui with Bonding Knife
|
http://www.slate.com/articles/health_and_science/science/2014/06/facebook_unethical_experiment_it_made_news_feeds_happier_or_sadder_to_manipulate.html
I figured it was just a matter of time before some company like FB or especially Google started experiments like this.
They control the type, frequency, and "tone" of the information that you see/find.
A real world warning of just how much potential influence an information company can have on people...whether we realize it, admit it, or deny it.....
They control the vertical.
They control the horizontal.
|
I destroy my enemies when I make them my friends.
Three!! Three successful trades! Ah ah ah!
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 04:22:56
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Decrepit Dakkanaut
|
You agree that you can be used for research when you sign up.
Remember: anytime you use a free product it means that you ARE the product.
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 04:25:15
Subject: Re:Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Terminator with Assault Cannon
|
awesome. kind of makes me wish I had face book.
|
*Insert witty and/or interesting statement here* |
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 04:34:30
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
d-usa wrote:You agree that you can be used for research when you sign up.
Remember: anytime you use a free product it means that you ARE the product.
When I read through the article all I could think was that someone obviously wasn't paying attention in science class when the Hawthorne Effect was being covered. Scientists don't tell people they're being observed for science ( TM) all the time. Given that Facebook already manipulates your feeds based on your own preferences, that they manipulate them for other reasons doesn't really strike me as something to get up in arms about.
The only part of the study I have issue with is the claim they manipulated what you saw from friends, which does to me seem more invasive than is necessary for a study. Facebook does have a normal news feed that's just popular news that could have been tweaked to achieve the same purpose.
|
This message was edited 1 time. Last update was at 2014/06/29 04:36:10
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 04:43:11
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Huge Hierodule
|
Huh. I wonder if this is related to the fact my feed tends to just show me one person at a time.
|
Q: What do you call a Dinosaur Handpuppet?
A: A Maniraptor |
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 10:10:20
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[MOD]
Anti-piracy Officer
Somewhere in south-central England.
|
d-usa wrote:You agree that you can be used for research when you sign up.
Remember: anytime you use a free product it means that you ARE the product.
The article says that consent must be "informed" (this is a general principle of justice, really) and the "consent" given by signing up to the Facebook T&C does not count as "informed consent" as applied in social science research.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 10:16:41
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Hallowed Canoness
|
d-usa wrote:Remember: anytime you use a free product it means that you ARE the product.
Well, except for stuff like (the huge majority of) FLOSS (Free/Libre/Open Source) software  .
|
"Our fantasy settings are grim and dark, but that is not a reflection of who we are or how we feel the real world should be. [...] We will continue to diversify the cast of characters we portray [...] so everyone can find representation and heroes they can relate to. [...] If [you don't feel the same way], you will not be missed"
https://twitter.com/WarComTeam/status/1268665798467432449/photo/1 |
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 10:30:12
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
The article is wrong. EDIT: To clarify, the laws requiring informed consent are targeted at medical research/procedures, but social science has traditionally been excluded from them.
These kinds of behavior studies happen frequently, even off the internet. The part that sets this one apart is that they were manipulating news feeds, which is more invasive than I'm familiar with studies being most of the time, but those feeds are already manipulated by a multitude of programs for numerous reasons, so what's the argument? Facebook feed me manipulated data? Welcome to the internet. Every ad and news feed you see is being manipulated by someone and targeted at you. Only difference here is that someone is watching publicly viewable information to see how you respond if you respond at all.
No one is going to go to court and get a judgement that it was wrong for them to feel 'sadder.' I might as well sue Dakkadakka OT for it's constant stream of "bad stuff in the world today" threads. I'll have about the same success.
|
This message was edited 5 times. Last update was at 2014/06/29 10:37:02
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 11:10:55
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[SWAP SHOP MOD]
Killer Klaivex
|
LordofHats wrote:
The article is wrong. EDIT: To clarify, the laws requiring informed consent are targeted at medical research/procedures, but social science has traditionally been excluded from them.
These kinds of behavior studies happen frequently, even off the internet. The part that sets this one apart is that they were manipulating news feeds, which is more invasive than I'm familiar with studies being most of the time, but those feeds are already manipulated by a multitude of programs for numerous reasons, so what's the argument? Facebook feed me manipulated data? Welcome to the internet. Every ad and news feed you see is being manipulated by someone and targeted at you. Only difference here is that someone is watching publicly viewable information to see how you respond if you respond at all.
No one is going to go to court and get a judgement that it was wrong for them to feel 'sadder.' I might as well sue Dakkadakka OT for it's constant stream of "bad stuff in the world today" threads. I'll have about the same success.
Would you hold the same view if it turned out that a higher proportion of the targeted group for 'depression' topped themselves than would statistically have otherwise done so?
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 11:22:48
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Decrepit Dakkanaut
|
Ketara wrote: LordofHats wrote:
The article is wrong. EDIT: To clarify, the laws requiring informed consent are targeted at medical research/procedures, but social science has traditionally been excluded from them.
These kinds of behavior studies happen frequently, even off the internet. The part that sets this one apart is that they were manipulating news feeds, which is more invasive than I'm familiar with studies being most of the time, but those feeds are already manipulated by a multitude of programs for numerous reasons, so what's the argument? Facebook feed me manipulated data? Welcome to the internet. Every ad and news feed you see is being manipulated by someone and targeted at you. Only difference here is that someone is watching publicly viewable information to see how you respond if you respond at all.
No one is going to go to court and get a judgement that it was wrong for them to feel 'sadder.' I might as well sue Dakkadakka OT for it's constant stream of "bad stuff in the world today" threads. I'll have about the same success.
Would you hold the same view if it turned out that a higher proportion of the targeted group for 'depression' topped themselves than would statistically have otherwise done so?
Did they do?.
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 11:26:50
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[MOD]
Anti-piracy Officer
Somewhere in south-central England.
|
This reminds me of the "blipverts" in Max Headroom.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 11:27:04
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Decrepit Dakkanaut
|
I feel better knowing that people might have been so happy that they left the house and followed their dreams and are now rich because they went and applied for that job that they were not sure about.
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 11:34:05
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
Ketara wrote:Would you hold the same view if it turned out that a higher proportion of the targeted group for 'depression' topped themselves than would statistically have otherwise done so?
I'd be pretty heartless not to care, but honestly. You think anyone, lacking a prior condition, is going to top themselves because news got a little more negative? Social studies like this usually are put through an Ethics Committee at several stages, so its not like they're being reckless (probably). There's really no way for this kind of study to be conducted with consent as knowledge of the study invalidates the results, a long time ethical question for people in this kind of work (as historians, we can slightly relate as its the same problem that can plague oral history).
The ethical question here isn't even damage to the subjects in my eye, as I see no way for this to really cause damage, but rather the way researchers chose to interfere. If it was just a generic news feed, I'd probably be 100% fine with it, but manipulating news feeds connected to friends lists strikes me as overly invasive for the needs of the study.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 12:06:12
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[SWAP SHOP MOD]
Killer Klaivex
|
Soladrin wrote:Did they do?.
Ketara wrote:Would you hold the same view if it turned out
LordofHats wrote: Ketara wrote:Would you hold the same view if it turned out that a higher proportion of the targeted group for 'depression' topped themselves than would statistically have otherwise done so?
I'd be pretty heartless not to care, but honestly. You think anyone, lacking a prior condition, is going to top themselves because news got a little more negative?
Why the qualification 'lacking a prior condition'? Do you think Facebook has access to people's medical records to ensure that people with prior conditions are filtered out?
Suicides are usually caused because of a moment of absolute blackness/despair whilst in a more general depressive cycle, and as a result are impulse driven. It would not be impossible to conceive of a scenario in which a highly depressed person feeling very alone in the world received only deliberately edited negative information for a period of time, and was pushed to those impulsive moments of self-harm and despair slightly more often as a result. In fact, I'd say that it was even probable that would be the case.
The result would naturally be a slightly higher rate of suicides. That wouldn't be instantly evident within a small sample group, but I'd wager it would be a directly applicable result of a study in which you deliberately make people more unhappy.
|
This message was edited 6 times. Last update was at 2014/06/29 12:10:24
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 12:15:34
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
Because I don't see how this could drive a normal person to top themselves. Negative news feeds happen as a regular course of daily life, hence my reference to DakkaDakka OT. We have negative strings of threads here all the time. All these guys have done is ensure there is a negative news feed at a time where they are able to observe the effects. The study never clarifies the kind of news. Could range from "another robbery on a corner store" to "omg Facebook is controlling you mindz everything you know is a lie!" for all we know.
I kind of dislike the Slate in general, cause the quality of the writing for many of its articles is kind of shotty. There's little information here to form any informed opinion on this particular case. It's another knee jerk reaction piece.
Do you think Facebook has access to people's medical records to ensure that people with prior conditions are filtered out?
Facebook isn't conducting the study, merely facilitating it (I doubt this is the first time either, as researchers have long held interest in using social media for research). It looks like Cornell and University of San Fran are the ones running the show. The article gives us no means by which the people being observed are chosen so we have no way to value the methodology of the study on that front. Like I said before. The common procedure for such studies is to be reviewed by an Ethics Committee before and during. Attributing recklessness on the part of the researchers when we have only hypothetical possibilities of what could maybe happen isn't really going to get us anywhere.
|
This message was edited 2 times. Last update was at 2014/06/29 12:17:32
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 12:23:47
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[SWAP SHOP MOD]
Killer Klaivex
|
LordofHats wrote:
Because I don't see how this could drive a normal person to top themselves. Negative news feeds happen as a regular course of daily life, hence my reference to DakkaDakka OT. We have negative strings of threads here all the time. All these guys have done is ensure there is a negative news feed at a time where they are able to observe the effects. The study never clarifies the kind of news. Could range from "another robbery on a corner store" to "omg Facebook is controlling you mindz everything you know is a lie!" for all we know.
Often, people with high levels of depression also suffer from high levels of anxiety, resulting in a disinclination to leave the house much (and the obvious result that they use the internet far more). With Facebook membership and usage at the heights that it currently is amongst people within a certain age bracket, the overall potential for a deliberately created 'negative news trend' to have an impact amongst sufferers of depression is clearly far greater than one deliberately inserted within say, a national newspaper.
Facebook isn't conducting the study, merely facilitating it (I doubt this is the first time either, as researchers have long held interest in using social media for research). It looks like Cornell and University of San Fran are the ones running the show. The article gives us no means by which the people being observed are chosen so we have no way to value the methodology of the study on that front.
Sure. But you know as well as I do that researchers from any university also do not hold access to peoples private medical records, generally speaking. Certainly not those of random people on facebook. And it would have to be a random sampling of people on facebook, or their study would be meaningless.
Like I said before. The common procedure for such studies is to be reviewed by an Ethics Committee before and during. Attributing recklessness on the part of the researchers when we have only hypothetical possibilities of what could maybe happen isn't really going to get us anywhere.
I didn't attribute recklessness. I merely hypothesized as to a likely consequence of the study.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 12:42:29
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
Ketara wrote: the overall potential for a deliberately created 'negative news trend'
Then why only pick on this study? Do you have a news feed? Congratulations, someone somewhere wrote code that is manipulating that news feed.
Especially since we live in a world where the media is constantly criticized for being too negative, I don't see how this is really going to mess anyone up in a way that normal news already hasn't unless they're choosing some really dark news. I'm not really sure we can draw a line in the sand that says party A can manipulate data and record the results but part B can't because party B was purposely trying to make people sad while party A only made them sad as a matter of course. That's kind of an unenforceable standard.
Sure. But you know as well as I do that researchers from any university also do not hold access to peoples private medical records, generally speaking. Certainly not those of random people on facebook. And it would have to be a random sampling of people on facebook, or their study would be meaningless.
For all we know they do. It's not uncommon for a university to use its own student population for studies of this nature and at the very least, is likely to know whether a student has a condition that might be influenced by research. A random sampling doesn't necessarily mean completely random. We are given no information how the people being studied are chosen.
Rereading the article quickly it does say that the study was approved by a board, so at least that happened. Caplan, the 'expert' offering the opinion that conduct here is unethical, says later in the article that he doesn't know the legal standard needed for the study. So really, we have a guy saying "this is unethical imo, but I'm not really sure what the law says."
Also rereading the article I think I can definitely say this; if Facebook is going to use people for research, it should be stated clearly. This is a general EULA problem, as the same thing is in the EULA's for Origin and Steam and we could probably avoid these kneejerk reaction articles and outrage if companies would explain more upfront exactly how their users might be used in research studies as well as better defining to us the kind of things these companies will involve themselves in.
I didn't attribute recklessness. I merely hypothesized as to a likely consequence of the study.
Yeah... and the nuclear plant down the road could meltdown tomorrow, but I have no information to suggest that anyone there isn't following all the safety procedures, making all my hypothesizing about how it might meltdown tomorrow useless as anything more than a mental exercise in how much I know about nuclear reactors.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 12:54:55
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Shas'ui with Bonding Knife
|
Ok, you don't care for Slate, does Forbes tickle your fancy?
http://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/
Regardless of the source or the "quality" of the writing....the message and intent are the same across the spectrum.
To warn and educate about the purposeful manipulation of information that you see from social media.
I would say that our members seem to be a bit more....intelligent and intellectual than the normal population. I think part of that comes from the cerebral nature of the wargames we enjoy.
This probably isn't all that big of a shock to most of us here.
But to the average Jane or Joe out there.....finding out that FB has been purposely feeding them negative stuff that they might not otherwise have seen, or seen as much of, it could be quite shocking.
"Normal" people put their faith in companies like FB or Google or CNN or Fox all the time. They believe that what they are being told or shown is genuine (i.e. not manipulated). We, on the other hand, know better.
Hopefully news of this will spread and wake people up to not just FB manipulation, but media manipulation en toto.
- - - - -
I too had the same thought about the potential culpability of FB should the suicide rate of its users in the 700k study suddenly increase.
Perhaps if there was one positive message that a severely depressed person missed out on, or vice versa, an extremely depressing article was shown to this depressed person.....it is entirely possible that said news feed update could be the catalyst for that person to take their life.
Or...as I said...that one happy post could keep them going for another day.
Having worked in a psych facility, I have seen first hand just how fragile the human psyche can be and how one thing (a word, a picture, a tv ad, etc) can set off a cascade of behavior.
|
This message was edited 2 times. Last update was at 2014/06/29 13:00:06
I destroy my enemies when I make them my friends.
Three!! Three successful trades! Ah ah ah!
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 13:00:25
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[SWAP SHOP MOD]
Killer Klaivex
|
LordofHats wrote: Ketara wrote: the overall potential for a deliberately created 'negative news trend'
Then why only pick on this study? Do you have a news feed? Congratulations, someone somewhere wrote code that is manipulating that news feed.
Why? Because this one deliberately promotes a negative newsfeed designed to make people unhappy. When I see another study designed to limit the flow of information to people to that of negative items, I would make the same comment again. I'm not entirely sure what you're inferring here.
Especially since we live in a world where the media is constantly criticized for being too negative, I don't see how this is really going to mess anyone up in a way that normal news already hasn't unless they're choosing some really dark news.
Sure. Page 1 has 'Death in Syria', and Page 4 'Bacon gives you cancer''. But Page 8 usually has 'Funny cat does funny thing' and Page 23 'Jordan just got married again'. Newspapers have a random selection based on what they believe readers will find to be of interest, and things such as comedy also feature prominently within that. In this particular case though, the variety is deliberately being filtered to promote negative items as exclusively as possible.
I'm not really sure we can draw a line in the sand that says party A can manipulate data and record the results but part B can't because party B was purposely trying to make people sad while party A only made them sad as a matter of course. That's kind of an unenforceable standard.
Why?
Party A is manipulating data generally. It could be promoting nice things, funny things, or a mix of things. People are unlikely to have their odds of topping themselves increased from laughing too much, wouldn't you say? Sure, they might go, 'Oh, Granny used to like this show', but that sort of thing is unlikely to manifest as a physical trend.
Party B alternatively, is deliberately attempting to make people unhappy.
For all we know they do. It's not uncommon for a university to use its own student population for studies of this nature and at the very least, is likely to know whether a student has a condition that might be influenced by research.
Very, very unlikely, to the point of improbable. I only left the possibility of them having medical records even remotely open simply because occasionally universities do drug trials and suchlike which require them to have access to medical records. Facebook explicitly stated that people opted into this study through their user agreement, which means that the odds of the selection criteria being 'people we already have access to the medical records of' are minute at best, and wildly implausible at worst.
A random sampling doesn't necessarily mean completely random. We are given no information how the people being studied are chosen.
Sure. But I reiterate, the odds of them having been able to check the medical records of the entire sample for those prior conditions you referenced are exceedingly small.
Yeah... and the nuclear plant down the road could meltdown tomorrow, but I have no information to suggest that anyone there isn't following all the safety procedures, making all my hypothesizing about how it might meltdown tomorrow useless as anything more than a mental exercise in how much I know about nuclear reactors.
I never commented on supposed recklessness. I also never commented on a lack of oversight. I hypothesized solely about likely outcomes of this type of study. You appear to be confusing that with me commenting on the ethics of undertaking the study in the first place. Which I haven't done.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 13:05:39
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
Yeah, but the Forbes article contains way more information. Lead with that one next time
Good tid bits;
such as the fact that it can keep track of the status updates we never actually post.
If there was a week in January 2012 where you were only seeing photos of dead dogs or incredibly cute babies
“*Probably* nobody was driven to suicide,” tweeted one professor linking to the study, adding a “#jokingnotjoking” hashtag.
Even if he's just mocking the idea that the study would cause suicides because he's a professional very good at what he does and finds the idea absurd and is absolutely correct that it is, that's not really the attitude I'm looking for...
The article also specifies that the researchers were Facebook employees. I suppose on the bright side of things, Facebook will now be manipulating everyone's news feeds to be happier, but it definitely looks now like they just picked people with no real filter other than maybe how many friends they had and how active their feeds were.
To warn and educate about the purposeful manipulation of information that you see from social media.
Honestly, the boats already sailed in my opinion. We're now in put up or shut up mode, cause there's no escaping it anymore.
Automatically Appended Next Post:
Say someone frequently reads negative news. The code that picks their news feed picks up on this and starts feeding them more and more negative news. The algorithim isn't specifically written to feed them negative news, but the practical difference is kind of 0 as it relates to the subject. They're being feed a constant stream of negative news, so really why should we care about one that's just base written to feed negative news when hundreds of thousands (probably millions) of people are already being fed the exact same thing by other feeds as a matter of course?
Very, very unlikely, to the point of improbable. I only left the possibility of them having medical records even remotely open simply because occasionally universities do drug trials and suchlike which require them to have access to medical records.
I don't know how it is in the UK, but in the US many universities have a clinic on campus that requests medical records from students. I've actually never bothered to look to see if they use those records for anything other than record keeping.
I never commented on supposed recklessness. I also never commented on a lack of oversight. I hypothesized solely about likely outcomes of this type of study. You appear to be confusing that with me commenting on the ethics of undertaking the study in the first place. Which I haven't done.
We have ethics and ethics boards precisely because the question you asked is something they consider (and obviously they are aware of it). To not consider that you might kill someone in the course of your research is reckless, and presumably the board that approved the study would have asked the exact same question. Any researcher studying human behavior should be considering the possibility, otherwise their committing nonfeasance. This is the whole reason we legalized Ethics in the first place.
|
This message was edited 2 times. Last update was at 2014/06/29 13:26:09
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 13:47:30
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[SWAP SHOP MOD]
Killer Klaivex
|
LordofHats wrote:
Say someone frequently reads negative news. The code that picks their news feed picks up on this and starts feeding them more and more negative news. The algorithim isn't specifically written to feed them negative news, but the practical difference is kind of 0 as it relates to the subject. They're being feed a constant stream of negative news, so really why should we care about one that's just base written to feed negative news when hundreds of thousands (probably millions) of people are already being fed the exact same thing by other feeds as a matter of course?
You are correct in that the outcome would be the same in both scenarios. This does not indicate that the difference between the two scenarios is not worthy of commentary though. For example, in one case, a dog kills a man. In another, a man dies of old age. In both scenarios, the outcome is the same (i.e. a dead person), but you wouldn't cease commenting on the former just because the latter one occurs on a daily basis.
I don't know how it is in the UK, but in the US many universities have a clinic on campus that requests medical records from students. I've actually never bothered to look to see if they use those records for anything other than record keeping.
We have the same, but those records are restricted to medical professionals who you have granted access to. Nobody has automatic access to your records that you do not explicitly grant access to. My own parents wouldn't be able to access mine unless I explicitly told the NHS to allow them to in the specific instance in which they were asking.
We have ethics and ethics boards precisely because the question you asked is something they consider (and obviously they are aware of it). To not consider that you might kill someone in the course of your research is reckless, and presumably the board that approved the study would have asked the exact same question. Any researcher studying human behavior should be considering the possibility, otherwise their committing nonfeasance. This is the whole reason we legalized Ethics in the first place.
Sure. But all of that lies far outside the scope of my hypothesis of likely outcomes of the study. Whether they considered it or not has nothing to do with my analysis of potential for a slight increase in suicides in the scenario provided by the article.
|
This message was edited 2 times. Last update was at 2014/06/29 13:49:24
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 13:57:14
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[MOD]
Anti-piracy Officer
Somewhere in south-central England.
|
"Ethical" and "Legal" are not the same thing.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 13:57:35
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Avatar of the Bloody-Handed God
Inside your mind, corrupting the pathways
|
Hmmm... facebook... Vault-tec... is there no one you can trust these days?
And psychological studies need ethical approval and consent the same as any other trial.
And not just "you signed up to our service and clicked through the 800 page user agreement",but "you agree to go into this trial" kind of agreement...
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 14:03:09
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
I'm not really talking about how a person who is sad in one scenario is the same as another sad person in some other unrelated scenario. Rather, say one man is killed by a dog trained to kill a man, and another is killed by a dog trained to kill a man but its okay he wants to be killed. Either, we have an issue with dogs trained to kill people, or we don't. Likewise, either we're okay with algorithms manipulating what we see and feeding us mountains of negative news, or we're not... Choosing to take issue with one of those dogs on the grounds that they're trained to kill people but not the other is rather arbitrary as a rule.
Ketara wrote:Whether they considered it or not has nothing to do with my analysis of potential for a slight increase in suicides in the scenario provided by the article.
Then what does this discussion really have to do with the study at all? If researchers have accounted for the possibility and a separate review board has ruled that the danger is acceptable, then why should we really be talking about something that hasn't happened and that the researchers themselves considered and deemed within the bounds of acceptability?
Automatically Appended Next Post:
No but as it pertains to research studies (mostly medical research and practice), the West started legalizing key aspects of Ethics following the Nuremburg Trials (look up Nuremburg Codes). It's why we can sue researchers in tort law.
|
This message was edited 3 times. Last update was at 2014/06/29 14:07:34
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 14:05:10
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Decrepit Dakkanaut
|
My guess is that in the UK, quite a few people actually have access to your records that you don't even know about. The standard process in such cases is usually to make sure that people don't know they are "your records" by removing all your personal information from them.
They wouldn't have "d-usa's" medical records. They would have the medical records of person 1275953 who is a [insert my demographics] with the following health information.
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 14:11:27
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
[SWAP SHOP MOD]
Killer Klaivex
|
LordofHats wrote:I'm not really talking about how a person who is sad in one scenario is the same as another sad person in some other unrelated scenario. Rather, say one man is killed by a dog trained to kill a man, and another is killed by a dog trained to kill a man but its okay he wants to be killed. Either, we have an issue with dogs trained to kill people, or we don't. Likewise, either we're okay with algorithms manipulating what we see and feeding us mountains of negative news, or we're not... Choosing to take issue with one of those dogs on the grounds that they're trained to kill people but not the other is rather arbitrary as a rule.
Who 'took issue'? It certainly wasn't me. I think you may be confusing me with somebody else.
Then what does this discussion really have to do with the study at all?
Errr......someone posted news of a study undertaken between Facebook and a University group. I commented as to the potential/probable ramifications of the aforementioned study. Which is somewhat the point of a discussion thread?
If researchers have accounted for the possibility and a separate review board has ruled that the danger is acceptable, then why should we really be talking about something that hasn't happened and that the researchers themselves considered and deemed within the bounds of acceptability?
Because I mulled over the likely consequences of the aforementioned study and posted my thoughts for discussion. That is somewhat the purpose of being here.
Although thoroughness compels me to point out that you have no proof that my hypothesis has not occurred, or indeed even that the researchers necessarily considered it.
|
|
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 14:13:18
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Major
Middle Earth
|
SilverMK2 wrote:Hmmm... facebook... Vault-tec... is there no one you can trust these days?
And psychological studies need ethical approval and consent the same as any other trial.
And not just "you signed up to our service and clicked through the 800 page user agreement",but "you agree to go into this trial" kind of agreement...
The study was undertaken with several universities so presumably it was passed by at least one ethics board, or at least that's how things work here in New Zealand.
|
We're watching you... scum. |
|
 |
 |
![[Post New]](/s/i/i.gif) 2014/06/29 14:14:41
Subject: Do not adjust your screen. We are in control: Facebook experiments on users
|
 |
Secret Force Behind the Rise of the Tau
USA
|
Ketara wrote:Who 'took issue'? It certainly wasn't me. I think you may be confusing me with somebody else.
The Straw Man I guess  He must have slipped out while I wasn't looking
Although thoroughness compels me to point out that you have no proof that my hypothesis has not occurred, or indeed even that the researchers necessarily considered it.
I tend to operate on the "this is how things are supposed to be done and how a responsible human being would do them and lacking any evidence to the contrary, I assume that this is the way things are being done" in any given specific situation  I've never been a fan of skepticism for the sake of skepticism. My parents are like that and I grew up hating it.
|
This message was edited 1 time. Last update was at 2014/06/29 14:19:57
|
|
 |
 |
|