←back to thread

1293 points rmason | 7 comments | | HN request time: 0.972s | source | bottom
1. b_tterc_p ◴[] No.19323150[source]
> How the study was conducted: A total of 1,500 persons were interviewed to explore Americans’ use of digital platforms and new media. From January 3rd through February 4th, 2019, telephone interviews were conducted with respondents age 12 and older who were selected via Random Digit Dial (RDD) sampling through both landline phones and mobile phones. The survey was offered in both Spanish and English. Data was weighted to national 12+ U.S. population estimates.

I wouldn’t put much faith in this estimate. While facebook’s is probably an overestimate of people actually engaged in their platform, this survey doesn’t seem very useful to me.

replies(2): >>19323453 #>>19323615 #
2. ryana ◴[] No.19323453[source]
I am always sad to see responses like this. Statistics is a very well-defined mathematical discipline, and any good research firm will use weighting techniques to adjust for demographic-based likelihood of response. The results they get from this are very accurate.

If you have concerns about Edison's methodology or application of standard survey weighting then I think that could be a fruitful conversation. But implying that 1,500 responses can't be predictive for a country of 350 million is woefully misinformed.

replies(4): >>19323671 #>>19323869 #>>19324184 #>>19324400 #
3. throwawaymath ◴[] No.19323615[source]
Is there a specific reason you doubt the estimate, or is your problem that the sample size is small? Small sample sizes don’t imply incorrect conclusions.

EDIT: To whoever has downvoted this, I politely (but urgently) recommend you read up on statistical significance. The idea that a small sample size implies a study’s findings are unreliable is one of the most widely held misconceptions in modern statistics.

4. kevin_thibedeau ◴[] No.19323671[source]
Phone surveys were accurate when robocalls and cellphones didn't exist. Now you're only sampling the people who aren't discerning enough to reject unknown numbers.
5. underwater ◴[] No.19323869[source]
Having concerns about their methodology would imply that they've actually shared it. The closest they get is a hand-wavy answer to the discrepancies between Facebook's data and their own:

> We're saying, "Do you currently use Facebook?" Facebook is probably measuring it on, “Do you ever open the app, or do you ever use it on any level?”

That answer doesn't event make sense. Given that Edison have gone to the press to promote their report and this particular number, you'd expect them to have a good answer on the discrepancies. They should definitely know what the Facebook numbers represent, especially given Facebook publicly disclose their definition of an active user in SEC filings.

6. b_tterc_p ◴[] No.19324184[source]
Fair. I do commonly do social statistics myself and have to deal with worse. My gripe was moreso that it's a random digit dialing survey (which I think would be full of bias for a purpose like this) instead of the actual usage statistics that facebook provides. Also, sampling is simply a hard thing to do. And their definition of leaving is pretty poor.

Also if we want to get nitpicky, while there is a significant drop between 2017 and 2018, there is no significant drop between 2019 and 2018 (62% -> 61%, p value of .57), despite the headline being 'Facebook Usage Continues to Drop' :)

7. dmurray ◴[] No.19324400[source]
Also, any methodology problems can be mitigated by the fact that they did the same survey with the same methodology in 2017, and compared their results. You expect to get better accuracy by asking people then and now "do you use Facebook?" than by asking them now "do you use Facebook, and did you use it two years ago?"