I read this article about rape culture http://www.usnews.com/opinion/blogs/economic-intelligence/2013/10/24/statistics-dont-back-up-claims-about-rape-culture. It essentially claims statistics don't back up rape culture claims. Now I'm not going to argue what the statistics are, here are some statistics you can look at if you're a bitter man who has deluded himself into believing false accusations are made all the time http://www.rainn.org/statistics?gclid=COT6ku-_m7wCFcY7MgodFWAAtA. The point of this post is to remind people to read critically and not take articles like the one first mentioned seriously. The article argues that "Statistics surrounding sexual assault are notoriously unreliable and inconsistent". The citation simply leads you to a professor who indicts the "one in four" statistic. The author of the citation also claims that the lack of cohesion in studies definitions of rapes make them unreliable. The rest of the article goes on to indict the one out of four women are raped statistic. The author of the article uses the impetus of the citation to make the aforementioned claim. However, the source only disputes one piece of statistical evidence and the lack of cohesive definitions doesn't mean all scholarship is "unreliable" nor does it necessarily imply inconsistency. First, unreliable implies the studies themselves were poorly done. Aside from the one of four study, which was indicted according to its "expansive" definition of sexual assault, there are no other examples provided. This lack of evidence allows us to call into question what is clearly a gross over generalization. Now moving on to the claim of "inconsistency". This claim is true in a sense that multiple definitions are being used according to the cited work. However, this doesn't mean that all definitions are pointing to similar results. It could be that even with the most conservative definitions and the most liberal definitions we have similar results coming up. The lack of consistency of definitions doesn't necessarily produce an inconsistency in the results. Regardless, even if it did this doesn't give us a compelling reason to disregard all studies, but instead to consider all studies individually. Finally, the laughable moment of the article is when it cite statistics from an organization indicted in its own citation. The Bureau of Justice Statistics study is unreliable given by this common critic from within the article's previous citation, where it states:
Feminist activists and others have plausibly argued that the relatively low figures of the FBI and the Bureau of Justice Statistics are not trustworthy. The FBI survey is based on the number of cases reported to the police, but rape is among the most underreported of crimes. The Bureau of Justice Statistics National Crime Survey is based on interviews with 100,000 randomly selected women. It, too, is said to be flawed because the women were never directly questioned about rape. Rape was discussed only if the woman happened to bring it up in the course of answering more general questions about criminal victimization. The Justice Department has changed its method of questioning to meet this criticism, so we will know in a year or two whether this has a significant effect on its numbers. - http://www.leaderu.com/real/ri9502/sommers.htmlHonestly the author of the article is an idiot. Making wild claims from very little evidence and then contradicting herself with her own evidence. Perhaps if the author read their own evidence they wouldn't be so easily proven to be a fraud. The point is when ever you see a headline be critical. Also don't let the sea of citations and studies affect your ability to analyze the points being made. This is the worst type of cherry picking and the author should be ashamed.
No comments:
Post a Comment