A new 50-state HUGE poll shows exactly why H Clinton holds the advantage over tRump

  • Posted by a hidden member.
    Log in to view his profile

    Sep 08, 2016 3:44 AM GMT
    https://www.washingtonpost.com/politics/a-new-50-state-poll-shows-exactly-why-clinton-holds-the-advantage-over-trump/2016/09/05/13458832-7152-11e6-9705-23e51a2f424d_story.htmlWith nine weeks until Election Day, Donald Trump is within striking distance in the Upper Midwest, but Hillary Clinton’s strength in many battlegrounds and some traditional Republican strongholds gives her a big electoral college advantage, according to a 50-state Washington Post-SurveyMonkey poll.

    The survey of all 50 states is the largest sample ever undertaken by The Post, which joined with SurveyMonkey and its online polling resources to produce the results. The state-by-state numbers are based on responses from more than 74,000 registered voters during the period of Aug. 9 to Sept. 1. The individual state samples vary in size from about 550 to more than 5,000, allowing greater opportunities than typical surveys to look at different groups within the population and compare them from state to state.

    The massive survey highlights a critical weakness in Trump’s candidacy an unprecedented deficit for a Republican among college-educated white voters, especially women. White college graduates have been loyal Republican voters in recent elections, but Trump is behind Clinton with this group across much of the country, including in some solidly red states....

    ...Trump is struggling in places Republicans have won consistently and that he must hold to have any hope of winning. These states include Arizona and Georgia, as well as Texas — the biggest surprise in the 50-state results. The Texas results, which are based on a sample of more than 5,000 people, show a dead heat, with Clinton ahead by one percentage point.

    Clinton also leads by fewer than four points in Colorado and Florida and is tied with Trump in North Carolina. In Colorado, other polls have shown a larger Clinton lead. In Mississippi, Trump’s lead is just two points...

    In a two-way competition between the major-party candidates, Clinton leads by four points or more in 20 states plus the District of Columbia. Together they add up to 244 electoral votes, 26 shy of the 270 needed to win.

    Trump leads by at least four points in 20 states as well, but those add up to just 126 electoral votes. In the 10 remaining states, which hold 168 electoral votes, neither candidate has a lead of four percentage points or better.

    ...A series of four-way ballot tests that include Libertarian Party nominee Gary Johnson and Green Party nominee Jill Stein project a somewhat narrower Clinton advantage, with more states showing margins of fewer than four points between the two major-party candidates....

    In the Post-SurveyMonkey poll, Clinton is winning 90 percent or more of the Democratic vote in 32 states, while Trump is at or above that level in just 13....

    6727071.jpg
  • Posted by a hidden member.
    Log in to view his profile

    Sep 08, 2016 4:03 AM GMT
    Infact, America should only have 49 stars on it's flag, as Hawaii is an occupied country.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 08, 2016 4:13 AM GMT
    Aunty_Jack saidInfact, America should only have 49 stars on it's flag, as Hawaii is an occupied countrie.


    https://en.wikipedia.org/wiki/Hawaii_Admission_ActThe Admission Act, formally An Act to Provide for the Admission of the State of Hawaii into the Union (Pub.L. 86–3, 73 Stat. 4, enacted March 18, 1959) is a statute enacted by the United States Congress and signed into law by President Dwight D. Eisenhower which dissolved the Territory of Hawaii and established the State of Hawaii as the 50th state to be admitted into the Union.[1] Statehood became effective on August 21, 1959.[2] Hawaii remains the most recent state to join the United States.

    Prior to 1959, Hawaii was a territory of the United States. In 1946, the United Nations listed Hawaii as a non-self-governing territory under the administration of the United States (Resolution 55(I) of 1946-12-14). Also listed as non-self-governing territories under the jurisdiction of the United States were Alaska Territory, American Samoa, Guam, Puerto Rico, and the Virgin Islands.

    Out of a total population of 600,000 in the islands and 155,000 registered voters, 140,000 votes were cast, the highest turnout ever in Hawaii. The vote showed approval rates of at least 93% by voters on all major islands (see adjacent figure for details). Of the approximately 140,000 votes cast, fewer than 8,000 rejected the Admission Act of 1959.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 09, 2016 12:52 AM GMT
    God_Said saidInfact, America should only have 49 stars on it's flag, as Hawaii is an occupied country.


    Liberals, its time you gave Manhattan back to the Indians so that you can move to Mayor DeBlasio's Socialist Communist Utopia of Cuba.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 09, 2016 12:55 AM GMT
    That survey is worse than useless. It wasn't a random sample and has no margin of error. It was a self selected sample generated through the Internet. Since Trump voters skew to the older and less educated obviously fewer of them are online. Also someone who isn't that into politics isn't going to bother clicking on an only survey.
    And for the record, I think a Trump presidency would be disastrous. I just don't think you can look at this poll and think it means HRC is going to win.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 09, 2016 1:19 AM GMT
    Clinton is leading, and eventually will win, because she's running against someone who's irretrievably broken.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 09, 2016 3:49 AM GMT
    Wyndahoi saidThat survey is worse than useless. It wasn't a random sample and has no margin of error. It was a self selected sample generated through the Internet. Since Trump voters skew to the older and less educated obviously fewer of them are online. Also someone who isn't that into politics isn't going to bother clicking on an only survey.
    And for the record, I think a Trump presidency would be disastrous. I just don't think you can look at this poll and think it means HRC is going to win.


    Is there a statistician in the house? I'm not a stats guy though I do enjoy them. Hadn't read the specifics until your mention though I'd seen that this was a surveymonkey so figured internet based. With regard to margins of error. Again, I'm not a stats guy and so don't quite follow that but I'd imagine the error matters more given lesser samplings? That we'd look more at the margin of error in a sampling of 1000 then we might in a sampling of 5000? Yes? Well, what was this, nearly 75,000. That's huge.

    Here's their methodology: http://apps.washingtonpost.com/g/page/politics/washington-post-surveymonkey-50-state-poll/2086/ which looks very above-board to me.

    To your charge of Trump support being under-represented because "the older and less educated obviously fewer of them are online", wow. What century are you living in? My father is in his high 80s and he was on facebook as soon as it became available. I just recently got my first smart phone because my cellphone finally died. And if the dumber don't access the internet then why are there so many GOP memes being posted? By that alone your argument doesn't hold.

    Even if it was as you say "self selected", by the time ya get to 75,000 participants, I'd think most wrinkles would iron out, Hard for me to imagine that many people not pulling from many different demographics. I don't know the math, maybe some statistician can explain that, but I'd think by methodology this more valid than you've just characterized.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 10, 2016 7:18 PM GMT
    theantijock said
    Wyndahoi saidThat survey is worse than useless. It wasn't a random sample and has no margin of error. It was a self selected sample generated through the Internet. Since Trump voters skew to the older and less educated obviously fewer of them are online. Also someone who isn't that into politics isn't going to bother clicking on an only survey.
    And for the record, I think a Trump presidency would be disastrous. I just don't think you can look at this poll and think it means HRC is going to win.


    Is there a statistician in the house? I'm not a stats guy though I do enjoy them. Hadn't read the specifics until your mention though I'd seen that this was a surveymonkey so figured internet based. With regard to margins of error. Again, I'm not a stats guy and so don't quite follow that but I'd imagine the error matters more given lesser samplings? That we'd look more at the margin of error in a sampling of 1000 then we might in a sampling of 5000? Yes? Well, what was this, nearly 75,000. That's huge.

    Here's their methodology: http://apps.washingtonpost.com/g/page/politics/washington-post-surveymonkey-50-state-poll/2086/ which looks very above-board to me.

    To your charge of Trump support being under-represented because "the older and less educated obviously fewer of them are online", wow. What century are you living in? My father is in his high 80s and he was on facebook as soon as it became available. I just recently got my first smart phone because my cellphone finally died. And if the dumber don't access the internet then why are there so many GOP memes being posted? By that alone your argument doesn't hold.

    Even if it was as you say "self selected", by the time ya get to 75,000 participants, I'd think most wrinkles would iron out, Hard for me to imagine that many people not pulling from many different demographics. I don't know the math, maybe some statistician can explain that, but I'd think by methodology this more valid than you've just characterized.


    Well, let me tell you a few things about statistics...
    For starters, a margin of error ALWAYS matters. Depending on what how your numbers were generated there are different ways of reporting your margin of error. That's what statistics is- using math to predict the likelihood your numbers accurately reflect reality. They surveyed 75,000 people, so we know what percentage of that group say they will vote for Hillary. But we have no idea if those 75,000 people are a representative sample of who will vote in November. So in this survey there AREN'T any actual statistics. There's just some basic arithmetic. It can't be relied on to predict what will happen in November.
    And your anecdotes about your 80 year old father being online are great. My 93 year old gramma is on Facebook as well. But plural of anecdotes doesn't equal data. There are plenty of older people online. But as a group they LESS likely to be connected than youth.
    Basically, when conducting a poll you want your group to be a perfect sampling of America- the same ethnic, economic status, religious affiliation and age as the population as a whole. Now, this isn't always feasible. Sometimes you only have 11% Latinos in your group instead of 13% like the population as a whole. So you give some extra weight to the Latinos you do have in your poll. But the more you have to make these adjustments the less sure you are that the answers you got reflect the bigger group. Which increases your margin of error.

    TLicon_biggrin.gifR?
    I actually know a statistics. This "poll" doesn't have any actual statistics because it's complete crap and meaningless.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 11, 2016 3:27 AM GMT
    Wyndahoi saidWell, let me tell you a few things about statistics...

    (blabla something pompous blabla: snipped)

    TLicon_biggrin.gifR?
    I actually know a statistics. This "poll" doesn't have any actual statistics because it's complete crap and meaningless.


    You actually know a statistics?

    Wow, impressive! Let's see what Nate has to say.

    https://en.wikipedia.org/wiki/Nate_SilverNathaniel Read "Nate" Silver (born January 13, 1978 ) is an American statistician and writer who analyzes baseball (see sabermetrics) and elections (see psephology). He is currently the editor-in-chief of ESPN's FiveThirtyEight blog and a Special Correspondent for ABC News. Silver first gained public recognition for developing PECOTA,[3] a system for forecasting the performance and career development of Major League Baseball players, which he sold to and then managed for Baseball Prospectus from 2003 to 2009.[4]

    After Silver successfully called the outcomes in 49 of the 50 states in the 2008 U.S. Presidential election, he was named one of The World's 100 Most Influential People by Time in 2009.


    Here, see if this isn't too long for you to read:

    http://fivethirtyeight.com/features/is-a-50-state-poll-as-good-as-50-state-polls/UPDATE (Sept. 10, 2:15 p.m.): After conversations with SurveyMonkey, we’re convinced that the critique you’ll read below doesn’t apply well to their recent 50-state poll, which instead is more analogous to 50 separate surveys in each state. Here’s what we mean by that. First, SurveyMonkey weighted each state separately, using only data from that state. And second, they took several measures to verify their respondents’ location, such as asking for their ZIP code in addition to the state in which they’re registered to vote. If a survey passes these two tests, we’ll treat it as we would a regular state poll. If not, we’ll still include it in our averages but with a lower weight, as described below.

    It sounds like a riddle of sorts: Is one giant poll of all 50 states the same thing as 50 small polls, one for each state, added together?

    If this seems like an odd question, it’s because it hadn’t really come up before this year. Sure, technically speaking, any national poll is composed of interviews from all 50 states. For instance, we’d expect a 1,000-person national poll to include about 100 respondents from California, 30 from Virginia, and 5 from Idaho, assuming that the number of people interviewed in each state was roughly proportional to turnout in 2012. But pollsters almost never report those state-by-state breakouts in the same way they do other sorts of demographic splits. That’s probably for good reason: The margins of error on those subsamples would be astronomical for all but the most populous states.

    But what if instead of using a sample size of 1,000, your poll interviewed 50,000 people? Now you’d have around 5,000 respondents from California and 1,500 from Virginia — more than enough to go around. Even your Idaho sample size — about 250 people — is semi-respectable.


    Several online pollsters are now doing this, interviewing tens of thousand of people nationally per week or over the course of several weeks, as part of their national polling. And they’re increasingly reporting their results on a state-by-state basis. SurveyMonkey, Ipsos and Morning Consult have all released 50-state surveys, projecting the outcome in each state along with the overall Electoral College result. Google Consumer Surveys, which interviews around 20,000 people per week, has a crosstab showing their state-by-state results.
  • Posted by a hidden member.
    Log in to view his profile

    Sep 11, 2016 3:28 AM GMT

    FiveThirtyEight has been using the state-by-state results from SurveyMonkey and Ipsos in its forecasts, and we’re in the midst of incorporating the data from Morning Consult and Google. (This has already attracted a fair amount of attention; Donald Trump’s campaign erroneously attributed Ipsos polls of Ohio and Iowa to FiveThirtyEight.) Which brings me back to my earlier question: Is a 500-person subsample of Colorado voters from a 20,000-person national poll the same thing as a 500-person poll that was dedicated to Colorado, specifically?

    After thinking and researching my way through the problem, my answer is that these polls aren’t quite the same. The Colorado-specific poll is likely to provide a more reliable estimate of what’s going on in that particular state. And it deserves a higher weight in our model as a result.

    One reason to give the 50-state technique a lower weight is that hasn’t really been empirically tested. There have been cases in the past where pollsters commissioned simultaneous polls of all 50 states — surveying 600 voters in each state, for example — but for reasons I’ll explain in a moment, that’s potentially different from commissioning a huge national poll and reporting the results of state-by-state subsamples.

    One potential source of error has to do with demographic weighting. Polls of all kinds engage in extensive demographic weighting because people aren’t equally likely to respond to polls Typically, for instance, white voters are more likely to respond to telephone polls than black voters. Pollsters attempt to counteract this by giving extra weight to the black voters they reach until the demographics of their poll matches that of Census data or other reliable sources.

    But establishing these weights is not easy because voters are not monolithic within these demographic groups. White voters in Oregon are much more likely to vote Democratic than white voters in Mississippi, for instance. If you’re taking a poll just of Oregon or Mississippi, you’ll optimize your demographic weights to match the makeup of those states specifically. But if you’re conducting a national poll that includes interviews from Oregon and Mississippi along with the other 48 states, you might not pay as much attention to how the results shake out in individual states. Perhaps you’ll overestimate the Democratic vote in Mississippi, where whites are especially conservative, and underestimate it in Oregon — but those differences will likely cancel out in the national result.

    Another potential problem is misidentifying the state a poll respondent votes in. With online polls, the problem is that IP addresses aren’t 100 percent reliable — for instance, a website would think I’m in Connecticut right now because that’s where ESPN’s internet connection is based, even though I’m writing from the FiveThirtyEight office in New York City. Someone filling out an online survey at their office in Washington, D.C., might actually live in Virginia or Maryland. With telephone polls, the issue is that people carry their mobile phone numbers around when they move from state to state, making it harder to identify a voter’s residence based on her phone number alone.

    Pollsters spend a lot of time thinking about problems like these when they’re conducting surveys of a particular state, and they can employ some good workarounds (for instance, asking the voter where they’re registered to vote). But in a national poll, the pollster doesn’t need to be as precise. If you misidentify me as a Connecticut voter when I’m really registered in New York, that won’t affect the topline margin in the national poll, even though it could skew the Connecticut and New York results.

    We’ve noticed, anecdotally, that the 50-state polls sometimes produce weird results in cases like these, in states that are either demographically idiosyncratic (such as Mississippi) or in small states (such as New Hampshire) where the sample is potentially contaminated by voters from another state. But this is FiveThirtyEight, and we’re not big on anecdotes. So we looked at the best predecessor for the 50-state polls that we could find: results from the 2012 Cooperative Congressional Election Study (CCES), a project conducted jointly by the online pollster YouGov and a consortium of universities. In 2012, the study surveyed around 50,000 voters, asking them about their presidential vote along with a long battery of demographic and political questions.

    This is an incredibly useful dataset that we use all the time at FiveThirtyEight. But what if you use the CCES to estimate the presidential vote in each state? To be clear, this is not a use the authors of the CCES necessarily intended or would recommend. But it’s the closest approximation I could think of for what Ipsos or SurveyMonkey are doing with their 50-state surveys.