This ran in our local paper today, and I call BS.
AP didn't describe the cross-section of the people surveyed. Reputable surveys report how many people were surveyed, and how many were in the survey pool. Good surveys report things like "1063 people were contacted by telephone and were asked the following questions. The pool of those surveyed were from registered voters" or some such source. Then they go on to list the actual wording of the questions with breakdowns by percentage of the responses.
The article doesn't mention any of this. What was the size of AP's survey pool? Was it their coworkers and friends they met at the company holiday party? Seven neighbors at a barbecue? Three random people they met in the elevator on the way to work in the morning? What was the wording of the questions that were asked? Polling experts will tell you that the wording of the survey can influence, and even greatly bias, the answers of those surveyed.
As an example, if you have a survey size of 3, and you ask them if drones should be illegal, and 2 of the 3 say "yes", then -- BOOM! -- you magically have an overwhelming majority of 67% of those surveyed advocating a drone ban.
Here's a quote from the article:
"But — Amazon take note — only 1 in 4 thinks using drones to deliver small packages is a good idea. Thirty-nine percent were opposed, and 34 percent were neutral on that question. Nearly the same share opposed using drones to take photographs or videos at weddings and other private events. A third opposed allowing farmers to use drones to spray crops, while another third supported it. Only 23 percent said they favored the recreational use of small drones."
After I read it, all I could think of was that the group of people they surveyed only equate drones with war, guns and missiles, know nothing about wedding photography, virtually nothing about where their food comes from nor how hard it is to run a profitable farming operation, or about RC aircraft or commercial flying at all.
To me, the way the article reads is more of a reflection on the biases of the reporters writing the story rather than on the results of the mysterious survey they conducted.
This is, pure and simple, the sloppiest of sloppy journalism. In fact, it's a stretch to even call it journalism. How about some more details about your survey, Associated Press? Please?