User Survey Anayltics FAQ
Where is the data coming from? When was it generated?
The OpenStack User Survey, which is answered by the community on a semiannual basis, provides the raw data for this survey. The survey has evolved substantially since its inception, so to compare similar questions, this analytics tool includes data from the sixth (October 2015), seventh (April 2016), and eighth (October 2016) User Surveys. You can see results from the October 2015 survey by selecting that data set from the top-center dropdown, just below the page title. Results from 2016 are presented in aggregate.
Who does this survey represent? Is it a market study?
We ask the entire OpenStack community to participate in the User Survey, and over the past year approximately 2,500 people have offered their perspectives. About one-quarter of that number have registered deployments. This survey is purely voluntary and confidential. It is not representative of the market overall, and we do not attempt in the scope of this survey to show market penetration or adoption. Rather, we use the User Survey to provide insights from real users on how OpenStack works for them, their deployment decisions and cloud size, and, through these insights, we are able to provide substantial feedback to development teams.
Why doesn't the data I see here match the latest User Survey report?
The survey report is a snapshot, reflecting a defined period of time. For example, the latest (October 2016) User Survey includes only answers from June-September 2016. The analytics tool, on the other hand, is a dynamic data set that is constantly updating as new survey answers come in. The reports provide deeper context, comment analysis and stronger data validation (as outlying answers are manually verified and might be omitted if they are found to be erroneous), whereas the analytics tool offers real-time data and lets users set their own analytics filters.
What does "beta" mean for this analytics tool?
We are opening up this tool to the community as early as possible to encourage feedback and help us iterate on it to make it as useful as possible. The beta label reflects the fact that while we've made every attempt to validate the analytics, you should draw conclusions with caution.
Whom should I contact with feedback?
Please email [email protected]
to report a problem (screenshots are always helpful) or make a comment. Your feedback is critical to helping us evolve this community tool.
Why won't data display for a certain filter? How does this tool protect respondent privacy?
We assure users that their answers will be kept confidential and will only be shared in aggregate. In some cases, when a user sets a filter that returns a small quantity of results, the system suppresses results to ensure user privacy. Try removing one filter or filtering based on a larger group to return a sufficient quantity of results.
I answered a question on the user survey, but I don't see it on this tool. Why?
We focused on showing those questions that are quantitative, not qualitative, in this analysis tool. Questions requesting a specific comment, such as, "What do you like about OpenStack?" are not shown here, but comment analysis will be done by the User Survey team and presented in the full reports. The most recent full report was April 2016 and the next full report will be April 2017.
Can I do significance testing on the data?
Not yet. Significance testing is based on evaluating the number of answers for each data set and determining whether the variance within a defined time frame is statistically significant. For the User Survey reports, an independent data scientist conducts significiance testing on all major charts to help us determine which charts we should highlight based on substantial changes.
Can I use this data in my commercial marketing efforts?
This is not recommended, due to the fact that the analytics tool is in beta and the data provided in the tool has not been independently verified. We recommend only using charts from the User Survey reports when sharing data with your customers, because it goes through many rounds of review as well as the User Committee before it is presented to the community. The analytics tool is intended to be directional for the community.