Date: June 9 (Saturday), 2018

Location: the West Wing of Mannheim Palace (Mannheim Business School Study and Conference Center)

Connect@IPSDS is a half day event dedicated to facilitating networking and  information exchange in the area of survey and data science across various sectors (academia, industry, government, and non-profits).

Connect@IPSDS is of interest to:

  • Students/young professionals - join us to learn about the latest survey and data science applications

  • Organizations/companies looking for experts in survey and data science - represent your organization/company during our event

  • Researchers and practitioners - network with a diverse community of data experts and join the discussion about survey and data science applications across a broad set of domains


13:00 Registration

14:00 IPSDS Talks & Panel Discussion:

  • Curtiss Cobb (Survey Scientist and Manager, Demography and Survey Science Group, Facebook)  Surveys, “Big Data,” and Machine Learning: Bringing Methods Together to Solve Difficult Problems at Scale

The hype that exists in research communities for “big data” and machine learning have led many to speculate about how “big data” may eliminate the need for survey research in the near future—often time using Facebook and its “big data” resources as an example for why surveys are no longer needed for understanding demographic trends or public opinion.  At Facebook, we view “big data” methods and survey research as complimentary tools rather than competing and often bring both together to solve some of our most difficult problems.  This talk will share two examples: (1) how survey research is leveraged for evaluating and improving machine learning model input and output; and (2) how machine learning models are helping Facebook overcome declining response rates in surveys.

Curtiss Cobb leads the Demography and Survey Science Team at Facebook, a quantitative focused research team that works across Facebook and the family of apps to identify and share best practices and methodological innovations in demographic and survey research.  His team oversees the collection of millions of survey responses a day from around the world using mobile, web, face-to-face and other methods.  Prior to Facebook, Curtiss was Senior Director of Survey Methodology at GfK and consulted on survey studies for clients such as the Associated Press, Pew Research Center, CDC, U.S. State Department and numerous academic studies. Curtiss received his B.A. from the University of Southern California and has an M.A. in Quantitative Methods for Social Sciences from Columbia University. He holds an M.A. and Ph.D. in Sociology from Stanford University.

  • Peter Lugtig (Associate Professor, Department for Methodology and Statistics, Utrecht University) Surveys and Sensors                                                                                                                 

Surveys and organic data (Big data collected not for the purpose of research) complement each other when the goal is to understand how people think, interact and behave. Whereas the key quality of survey data lies in the fact that it can be used to study attitudes for a small sample that is representative of the general population, the key quality of Big data is that it can help to understand behavior with a large and detailed set of data. Survey data alone can rarely be used to study behavior accurately, whereas using Big Data in isolation leads to problems with inference, and understanding what people think or believe. How to combine survey and Big data however? One could start with an organic dataset (eg. Twitter users, a client database, ) and then ask users to complete a survey. Or you could do the reverse: start with a survey, and then try to enrich the data with organic data. In this talk I will outline why the last approach in my view is more promising.  I will show examples of how in some recent projects I was involved in, surveys were complemented with sensor measurements collected through smartphones. The sensor measures concentrated on obtaining the locations of people for a number of days, to investigate mobility and time use. I will discuss how the combination of survey and sensor data works in practice, and briefly talk about nonresponse, willingness and ethical concerns in collecting sensor data through smartphones.

Peter works as an associate professor at the department for Methodology and Statistics at Utrecht University, the Netherlands. He is involved in the coordination of a new programme in Applied Data Science, and teaches courses in survey methodology. He has published widely on longitudinal surveys, mobile surveys, and has more recently focused on complementing survey data with sensor data collected through mobile phones.

  •  Anna-Lena Disterheft (Product Owner, Statistics Team, Civey) New Modes of Online Sampling: Growing and Maintaining an Online Access Panel via a Network of Media Publisher 

Self-selection bias in classical online panels is the main point of criticism when talking about the quality of online polling. Online surveys are mostly just put on the web and respondents usually become aware of their existence by chance or some sort of advertising before they can even choose to participate or not. We use a widget-based recommendation system, that distributes polls across a network of currently more than 15,000 publisher websites, engaging users in a wide variety of topics. By using different websites and polls as entry points, we are able to attract a much more diverse user base, thus mitigating self-selection bias. Users are incentivized for participation by directly showing them real-time weighted, representative results each time after answering a poll. Once users have answered the entry poll, we ask them to provide their demographics and, based on our recommendation system, randomly suggest a list of polls from various topics. We keep respondents engaged by a simple, optimized user experience, by giving them a sense of participation and the possibility to see actual high-quality results. This way, we were able to build up Germany’s largest online panel with 500,000 signed up users in just over one year, around 36,000 of which are actively answering an average of 15 polls every day. Even though minor biases on the panel level remain, we are able to reach a lot of participants in demographic groups that are typically strongly underrepresented or even absent in online panels. Combined with appropriate post-stratification methods (Park et al, 2004), we believe that this can lead to high-quality results at low costs. We were already able to prove the quality of our results with pre-election polls for several states and national German elections.

Anna-Lena is a Product Owner at Civey, tech company based in Germany. Civey is a new generation of public opinion research company, pioneering high-quality representative surveys in real time. 

17:00 Coffee and Cake

Cost:  Free, refreshments provided
Registration (booked out, waiting list is open):