2019 Montréal symposium accepted
Our symposium “Methods and topics in Internet-based and Social Media research” at the Annual Meeting of the Society for Computers in Psychology (SCiP) has been accepted! See us in Montréal, November 14, 2019.
Our symposium “Methods and topics in Internet-based and Social Media research” at the Annual Meeting of the Society for Computers in Psychology (SCiP) has been accepted! See us in Montréal, November 14, 2019.
Nadja Younes and Ulf-Dietrich Reips’ recently published article “Guideline for Improving the Reliability of Google Ngram Studies: Evidence from Religious Terms.” PloS One, 14(3): e0213554. doi.org/10.1371/journal.pone.0213554 has been featured in a press release.
Item Pool Visualization (IPV) is an illustration system that locates items and item pools (scales) from multiple psychological instruments regarding their commonality and distinguishability along several dimensions of nested radar charts. The application of IPV creates illustrations that represent different item pools by different circles that do not overlap. IPV illustrates a comparison of different structural equation models that are estimated with the same data. It combines the advantages of general and correlated factor models when evaluating psychological instruments. Further, in contrast to other visualization methods, IPV provides an empirically driven categorization of psychological constructs and their subconcepts (facets) that is suited to provide professionals with help in comparing psychometric constructs, questionnaires, and selecting tests.
Dantlgraber, M., Stieger, S., & Reips, U.-D. (in press). Introducing Item Pool Visualization (IPV): A method for investigation of concepts in self-reports and psychometric tests. Methodological Innovations.
Ulf-Dietrich Reips presented at Linguistic Modeling and its Interfaces lecture series at the University of Tübingen July 19:
“Internet-based research methods: Challenges and solutions”
ABSTRACT: In this presentation I will provide an overview of guidelines and techniques, methods, and tools for Internet-based experimentation and big data and social media research, including solutions to many of the methodological challenges in data collection via the Internet. The challenges discussed here include issues in design, security, selection of and access to social media platforms, recruitment, multiple submissions, measurement scales (e.g. visual analogue scales), response time measurement, dropout, and data quality. Some widely used tools and practices like Amazon Mechanical Turk, the cognitive reflection test, and forced responding are critically reviewed. Among other methods that have been developed in Web methodology and Internet science, I will explain the one-item-one-screen (OIOS) design, the seriousness check, option bars, the multiple site entry technique, and our recently published guidelines to conduct Google Ngram studies (Younes & Reips, 2019, PLoS One). Picked from our iScience Server at http://iscience.eu I will demonstrate a new version of our Web experiment generator WEXTOR (https://wextor.eu) as a tool that automatically implements optimal solutions. Another example is Social Lab, our “Open Source Facebook” available at http://sociallab.es that can be played with in learning about privacy at http://en.sociallab.es. The presentation will conclude with a discussion of failed visions, problematic trends, and chances in evolving scenarios of using social media and online data in scientific work. Publications are available from http://www.researchgate.net/profile/Ulf-Dietrich_Reips and http://tinyurl.com/reipspub .