2019 Montréal symposium accepted
Our symposium “Methods and topics in Internet-based and Social Media research” at the Annual Meeting of the Society for Computers in Psychology (SCiP) has been accepted! See us in Montréal, November 14, 2019.
Our symposium “Methods and topics in Internet-based and Social Media research” at the Annual Meeting of the Society for Computers in Psychology (SCiP) has been accepted! See us in Montréal, November 14, 2019.
Nadja Younes und Ulf-Dietrich Reips kürzlich erschienener Artikel “Guideline for Improving the Reliability of Google Ngram Studies: Evidence from Religious Terms. PloS One, 14(3): e0213554. doi.org/10.1371/journal.pone.0213554 wurde in einer Pressemitteilung vorgestellt.
IPV bezeichnet eine Darstellung, die Items und Item-Pools (Skalen) verschiedener psychologischer Verfahren bezüglich ihrer Gemeinsamkeiten und Unterschiede in verschachtelten Netzdiagrammen lokalisiert. Durch die Anwendung von IPV entstehen Abbildungen, in denen unterschiedliche Item-Pools mit unterschiedlichen, sich nicht überschneidenden Kreisen repräsentiert werden. IPV illustriert einen Vergleich verschiedener Strukturgleichungsmodelle, die mit identischen Daten geschätzt werden. Sie vereint hierbei die Stärken von Einfaktor-Modellen und Modellen korrelierender Faktoren. Des Weiteren ermöglicht IPV eine empirisch geleitete Kategorisierung psychologischer Konstrukte und deren untergeordneter Konzepte (Facetten), was dabei hilft, psychometrische Konstrukte zu vergleichen oder Fragebogen und Tests für Anwendungszwecke auszuwählen.
Dantlgraber, M., Stieger, S., & Reips, U.-D. (in press). Introducing Item Pool Visualization (IPV): A method for investigation of concepts in self-reports and psychometric tests. Methodological Innovations.
Ulf-Dietrich Reips presented at Linguistic Modeling and its Interfaces lecture series at the University of Tübingen July 19:
“Internet-based research methods: Challenges and solutions”
ABSTRACT: In this presentation I will provide an overview of guidelines and techniques, methods, and tools for Internet-based experimentation and big data and social media research, including solutions to many of the methodological challenges in data collection via the Internet. The challenges discussed here include issues in design, security, selection of and access to social media platforms, recruitment, multiple submissions, measurement scales (e.g. visual analogue scales), response time measurement, dropout, and data quality. Some widely used tools and practices like Amazon Mechanical Turk, the cognitive reflection test, and forced responding are critically reviewed. Among other methods that have been developed in Web methodology and Internet science, I will explain the one-item-one-screen (OIOS) design, the seriousness check, option bars, the multiple site entry technique, and our recently published guidelines to conduct Google Ngram studies (Younes & Reips, 2019, PLoS One). Picked from our iScience Server at http://iscience.eu I will demonstrate a new version of our Web experiment generator WEXTOR (https://wextor.eu) as a tool that automatically implements optimal solutions. Another example is Social Lab, our “Open Source Facebook” available at http://sociallab.es that can be played with in learning about privacy at http://en.sociallab.es. The presentation will conclude with a discussion of failed visions, problematic trends, and chances in evolving scenarios of using social media and online data in scientific work. Publications are available from http://www.researchgate.net/profile/Ulf-Dietrich_Reips and http://tinyurl.com/reipspub .