Shravan Vasishth

Professor, Dept. of Linguistics, University of Potsdam, 14476 Potsdam, Germany
Speaker, Language Cluster, Cognitive Science
Phone: +49-(0)331-977-2950 | Fax: - 2087 | Email: vasishth at
GPG public key, Orcid ID, google scholar, github, bitbucket, statistics blog, vasishth lab blog
Back to home page

Bayesian Linear Mixed Models using Stan: A tutorial for psychologists, linguists, and cognitive scientists

by Tanner Sorensen, Sven Hohenstein, Shravan Vasishth, Quantitative Methods for Psychology, 2016. Vol 12, No. 3, pages 175--200.
The published paper is available here.
You can download all the source code from here; alternatively, if you are not familiar with github, please download this zip archive. Please look at the doc directory for the paper and run the code embedded in the Rnw file.

The reader may also benefit from this more introductory review of Bayesian methods intended for linguists and psychologists: Statistical methods for linguistic research: Foundational Ideas - Part II. Part I covers frequentist methods (it is in press with Language and Linguistic Compass) and is available here.
The RePsychLing package, by Bates, Kliegl, Vasishth, and Baayen has examples of linear mixed models using Stan. See, for example, the R markdown file KBStan.Rmd in vignettes; the code here is the most efficient of the examples shown as the computations are done in matrix form.

Papers we have published that use Bayesian methods for data analysis

Lena A. Jäger, Felix Engelmann, and Shravan Vasishth. Similarity-based interference in sentence comprehension: Literature review and Bayesian meta-analysis. Journal of Memory and Language, 94:316-339, 2017. [ DOI | code | pdf ]
Gerrit Kentner and Shravan Vasishth. Prosodic focus marking in silent reading: Effects of discourse context and rhythm. 7(319), 2016. Frontiers in Psychology. [ DOI | pdf | http ]
Pavel Logačev and Shravan Vasishth. Understanding underspecification: A comparison of two computational implementations. Quarterly Journal of Experimental Psychology, 69(5):996-1012, 2016. [ DOI | code | pdf ]
Bruno Nicenboim, Pavel Logačev, Carolina Gattei, and Shravan Vasishth. When high-capacity readers slow down and low-capacity readers speed up: Working memory differences in unbounded dependencies. Frontiers in Psychology, 7(280), 2016. Special Issue on Encoding and Navigating Linguistic Representations in Memory. [ DOI | code | pdf ]
Bruno Nicenboim and Shravan Vasishth. Statistical methods for linguistic research: Foundational Ideas - Part II. Language and Linguistics Compass, 10:591-613, 2016. [ code | pdf ]
Dario Paape and Shravan Vasishth. Local coherence and preemptive digging-in effects in German. Language and Speech, 59:387-403, 2016. [ pdf ]
Molood Sadat Safavi, Samar Husain, and Shravan Vasishth. Dependency resolution difficulty increases with distance in Persian separable complex predicates: Implications for expectation and memory-based accounts. Frontiers in Psychology, 7, 2016. [ DOI | code | pdf ]
Stefan L. Frank, Thijs Trompenaars, and Shravan Vasishth. Cross-linguistic differences in processing double-embedded relative clauses: Working-memory constraints or language statistics? Cognitive Science, page n/a, 2015. [ code | pdf ]
Samar Husain, Shravan Vasishth, and Narayanan Srinivasan. Integration and prediction difficulty in Hindi sentence comprehension: Evidence from an eye-tracking corpus. Journal of Eye Movement Research, 8(2):1-12, 2015. [ pdf ]
Philip Hofmeister and Shravan Vasishth. Distinctiveness and encoding effects in online sentence comprehension. 5:1-13, 2014. Article 1237. [ DOI | code | pdf ]
Samar Husain, Shravan Vasishth, and Narayanan Srinivasan. Strong Expectations Cancel Locality Effects: Evidence from Hindi. PLoS ONE, 9(7):1-14, 2014. [ code | pdf ]
Shravan Vasishth, Zhong Chen, Qiang Li, and Gueilan Guo. Processing Chinese Relative Clauses: Evidence for the Subject-Relative Advantage. PLoS ONE, 8(10):1-14, 10 2013. [ code | pdf ]

Please send comments and corrections regarding this page to : vasishth SQUIGGLE ling dot uni hyphen potsdam dot de.