Research

I am primarily interested in Philosophy of Neuroscience, but my research ranges in scope from questions in general philosophy of science (How does data analysis generate evidence?) to the epistemology of neuroscience (What can the evidence of neuroimaging tells us about the structure of cognition?). My approach to philosophical enquiry involves collaborating with scientists who use the techniques, methods and models my philosophical questions are directed at understanding. This allows my work to be calibrated by the details of and challenges faced in the day-to-day practice of science, and to have an immediate and positive impact on those practices.

In addition to my research in the philosophy of science, I actively work on the philosophy of teaching and learning. I am an active member of the America Association of Philosophy Teachers, and recently became a member of the editorial board of Teaching Philosophy. I have presented workshops on teaching and pedagogy, and actively seek new ideas, strategies and discussions on teaching and course design within and for philosophy. My current work on teaching includes the development of a method for assessing and promoting student participation that is inclusive and aligned with typical learning outcomes for philosophy courses, investigations of the role for games in teaching philosophy, and the development of graduate courses and training programs to train graduate students to be effective instructors and also prepare them for interdisciplinary research projects.

Current Projects

Transparency and Reproducibility in Cognitive Neuroscience

Significant progress in the last decade of neuroimaging research has come in the form of new technologies and tools for sharing, organizing, and analyzing data and theories. These technologies have gone hand in hand with two parallel movements in cognitive neuroscience. The first is the growing pressure to make research practices reproducible and transparent, with the aim of improving the standards of evidence and quality of research. The second is the realization that newly developed analysis techniques that could be used to investigate hypotheses and theories previously beyond the scope of available evidence require large collections of data to produce meaningful results.

The specific philosophical questions to be addressed are: How are new technologies for sharing, organizing, and analyzing data and theories changing the norms of evidence in neuroimaging research? How are these technologies bringing research communities, theories, data and analysis techniques together in novel ways? What role do different data analysis techniques play in using data as evidence for claims about phenomena it was not produced to investigate? How does the use of new technologies change the way cognitive scientists interpret the data they work with?

I am pursuing these questions by collaborating on a variety of projects that aim to improve reproducibility and transparency in cognitive neuroscience. This includes OpenNeuro and NeuroVault, which are repositories that facilitate data sharing, the Brain Imaging Data Structure, which is a data organization standard for brain imaging data, and fMRIPrep, an easy to use pipeline that automates the pre-processing stage of data analysis for functional imaging data.

Epistemology of Neuroimaging

Philosophers have been skeptical of the use of neuroimaging technology to investigate the relationship between cognitive function and the brain (van Orden and Paap 1997; Klein 2010; Roskies 2010; Aktunc 2014). I have argued that, when the full range of data analysis techniques used with neuroimaging data are taken into consideration, the strength of the evidence generated by neuroimaging data is stronger than philosophers have appreciated (Wright 2017). This argument both provides grounds for optimism with respect to the use of neuroimaging in cognitive neuroscience, and general insight into the role and importance of multiple data analysis techniques for the generation of scientific knowledge.

The next stage of this project is to develop an account of evidence in neuroimaging that is sensitive to the supporting role data manipulations and analysis play in the interpretation of neuroimaging data. I have begun this task in recent work that argues that data analysis results do not produce evidence, but instead are better thought of as lenses that researchers construct to bring specific patterns within data into focus (Wright forthcoming).

Data Sharing and Automated Database Curation

The NeuroSynth Database is novel tool used in cognitive neuroscience to perform meta-analyses of neuroimaging data. What’s novel about it is that data is entered into the database and curated automatically by an algorithm that searches neuroscience journals, extracts the relevant data points and labels the data. The automated procedure is important as it overcomes practical limitations to database development in cognitive neuroscience (Yarkoni et al 2009). However, automated curation raises interesting epistemic problems as curation is normally managed by a team of highly trained professional curators with field-relevant expertise (for example, see the IEDB curation manual). Two questions that I am working on are: What difference does automation make with respect to the epistemically responsible use of the database? How does automated curation bear on philosophical accounts of the role databases play in science?

Teaching Philosophy Through Games

“Gamification” has become yet another buzzword in teaching and pedagogy. The idea is that people have fun playing games, and people are engaged when they have fun. Classroom engagement, then, could be improved by using games to make students have fun. The problem is, games in the classroom are a learning activity and so the learning needs to be aligned with the learning outcomes of the course or else it not actually supporting learning. I argue that ‘fun’ isn’t the primary benefit of using games in the classroom. Games provide an opportunity to give students an experience, and that experience can be tailored by the rules and components of the game. Creating an authentic game experience, then, is about creating an experience that is analogous to the lesson, debate or theory being taught. I propose that well designed game experiences can create first-hand analogies which can serve as the subject of discussion in stead of (or in addition to) traditional, abstract thought experiments.