Accepting students

I am excited to be continuing the growth of my lab and will be accepting a student to start next year in the Counseling psychology PhD program. We have a ton of projects going on and its a great time to jump in. If you are interested in help seeking behaviors, we have a ton of cool projects in the middle of data collection. Those interested in military populations, we’ve got those as well. If you are interested in assessment, we have even more!

If you are interested in the lab, I’d LOVE to hear from you! Send me an email with info about your experiences, interests, and questions you have about how you would fit. I’m happy to give you more details about my mentoring style, the lab, TTU, and the Counseling program.

 

As you consider becoming a Red Raider, here is some actual footage of our lab environment

source.gif

Undergrad lab member in the spotlight

I wanted to take a minute to brag about one of the lab members. Liz was an awesome addition to the lab this semester and I’m glad she joined us. Clearly she stands out to the TTU department as well!

New paper: Male help-seeking – depression and stigma

I’m thrilled to have the new paper with Dr. Brian Cole accepted to Psychology of Men and Masculinities. We took at a look at help seeking behaviors in men with depressive symptoms to get a better idea of what coping and help seeking looks like, and then at how stigma and gender identity conflict predicts those behaviors.

Model 1 - Cole Dissertation SEM

Above is a slightly simplified figure from the paper, but the take away is clear for me. Help seeking varies, and so do their relationships to classic barriers to mental healthcare like self-stigma. If we want to improve our understanding of therapy engagement, we can’t act like everyone copes the same or that everything will influence those different methods the same way.

My comments on the PTSD guidelines

Division 29 (Psychotherapy) did some great work putting together a special issue critiquing the APA guidelines on the treatment of PTSD. If you haven’t already, check it out here. I read through the articles yesterday and had a ton of reactions. There was some really great work in there. I tweeted a bunch of them out (link below) but I have one big takeaway:

If we continue to do research on therapy efficacy, we can’t ignore the largest basis of literature supporting the role of working alliance as a central mechanism of change.  RCTs that know of a causal mechanism that do not control for those effects are not performing good science. Working alliance CAN and MUST be measured for us to make any claims about ‘what treatment is best’.

Alex Williams (the author of a fantastic recent article on the state of RCT/evidence-based care) commented that the reason that they haven’t done so was because doing so ‘didn’t answer the question they wanted to ask’, but blindfolds and intentional avoidance don’t make for good science.

New Paper: MMPI-2-RF and Service Era differences

A new paper was just accepted in the Journal of Clinical Psychology in Medical Settings which examines how era of military service influences scores on the MMPI-2-RF for veterans. We used a national sample of assessments conducted in PTSD Clinical Team (PCT) settings around the country and, after controlling for gender because of different rates in service, contrasted the means and clinical elevation rates for with Vietnam and Gulf War service histories. In contrast to the MMPI-2 which demonstrated that these service experiences resulted in distinctive symptom presentations, the MMPI-2-RF provides a more generalizable and consistent interpretive basis with largely negligible differences.

Click here to access a pre-print version of the paper.

Upcoming APA Presentation by Brittney Golden

Following up on her presentation at AACN on sensitivity of the MMPI-2-RF over-reporting scales in a military concussion clinic (a project that just got a revise and resubmit as well!), Brittney is presenting on potential test bias and the resulting differential performance observed across the over-reporting scales based on previous meta-analytically identified moderators.

APAposter_MMPIbias_Final

This isn’t quite the final version just yet, but I couldn’t wait to share! Long story short- the same moderators we see impacting performance (score and rate of invalidity) in meta-analysis show up when we look for them in individual samples. White, college educated, men with no mental health diagnoses endorse less pathology and invalidate scores less frequently within samples drawn from the same referral source.

She is doing an awesome job on this project. Check it out.

Determining authorship order (counseling publication project)

Dr. Currin and I finished our analysis of publication trends among academic counseling psychology faculty today and now I’m finishing up the discussion section now.  The irony of writing a paper completely collaboratively and not taking authorship too seriously in academia where publish or perish really struck a cord with both of us. So… we decided our authorship completely by a competitive (but 100% not serious) game of Zombie Dice.

We even had an judge (who’s office we used so that we had ‘neutral territory’)- a big thanks to Dr. Lindsay Greenlee for her officiating. In case you can’t tell from Joe’s deflated posture, I won (this was my best impersonation of Gaston that I could muster).

48658

 

Upcoming presentation of research

A new collaborator of mine (Dr. Pat Armistead-Jehle) is presenting some work Brittney Golden and I have been doing with him on the MMPI-2-RF validity scales in the detection of failed performance validity testing at the upcoming American Academy of Clinical Neuropsychology (AACN) conference. Brittney did a fantastic job as first author on this poster.

AACN

Want to know more? Download our poster [click here]

Golden, B., Ingram, P.B., & Armistead-Jehle, P.J. (2019, June). Evaluating the effectiveness of the MMPI-2-RF Over-reporting Scales in a military neuropsychology clinic. Poster accepted for presentation at the 2019 American Academy of Clinical Neuropsychology Conference. Chicago, IL.

Post in class debate on mechanisms of change pose

This is one of my favorite days of teaching Theories and Interventions. At the end of the semester, after breaking up sections of specific theories/mechanisms of change (e.g.,  dynamic, cognitive, behavioral, etc.) for the months before, I introduce students to the debate between common and specific factors. After a few days of lecture, I break them into debate teams to address some of the ‘big picture’ questions that come from this debate.

I ask things like: (a) do ESTs provide the BEST care for our clients, (b) if you were designing a psychology training program and selecting course focus, should specific factor EST exposure/training be emphasized more or should common factors/micro-skills be the focus to make ‘better clinicians’, and (c) how directive can (and should) manualized treatments be?

I was super proud of how well the 1st year counseling students did – they killed the debate (with teams switching positions halfway through).

20190417_152236

Projects examining training and competence

In a follow-up to the pilot study in Trainng and Education in Professional Psychology on personality assessment, the lab has launched several related data collection projects focused on the area on training. Our national study on assessment competency is live and recruiting (n ~ 400 as of today) and we hope to conclude data collection over the summer and start our analyses. It’s a great time to promote evidence-based assessment and evidence-based assessment training! The new project expands on many of the themes found in our pilot study across all aspects of assessment (personality, cognitive, neuropsychological, and miscellaneous instruments), as well as examining supervision trends and satisfaction and instrument selection heuristics.

We (Ingram and Schmidt) are also launching an up-to-date study on APPIC and intern selection.  It’s high time that we determine what “fit” is and the types of criteria used in internship sites (not just across the board, as has been done previously; see Ginkel, Davis, & Michael, 2010; Rodolfa et al., 1999) but at specific types of sites (e.g., VA, Medical School, Counseling Center, etc.). Intern applicants have specific interests so it is worth providing them specific information to enable intentional course/application planning. We anticipate data collection starting within the month.