Person:
López López, José Antonio

Loading...
Profile Picture
Name
López López, José Antonio
publication.page.department
Universidad de Murcia. Departamento de Psicología Básica y Metodología

Search Results

Now showing 1 - 3 of 3
  • Publication
    Open Access
    Reliability Generalization of the School Attitude Assessment Survey‑Revised: A Meta‑Analytic Structural Equation Modeling Approach
    (Sage Publications, 2025-06-09) López López, José Antonio; López Nicolás, Rubén; Sandoval Lentisco, Alejandro; Sánchez Meca, Julio; Veas Iniesta, Alejandro; Psicología Evolutiva y de la Educación; Facultad de Psicología y Logopedia
    The School Attitude Assessment Survey-Revised (SAAS-R) is a popular scale for assessing attitudinal and motivational aspects of students’ academic achievement. However, evidence on key psychometric properties of the SAAS-R such as reliability remains limited. We conducted a reliability generalization study of the SAAS-R using meta-analytic structural equation modeling (MASEM). We included studies reporting an application of the SAAS-R and providing correlation coefficients between the SAAS-R subscales. We searched ERIC, PsycINFO, Academic Search Premier, Supplemental Index, and Web of Science from database inception to July 2023. Analyses were based on 18 independent matrices from 13 studies examining 8712 participants. Our main results, based on a one-stage, correlation-based MASEM approach and with omega total as the reliability measure, yielded an overall reliability estimate of 0.795 (95% CI 0.778–0.811). This suggests that the SAAS-R offers good score reliability for research and practice purposes. Applications using adapted versions obtained on average higher score reliabilities than the original ones. We discuss the implications of these results, which need to be interpreted with caution given the important reporting limitations of the primary studies included.
  • Publication
    Open Access
    Reproducibility of Published Meta-Analyses on Clinical-Psychological Interventions
    (SAGE Publications, 2024-02-05) López Nicolás, Rubén; Lakens, Daniel; López López, José Antonio; Rubio Aparicio, María; Sandoval Lentisco, Alejandro; López-Ibáñez, Carmen; Blázquez-Rincón, Desirée; Sánchez Meca, Julio; Psicología Básica y Metodología; Facultad de Psicología y Logopedia
    Meta-analysis is one of the most useful research approaches, the relevance of which relies on its credibility. Reproducibility of scientific results could be considered as the minimal threshold of this credibility. We assessed the reproducibility of a sample of meta-analyses published between 2000 and 2020. From a random sample of 100 articles reporting results of meta-analyses of interventions in clinical psychology, 217 meta-analyses were selected. We first tried to retrieve the original data by recovering a data file, recoding the data from document files, or requesting it from original authors. Second, through a multistage workflow, we tried to reproduce the main results of each meta-analysis. The original data were retrieved for 67% (146/217) of meta-analyses. Although this rate showed an improvement over the years, in only 5% of these cases was it possible to retrieve a data file ready for reuse. Of these 146, 52 showed a discrepancy larger than 5% in the main results in the first stage. For 10 meta-analyses, this discrepancy was solved after fixing a coding error of our data-retrieval process, and for 15 of them, it was considered approximately reproduced in a qualitative assessment. In the remaining meta-analyses (18%, 27/146), different issues were identified in an in-depth review, such as reporting inconsistencies, lack of data, or transcription errors. Nevertheless, the numerical discrepancies were mostly minor and had little or no impact on the conclusions. Overall, one of the biggest threats to the reproducibility of meta-analysis is related to data availability and current data-sharing practices in meta-analysis.
  • Publication
    Open Access
    Transparency in Cognitive Training Meta-analyses: A Meta-review
    (Springer Nature, 2024-04-19) López Nicolás, Rubén; Sandoval Lentisco, Alejandro; Tortajada Gomariz, Miriam; López López, José Antonio; Sánchez Meca, Julio; Psicología Básica y Metodología; Facultad de Psicología y Logopedia
    Meta-analyses often present flexibility regarding their inclusion criteria, outcomes of interest, statistical analyses, and assessments of the primary studies. For this reason, it is necessary to transparently report all the information that could impact the results. In this meta-review, we aimed to assess the transparency of meta-analyses that examined the benefits of cognitive training, given the ongoing controversy that exists in this field. Ninety-seven meta-analytic reviews were included, which examined a wide range of populations with different clinical conditions and ages. Regarding the reporting, information about the search of the studies, screening procedure, or data collection was detailed by most reviews. However, authors usually failed to report other aspects such as the specific meta-analytic parameters, the formula used to compute the effect sizes, or the data from primary studies that were used to compute the effect sizes. Although some of these practices have improved over the years, others remained the same. Moreover, examining the eligibility criteria of the reviews revealed a great heterogeneity in aspects such as the training duration, age cut-offs, or study designs that were considered. Preregistered meta-analyses often specified poorly how they would deal with the multiplicity of data or assess publication bias in their protocols, and some contained non-disclosed deviations in their eligibility criteria or outcomes of interests. The findings shown here, although they do not question the benefits of cognitive training, illustrate important aspects that future reviews must consider.