Quantitative trait loci analysis using the false discovery rate

Yoav Benjamini, Daniel Yekutieli*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


False discovery rate control has become an essential tool in any study that has a very large multiplicity problem. False discovery rate-controlling procedures have also been found to be very effective in QTL analysis, ensuring reproducible results with few falsely discovered linkages and offering increased power to discover QTL, although their acceptance has been slower than in microarray analysis, for example. The reason is partly because the methodological aspects of applying the false discovery rate to QTL mapping are not well developed. Our aim in this work is to lay a solid foundation for the use of the false discovery rate in QTL mapping. We review the false discovery rate criterion, the appropriate interpretation of the FDR, and alternative formulations of the FDR that appeared in the statistical and genetics literature. We discuss important features of the FDR approach, some stemming from new developments in FDR theory and methodology, which deem it especially useful in linkage analysis. We review false discovery rate-controlling procedures - the BH, the resampling procedure, and the adaptive two-stage procedure - and discuss the validity of these procedures in single- and multiple-trait QTL mapping. Finally we argue that the control of the false discovery rate has an important role in suggesting, indicating the significance of, and confirming QTL and present guidelines for its use.

Original languageEnglish
Pages (from-to)783-790
Number of pages8
Issue number2
StatePublished - Oct 2005


Dive into the research topics of 'Quantitative trait loci analysis using the false discovery rate'. Together they form a unique fingerprint.

Cite this