The knockoff filter of (Ann. Statist. 43 (2015) 2055–2085) is a flexible framework for multiple testing in supervised learning models, based on introducing synthetic predictor variables to control the false discovery rate (FDR). Using the conditional calibration framework of (Ann. Statist. 50 (2022) 3091–3118), we introduce the calibrated knockoff procedure, a method that uniformly improves the power of any fixed-X or model-X knockoff procedure. We show theoretically and empirically that the improvement is especially notable in two contexts where knockoff methods can be nearly powerless: when the rejection set is small, and when the structure of the design matrix in fixed-X knockoffs prevents us from constructing good knockoff variables. In these contexts, calibrated knockoffs even outperform competing FDR-controlling methods like the (dependence-adjusted) Benjamini– Hochberg procedure in many scenarios.