We consider inference in models deﬁned by approximate moment conditions. We show that near-optimal conﬁdence intervals (CIs) can be formed by taking a generalized method of moments (GMM) estimator, and adding and subtracting the standard error times a critical value that takes into account the potential bias from misspeciﬁcation of the moment conditions. In order to optimize performance under potential misspeciﬁcation, the weighting matrix for this GMM estimator takes into account this potential bias, and therefore diﬀers from the one that is optimal under correct speciﬁcation. To formally show the near-optimality of these CIs, we develop asymptotic eﬀiciency bounds for inference in the locally misspeciﬁed GMM setting. These bounds may be of independent interest, due to their implications for the possibility of using moment selection procedures when conducting inference in moment condition models. We apply our methods in an empirical application to automobile demand, and show that adjusting the weighting matrix can shrink the CIs by a factor of 3 or more.