A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
IEEE Transactions on Information Theory, 2019
In this work, we provide a characterization of the class of divergences that bypasses the use of nonparametric smoothing in the construction of the divergences, leading to minimum distance estimation free of issues such as dimensionality, smoothness, etc. that the usual kernel density estimators exhibit.
with Ayanendranath Basu [Paper link]
Conference on Learning Theory, 2020
We find out the minimax rate of estimating the profile of a finite population in small sample regimes. Our method involves characterizing both upper and lower bounds via an infinite-dimensional optimization problem based on Wolfowitz minimum distance estimators.
with Yury Polyanskiy and Yihong Wu [Paper link]
Analyzing a prediction problem on the first-order Markov chain, we demonstrate the optimal minimax rate when the states-space size and sample size are provided. We also investigate the effect of spectral gaps in the case of reversible chains to achieve the parametric rate of estimation.
with Yanjun Han and Yihong Wu [Paper link]
We study the problem of support-set recovery in the linear models using the non-convex penalties of stochastic gates (STG). Both the theoretical and application aspects are discussed.
with Henry Li, Yutaro Yamada and Ofir Lindenbaum [Paper link]
In preparation, 2021
with Yury Polyanskiy and Yihong Wu
I presented my paper on Optimal prediction of Markov chains with and without spectral gaps at NeurIPS 2021. Here is a link to the poster.