A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
IEEE Transactions on Information Theory, 2019
In this work we provide a characterization of the class of divergences that bypasses the use of non-parametric smoothing in the construction of divergences, leading to minimum distance estimation free of issues such as dimensionality, smoothness etc. that usual kernell density esitmators exhibit.
with Ayanendranath Basu [Paper link]
Conference on Learning Theory, 2020
We find out minimax rate of estimating profile of a finite population in certain small sample regime. Our method involves characterizing both upper and lower bounds via an infinite dimensional optimization problem based on Wolfowitz minimum distance estimators.
with Yury Polyanskiy and Yihong Wu [Paper link]
Analyzing prediction problems on first order Markov chains we demonstrate the optimal rate when the size of states space and sample size is provided. We also analyze effect of spectral gaps in case of reversible chains to achieve parametric rate of estimation.
with Yanjun Han and Yihong Wu [Paper link]
We study the problem of support set recovery in linear models using the non-convex penalties of stochastic gates (STG). Both theoretical and application aspects are discussed.
with Henry Li, Yutaro Yamada and Ofir Lindenbaum [Paper link]
In preparation, 2021
with Yury Polyanskiy and Yihong Wu
I presented my paper on Optimal prediction of Markov chains with and without spectral gaps at NeurIPS 2021. Here is a link to the poster.