A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.





A characterization of all single-integral, non-kernel divergence estimators.*

IEEE Transactions on Information Theory, 2019

In this work we provide a characterization of the class of divergences that bypasses the use of non-parametric smoothing in the construction of divergences, leading to minimum distance estimation free of issues such as dimensionality, smoothness etc. that usual kernell density esitmators exhibit.

with Ayanendranath Basu [Paper link]

Extrapolating the profile of a finite population

Conference on Learning Theory, 2020

We find out minimax rate of estimating profile of a finite population in certain small sample regime. Our method involves characterizing both upper and lower bounds via an infinite dimensional optimization problem based on Wolfowitz minimum distance estimators.

with Yury Polyanskiy and Yihong Wu [Paper link]