## Project 1: The Entropy-Influence Conjecture

The entropy-influence conjecture was originally asked by Ehud Friedgut and Gil Kalai in 1996.

For a boolean function $f:\{-1,1\}^{n}\rightarrow\{-1,1\}$, its influence is $I(f) := \sum_{i\in[n]}\Pr_{x}[f(x) \neq f(x\oplus e_{i})]$.
The entropy of $f$ is defined by $\mathcal{H}(f):= -\sum_{S}\hat{f}(S)^{2}\log(\hat{f}(S)^{2})$. The entropy-influence conjecture asks, could we prove that $\mathcal{H}(f) = O(I(f))$ for any boolean function $f$.

In my knowledge, the best known (general) result was given by Gopalan, Servedio, Tal and Wigderson [1]. They proved that $\mathcal{H}(f) = O(\log (s_{f}) \cdot I(f))$ where $s_{f}$ is the sensitivity of $f$. By plugging-in a robust version of [1], a result of Lovett, Tal and Zhang [2] shows that we can replace $s_{f}$ by the robust sensitivity. In particular, it shows $\mathcal{H}(f) = O(w\cdot \log w)$ for any width-$w$ DNF $f$.

The entropy-influence conjecture is known true for some classes of boolean functions. However it is still hard for general boolean functions. It is even non-trivial to prove that $\mathcal{H}(f) = 2^{O(I(f))}$.

An easier question is the min-entropy influence conjecture. Which asks could we prove that $\mathcal{H}_{\infty}(f) = O(I(f))$. By Friedgut’s juntas theorem, we are able to prove $\mathcal{H}_{\infty}(f) = O(I(f)^{2})$. It is interesting to ask could we prove this true for entropy, i.e., could we prove that $\mathcal{H}(f) = O(I(f)^{2})$?

[1] Gopalan, P., Servedio, R., Tal, A. and Wigderson, A., 2016. Degree and sensitivity: tails of two distributions. arXiv preprint arXiv:1604.07432.

[2] Lovett, S., Tal, A. and Zhang, J., 2018. The robust sensitivity of boolean functions. In Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms (pp. 1822-1833). Society for Industrial and Applied Mathematics.