|Title:||INFORMATION THEORY AND PRIVACY RELATED QUESTIONS IN COMMUNICATION COMPLEXITY
|Currently accessibly only within the Technion network|
|Abstract:||We study the correlation between the communication complexity of a task, i.e. the number of bits that the communicating parties must send in order to solve it, and the internal and external information complexities, a measure of how much information the parties need to reveal to each other, or to an external observer, about their inputs. We compare the relation between these measures to the tight relation between the communication needed of one way message sending and the entropy function, and show the cases in which these are analogues and the cases in which they are not. We provide the first example of a computational task for which the internal and external information complexities are exponentially apart, even though the computation output itself does not reveal much information. We discuss the characteristics of the communication model we define, in which at the end of a computation the output must be broadcast to the public. We discuss the connection between using private randomness as a method of hiding information from the other parties while performing a computational task, and also provide a full proof of a long time known theorem showing that for every boolean function, the communication complexity is at least logarithmic in the input size, if we don't allow the usage of shared randomness.|
|Copyright||The above paper is copyright by the Technion, Author(s), or others. Please contact the author(s) for more information|
Remark: Any link to this technical report should be to this page (http://www.cs.technion.ac.il/users/wwwb/cgi-bin/tr-info.cgi/2016/MSC/MSC-2016-11), rather than to the URL of the PDF files directly. The latter URLs may change without notice.
To the list of the MSC technical reports of 2016
To the main CS technical reports page
Computer science department, Technion