Wednesday, 13.1.2016, 11:30
Traditionally, network measurement takes a small-data approach. Data is expensive, must be gathered unobtrusively, validated carefully, and used to address sharply-defined problems, if we are to obtain reliable answers. The undeniable existence of BigData in and around telephone and data networks (which have merged years ago ) and the presence of tools for learning from unstructured masses of data are changing this. A recent EU tender has called for a “crowdsourcing” approach to characterizing Internet performance at its edge in all European countries, with protocols for gathering the data needed from many sources, archiving it for wider use, and providing privacy guidelines, while making all of this web-visible to all citizens.
The types of data and measurement systems involved in traditional passive and active network measurements have evolved as new means of observation emerge. The forces driving the evolution are much cheaper hardware for measurements at the edge, making ubiquitous measurement possible, and new customers for this information. Operators and researchers are now joined by end users and regulators, who all want to know if they are getting the service and performance that they are paying for, and if not…?
Really big data sets are now available from telephone call records (CDRs and MCDRs) from every conversation, from cellphone sensors that monitor location and signal quality, and from smart phones, which log in addition all the applications that users choose to run and the associated data traffic volumes. I will show examples of the information that can be gleaned from a million or more smartphones. Besides monitoring all sorts of carrier and customer uses of new technologies, it is possible to make visible the internet’s interior from its edge. Thus we can assess network neutrality and pinpoint sources of performance issues.
Scott Kirkpatrick is Professor of Engineering and Computer Science at the Hebrew University, Jerusalem, Israel. His research in recent years has focused on understanding the growth of living engineering organisms such as the Internet and especially on the technical and human aspects of the huge space at the edges of the Internet.
He was a researcher and manager in physics and computer science at the IBM TJ Watson Research Center, leading projects in new technologies for personal computing, In 2000 he moved to his present position. At IBM he led a team which prototyped and then shipped IBM’s first Thinkpad, a pen-driven tablet computer. As a physicist, his work on percolation and spin glasses led to simulated annealing, an optimization framework widely employed in automated computer design.