Nick Duffield (Rutgers University)
Wednesday, 30.4.2014, 11:30
Massive graph datasets are used operationally by providers of internet, social network and search services. Sampling can reduce storage requirements as well as query execution times, while prolonging the useful life of the data for baselining and retrospective analysis. Here, sampling must mediate between the characteristics of the data, the available resources, and the accuracy needs of queries. Inference methods can be used to fuse datasets which individually provide only an incomplete view of the system under study. In this talk we describe some successes in applying these ideas to massive Internet measurements and some potential new applications to inverse problems in urban informatics, and to bioinformatics.
Nick Duffield joined Rutgers University as a Research Professor in 2013. Previously, he worked at AT &T Labs Research, where he was a Distinguished Member of Technical Staff and an AT &T Fellow, and held faculty positions in Europe. He works on network and data science, particularly the acquisition, analysis and applications of operational network data. He was formerly chair of the IETF Working Group on Packet Sampling, and an associate editor of the IEEE/ACM Transactions on Networking. He is an IEEE Fellow and was a co-recipient of the ACM Sigmetrics Test of Time Award in both 2012 and 2013 for work in network tomography. He was recently TPC Co-Chair of IFIP Performance 2013, a keynote speaker at the 25th International Teletraffic Congress in Shanghai, China, and an invited speaker at the workshop on Big Data in the Mathematical Sciences in Warwick, UK.