3
3.0

Jun 30, 2018
06/18

by
Jonathan Scarlett; Vincent Y. F. Tan

texts

#
eye 3

#
favorite 0

#
comment 0

This paper studies the second-order asymptotics of the discrete memoryless multiple-access channel with degraded message sets. For a fixed average error probability $\epsilon\in(0,1)$ and an arbitrary point on the boundary of the capacity region, we characterize the speed of convergence of rate pairs that converge to that point for codes that have asymptotic error probability no larger than $\epsilon$, thus complementing an analogous result given previously for the Gaussian setting.

Topics: Mathematics, Computing Research Repository, Information Theory

Source: http://arxiv.org/abs/1408.1119

3
3.0

Jun 29, 2018
06/18

by
Ilija Bogunovic; Jonathan Scarlett; Volkan Cevher

texts

#
eye 3

#
favorite 0

#
comment 0

We consider the sequential Bayesian optimization problem with bandit feedback, adopting a formulation that allows for the reward function to vary with time. We model the reward function using a Gaussian process whose evolution obeys a simple Markov model. We introduce two natural extensions of the classical Gaussian process upper confidence bound (GP-UCB) algorithm. The first, R-GP-UCB, resets GP-UCB at regular intervals. The second, TV-GP-UCB, instead forgets about old data in a smooth...

Topics: Machine Learning, Learning, Computing Research Repository, Statistics

Source: http://arxiv.org/abs/1601.06650

5
5.0

Jun 30, 2018
06/18

by
Jonathan Scarlett; Alfonso Martinez; Albert Guillén i Fàbregas

texts

#
eye 5

#
favorite 0

#
comment 0

This paper presents a saddlepoint approximation of the random-coding union bound of Polyanskiy et al. for i.i.d. random coding over discrete memoryless channels. The approximation is single-letter, and can thus be computed efficiently. Moreover, it is shown to be asymptotically tight for both fixed and varying rates, unifying existing achievability results in the regimes of error exponents, second-order coding rates, and moderate deviations. For fixed rates, novel exact-asymptotics expressions...

Topics: Mathematics, Computing Research Repository, Information Theory

Source: http://arxiv.org/abs/1402.3941

2
2.0

Jun 29, 2018
06/18

by
Jonathan Scarlett; Alfonso Martinez; Albert Guillén i Fàbregas

texts

#
eye 2

#
favorite 0

#
comment 0

This paper studies channel coding for the discrete memoryless multiple-access channel with a given (possibly suboptimal) decoding rule. A multi-letter successive decoding rule depending on an arbitrary non-negative decoding metric is considered, and achievable rate regions and error exponents are derived both for the standard MAC (independent codebooks), and for the cognitive MAC (one user knows both messages) with superposition coding. In the cognitive case, the rate region and error exponent...

Topics: Information Theory, Computing Research Repository, Mathematics

Source: http://arxiv.org/abs/1612.00211

2
2.0

Jun 29, 2018
06/18

by
Jonathan Scarlett; Volkan Cevher

texts

#
eye 2

#
favorite 0

#
comment 0

In this paper, we consider the problem of estimating the underlying graph associated with an Ising model given a number of independent and identically distributed samples. We adopt an \emph{approximate recovery} criterion that allows for a number of missed edges or incorrectly-included edges, in contrast with the widely-studied exact recovery problem. Our main results provide information-theoretic lower bounds on the sample complexity for graph classes imposing constraints on the number of...

Topics: Machine Learning, Mathematics, Information Theory, Learning, Statistics, Computing Research...

Source: http://arxiv.org/abs/1602.03647

2
2.0

Jun 29, 2018
06/18

by
Jonathan Scarlett; Volkan Cevher

texts

#
eye 2

#
favorite 0

#
comment 0

We consider the group testing problem, in which one seeks to identify a subset of defective items within a larger set of items based on a number of noisy tests. While matching achievability and converse bounds are known in several cases of interest for i.i.d.~measurement matrices, less is known regarding converse bounds for arbitrary measurement matrices. We address this by presenting two converse bounds for arbitrary matrices and general noise models. First, we provide a strong converse bound...

Topics: Information Theory, Computing Research Repository, Mathematics

Source: http://arxiv.org/abs/1602.00875

46
46

Sep 23, 2013
09/13

by
Jonathan Scarlett; Jamie Evans; Subhrakanti Dey

texts

#
eye 46

#
favorite 0

#
comment 0

We consider a single antenna narrowband multiple access channel in which users send training sequences to the base station and scheduling is performed based on minimum mean square error (MMSE) channel estimates. In such a system, there is an inherent tradeoff between training overhead and the amount of multiuser diversity achieved. We analyze a block fading channel with independent Rayleigh distributed channel gains, where the parameters to be optimized are the number of users considered for...

Source: http://arxiv.org/abs/1105.3531v4

6
6.0

Jun 30, 2018
06/18

by
Yen-Huan Li; Jonathan Scarlett; Pradeep Ravikumar; Volkan Cevher

texts

#
eye 6

#
favorite 0

#
comment 0

We consider the model selection consistency or sparsistency of a broad set of $\ell_1$-regularized $M$-estimators for linear and non-linear statistical models in a unified fashion. For this purpose, we propose the local structured smoothness condition (LSSC) on the loss function. We provide a general result giving deterministic sufficient conditions for sparsistency in terms of the regularization parameter, ambient dimension, sparsity level, and number of measurements. We show that several...

Topics: Mathematics, Statistics Theory, Statistics

Source: http://arxiv.org/abs/1410.7605

4
4.0

Jun 29, 2018
06/18

by
Matthew Aldridge; Oliver Johnson; Jonathan Scarlett

texts

#
eye 4

#
favorite 0

#
comment 0

We consider nonadaptive group testing where each item is placed in a constant number of tests. The tests are chosen uniformly at random with replacement, so the testing matrix has (almost) constant column weights. We show that performance is improved compared to Bernoulli designs, where each item is placed in each test independently with a fixed probability. In particular, we show that the rate of the practical COMP detection algorithm is increased by 31% in all sparsity regimes. In dense...

Topics: Statistics Theory, Statistics, Information Theory, Computing Research Repository, Mathematics

Source: http://arxiv.org/abs/1602.03471

28
28

Jun 28, 2018
06/18

by
Luca Baldassarre; Yen-Huan Li; Jonathan Scarlett; Baran Gözcü; Ilija Bogunovic; Volkan Cevher

texts

#
eye 28

#
favorite 0

#
comment 0

The problem of recovering a structured signal $\mathbf{x} \in \mathbb{C}^p$ from a set of dimensionality-reduced linear measurements $\mathbf{b} = \mathbf {A}\mathbf {x}$ arises in a variety of applications, such as medical imaging, spectroscopy, Fourier optics, and computerized tomography. Due to computational and storage complexity or physical constraints imposed by the problem, the measurement matrix $\mathbf{A} \in \mathbb{C}^{n \times p}$ is often of the form $\mathbf{A} =...

Topics: Information Theory, Statistics, Machine Learning, Mathematics, Learning, Computing Research...

Source: http://arxiv.org/abs/1510.06188

3
3.0

Jun 29, 2018
06/18

by
Oliver Johnson; Matthew Aldridge; Jonathan Scarlett

texts

#
eye 3

#
favorite 0

#
comment 0

We consider the nonadaptive group testing problem in the case that each item appears in a constant number of tests, chosen uniformly at random with replacement, so that the testing matrix has (almost) constant column weights. We analyse the performance of simple and practical algorithms in a range of sparsity regimes, showing that the performance is consistently improved in comparison with more standard Bernoulli designs. In particular, using a constant-column weight design, the DD algorithm is...

Topics: Probability, Information Theory, Computing Research Repository, Mathematics

Source: http://arxiv.org/abs/1612.07122

10
10.0

Jun 28, 2018
06/18

by
Jonathan Scarlett; Anelia Somekh-Baruch; Alfonso Martinez; Albert Guillén i Fàbregas

texts

#
eye 10

#
favorite 0

#
comment 0

This paper studies the mismatched decoding problem for binary-input discrete memoryless channels. An example is provided for which an achievable rate based on superposition coding exceeds the LM rate (Hui, 1983; Csisz\'ar-K\"orner, 1981), thus providing a counter-example to a previously reported converse result (Balakirsky, 1995). Both numerical evaluations and theoretical results are used in establishing this claim.

Topics: Information Theory, Computing Research Repository, Mathematics

Source: http://arxiv.org/abs/1508.02374

6
6.0

Jun 29, 2018
06/18

by
Jonathan Scarlett; Volkan Cevher

texts

#
eye 6

#
favorite 0

#
comment 0

We consider the problem of estimating the underlying graph associated with a Markov random field, with the added twist that the decoding algorithm can iteratively choose which subsets of nodes to sample based on the previous samples, resulting in an active learning setting. Considering both Ising and Gaussian models, we provide algorithm-independent lower bounds for high-probability recovery within the class of degree-bounded graphs. Our main results are minimax lower bounds for the active...

Topics: Machine Learning, Mathematics, Information Theory, Learning, Statistics, Computing Research...

Source: http://arxiv.org/abs/1607.02413

14
14

Jun 26, 2018
06/18

by
Jonathan Scarlett; Volkan Cevher

texts

#
eye 14

#
favorite 0

#
comment 0

The support recovery problem consists of determining a sparse subset of a set of variables that is relevant in generating a set of observations, and arises in a diverse range of settings such as compressive sensing, and subset selection in regression, and group testing. In this paper, we take a unified approach to support recovery problems, considering general probabilistic models relating a sparse data vector to an observation vector. We study the information-theoretic limits of both exact and...

Topics: Statistics Theory, Mathematics, Information Theory, Computing Research Repository, Machine...

Source: http://arxiv.org/abs/1501.07440

5
5.0

Jun 28, 2018
06/18

by
Jonathan Scarlett; Vincent Y. F. Tan; Giuseppe Durisi

texts

#
eye 5

#
favorite 0

#
comment 0

We study the second-order asymptotics of information transmission using random Gaussian codebooks and nearest neighbor (NN) decoding over a power-limited stationary memoryless additive non-Gaussian noise channel. We show that the dispersion term depends on the non-Gaussian noise only through its second and fourth moments, thus complementing the capacity result (Lapidoth, 1996), which depends only on the second moment. Furthermore, we characterize the second-order asymptotics of point-to-point...

Topics: Information Theory, Computing Research Repository, Mathematics

Source: http://arxiv.org/abs/1512.06618

43
43

Sep 23, 2013
09/13

by
Jonathan Scarlett; Alfonso Martinez; Albert Guillén i Fàbregas

texts

#
eye 43

#
favorite 0

#
comment 0

This paper presents an achievable second-order rate region for the discrete memoryless multiple-access channel. The result is obtained using a random-coding ensemble in which each user's codebook contains codewords of a fixed composition. The improvement of the second-order rate region over existing ones is demonstrated both analytically and numerically. Finally, an achievable second-order rate region for the Gaussian multiple-access channel is derived via an increasingly fine quantization of...

Source: http://arxiv.org/abs/1303.6167v3

3
3.0

Jun 29, 2018
06/18

by
Jonathan Scarlett; Volkan Cevher

texts

#
eye 3

#
favorite 0

#
comment 0

In this paper, we study the information-theoretic limits of community detection in the symmetric two-community stochastic block model, with intra-community and inter-community edge probabilities $\frac{a}{n}$ and $\frac{b}{n}$ respectively. We consider the sparse setting, in which $a$ and $b$ do not scale with $n$, and provide upper and lower bounds on the proportion of community labels recovered on average. We provide a numerical example for which the bounds are near-matching for moderate...

Topics: Machine Learning, Mathematics, Information Theory, Statistics, Computing Research Repository,...

Source: http://arxiv.org/abs/1602.00877

45
45

Sep 23, 2013
09/13

by
Jonathan Scarlett; Alfonso Martinez; Albert Guillén i Fàbregas

texts

#
eye 45

#
favorite 0

#
comment 0

This paper considers the problem of channel coding with a given (possibly suboptimal) decoding rule. Finite-length upper and lower bounds on the random-coding error probability for a general codeword distribution are given. These bounds are applied to three random-coding ensembles: i.i.d., constant-composition, and cost-constrained. Ensemble-tight error exponents are presented for each ensemble, and achievable second-order coding rates are given. Connections are drawn between the ensembles...

Source: http://arxiv.org/abs/1303.6166v1

4
4.0

Jun 29, 2018
06/18

by
Ilija Bogunovic; Jonathan Scarlett; Andreas Krause; Volkan Cevher

texts

#
eye 4

#
favorite 0

#
comment 0

We present a new algorithm, truncated variance reduction (TruVaR), that treats Bayesian optimization (BO) and level-set estimation (LSE) with Gaussian processes in a unified fashion. The algorithm greedily shrinks a sum of truncated variances within a set of potential maximizers (BO) or unclassified points (LSE), which is updated based on confidence bounds. TruVaR is effective in several important settings that are typically non-trivial to incorporate into myopic algorithms, including pointwise...

Topics: Machine Learning, Mathematics, Information Theory, Learning, Statistics, Computing Research...

Source: http://arxiv.org/abs/1610.07379

6
6.0

Jun 30, 2018
06/18

by
Volkan Cevher; Michael Kapralov; Jonathan Scarlett; Amir Zandieh

texts

#
eye 6

#
favorite 0

#
comment 0

The problem of approximately computing the $k$ dominant Fourier coefficients of a vector $X$ quickly, and using few samples in time domain, is known as the Sparse Fourier Transform (sparse FFT) problem. A long line of work on the sparse FFT has resulted in algorithms with $O(k\log n\log (n/k))$ runtime [Hassanieh et al., STOC'12] and $O(k\log n)$ sample complexity [Indyk et al., FOCS'14]. These results are proved using non-adaptive algorithms, and the latter $O(k\log n)$ sample complexity...

Topics: Data Structures and Algorithms, Computing Research Repository

Source: http://arxiv.org/abs/1702.01286