4,630
4.6K
Jan 6, 2020
01/20
by
Oliver Johnson
texts
eye 4,630
favorite 13
comment 0
Other Scenes, an American counterculture newspaper. From the collection of Letterform Archive . To schedule a visit, please click here.
Topic: Counterculture
50
50
Sep 22, 2013
09/13
by
Peter Harremoes; Oliver Johnson; Ioannis Kontoyiannis
texts
eye 50
favorite 0
comment 0
Renyi's "thinning" operation on a discrete random variable is a natural discrete analog of the scaling operation for continuous random variables. The properties of thinning are investigated in an information-theoretic context, especially in connection with information-theoretic inequalities related to Poisson approximation results. The classical Binomial-to-Poisson convergence (sometimes referred to as the "law of small numbers" is seen to be a special case of a thinning...
Source: http://arxiv.org/abs/0906.0690v1
223
223
Jul 18, 2012
07/12
by
Schoolcraft, Oliver Johnson
texts
eye 223
favorite 0
comment 0
398
398
Jan 23, 2009
01/09
by
Oliver Johnson
texts
eye 398
favorite 0
comment 0
Book digitized by Google from the library of the University of Michigan and uploaded to the Internet Archive by user tpb.
Topics: abolitionists, domzcdoygooglc, thayer, slavery, garrison, garrisonian, republican, emigrant, slave,...
Source: http://books.google.com/books?id=RfJYAAAAMAAJ&oe=UTF-8
32 p. ; 21 cm.
Topics: Anti-slavery movement, Abolition, Black Americans, Emancipation, Law, Justice, Rare books and...
55 pages : 23 cm
Topics: Golf courses Equipment and supplies Catalogs, Turfgrasses Seeds Catalogs, Nursery stock Illinois...
12
12
Jun 28, 2018
06/18
by
Thomas Kealy; Oliver Johnson; Robert Piechocki
texts
eye 12
favorite 0
comment 0
We consider the problem of reconstructing wideband frequency spectra from distributed, compressive measurements. The measurements are made by a network of nodes, each independently mixing the ambient spectra with low frequency, random signals. The reconstruction takes place via local transmissions between nodes, each performing simple statistical operations such as ridge regression and shrinkage.
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1506.07436
2
2.0
Jun 29, 2018
06/18
by
Peter Harremoës; Oliver Johnson; Ioannis Kontoyiannis
texts
eye 2
favorite 0
comment 0
In this paper we establish lower bounds on information divergence of a distribution on the integers from a Poisson distribution. These lower bounds are tight and in the cases where a rate of convergence in the Law of Thin Numbers can be computed the rate is determined by the lower bounds proved in this paper. General techniques for getting lower bounds in terms of moments are developed. The results about lower bound in the Law of Thin Numbers are used to derive similar results for the Central...
Topics: Probability, Mathematics
Source: http://arxiv.org/abs/1601.04255
4
4.0
Jun 30, 2018
06/18
by
Vaia Kalokidou; Oliver Johnson; Robert Piechocki
texts
eye 4
favorite 0
comment 0
Heterogeneous networks have a key role in the design of future mobile communication networks, since the employment of small cells around a macrocell enhances the network's efficiency and decreases complexity and power demand. Moreover, research on Blind Interference Alignment (BIA) has shown that optimal Degrees of Freedom (DoF) can be achieved in certain network architectures, with no requirement of Channel State Information (CSI) at the transmitters. Our contribution is a generalised model of...
Topics: Mathematics, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1407.2391
Cover title
Topics: Golf courses Equipment and supplies Catalogs, Turfgrasses Seeds Catalogs, Nursery stock Illinois...
37
37
Sep 22, 2013
09/13
by
Oliver Johnson; Ioannis Kontoyiannis; Mokshay Madiman
texts
eye 37
favorite 0
comment 0
Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisson approximation limit theorems (analogous to the corresponding developments for the central limit theorem and for simple Poisson approximation), this work examines sufficient conditions under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. We show that the natural analog of the Poisson maximum entropy property...
Source: http://arxiv.org/abs/0805.4112v1
30
30
Jun 26, 2018
06/18
by
Vaia Kalokidou; Oliver Johnson; Robert Piechocki
texts
eye 30
favorite 0
comment 0
Future mobile communication networks will require enhanced network efficiency and reduced system overhead due to their user density and high data rate demanding applications of the mobile devices. Research on Blind Interference Alignment (BIA) and Topological Interference Management (TIM) has shown that optimal Degrees of Freedom (DoF) can be achieved, in the absence of Channel State Information (CSI) at the transmitters, reducing the network's overhead. Moreover, the recently emerged...
Topics: Information Theory, Mathematics, Computing Research Repository
Source: http://arxiv.org/abs/1501.07723
58
58
Sep 22, 2013
09/13
by
Oliver Johnson
texts
eye 58
favorite 0
comment 0
We prove that the Poisson distribution maximises entropy in the class of ultra-log-concave distributions, extending a result of Harremo\"{e}s. The proof uses ideas concerning log-concavity, and a semigroup action involving adding Poisson variables and thinning. We go on to show that the entropy is a concave function along this semigroup.
Source: http://arxiv.org/abs/math/0603647v2
57
57
Sep 19, 2013
09/13
by
Oliver Johnson
texts
eye 57
favorite 0
comment 0
The Poincare constant R(Y) of a random variable Y relates the L2 norm of a function g and its derivative g'. Since R(Y) - Var(Y) is positive, with equality if and only if Y is normal, it can be seen as a distance from the normal distribution. In this paper we establish a best possible rate of convergence of this distance in the Central Limit Theorem. Furthermore, we show that R(Y) is finite for discrete mixtures of normals, allowing us to add rates to the proof of the Central Limit Theorem in...
Source: http://arxiv.org/abs/math/0206227v1
55
55
Sep 19, 2013
09/13
by
Dino Sejdinovic; Oliver Johnson
texts
eye 55
favorite 0
comment 0
An information theoretic perspective on group testing problems has recently been proposed by Atia and Saligrama, in order to characterise the optimal number of tests. Their results hold in the noiseless case, where only false positives occur, and where only false negatives occur. We extend their results to a model containing both false positives and false negatives, developing simple information theoretic bounds on the number of tests required. Based on these bounds, we obtain an improved order...
Source: http://arxiv.org/abs/1010.2441v1
174
174
Nov 30, 2009
11/09
by
Schoolcraft, Oliver Johnson
texts
eye 174
favorite 0
comment 0
Book digitized by Google from the library of University of California and uploaded to the Internet Archive by user tpb.
Source: http://books.google.com/books?id=7zZIAAAAIAAJ&oe=UTF-8
40
40
Apr 9, 2012
04/12
by
Oliver Johnson
texts
eye 40
favorite 1
comment 0
38
38
Sep 18, 2013
09/13
by
Oliver Johnson; Christina Goldschmidt
texts
eye 38
favorite 0
comment 0
We extend Hoggar's theorem that the sum of two independent discrete-valued log-concave random variables is itself log-concave. We introduce conditions under which the result still holds for dependent variables. We argue that these conditions are natural by giving some applications. Firstly, we use our main theorem to give simple proofs of the log-concavity of the Stirling numbers of the second kind and of the Eulerian numbers. Secondly, we prove results concerning the log-concavity of the sum...
Source: http://arxiv.org/abs/math/0502548v2
66
66
Sep 23, 2013
09/13
by
Oliver Johnson
texts
eye 66
favorite 0
comment 0
We adapt arguments concerning entropy-theoretic convergence from the independent case to the case of FKG random variables. FKG systems are chosen since their dependence structure is controlled through covariance alone, though in the sequel we use many of the same arguments for weakly dependent random variables. As in previous work of Barron and Johnson, we consider random variables perturbed by small normals, since the FKG property gives us control of the resulting densities. We need to impose...
Source: http://arxiv.org/abs/math/0109156v1
54
54
Sep 23, 2013
09/13
by
Yaming Yu; Oliver Johnson
texts
eye 54
favorite 0
comment 0
Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T_a. That is, if X and Y are independent random variables on Z_+ with ultra-log-concave probability mass functions, then H(T_a X+T_{1-a} Y)>= a H(X)+(1-a)H(Y), 0
Source: http://arxiv.org/abs/0904.1446v1
83
83
Sep 22, 2013
09/13
by
Oliver Johnson
texts
eye 83
favorite 0
comment 0
Consider a sphere of radius root(n) in n dimensions, and consider X, a random variable uniformly distributed on its surface. Poincare's Observation states that for large n, the distribution of the first k coordinates of X is close in total variation distance to the standard normal N(0,I_k). In this paper, we consider a larger family of manifolds, and X taking a more general distribution on the surfaces. We establish a bound in the stronger Kullback--Leibler sense of relative entropy, and...
Source: http://arxiv.org/abs/math/0201273v1
48
48
Sep 17, 2013
09/13
by
Oliver Johnson; Ioannis Kontoyiannis; Mokshay Madiman
texts
eye 48
favorite 0
comment 0
Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, {\em Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum...
Source: http://arxiv.org/abs/0912.0581v2
4
4.0
Jun 28, 2018
06/18
by
Oliver Johnson
texts
eye 4
favorite 0
comment 0
We describe five types of results concerning information and concentration of discrete random variables, and relationships between them, motivated by their counterparts in the continuous case. The results we consider are information theoretic approaches to Poisson approximation, the maximum entropy property of the Poisson distribution, discrete concentration (Poincar\'{e} and logarithmic Sobolev) inequalities, monotonicity of entropy and concavity of entropy in the Shepp--Olkin regime.
Topics: Information Theory, Probability, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1510.05390
33
33
Sep 18, 2013
09/13
by
Oliver Johnson
texts
eye 33
favorite 0
comment 0
We analyse the properties of the Principal Fitted Components (PFC) algorithm proposed by Cook. We derive theoretical properties of the resulting estimators, including sufficient conditions under which they are $\sqrt{n}$-consistent, and explain some of the simulation results given in Cook's paper. We use techniques from random matrix theory and perturbation theory. We argue that, under Cook's model at least, the PFC algorithm should outperform the Principal Components algorithm.
Source: http://arxiv.org/abs/0806.4120v2
13
13
Jun 28, 2018
06/18
by
Oliver Johnson
texts
eye 13
favorite 0
comment 0
We consider probability mass functions $V$ supported on the positive integers using arguments introduced by Caputo, Dai Pra and Posta, based on a Bakry--\'{E}mery condition for a Markov birth and death operator with invariant measure $V$. Under this condition, we prove a modified logarithmic Sobolev inequality, generalizing and strengthening results of Wu, Bobkov and Ledoux, and Caputo, Dai Pra and Posta. We show how this inequality implies results including concentration of measure and...
Topics: Information Theory, Computing Research Repository, Mathematics, Probability
Source: http://arxiv.org/abs/1507.06268
60
60
Sep 19, 2013
09/13
by
Oliver Johnson; Christophe Vignat
texts
eye 60
favorite 0
comment 0
We consider the Student-t and Student-r distributions, which maximise Renyi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Renyi maximising family, and show that the Renyi maximisers are the case of equality in a version of the Entropy Power Inequality. Further, we show that the Renyi maximisers...
Source: http://arxiv.org/abs/math/0507400v2
178
178
Aug 26, 2008
08/08
by
Oliver Johnson , Vermont Anti-Slavery Society
texts
eye 178
favorite 0
comment 0
Book digitized by Google from the library of Harvard University and uploaded to the Internet Archive by user tpb.
Topics: slaves, slavery, slave, emancipation, principles, slaveholders, masters, colored, labor, liberty,...
Source: http://books.google.com/books?id=2Q7Snwr2MvYC&oe=UTF-8
132
132
Jul 20, 2013
07/13
by
Oliver Johnson; Richard Samworth
texts
eye 132
favorite 0
comment 0
We give a new proof of the classical Central Limit Theorem, in the Mallows ($L^r$-Wasserstein) distance. Our proof is elementary in the sense that it does not require complex analysis, but rather makes use of a simple subadditive inequality related to this metric. The key is to analyse the case where equality holds. We provide some results concerning rates of convergence. We also consider convergence to stable distributions, and obtain a bound on the rate of such convergence.
Source: http://arxiv.org/abs/math/0406218v2
Topics: Middlebury College, Letters, Correspondence, Abolitionism, Anti-slavery, Johnson, Oliver, Robinson,...
5
5.0
Jun 29, 2018
06/18
by
Oliver Johnson; Matthew Aldridge; Jonathan Scarlett
texts
eye 5
favorite 0
comment 0
We consider the nonadaptive group testing problem in the case that each item appears in a constant number of tests, chosen uniformly at random with replacement, so that the testing matrix has (almost) constant column weights. We analyse the performance of simple and practical algorithms in a range of sparsity regimes, showing that the performance is consistently improved in comparison with more standard Bernoulli designs. In particular, using a constant-column weight design, the DD algorithm is...
Topics: Probability, Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1612.07122
56
56
Sep 21, 2013
09/13
by
Oliver Johnson; Dino Sejdinovic; James Cruise; Ayalvadi Ganesh; Robert Piechocki
texts
eye 56
favorite 0
comment 0
Given the output of a data source taking values in a finite alphabet, we wish to detect change-points, that is times when the statistical properties of the source change. Motivated by ideas of match lengths in information theory, we introduce a novel non-parametric estimator which we call CRECHE (CRossings Enumeration CHange Estimator). We present simulation evidence that this estimator performs well, both for simulated sources and for real data formed by concatenating text sources. For...
Source: http://arxiv.org/abs/1106.5714v2
57
57
Sep 23, 2013
09/13
by
Oliver Johnson
texts
eye 57
favorite 0
comment 0
We adapt arguments concerning information-theoretic convergence in the Central Limit Theorem to the case of dependent random variables under Rosenblatt mixing conditions. The key is to work with random variables perturbed by the addition of a normal random variable, giving us good control of the joint density and the mixing coefficient. We strengthen results of Takano and of Carlen and Soffer to provide entropy-theoretic, not weak convergence.
Source: http://arxiv.org/abs/0810.0593v1
0
0.0
Aug 18, 2021
08/21
by
Oliver Johnson
data
eye 0
favorite 0
comment 0
Source: https://osf.io/xy5kh/
2
2.0
Jun 30, 2018
06/18
by
Tom Kealy; Oliver Johnson; Robert Piechocki
texts
eye 2
favorite 0
comment 0
We consider the group testing problem, in the case where the items are defective independently but with non-constant probability. We introduce and analyse an algorithm to solve this problem by grouping items together appropriately. We give conditions under which the algorithm performs essentially optimally in the sense of information-theoretic capacity. We use concentration of measure results to bound the probability that this algorithm requires many more tests than the expected number. This...
Topics: Mathematics, Probability, Computing Research Repository, Information Theory
Source: http://arxiv.org/abs/1409.8653
69
69
Sep 22, 2013
09/13
by
Oliver Johnson
texts
eye 69
favorite 0
comment 0
We provide a condition under which a version of Shannon's Entropy Power Inequality will hold for dependent variables. We provide information inequalities extending those found in the independent case.
Source: http://arxiv.org/abs/math/0111021v1
48
48
Sep 22, 2013
09/13
by
Oliver Johnson; Andrew Barron
texts
eye 48
favorite 0
comment 0
We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincare inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher information ever becomes finite then it converges to zero.
Source: http://arxiv.org/abs/math/0111020v2
169
169
Jul 20, 2013
07/13
by
Richard Samworth; Oliver Johnson
texts
eye 169
favorite 0
comment 0
We study the rate of convergence of the Mallows distance between the empirical distribution of a sample and the underlying population. The surprising feature of our results is that the convergence rate is slower in the discrete case than in the absolutely continuous setting. We show how the hazard function plays a significant role in these calculations. As an application, we recall that the quantity studied provides an upper bound on the distance between the bootstrap distribution of a sample...
Source: http://arxiv.org/abs/math/0406603v1
44
44
Sep 19, 2013
09/13
by
Oliver Johnson; Yaming Yu
texts
eye 44
favorite 0
comment 0
We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the Entropy Power Inequality do not in fact hold, but propose an alternative formulation which does always hold. The key to many proofs of Shannon's Entropy Power Inequality is the behaviour of entropy on...
Source: http://arxiv.org/abs/0909.0641v2
59
59
Sep 22, 2013
09/13
by
A. D. Barbour; Oliver Johnson; Ioannis Kontoyiannis; Mokshay Madiman
texts
eye 59
favorite 0
comment 0
An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let $P_{S_n}$ be the distribution of a sum $S_n=\Sumn Y_i$ of independent integer-valued random variables $Y_i$. Nonasymptotic bounds are derived for the distance between $P_{S_n}$ and an appropriately chosen compound Poisson law. In the case where all $Y_i$ have the same conditional distribution given $\{Y_i\neq 0\}$, a...
Source: http://arxiv.org/abs/1004.3692v1
3
3.0
Jun 29, 2018
06/18
by
Vaia Kalokidou; Oliver Johnson; Robert Piechocki
texts
eye 3
favorite 0
comment 0
Future multi-tier communication networks will require enhanced network capacity and reduced overhead. In the absence of Channel State Information (CSI) at the transmitters, Blind Interference Alignment (BIA) and Topological Interference Management (TIM) can achieve optimal Degrees of Freedom (DoF), minimising network's overhead. In addition, Non-Orthogonal Multiple Access (NOMA) can increase the sum rate of the network, compared to orthogonal radio access techniques currently adopted by 4G...
Topics: Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1601.08132
44
44
Sep 22, 2013
09/13
by
Oliver Johnson; Matthew Aldridge; Robert Piechocki
texts
eye 44
favorite 0
comment 0
Ergodic interference alignment, as introduced by Nazer et al (NGJV), is a technique that allows high-rate communication in n-user interference networks with fast fading. It works by splitting communication across a pair of fading matrices. However, it comes with the overhead of a long time delay until matchable matrices occur: the delay is q^n^2 for field size q. In this paper, we outline two new families of schemes, called JAP and JAP-B, that reduce the expected delay, sometimes at the cost of...
Source: http://arxiv.org/abs/1004.0208v3
7
7.0
Jun 29, 2018
06/18
by
Matthew Aldridge; Oliver Johnson; Jonathan Scarlett
texts
eye 7
favorite 0
comment 0
We consider nonadaptive group testing where each item is placed in a constant number of tests. The tests are chosen uniformly at random with replacement, so the testing matrix has (almost) constant column weights. We show that performance is improved compared to Bernoulli designs, where each item is placed in each test independently with a fixed probability. In particular, we show that the rate of the practical COMP detection algorithm is increased by 31% in all sparsity regimes. In dense...
Topics: Statistics Theory, Statistics, Information Theory, Computing Research Repository, Mathematics
Source: http://arxiv.org/abs/1602.03471
Topics: Middlebury College, Letters, Correspondence, Abolitionism, Anti-slavery, Johnson, Oliver, Robinson,...
86
86
Jul 20, 2013
07/13
by
Erwan Hillion; Oliver Johnson; Yaming Yu
texts
eye 86
favorite 0
comment 0
We consider probability measures supported on a finite discrete interval $[0,n]$. We introduce a new finitedifference operator $\nabla_n$, defined as a linear combination of left and right finite differences. We show that this operator $\nabla_n$ plays a key role in a new Poincar\'e (spectral gap) inequality with respect to binomial weights, with the orthogonal Krawtchouk polynomials acting as eigenfunctions of the relevant operator. We briefly discuss the relationship of this operator to the...
Source: http://arxiv.org/abs/1107.0127v1
68
68
Sep 20, 2013
09/13
by
Oliver Johnson
texts
eye 68
favorite 0
comment 0
Define the non-overlapping return time of a random process to be the number of blocks that we wait before a particular block reappears. We prove a Central Limit Theorem based on these return times. This result has applications to entropy estimation, and to the problem of determining if digits have come from an independent equidistribted sequence. In the case of an equidistributed sequence, we use an argument based on negative association to prove convergence under weaker conditions.
Source: http://arxiv.org/abs/math/0506165v1
Caption title
Topics: Nursery stock Prices Illinois Chicago, Field crops Seeds Catalogs, Grain Seeds Catalogs, Grasses...
41
41
Sep 21, 2013
09/13
by
Leonardo Baldassini; Oliver Johnson; Matthew Aldridge
texts
eye 41
favorite 0
comment 0
We define capacity for group testing problems and deduce bounds for the capacity of a variety of noisy models, based on the capacity of equivalent noisy communication channels. For noiseless adaptive group testing we prove an information-theoretic lower bound which tightens a bound of Chan et al. This can be combined with a performance analysis of a version of Hwang's adaptive group testing algorithm, in order to deduce the capacity of noiseless and erasure group testing models.
Source: http://arxiv.org/abs/1301.7023v2
46
46
Sep 23, 2013
09/13
by
Fraser Daly; Oliver Johnson
texts
eye 46
favorite 0
comment 0
We give bounds on the Poincare (inverse spectral gap) constant of a non-negative, integer-valued random variable W, under negative dependence assumptions such as ultra log-concavity and total negative dependence. We show that the bounds obtained compare well to others in the literature. Examples treated include some occupancy and urn models, a random graph model and small spacings on the circumference of a circle. Applications to Poisson convergence theorems are considered.
Source: http://arxiv.org/abs/0801.2112v2
Topics: Middlebury College, Letters, Correspondence, Abolitionism, Anti-slavery, Johnson, Oliver, Robinson,...
52
52
Sep 23, 2013
09/13
by
Oliver Johnson; Yuri Suhov
texts
eye 52
favorite 0
comment 0
A model of a quantum information source is proposed, based on the Gibbs ensemble of ideal (free) particles (bosons or fermions). We identify the (thermodynamic) von Neumann entropy as the information rate and establish the classical Lempel--Ziv universal coding algorithm in Grassberger's form for such a source. This generalises the Schumacher theorem to the case of non-IID qubits.
Source: http://arxiv.org/abs/math-ph/0109023v2