Erschienen: 13.05.2018 Abbildung von Ahlswede / Althöfer / Deppe / Tamm | Combinatorial Methods and Models | Softcover reprint of the original 1st ed. 2018 | 2018 | Rudolf Ahlswede’s Lectures on ... | 13

Ahlswede / Althöfer / Deppe / Tamm

Combinatorial Methods and Models

Rudolf Ahlswede’s Lectures on Information Theory 4

lieferbar ca. 10 Tage als Sonderdruck ohne Rückgaberecht

117,69 €

inkl. Mwst.

Softcover reprint of the original 1st ed. 2018 2018. Buch. xviii, 385 S. 11 s/w-Abbildungen, Bibliographien. Softcover

Springer. ISBN 978-3-319-85073-3

Format (B x L): 15,5 x 23,5 cm

Gewicht: 6823 g

In englischer Sprache


The fourth volume of Rudolf Ahlswede’s lectures on Information Theory is focused on Combinatorics. Ahlswede was originally motivated to study combinatorial aspects of Information Theory via zero-error codes: in this case the structure of the coding problems usually drastically changes from probabilistic to combinatorial. The best example is Shannon’s zero error capacity, where independent sets in graphs have to be examined. The extension to multiple access channels leads to the Zarankiewicz problem. A code can be regarded combinatorially as a hypergraph; and many coding theorems can be obtained by appropriate colourings or coverings of the underlying hypergraphs. Several such colouring and covering techniques and their applications are introduced in this book. Furthermore, codes produced by permutations and one of Ahlswede’s favourite research fields -- extremal problems in Combinatorics -- are presented. Whereas the first part of the book concentrates on combinatorial methods in order to analyse classical codes as prefix codes or codes in the Hamming metric, the second is devoted to combinatorial models in Information Theory. Here the code concept already relies on a rather combinatorial structure, as in several concrete models of multiple access channels or more refined distortions. An analytical tool coming into play, especially during the analysis of perfect codes, is the use of orthogonal polynomials. Classical information processing concerns the main tasks of gaining knowledge and the storage, transmission and hiding of data. The first task is the prime goal of Statistics. For transmission and hiding data, Shannon developed an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.


  • Dieses Set enthält folgende Produkte:
      Auch in folgendem Set erhältlich:
      • nach oben

        Ihre Daten werden geladen ...