Зарегистрироваться
Восстановить пароль
FAQ по входу

Du D.-Z., Ko K.-I. Theory of Computational Complexity

  • Файл формата djvu
  • размером 5,67 МБ
  • Добавлен пользователем
  • Описание отредактировано
Du D.-Z., Ko K.-I. Theory of Computational Complexity
Издательство John Wiley, 2000, -506 pp.
Computational complexity theory has been a central area of theoretical computer science since its early development in the mid-1960s. Its subsequent rapid development in the next three decades has not only established itself as a rich, exciting theory but also shown strong influence on many other related areas in computer science, mathematics, and operations research. We may roughly divide its development into three periods. The works from the mid- 1960s to the early 1970s paved a solid foundation for the theory of feasible computation. The Turing machine-based complexity theory and the axiomatic complexity theory established computational complexity as an independent mathematics discipline. The identification of polynomial-time computability with feasible computability and the discovery of the NP-complete problems consolidated the P versus NP question as the central issue of the complexity theory.
From the early 1970s to the mid-1980s, research in computational complexity expanded at an exponential rate both in depth and in breadth. Various new computational models were proposed as alternatives to the traditional deterministic models. Finer complexity hierarchies and complexity classes were identified from these new models and more accurate classifications have been obtained for the complexity of practical algorithmic problems. Parallel computational models, such as, alternating Turing machines and parallel random access machines, together with the NChierarchy, provide a tool for the classification of the complexity of feasible problems. Probabilistic Turing machines are a model for the complexity theory of distribution-independent randomized algorithms. Interactive proof systems, an extension of probabilistic Turing machines, and communication complexity study the complexity aspect of distributed or interactive computing. The study of one-way functions led to a breakthrough in cryptography. A theory of average-case completeness, based on the notion of distribution-faithful reductions, aims at establishing the foundation for the distribution-dependent average-case complexity. Boolean and threshold circuits are models for nonuniform complexity in which algebraic and topological methods have found interesting applications. Oracle Turing machines are the model for the theory of relativization in which combinatorial techniques meet recursion-theoretic techniques. Program-size complexity (or, Kolmogorov complexity) formalizes the notion of descriptive complexity and has strong connections with computational complexity. Although the central questions remain open, these developments demonstrate that computational complexity is a rich discipline with both a deep mathematical theory and a diverse area of applications.
Beginning in the mid-1980s, we have seen a number of deep, surprising results using diverse, sophisticated proof techniques. In addition, seemingly independent subareas have found interesting connections. The exponential lower bounds for monotone circuits and constant-depth circuits have been found using probabilistic and algebraic methods. The connection between constant-depth circuits and relativization led to the relativized separation of the polynomial-time hierarchy. The technique of nondeterministic iterative counting has been used to collapse the nondeterministic space hierarchy. The study of probabilistic reductions gave us the surprising result about the power of the counting class #P versus the polynomial-time hierarchy. Arithmetization of Boolean computation in interactive proof systems collapses the class of polynomial space to the class of sets with interactive proof systems. Further development of this research, together with techniques of coding theory, have led to strong negative results in combinatorial approximation.
As outlined above, complexity theory has grown fast both in breadth and in depth. With so many new computational models, new proof techniques and applications in different areas, it is simply not possible to cover all important topics of the theory in a single book. The goal of this book is therefore not to provide a comprehensive treatment of complexity theory. Instead, we only select some of the fundamental areas which we believe represent the most important recent advances in complexity theory, in particular, on the P versus NP problem, and present the complete treatment of these subjects. The presentation follows the approach of traditional mathematics textbooks. With a small number of exceptions, all theorems are presented with rigorous mathematical proofs.
Part I Uniform Complexity
Models of Computation and Complexity Classes
NP-Completeness
The Polynomial-Time Hierarchy and Polynomial Space
Structure of NP
Part II Nonuniform Complexity
Decision Trees
Circuit Complexity
Polynomial-Time Isomorphism
Part III Probabilistic Complexity
Probabilistic Machines and Complexity Classes
Complexity of Counting
Interactive Proof Systems
Probabilistically Checkable Proofs and NP-Hard Optimization Problems
  • Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
  • Регистрация