Parallel Computation Models and Systems, Dataflow, Coelets, and Beyond
Friday 9/30, 8:30am to 9:30am
LCPC/CnC Workshops @ Hilton Garden Inn Rochester/University
Wednesday 9/28, 1pm to 2pm
LCPC/CnC Workshops @ Hilton Garden Inn Rochester/University
This keynote is free and open to the public but requires registration
The writing of efficient parallel programs has always been difficult, and is currently compounded by the increasing complexity of architectures. We suggest that these difficulties need to be addressed already at the algorithm design level. Resources such as books for relevant efficient algorithms are currently lacking.
In sequential computing programmers have enjoyed efficiently universal algorithms, which have permitted them to write programs independent of machines. We suggest that for parallel or multi-core computers this will no longer be generally possible. Algorithms that run efficiently will need to be aware of the resource parameters of the machines on which they run. The main promise is that of portable algorithms, those that contain efficient designs for all reasonable ranges of the basic resource parameters and input sizes.
Such portable algorithms need to be designed just once, but, once designed, they can be compiled to run efficiently on any machine. In this way the intellectual effort that goes into parallel algorithms design becomes reusable. To permit such portable algorithms some standard bridging model is needed – a common understanding between hardware and algorithm designers of what the costs of a computation are. We shall describe the Multi-BSP model as a candidate for this role. We show that for several basic problems, namely matrix multiplication, fast Fourier transform, and sorting, portable algorithms do exist that are optimal in a defined sense, for all combinations of input size and parameter values.
Leslie Valiant was educated at King’s College, Cambridge; Imperial College, London; and at Warwick University where he received his Ph.D. in computer science in 1974. He is currently T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics in the School of Engineering and Applied Sciences at Harvard University, where he has taught since 1982. Before coming to Harvard he had taught at Carnegie Mellon University, Leeds University, and the University of Edinburgh.
His work has ranged over several areas of theoretical computer science, particularly complexity theory, computational learning, and parallel computation. He also has interests in computational neuroscience, evolution and artificial intelligence.
He received the Nevanlinna Prize at the International Congress of Mathematicians in 1986, the Knuth Award in 1997, the European Association for Theoretical Computer Science EATCS Award in 2008, and the 2010 A. M. Turing Award. He is a Fellow of the Royal Society (London) and a member of the National Academy of Sciences (USA).
Colored plots/figures may be printed in grayscales which could be difficult to tell.
We can print a color table in grayscale and choose the most distinguishable colors from the printed table. Here is an example:
for three colors, we can use FF6600, 66CC66 and 0033CC.
Prerequisites: CSC 252 and CSC 254 are required for CSC 453 and recommended for others. Familiarity with a dynamic programming language such as Python.
Crosslisted: TCS 453
This course studies dynamically-typed programming languages and modular software development. Topics include principles and practice of modular design, functional and object-oriented programming techniques, software engineering concepts, software correctness and reliability, programming tools, and design examples. Ruby is used as the main instruction language. The lessons complement those in traditional compilers and programming languages courses, which focus mainly on statically-typed languages and individual algorithms rather than system design. A significant portion of the assignment is a group project.
Teaching Staff: Prof. Chen Ding, firstname.lastname@example.org; John Jacobs, email@example.com
Preparation (before first class):
“No Silver Bullet — Essence and Accidents of Software Engineering” is a classic paper on software engineering written by Turing Award winner Fred Brooks in 1986. Read the paper (available here if accessed inside the UR network) especially pages 3 to 5 on the “essential difficulties” of software development.
“A former member of the SD10 Panel on Computing in Support of Battle Management explains why he believes the ‘star wars’ effort will not achieve its stated goals.” Read the paper (available here if accessed inside the UR network) pages 2 to 4 the section titled “Why software is unreliable.” Which of the “essential difficulties” was Parnas discussing?
More background of this debate, detailed rationales and an illuminating discussion of the ethical issues can be found in another article of Parnas: “SDI: A Violation of Professional Responsibility”. The article does not seem to have a free version online, but you can read it by borrowing the book “Software Fundamentals” (included as Chapter 27) from the textbook reserve for CSC 253/453 at the Carlson Library. The lease is two hours.
Further material will be distributed through the Blackboard web site for students who have registered. Contact the instructor if you have problem accessing the site.
After the post on computational locality, it is interesting to examine our understanding of physical locality. There is a fascinating storyline in the history of physics.
Einstein is one of the greatest physicists. However, he was always troubled by quantum mechanics whose results, unlike Newton mechanics, are inherently probabilistic rather than deterministic. For example, his quote “God does not play dice with the universe” was actually a direct criticism.
The discovery of two-particle quantum entanglement gave him a chance to formulate his objection precisely. In 1935, Einstein, Boris Podolsky and Nathan Rosen distributed the so-called EPR paradox. They laid out four premises of a physics theory:
[Greenberger et al. 1990] state in EPR’s words.
For a two-particle entanglement, the measurement of particle 1 can cause no real change in particle 2. Until the measurement, the states of particles 1 and 2 have no definite values from the theory of quantum mechanics. Therefore, as a theory quantum mechanics is incomplete.
These objections were unresolved (and largely ignored) by physicists in the next fifty years. A series of studies on Bell Inequality were made, but the experimental setups were open to criticism.
In late 1980s, researchers at University of Rochester especially Leonard Mandel, Professor of Physics and Optics, and his students followed on their studies on the quantum properties of light and invented a technique called parametric down-conversion to produce entangled photons.
Daniel Greenberger, while visitingVienna with Anton Zeilinger, found that a perfect correlation among three particles was enough to disprove EPR. Michael Horne, back at Boston, discovered Mandel’s two-particle interference design based on a “folded diamond. ”
The paper by Greenberger et al. was published in 1990. It shows that for 3 or more particle entanglement, the four EPR premises would derive contradictory results. No theory that meets these four premises can explain the perfect correlation in such entanglements. In the proof by Greenberger et al., (Section III, slightly more than 1 page), “the manipulation of an additional degree of freedom is essential to the exhibition of a contradiction.”
The contradiction shows that either locality or reality assumptions do not hold. The combination of the two is now called “local realism.”
Scully et al. wrote that “In an ingenious experiment with [Zhe-Yu] Ou, Leonard utilized polarization entanglement for photon pairs produced in down conversion to achieve a violation of Bell’s inequality (i.e., local realism).”
I first heard about the story from the book by Aczel and taught the 1990 paper in the opening lecture of a graduate seminar in Spring 2005 on Modeling and Improving Large-Scale Program Behavior.
Greenberger, Daniel M., et al. “Bell’s theorem without inequalities.” Am. J. Phys 58.12 (1990): 1131-1143.
ROBERT J. SCULLY, MARLAN O. SCULLY, AND H. JEFF KIMBLE, “Leonard Mandel,” Biographical Memoirs: Volume 87, The National Academies Press (2005)
Zhe-Yu Ou and Leonard Mandel. Violation of Bell’s inequality and classical probability in a two-photon correlation experiment. Phys. Rev. Lett.61(1988):50.
Amir D. Aczel, Entanglement: The Greatest Mystery in Physics. Random House, Inc. (2002)
Here are three sources people can cite for the definition of locality by Peter Denning:
“The tendency for programs to cluster references to subsets of address space for extended periods is called the principle of locality” on page 143 in Peter J. Denning and Craig H. Martell. 2015. Great Principles of Computing. MIT Press
Locality is “the observed tendency of programs to cluster references to small subsets of their pages for extended intervals.” on page 23 in Peter J. Denning. 2005. The locality principle. Communications of ACM 48, 7 (2005), 19–24
Locality is “a pattern of computational behavior in which memory accesses are confined to locality sets for lengthy periods of time.” in an email from Denning, June 26, 2016.
Given the domain name of this site, it is fitting to have the precise definition of the term here.
Locality has been discussed by others.
In their influential paper titled Dynamic Storage Allocation: A Survey and Critical Review [ISMM 1995], Wilson and others wrote “Locality is very poorly understood, however; aside from making a few important general comments, we leave most issues of locality for future research.”