The Science of Information: From Language to Black Holes
Free trial of The Great Courses Signature Collection or buy
Terms apply
Prime membership required
Episodes
- S1 E1 - The Transformability of InformationDecember 10, 201533minWhat is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit - the basic unit of information.Free trial of The Great Courses Signature Collection or buy
- S1 E2 - Computation and Logic GatesDecember 10, 201531minAccompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations.Free trial of The Great Courses Signature Collection or buy
- S1 E3 - Measuring InformationDecember 10, 201531minHow is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game.Free trial of The Great Courses Signature Collection or buy
- S1 E4 - Entropy and the Average SurpriseDecember 10, 201531minIntuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.Free trial of The Great Courses Signature Collection or buy
- S1 E5 - Data Compression and Prefix-Free CodesDecember 10, 201531minProbe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes.Free trial of The Great Courses Signature Collection or buy
- S1 E6 - Encoding Images and SoundsDecember 10, 201530minLearn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats.Free trial of The Great Courses Signature Collection or buy
- S1 E7 - Noise and Channel CapacityDecember 10, 201531minOne of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate.Free trial of The Great Courses Signature Collection or buy
- S1 E8 - Error-Correcting CodesDecember 10, 201532minDig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer operating systems, CDs, and data transmissions from the Voyager spacecraft.Free trial of The Great Courses Signature Collection or buy
- S1 E9 - Signals and BandwidthDecember 10, 201531minTwelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth - concepts that apply to many types of communication.Free trial of The Great Courses Signature Collection or buy
- S1 E10 - Cryptography and Key EntropyDecember 10, 201529minThe science of information is also the science of secrets. Investigate the history of cryptography starting with the simple cipher used by Julius Caesar. See how entropy is a useful measure of the security of an encryption key, and follow the deciphering strategies that cracked early codes.Free trial of The Great Courses Signature Collection or buy
- S1 E11 - Cryptanalysis and Unraveling the EnigmaDecember 10, 201531minUnravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy.Free trial of The Great Courses Signature Collection or buy
- S1 E12 - Unbreakable Codes and Public KeysDecember 10, 201531minThe one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key cryptography, which makes modern transactions secure - for now.Free trial of The Great Courses Signature Collection or buy
- S1 E13 - What Genetic Information Can DoDecember 10, 201531minLearn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to an organism known as LUCA - the last universal common ancestor - which lived 3.5 to 4 billion years ago.Free trial of The Great Courses Signature Collection or buy
- S1 E14 - Life’s Origins and DNA ComputingDecember 10, 201531minDNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results from eons of natural computation.Free trial of The Great Courses Signature Collection or buy
- S1 E15 - Neural Codes in the BrainDecember 10, 201531minStudy the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand versus the intricacies of neuron activity on the other. Then estimate the total information capacity of the brain.Free trial of The Great Courses Signature Collection or buy
- S1 E16 - Entropy and Microstate InformationDecember 10, 201530minReturn to the concept of entropy, tracing its origin to thermodynamics, the branch of science dealing with heat. Discover that here the laws of nature and information meet. Understand the influential second law of thermodynamics, and conduct a famous thought experiment called Maxwell's demon.Free trial of The Great Courses Signature Collection or buy
- S1 E17 - Erasure Cost and Reversible ComputingDecember 10, 201531minMaxwell's demon has startling implications for the push toward ever-faster computers. Probe the connection between the second law of thermodynamics and the erasure of information, which turns out to be a practical barrier to computer processing speed. Learn how computer scientists deal with the demon.Free trial of The Great Courses Signature Collection or buy
- S1 E18 - Horse Races and Stock MarketsDecember 10, 201532minOne of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing and stock trading.Free trial of The Great Courses Signature Collection or buy
- S1 E19 - Turing Machines and Algorithmic InformationDecember 10, 201531minContrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers.Free trial of The Great Courses Signature Collection or buy
- S1 E20 - Uncomputable Functions and IncompletenessDecember 10, 201531minAlgorithmic information is plagued by a strange impossibility that shakes the very foundations of logic and mathematics. Investigate this drama in four acts, starting with a famous conundrum called the Berry Paradox and including Turing's surprising proof that no single computer program can determine whether other programs will ever halt.Free trial of The Great Courses Signature Collection or buy
- S1 E21 - Qubits and Quantum InformationDecember 10, 201530minEnter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept that seems positively magical: the quantum computer.Free trial of The Great Courses Signature Collection or buy
- S1 E22 - Quantum Cryptography via EntanglementDecember 10, 201532minLearn how a feature of the quantum world called entanglement is the key to an unbreakable code. Review the counterintuitive rules of entanglement. Then play a game based on The Newlywed Game that illustrates the monogamy of entanglement. This is the principle underlying quantum cryptography.Free trial of The Great Courses Signature Collection or buy
- S1 E23 - It from Bit: Physics from InformationDecember 10, 201531minPhysicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!Free trial of The Great Courses Signature Collection or buy
- S1 E24 - The Meaning of InformationDecember 10, 201533minSurvey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography - designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"Free trial of The Great Courses Signature Collection or buy
Details
More info
By clicking play, you agree to our Terms of Use.