Have you ever actually read Claude Shannon's 1948 paper which built the internet
| Razzle-dazzle cerise quadroon | 04/08/25 | | excitant office factory reset button | 04/08/25 | | Razzle-dazzle cerise quadroon | 04/18/25 | | Naked slippery resort mental disorder | 04/22/25 | | Lime charismatic garrison boltzmann | 04/22/25 | | Razzle-dazzle cerise quadroon | 10/27/25 | | aggressive cracking hall rigor | 10/27/25 | | Razzle-dazzle cerise quadroon | 10/27/25 | | spectacular locus | 10/27/25 | | splenetic masturbator casino | 11/11/25 | | Razzle-dazzle cerise quadroon | 12/29/25 | | rose karate | 12/29/25 | | Razzle-dazzle cerise quadroon | 12/29/25 | | rose karate | 12/29/25 | | Lime charismatic garrison boltzmann | 12/29/25 | | ,.,..,.,..,.,.,.,..,.,.,,..,..,.,,..,.,,. | 01/17/26 | | ,.,..,.,..,.,.,.,..,.,.,,..,..,.,,..,.,,. | 01/19/26 |
Poast new message in this thread
Date: January 17th, 2026 8:13 AM
Author: ,.,..,.,..,.,.,.,..,.,.,,..,..,.,,..,.,,.
this is an extremely good paper so far. i'm not ready to discuss the whole thing yet, but i'm into it a bit, and i should lay down some very preliminary ideas to begin to ground this whole thing for non-experts.
shannon's job here is to formalize the problem of communication. not some sort of habermas generality about psycho-emotive human communication, but the actual components of a 'message' in theory. what does it even mean to 'communicate' anything at all?
shannon's approach is to decompose 'message' into its barest constituent units, which he cites other theorists before himself as describing in terms of 'bits.' for example, when he talks about television transmissions, he describes them in terms of XY grid coordinates (the individual dots appearing on a screen) at any given moment, and then each coordinate has a motion in terms of time.
there is a unit of time in which each XY coordinate shifts value. in his era, it would have been the standard refresh rate in terms of hertz, which was the rate at which a cathode ray tube could scan a particular field. or around 30 frames per second.
in this type of 'message,' we now have a specific set of data (dots per image and image per unit of time) which must now be transmitted between a sender and a receiver. this data can be described in terms of bits to express the overall size of the transmission. easy enough, no?
NO!
because we have INTERFERENCE. interference is any sort of additional data contributed to the set of data which defines our original message. interference can come from any source (weather, hackers, animals chewing through wires, etc.). in terms of our analog TV example, interference visually represents as STATIC. static is the shit we don't want.
shannon's task from this point forward is to establish methods by which to deal with interference, and to reconstruct - as best he can - the original and purest form of the 'message' prior to its transmission.
to be continued, probably.
(http://www.autoadmit.com/thread.php?thread_id=5707716&forum_id=2Reputation#49596052) |
Date: January 19th, 2026 12:02 AM
Author: ,.,..,.,..,.,.,.,..,.,.,,..,..,.,,..,.,,.
a few more preliminary concepts which will be important here:
-logarithms: if you forgot math, the log is basically the inverse of doing an exponent. logs are important in information theory, because we will be dealing with problems like the rates at which possible symbol values increase over a unit of time, which often end up as log functions.
-the concept of a CHANNEL via which symbols/messages are transmitted. the channel is the vector by which our transmission factors are limited and often defined. shannon formalizes the channel in various ways; it is not a vague term for him.
-the markov/markoff chain and stochastic processes. the upshot here is that symbols within systems such as latin letters or TV pixel values are not usually random. they can be calculated in terms of probabilities (letter frequency, for example) and in terms of what has come prior (which also applies to words and larger units rather than just individual symbols).
-'states' as an abstraction for more particular entities such as letters or telegraphy dots/dashes. shannon is trying to move these things into a more abstracted form. if you shift from letter A to letter B, a formalized description would be that you have moved from a system state 1 (in this case, A) to a system state 2. as noted above, the probability of a particular state-shift is one of the major problems under consideration, because this is not going to be random if a coherent message is being transmitted rather than pure noise or static.
-ergodicity. a simple example would be something like letter frequencies in english words. if you start sending coherent english text, you would be expected to arrive at the same letter frequency over time as someone else doing the same thing, even though your texts are unrelated in terms of content. this is because english in this case is an ergodic system with specific frequency properties.
all of this will be significant as the paper moves along.
(http://www.autoadmit.com/thread.php?thread_id=5707716&forum_id=2Reputation#49600223) |
|
|