TM challenges but does not violate the Shannon–Hartley theorem.
Informally referred to as Shannon’s power-efficiency limit, this revered tenet of information theory is a mathematical model that describes an absolute limit to the amount of error-free data which can be sent over a speciﬁc bandwidth in the presence of noise.
In this manner, establishing a theoretical barrier beyond which a signal cannot be sent without errors. This limit has been a foundation of communications and information theory since MIT professor Claude Shannon produced his mathematical proof of the theorem in 1948.
New digital signal processing technology combined with the development of TM’s inflection-based data encoding makes it possible to credibly challenge Shannon’s Limit.
This is because Shannon, made certain assumptions (some of them based on statistical probability and the previous work of Bill Nyquist regards Fourier Transform Analysis), regarding the character of electromagnetic waves when he created his theorem.
Digital communications science was still in its theoretical infancy at the time that Shannon formulated his theorem.
TM relies on new science and technology to transparently overlay a signal on an existing signal while remaining within its licensed bandwidth. This allows the simultaneous re-use of spectrum and increased the capacity for information that can be transmitted. As a result this can appear to violate Shannon challenges while remaining true to his theorem.