Sie sind auf Seite 1von 189

junk_scribd.

txt
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Page 1
junk_scribd.txt
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
EiScreen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 2 of about 17,300 results (0.63 seconds)
Search Results
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano ...
https://www.amazon.com/Rautavaara-Works-Piano-Sonatas-Etudes/.../B00000JMYG
Rating: 4.4 - 3 reviews
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano - Piano Sonatas
No. 1 & 2; Icons; Etudes - Amazon.com Music.
Einojuhani Rautavaara - Scribd
https://www.scribd.com/document/336022644/Einojuhani-Rautavaara
Einojuhani Rautavaara ... The etudes were composed in 1969, ...reintroduce a
sonorous, broad piano style using ... Each tude focuses on a particular interval.
Einojuhani Rautavaara - Piano Solo Sheet Music from Presto Classical
www.prestoclassical.co.uk/sm/category1%7CPiano+Solo~composer%7C8865-b
Browse Sheet Music - Composer: Einojuhani Rautavaara, Piano Solo. ... Einojuhani
Rautavaara 's Music For Upright Piano . ... Rautavaara, E: Etudes op. 42.
25 Etudes Melodiques, Op.45 (Heller, Stephen) - IMSLP/Petrucci ...
imslp.org/wiki/25_Etudes_Melodiques,_Op.45_(Heller,_Stephen)
Page 2
junk_scribd.txt
25 Etudes Melodiques, Op.45 (Heller, Stephen) ... Sheet Music. Piano Scores (8);
Parts (0); Arrangements and Transcriptions (0); Other (0) ...
Einojuhani Rautavaara - Classical Archives
www.classicalarchives.com Composers
Einojuhani Rautavaara (composer 1928-) - Play streams in full or download MP3 from
Classical Archives (classicalarchives.com), the largest and best organized ...
Einojuhani Rautavaara: Music For Upright Piano - Piano Instrumental ...
www.musicroom.com ... Piano Solo Post-1900 Instrumental Work
Einojuhani Rautavaara's Music For Upright Piano. ... Media: Sheet Music ... In 1965,
when Einojuhani Rautavaara was thirty-seven years old, he was awarded the
prestigious Sibelius Prize, ... Chopin: Complete Preludes And Etudes 12.95.
Music Finland Core | Einojuhani Rautavaara
https://core.musicfinland.fi/composers/einojuhani-rautavaara
Einojuhani Rautavaara was one of Finland's internationally most successful
composers. He made ... Etydit-Etudes, 1969, 8, 00:00, Fennica Gehrman. Fanfaari ...
Rautavaara, Einojuhani - free listen online, download mp3, download ...
classical-music-online.net/en/composer/Rautavaara/1697
Rautavaara, Einojuhani - free listen online, download mp3, download sheet ... Sonata
1 `Christ and the Fisherman`. Sonata 2 `Sermon of Fire`. Etudes.
Boosey and Hawkes Piano Anthology, The ( Pia | J.W. Pepper Sheet ...
https://www.jwpepper.com/Boosey-and-Hawkes-Piano...The/10289615.item
Piano Sheet Music. ... RAG by ELENA KATS-CHERNIN; FANTASIA by BENJAMIN LEES; ETUDE
IN A by BOHUSLAV MARTINU ... BOHUSLAV MARTINU; PASSIONALE by EINOJUHANI RAUTAVAARA;
SONG AND DANCE by NED ROREM ...
Einojuhani Rautavaara - Ondine Records
https://www.ondine.net/?cid=4.2&oid=622
Einojuhani Rautavaara (born 9 October 1928) is internationally one of the best known
and most frequently performed Finnish composers. ... Sibelius selected Rautavaara
who spent two years studying with Vincent Persichetti ... Etudes, Op. 42
Previous
1
2
3
4
5
6
7
8
9
10
Next
Sponsored
Shop for sheet Ein... on Google
Rautavaara: Piano Works by Einojuhani Rautavaara - Piano Sheet Music
Rautavaara: Piano Works by Einojuhani Rautavaara Piano Sheet Music
$12.99
Sheet Music Plus
More on Google
Page 3
junk_scribd.txt
92104, San Diego, CA - From your phone (Location History) - Use precise location -
Learn more
Help Send feedback Privacy Termsnojuhani Rautavaara was the leading Finnish composer
of his generation * His late style combined modernism with mystical romanticism *
Series of orchestral works inspired by metaphysical and religious subjects *
Immensely popular recordings on Ondine label, including best-selling Symphony No.7
(Angel of Light) (1995) * Operas on creative and historic themes including Vincent
(1986-87) and Rasputin (2001-03) * Widely performed choral works including Vigilia
(1971-72, rev.1996) * Works written for leading orchestras on both sides of Atlantic
read the 00 Kernel Debug Guide
https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local pitch perception try to explain how the physical sound
and specific physiology of the auditory system work together to yield the experience
of pitch. In general, pitch perception theories can be divided into place coding and
temporal coding. Place theory holds that the perception of pitch is determined by
the place of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 2 of about 17,300 results (0.63 seconds)
Search Results
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano ...
https://www.amazon.com/Rautavaara-Works-Piano-Sonatas-Etudes/.../B00000JMYG
Rating: 4.4 - 3 reviews
Einojuhani Rautavaara, Laura Mikkola - Rautavaara: Works for Piano - Piano Sonatas
No. 1 & 2; Icons; Etudes - Amazon.com Music.
Page 4
junk_scribd.txt
Einojuhani Rautavaara - Scribd
https://www.scribd.com/document/336022644/Einojuhani-Rautavaara
Einojuhani Rautavaara ... The etudes were composed in 1969, ...reintroduce a
sonorous, broad piano style using ... Each tude focuses on a particular interval.
Einojuhani Rautavaara - Piano Solo Sheet Music from Presto Classical
www.prestoclassical.co.uk/sm/category1%7CPiano+Solo~composer%7C8865-b
Browse Sheet Music - Composer: Einojuhani Rautavaara, Piano Solo. ... Einojuhani
Rautavaara 's Music For Upright Piano . ... Rautavaara, E: Etudes op. 42.
25 Etudes Melodiques, Op.45 (Heller, Stephen) - IMSLP/Petrucci ...
imslp.org/wiki/25_Etudes_Melodiques,_Op.45_(Heller,_Stephen)
25 Etudes Melodiques, Op.45 (Heller, Stephen) ... Sheet Music. Piano Scores (8);
Parts (0); Arrangements and Transcriptions (0); Other (0) ...
Einojuhani Rautavaara - Classical Archives
www.classicalarchives.com Composers
Einojuhani Rautavaara (composer 1928-) - Play streams in full or download MP3 from
Classical Archives (classicalarchives.com), the largest and best organized ...
Einojuhani Rautavaara: Music For Upright Piano - Piano Instrumental ...
www.musicroom.com ... Piano Solo Post-1900 Instrumental Work
Einojuhani Rautavaara's Music For Upright Piano. ... Media: Sheet Music ... In 1965,
when Einojuhani Rautavaara was thirty-seven years old, he was awarded the
prestigious Sibelius Prize, ... Chopin: Complete Preludes And Etudes 12.95.
Music Finland Core | Einojuhani Rautavaara
https://core.musicfinland.fi/composers/einojuhani-rautavaara
Einojuhani Rautavaara was one of Finland's internationally most successful
composers. He made ... Etydit-Etudes, 1969, 8, 00:00, Fennica Gehrman. Fanfaari ...
Rautavaara, Einojuhani - free listen online, download mp3, download ...
classical-music-online.net/en/composer/Rautavaara/1697
Rautavaara, Einojuhani - free listen online, download mp3, download sheet ... Sonata
1 `Christ and the Fisherman`. Sonata 2 `Sermon of Fire`. Etudes.
Boosey and Hawkes Piano Anthology, The ( Pia | J.W. Pepper Sheet ...
https://www.jwpepper.com/Boosey-and-Hawkes-Piano...The/10289615.item
Piano Sheet Music. ... RAG by ELENA KATS-CHERNIN; FANTASIA by BENJAMIN LEES; ETUDE
IN A by BOHUSLAV MARTINU ... BOHUSLAV MARTINU; PASSIONALE by EINOJUHANI RAUTAVAARA;
SONG AND DANCE by NED ROREM ...
Einojuhani Rautavaara - Ondine Records
https://www.ondine.net/?cid=4.2&oid=622
Einojuhani Rautavaara (born 9 October 1928) is internationally one of the best known
and most frequently performed Finnish composers. ... Sibelius selected Rautavaara
who spent two years studying with Vincent Persichetti ... Etudes, Op. 42
Previous
1
2
3
4
5
6
7
8
Page 5
junk_scribd.txt
9
10
Next
Sponsored
Shop for sheet Ein... on Google
Rautavaara: Piano Works by Einojuhani Rautavaara - Piano Sheet Music
Rautavaara: Piano Works by Einojuhani Rautavaara Piano Sheet Music
$12.99
Sheet Music Plus
More on Google
92104, San Diego, CA - From your phone (Location History) - Use precise location -
Learn more
Help Send feedback Privacy Terms

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of th
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program
Files\Microsoft\Debugging Tools for Windows, you can run LiveKD from any directory;
otherwise you should copy LiveKD to the directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find pitch perception try to explain how the
physical sound and specific physiology of the auditory system work together to yield
the experience of pitch. In general, pitch perception theories can be divided into
Page 6
junk_scribd.txt
place coding and temporal coding. Place theory holds that the perception of pitch is
determined by the place of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of th-out-which-build-and-version-of-windows-10-you-have/
alt-I -> about
command line:
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-ver
sion-csdversion-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

Page 7
junk_scribd.txt
https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve
addresses for thread start addresses in the threads tab of the process properties
dialog and the thread stack window then configure symbols by first downloading the
Debugging Tools for Windows package from Microsoft's web site and installing it in
its default directory. Open the Configure Symbols dialog and specify the path to the
dbghelp.dll that's in the Debugging Tools directory and have the symbol engine
download symbols on demand from Microsoft to a directory on your disk by entering a
symbol server string for the symbol path. For example, to have symbols download to
the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools


psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02
LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d
"C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

Page 8
junk_scribd.txt
lected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of
the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the
value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the
values in each of the output segments to get a series of new higher- order values.
We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows
the results using one dimension -pitch alone. The first pass segments the pitches
into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental
boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be
a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive
dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals,
the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking
for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is
parsed into a single segment. To illustrate the Tenney/Polansky Algorithm, we
perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three
to six pitches; that is, the seg- mentation is determined by the sequence of the
sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive
values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (dEIn essence, the algorithm looks at a string of
intervals derived from the successive values in some musical dimension in a piece of
music. The string might be a series of pitch intervals, time intervals (delays),
dynamic changes, and so forth. More than one string can be selected for analysis.
Then the algorithm combines the values of each dimension's suc- cessive intervals
according to a user-specified average which assigns a relative "weight" to each of
the dimensions. Example 20 illustrates the principle of the Tenney/ Polansky
Page 9
junk_scribd.txt
algorithm: For four successive dimension values labeled A through D forming three
successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the
value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then xample 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles Example 18. Primes enclosed in rectangles Example 18.
Primes enclosed in rectangles
<0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2 1> <0 2
1> <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2 <1 0 3 2
<1 0 3 2 <1 0 3 2 < 131 402> < 131 402> < 131 402> < 131 402> < 131 402> < 131 402>
< 131 402> < 131 402> < 131 402> < 131 402> < 131 402>

read the 00 Kernel Debug Guide


https://samsclass.info/126/proj/p12-kernel-debug-win10.htm
- Installing Debugging Tools for Windows
01 WIN SDK
-- Use the Windows SDK, select only the components you want to install
-- in this case, "debugging tools for windows"
- set up local kernel mode debug:
bcdedit /debug on
bcdedit /dbgsettings local
(need to reset)

02 LiveKD -
https://technet.microsoft.com/en-us/sysinternals/bb897415.aspx
If you install the tools to their default directory of \Program
Files\Microsoft\Debugging Tools for Windows, you can run LiveKD from any directory;
otherwise you should copy LiveKD to the directory in which the tools are installed
170414 - installed on drive d: - copy livekd64.exe to
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64

when prompted for sym dir, enter:


D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
srv*D:\Program Files (x86)\Windows
Kits\10\Debuggers\x64\livekd64_Symbols*http://msdl.microsoft.com/download/symbols

side issue : symbol store


https://www.howtogeek.com/236195/how-to-find-out-which-build-and-version-of-windows-
10-you-have/
alt-I -> about
command line:
Page 10
junk_scribd.txt
cmd> systeminfo
cmd> ver
cmd> winver
cmd> wmic os get buildnumber,caption,CSDVersion /format:csv
cmd> wmic os get /value

other sites:
https://blogs.technet.microsoft.com/markrussinovich/2005/08/17/unkillable-processes/
https://www.windows-commandline.com/find-windows-os-version-from-command/
http://stackoverflow.com/questions/30159714/i-want-to-know-what-wmic-os-get-name-ver
sion-csdversion-command-returns-for-w

Debugger reference->debugger commands->Kernel-mode extension commands


https://msdn.microsoft.com/en-us/library/windows/hardware/ff564717(v=vs.85).aspx
!process

https://msdn.microsoft.com/en-us/library/windows/hardware/ff563812(v=vs.85).aspx
!irp

process explorer: configure symbols


Configure Symbols: on Windows NT and higher, if you want Process Explorer to resolve
addresses for thread start addresses in the threads tab of the process properties
dialog and the thread stack window then configure symbols by first downloading the
Debugging Tools for Windows package from Microsoft's web site and installing it in
its default directory. Open the Configure Symbols dialog and specify the path to the
dbghelp.dll that's in the Debugging Tools directory and have the symbol engine
download symbols on demand from Microsoft to a directory on your disk by entering a
symbol server string for the symbol path. For example, to have symbols download to
the c:\symbols directory you would enter this string:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

https://blogs.msdn.microsoft.com/usbcoreblog/2009/10/06/why-doesnt-my-driver-unload/

running as SYSTEM:
https://forum.sysinternals.com/system-process-using-all-cpu_topic12233.html

Page 11
junk_scribd.txt
cd C:\Users\dan\Desktop\kernel debug\02 LiveKD\PSTools
psexec -s -i -d "C:\Users\dan\Desktop\kernel debug\02
LiveKD\ProcessExplorer\procexp64.exe
psexec -s -i -d
"C:\Users\dan\Downloads\processhacker-2.39-bin\x64\ProcessHacker.exe"
to configure symbols in procexp64 or processhacker:
srv*c:\symbols*http://msdl.microsoft.com/download/symbols
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\dbghelp.dll

cd D:\Program Files (x86)\Windows Kits\10\Debuggers\x64


d:
livekd64 -vsym
y
D:\Program Files (x86)\Windows Kits\10\Debuggers\x64\livekd64_Symbols
!process 0 1 notmyfault64.exe
!process 0 7 notmyfault64.exe
!irp ffff980e687e8310

In essence, the algorithm looks at a string of intervals derived from the successive
values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More
than one string can be selected for analysis. Then the algorithm combines the values
of each dimension's suc- cessive intervals according to a user-specified average
which assigns a relative "weight" to each of the dimensions. Example 20 illustrates
the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X,
Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its
simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments
(or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks
at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time
intervals (delays), dynamic changes, and so forth. More than one string can be
selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the
middle interval is greater than the other two intervals, the string of values is
Page 12
junk_scribd.txt
segmented in half; the value C starts a new segment or phrase. In its simplest form,
using only one musical dimension, the algorithm works by going through the
dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks
at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time
intervals (delays), dynamic changes, and so forth. More than one string can be
selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ P

olansky algorithm: For four successive dimension values labeled A through D forming
three successive unordered intervals labeled X, Y, and Z, if the middle interval is
greater than the other two intervals, the string of values is segmented in half; the
value C starts a new segment or phrase. In its simplest form, using only one musical
dimension, the algorithm works by going through the dimension's list of un- directed
intervals in threes looking for maximum values and segmenting accordingly. This
results in a series of successive segments (or phrases). We can then average the
values in each of the output segments to get a series of new higher- order values.
We input these into the algorithm to produce a second-order segmentation and so
forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows
the results using one dimension -pitch alone. The first pass segments the pitches
into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental
boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be
a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive
dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals,
the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking
for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the
Page 13
junk_scribd.txt
output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is
parsed into a single segment. To illustrate the Tenney/Polansky Algorithm, we
perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three
to six pitches; that is, the seg- mentation is determined by the sequence of the
sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive
values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More
than one string can be selected for analysis. Then the algorithm combines the values
of each dimension's suc- cessive intervals according to a user-specified average
which assigns a relative "weight" to each of the dimensions. Example 20 illustrates
the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X,
Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its
simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments
(or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks
at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time
intervals (delays), dynamic changes, and so forth. More than one string can be
selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the
middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form,
using only one musical dimension, the algorithm works by going through the
dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a
series of new higher- or

der values. We input these into the algorithm to produce a second-order segmentation
and so forth, until the music is parsed into a single segment. To illustrate the
Tenney/Polansky Algorithm, we perform it on the Schoenberg piece. Example 21a shows
the results using one dimension -pitch alone. The first pass segments the pitches
Page 14
junk_scribd.txt
into segments of three to six pitches; that is, the seg- mentation is determined by
the sequence of the sizes of suc- cessive unordered pitch intervals. The segmental
boundaries In essence, the algorithm looks at a string of intervals derived from the
successive values in some musical dimension in a piece of music. The string might be
a series of pitch intervals, time intervals (delays), dynamic changes, and so forth.
More than one string can be selected for analysis. Then the algorithm combines the
values of each dimension's suc- cessive intervals according to a user-specified
average which assigns a relative "weight" to each of the dimensions. Example 20
illustrates the principle of the Tenney/ Polansky algorithm: For four successive
dimension values labeled A through D forming three successive unordered intervals
labeled X, Y, and Z, if the middle interval is greater than the other two intervals,
the string of values is segmented in half; the value C starts a new segment or
phrase. In its simplest form, using only one musical dimension, the algorithm works
by going through the dimension's list of un- directed intervals in threes looking
for maximum values and segmenting accordingly. This results in a series of
successive segments (or phrases). We can then average the values in each of the
output segments to get a series of new higher- order values. We input these into the
algorithm to produce a second-order segmentation and so forth, until the music is
parsed into a single segment. To illustrate the Tenney/Polansky Algorithm, we
perform it on the Schoenberg piece. Example 21a shows the results using one
dimension -pitch alone. The first pass segments the pitches into segments of three
to six pitches; that is, the seg- mentation is determined by the sequence of the
sizes of suc- cessive unordered pitch intervals. The segmental boundaries In
essence, the algorithm looks at a string of intervals derived from the successive
values in some musical dimension in a piece of music. The string might be a series
of pitch intervals, time intervals (delays), dynamic changes, and so forth. More
than one string can be selected for analysis. Then the algorithm combines the values
of each dimension's suc- cessive intervals according to a user-specified average
which assigns a relative "weight" to each of the dimensions. Example 20 illustrates
the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X,
Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its
simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments
(or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks
at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time
intervals (delays), dynamic changes, and so forth. More

Page 15
junk_scribd.txt
than one string can be selected for analysis. Then the algorithm combines the values
of each dimension's suc- cessive intervals according to a user-specified average
which assigns a relative "weight" to each of the dimensions. Example 20 illustrates
the principle of the Tenney/ Polansky algorithm: For four successive dimension
values labeled A through D forming three successive unordered intervals labeled X,
Y, and Z, if the middle interval is greater than the other two intervals, the string
of values is segmented in half; the value C starts a new segment or phrase. In its
simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments
(or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks
at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time
intervals (delays), dynamic changes, and so forth. More than one string can be
selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimension values labeled A
through D forming three successive unordered intervals labeled X, Y, and Z, if the
middle interval is greater than the other two intervals, the string of values is
segmented in half; the value C starts a new segment or phrase. In its simplest form,
using only one musical dimension, the algorithm works by going through the
dimension's list of un- directed intervals in threes looking for maximum values and
segmenting accordingly. This results in a series of successive segments (or
phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries In essence, the algorithm looks
at a string of intervals derived from the successive values in some musical
dimension in a piece of music. The string might be a series of pitch intervals, time
intervals (delays), dynamic changes, and so forth. More than one string can be
selected for analysis. Then the algorithm combines the values of each dimension's
suc- cessive intervals according to a user-specified average which assigns a
relative "weight" to each of the dimensions. Example 20 illustrates the principle of
the Tenney/ Polansky algorithm: For four successive dimensi

on values labeled A through D forming three successive unordered intervals labeled


Page 16
junk_scribd.txt
X, Y, and Z, if the middle interval is greater than the other two intervals, the
string of values is segmented in half; the value C starts a new segment or phrase.
In its simplest form, using only one musical dimension, the algorithm works by going
through the dimension's list of un- directed intervals in threes looking for maximum
values and segmenting accordingly. This results in a series of successive segments
(or phrases). We can then average the values in each of the output segments to get a
series of new higher- order values. We input these into the algorithm to produce a
second-order segmentation and so forth, until the music is parsed into a single
segment. To illustrate the Tenney/Polansky Algorithm, we perform it on the
Schoenberg piece. Example 21a shows the results using one dimension -pitch alone.
The first pass segments the pitches into segments of three to six pitches; that is,
the seg- mentation is determined by the sequence of the sizes of suc- cessive
unordered pitch intervals. The segmental boundaries
are shown by vertical lines. The results are quite reasonable. For instance, the
four pitches <E, Ft, G, F> in phrase 2 are segmented out of the rest of the measure
since they fall in a lower register from the others. Phrases 4 and 5 seem seg-
mented correctly; the first is divided into two segments, the second into one. And
in general, the boundaries of these first-level segments never contradict our more
intuitively de- rived phrase structure. The second pass works on the aver- ages of
the values in each level-1 segment. These averages are simply the center pitch of
the bandwidth (measured in semitones) of each level-1 segment. The intervals between
the series of bandwidths forming level 2 are the input to the second pass of the
algorithm. The resulting second-level seg- mentation divides the piece in half in
the middle of the third phrase, which contradicts our six-phrase structure. That the
second-pass parsing is at variance with our phrase structure is not an
embarrassment, for we are taking pitch intervals as the only criterion for
segmentation. Let us ex- amine Example 21b with the algorithm's taking only time
spans between notes as input. Here the unit of time is a thirty-second note. Once
again the first level basically con- forms to our ideas of the phrase structure,
with two excep- tions. Likewise, the second pass partitions the stream of dura-
tions so that it has an exception inherited from level 1; the last phrase is divided
in half, with its first part serving as a conclusion to the second-level segment
that starts at phrase 4. Finally, Example 21c shows the algorithm's output using
both duration and pitch. The initial values of the previous examples are simply
added together. This time the results get

(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt,
Rinehart and Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language.
Cambridge, MA: MIT Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W.
W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton:
Princeton Univ. Press.
Page 17
junk_scribd.txt
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17,
2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole
Porter," Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift
for Allen: An Introduction and Conclusion," in A Music-Theoretical Matrix: Essays in
Honor of Allen Forte (Part V), ed. David Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music
Theory 8/2 (1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in
The Grove Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett
(New York: Oxford University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
Page 18
junk_scribd.txt
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 19
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
Page 20
junk_scribd.txt
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Page 21
junk_scribd.txt
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 22
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 23
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
Page 24
junk_scribd.txt
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
Page 25
junk_scribd.txt
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 26
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 27
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
Page 28
junk_scribd.txt
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 29
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
Page 30
junk_scribd.txt
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
Page 31
junk_scribd.txt
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 32
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 33
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 34
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
Page 35
junk_scribd.txt
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 36
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 37
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 38
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsScreen reader users, click here to turn off Google Instant...


Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Page 39
junk_scribd.txt
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Page 40
junk_scribd.txt
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
Page 41
junk_scribd.txt
9
10
11
Next
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Page 42
junk_scribd.txt
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
Page 43
junk_scribd.txt
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
Page 44
junk_scribd.txt

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo
Page 45
junk_scribd.txt

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
Page 46
junk_scribd.txt
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 47
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 48
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 49
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
Page 50
junk_scribd.txt
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 51
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 52
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 53
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
Page 54
junk_scribd.txt
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
Page 55
junk_scribd.txt
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 56
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 57
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 58
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 59
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 60
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 61
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 62
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 63
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
Page 64
junk_scribd.txt
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
Page 65
junk_scribd.txt
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 66
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 67
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 68
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 69
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 70
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 71
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 72
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 73
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

Page 74
junk_scribd.txt
arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted
nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
Page 75
junk_scribd.txt
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 76
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 77
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 78
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 79
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 80
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 81
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 82
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 83
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
Page 84
junk_scribd.txt
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

Page 85
junk_scribd.txt
A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 86
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 87
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 88
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Page 89
junk_scribd.txt
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 90
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\
Page 91
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of
thEinojuhani Rautavaara was the leading Finnish composer of his generation * His
late style combined modernism with mystical romanticism * Series of orchestral works
inspired by metaphysical and religious subjects * Immensely popular recordings on
Ondine label, including best-selling Symphony No.7 (Angel of Light) (1995) * Operas
on creative and historic themes including Vincent (1986-87) and Rasputin (2001-03) *
Widely performed choral works including Vigilia (1971-72, rev.1996) * Works written
for leading orchestras on both sides of Atlantic
re complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 92
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
Page 93
junk_scribd.txt
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Page 94
junk_scribd.txt
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.
Page 95
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 96
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 97
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 98
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
Page 99
junk_scribd.txt
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
Page 100
junk_scribd.txt
6
7
8
9
10
11
Next

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Page 101
junk_scribd.txt
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
Page 102
junk_scribd.txt
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Bobs Atonal Theory Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
Page 103
junk_scribd.txt
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
Page 104
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
Page 105
junk_scribd.txt
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
Page 106
junk_scribd.txt
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 107
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 108
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 109
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
Page 110
junk_scribd.txt
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 111
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 112
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 113
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 114
junk_scribd.txt

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo
Page 115
junk_scribd.txt

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
Page 116
junk_scribd.txt
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 117
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 118
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 119
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
Page 120
junk_scribd.txt
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 121
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 122
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 123
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
Page 124
junk_scribd.txt
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
Page 125
junk_scribd.txt
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 126
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 127
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 128
junk_scribd.txt

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 129
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 130
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 131
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 132
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 133
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
Page 134
junk_scribd.txt
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,Einojuhani Rautavaara was the leading Finnish composer of his generation


* His late style combined modernism with mystical romanticism * Series of orchestral
works inspired by metaphysical and religious subjects * Immensely popular recordings
on Ondine label, including best-selling Symphony No.7 (Angel of Light) (1995) *
Operas on creative and historic themes including Vincent (1986-87) and Rasputin
(2001-03) * Widely performed choral works including Vigilia (1971-72, rev.1996) *
Works written for leading orchestras on both sides of Atlantic
ly mean that most people won't agree on which notes are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories ofe functional network is based on evaluating the similarity of the
dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Page 135
junk_scribd.txt
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
Page 136
junk_scribd.txt
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Page 137
junk_scribd.txt
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
Page 138
junk_scribd.txt
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
Page 139
junk_scribd.txt
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
Page 140
junk_scribd.txt
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 141
junk_scribd.txt

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 142
junk_scribd.txt

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo
Page 143
junk_scribd.txt

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
Page 144
junk_scribd.txt
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 145
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
Page 146
junk_scribd.txt
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 147
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
Page 148
junk_scribd.txt
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.
Page 149
junk_scribd.txt

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
Page 150
junk_scribd.txt
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 151
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
Page 152
junk_scribd.txt
coursedependonwhetherthe
Forte is well known for his book The Structure of Atonal Music (1973), which traces
many of its roots to an article of a decade earlier: "A Theory of Set-Complexes for
Music" (1964).[6] In these works, he "applied set-theoretic principles to the
analysis of unordered collections of pitch classes, called pitch-class sets (pc
sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual
coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it
(to varying degrees)."[7]

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian
analysis and music of the Great American Songbook. A complete, annotated
bibliography of his publications appears in the previously cited article, Berry,
"The Twin Legacies of a Scholar-Teacher." Excluding items only edited by Forte, it
lists ten books, sixty-three articles, and thirty-six other types publications, from
1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period
in its development, from volume 4/2 (1960) through 11/1 (1967). His involvement with
the journal, including many biographical details, is addressed in David Carson
Berry, "Journal of Music Theory under Allen Forte's Editorship," Journal of Music
Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in
commemoration of his seventieth birthday, was published in 1997 and edited by his
former students James M. Baker, David W. Beach, and Jonathan W. Bernard (FA12, FA6,
and FA11, according to Berry's list). It was titled Music Theory in Concept and
Practice (a title derived from Forte's 1962 undergraduate textbook, Tonal Harmony in
Concept and Practice). The second was serialized in five installments of Gamut: The
Journal of the Music Theory Society of the Mid-Atlantic, between 2009 and 2013. It
was edited by Forte's former student David Carson Berry (FA72) and was titled A
Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by
Forte's former doctoral advisees, and three special features: a previously
unpublished article by Forte, on Gershwin songs; a collection of tributes and
reminiscences from forty-two of his former advisees; and an annotated register of
his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita
professor of piano at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia
Univ. Teachers College.
note isin thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are
Page 153
junk_scribd.txt
notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
Page 154
junk_scribd.txt
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 155
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 156
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 157
junk_scribd.txt
https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 158
junk_scribd.txt
https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 159
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
Page 160
junk_scribd.txt
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 161
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


Page 162
junk_scribd.txt
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
Page 163
junk_scribd.txt
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.
Screen reader users, click here to turn off Google Instant...
Google

sheet Einojuhani Rautavaara - Etudes

AllVideosImagesShoppingMapsMore
SettingsTools
Page 164
junk_scribd.txt
Page 7 of about 17,300 results (0.63 seconds)
Search Results
Rautavaara's Riverboat - Good-Music-Guide.com
www.good-music-guide.com ... The Music Room Composer Discussion
May 1, 2007 - 20 posts - 7 authors
Rautavaara's Riverboat. ... 2007, 11:03:53 AM . Any composer named Einojuhani
deserves a separate thread . . . . Logged ... His Etudes and Icons are also amazing,
and his Piano Sonatas 1 and 2 are wonderful. Narcissus is also ... Anyone know where
I could get some of his piano sheet music? Logged ...
Download link Youtube: Einojuhani Rautavaara - Etudes (1969)
igetlinkyoutube.com/watch?v=nvZ1dzZry1w
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016) Pianist: Laura ...
Download youtube to mp3: Einojuhani Rautavaara - Etudes (1969) ..... to mp3: Hamelin
plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.
99.5 | New Releases - WGBH
www.wgbh.org/995/newandnotablecds.cfm
Visit Augustin Hadelich's site for more information, and to download sheet music for
cadenzas ... I have most savored by pianist Mutsuko Uchida features the etudes by
Claude Debussy. ... The Helsinki Philharmonic and Einojuhani Rautavaara
Einojuhani Rautavaara Etudes 1969.mp3 Play online
mp3top.online/play/einojuhani-rautavaara-etudes-1969/nvZ1dzZry1w.html
Einojuhani Rautavaara. Einojuhani Rautavaara - Piano Concerto No 1 (1969).mp3 ...
Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music.mp3.
Buy Sheet Music VIOLIN - FIDDLE - INSTRUCTIONAL : STUDIES ...
m.buy-scores.com/boutique-search-engine-uk.php?search=&CATEGORIE...
Etude Methodique De La Double Corde Volume 2. Details. Details ... Piano solo [Sheet
music] ABRSM Publishing .... By Einojuhani Rautavaara. For Violin.
Schulhoff - 5 Etudes de Jazz Video Download MP4 3GP FLV - YiFlix ...
www.yiflix.com Music
Mar 24, 2013 - Hamelin plays Gershwin - Songbook (18 Songs) Audio + Sheet Music 28
Jun 1220:02 ... Einojuhani Rautavaara - Etudes (1969) 19 Apr 1512: ...
Einojuhani Rautavaara - Etudes (1969)|phim hot nhat
phimhotnhat.net/.../video-einojuhani-rautavaara-etudes-1969.nvZ1...
Translate this page
Composer: Einojuhani Rautavaara (October 9, 1928 July 27, 2016)Pianist: Laura
Mikkola00:03 Etude I - Thirds03:21 Etude II - Sevenths04:26 Etude III ...
[PDF]Download pdf file - Modern Accordion Perspectives
www.modernaccordionperspectives.com/Publications_files/MAP2.pdf
Etude II (2009). (Gesualdi). Juan-Jos Mosalini ... Three Etudes (2000). (Olczak).
Younghi Pagh-Paan ... Einojuhani Rautavaara (Finland). Fiddlers (1952-1991).
rautavaara fire sermon pdf - Findeen.com
www.findeen.co.uk Search Directory
... "The Fire Sermon" sheet music - piano sheet music by Einojuhani Rautavaara: ...
2 The Fire Sermon: Rautavaara: 15: original: pdf: 4 years: 6 Etudes for Piano: ...
John Luther Adams - Nunataks (Solitary Peaks) for Piano (2007 ...
1tvprograma.ru/prosmotr/MnJzM0tuN3lFU2s/
Translate this page
... grandeur, the sudden rise to meet each peak (there are ten) and the slow descent
Page 165
junk_scribd.txt
to the vast ice sheet afterwards. ... Einojuhani Rautavaara - Etudes (1969).
Previous
2
3
4
5
6
7
8
9
10
11
Next
etween-correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
Page 166
junk_scribd.txt
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
Page 167
junk_scribd.txt
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
Page 168
junk_scribd.txt
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
Page 169
junk_scribd.txt
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
Page 170
junk_scribd.txt
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
Page 171
junk_scribd.txt
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

Page 172
junk_scribd.txt
https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

Page 173
junk_scribd.txt
httpsnsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In
thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppositedirectionorbyanotherleap,providedthe two
successiveleapsoutline oneof afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

Page 174
junk_scribd.txt
://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.
Page 175
junk_scribd.txt

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

Page 176
junk_scribd.txt

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
Page 177
junk_scribd.txt
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Page 178
junk_scribd.txt
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
Page 179
junk_scribd.txt
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Page 180
junk_scribd.txt
Mutual information is more general and measures the reduction of uncertainty in Y
after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
Page 181
junk_scribd.txt
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
Page 182
junk_scribd.txt
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

https://www.nature.com/articles/srep10829
The construction of the functional network is based on evaluating the similarity of
the dynamics of the oscillators through the computation of a statistical similarity
measure (SSM). In this work we used three SSMs, namely the absolute value of the
cross correlation (also known as Pearsons coefficient) CC, the mutual information
MI and the mutual information of the time series ordinal patterns MIOP25. The former
is a linear measure and the two latter are non-linear ones.

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
Forte is well known for his book The Structure of Atonal Music (1973), which traces
many of its roots to an article of a decade earlier: "A Theory of Set-Complexes for
Music" (1964).[6] In these works, he "applied set-theoretic principles to the
analysis of unordered collections of pitch classes, called pitch-class sets (pc
sets). [...] The basic goal of Forte's theory was to define the various
relationships that existed among the relevant sets of a work, so that contextual
coherence could be demonstrated." Although the methodology derived from Fortes work
"has had its detractors ... textbooks on post-tonal analysis now routinely teach it
(to varying degrees)."[7]
Page 183
junk_scribd.txt

Forte published analyses of the works of Webern and Berg and wrote about Schenkerian
analysis and music of the Great American Songbook. A complete, annotated
bibliography of his publications appears in the previously cited article, Berry,
"The Twin Legacies of a Scholar-Teacher." Excluding items only edited by Forte, it
lists ten books, sixty-three articles, and thirty-six other types publications, from
1955 through early 2009

Forte was also the editor of the Journal of Music Theory during an important period
in its development, from volume 4/2 (1960) through 11/1 (1967). His involvement with
the journal, including many biographical details, is addressed in David Carson
Berry, "Journal of Music Theory under Allen Forte's Editorship," Journal of Music
Theory 50/1 (2006): 7-23.

Honors and Awards[edit]


He has been honored by two Festschriften (homage volumes). The first, in
commemoration of his seventieth birthday, was published in 1997 and edited by his
former students James M. Baker, David W. Beach, and Jonathan W. Bernard (FA12, FA6,
and FA11, according to Berry's list). It was titled Music Theory in Concept and
Practice (a title derived from Forte's 1962 undergraduate textbook, Tonal Harmony in
Concept and Practice). The second was serialized in five installments of Gamut: The
Journal of the Music Theory Society of the Mid-Atlantic, between 2009 and 2013. It
was edited by Forte's former student David Carson Berry (FA72) and was titled A
Music-Theoretical Matrix: Essays in Honor of Allen Forte (a title derived from
Forte's 1961 monograph, A Compositional Matrix). It included twenty-two articles by
Forte's former doctoral advisees, and three special features: a previously
unpublished article by Forte, on Gershwin songs; a collection of tributes and
reminiscences from forty-two of his former advisees; and an annotated register of
his publications and advisees.

Personal life[edit]
Forte was married to the French-born pianist Madeleine (Hsu) Forte, emerita
professor of piano at Boise State University.

Bibliography (Books only)[edit]


(1955) Contemporary Tone-Structures. New York: Bureau of Publications, Columbia
Univ. Teachers College.
(1961) The Compositional Matrix. Baldwin, NY: Music Teachers National Assoc.
(1962) Tonal Harmony in Concept and Practice (3rd ed., 1979). New York: Holt,
Rinehart and Winston.
(1967) SNOBOL3 Primer: An Introduction to the Computer Programming Language.
Cambridge, MA: MIT Press.
(1973) The Structure of Atonal Music. New Haven: Yale Univ. Press.
(1978) The Harmonic Organization of The Rite of Spring. New Haven: Yale Univ. Press.
(1982) Introduction to Schenkerian Analysis (with Steven E. Gilbert). New York: W.
W. Norton.
(1995) The American Popular Ballad of the Golden Era: 1924-1950. Princeton:
Princeton Univ. Press.
Page 184
junk_scribd.txt
(1998) The Atonal Music of Anton Webern. New Haven: Yale Univ. Press.
(2001) Listening to Classic American Popular Songs. New Haven: Yale Univ. Press.
See also[edit]
Forte number
References[edit]
Jump up ^ "In memoriam Allen Forte, music theorist". news.yale.edu. October 17,
2014.
Jump up ^ http://news.yale.edu/2014/10/17/memoriam-allen-forte-music-theorist
Jump up ^ Allen Forte, "Secrets of Melody: Line and Design in the Songs of Cole
Porter," Musical Quarterly 77/4 (1993), unnumbered note on 644-645.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 8.
Jump up ^ David Carson Berry, "Journal of Music Theory under Allen Fortes
Editorship," Journal of Music Theory 50/1 (2006), 9-10; and Berry, "Our Festschrift
for Allen: An Introduction and Conclusion," in A Music-Theoretical Matrix: Essays in
Honor of Allen Forte (Part V), ed. David Carson Berry, Gamut 6/2 (2013), 3.
Jump up ^ Allen Forte, "A Theory of Set-Complexes for Music," Journal of Music
Theory 8/2 (1964): 136-183.
Jump up ^ David Carson Berry, "Theory," sect. 5.iv (Pitch-class set theory), in
The Grove Dictionary of American Music, 2nd edition, ed. Charles Hiroshi Garrett
(New York: Oxford University Press, 2013), 8:175-176.
External links[edit]

https://stats.stackexchange.com/questions/81659/mutual-information-versus-correlatio
n
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance
ry Primer
1
Pitch and pitch-class (pc)(1)
Pitch space
: a linear series of pitches (semitones) from low to high modeled by integers.(2)
Sets of pitches (called
psets
) are selections from the set of pitches; they are unordered in time.(3)
Pc space
: circle of pitch-classes (no lower or higher relations) modeled by integers, mod 12
(see below).(4) Pcs are related to the pitches by taking the latter mod 12. Pitches
related by any number of octaves map to the same pitch-class.(5) Sets of pcs (called
Page 185
junk_scribd.txt
pcsets
) are selections from the set of pcs; they are unordered in time (and pitch).(6)
Pcsets must be realized (or represented or articulated) by pitches. To realize
a pcset in music, it must be ordered in pitch andin time. Every musical articulation
of a pcset produces a contour. Many different psets may represent one pcset. Pcsets
may modelmelodies, harmonies, mixed textures, etc.Definitions from Finite Set
Theory(6) The set of all the pcs is called the
aggregate
and is denoted by the letter U; the set of no pcs is called the empty or
null
set, and isdenoted by the sign

(7) Membership: If a is a member (or element) of the set B, we write a


?
B.(8) Inclusion: If A and B are sets and A is contained in B, we write A
?
B.(9) The union of two sets A and B (written A
?
B) is the content of both of them.(10) The intersection of two sets A and B is their
common elements (written A
n
B).(11) Two sets are disjoint if their intersection is

.(12) B is the complement of A, if B contains all elements of U not in A. We show


the complement of A by A
'
.NB: A
n
A
'
=

(A and A
'
are d

nsingle-voicewritingthere are "rules" for thewayamelodyshouldprogress.In


thecompositionof a cantus firmusinmodalcounterpoint,forexample,aleapislimitedto
certainintervalsandmust be followed
eitherbyastepintheoppos//stats.stackexchange.com/questions/81659/mutual-information-
versus-correlation
So the two are not antagonisticthey are complementary, describing different aspects
of the association between two random variables. One could comment that Mutual
Information "is not concerned" whether the association is linear or not, while
Covariance may be zero and the variables may still be stochastically dependent. On
the other hand, Covariance can be calculated directly from a data sample without the
Page 186
junk_scribd.txt
need to actually know the probability distributions involved (since it is an
expression involving moments of the distribution), while Mutual Information requires
knowledge of the distributions, whose estimation, if unknown, is a much more
delicate and uncertain work compared to the estimation of Covariance

https
https://stats.stackexchange.com/questions/1052/what-is-the-major-difference-between-
correlation-and-mutual-information
Correlation measures the linear relationship (Pearson's correlation) or monotonic
relationship (Spearman's correlation) between two variables, X and Y.

Mutual information is more general and measures the reduction of uncertainty in Y


after observing X. It is the KL distance between the joint density and the product
of the individual densities. So MI can measure non-monotonic relationships and other
more complicated relationships\

itedirectionorbyanotherleap,providedthe two successiveleapsoutline oneof


afewpermissiblethree-note
sonorities.Inmulti-voicecontexts,theleadingofavoiceisdeterminedevenfurther.As
Icompose,forinstance,I ask: Will the nextnote Iwritedownform a consonancewith
theothervoices?Ifnot,is the dissonancecorrectly preparedand resolved?What
scaledegreesand harmonies are involved?(Andtheanswers to suchquestionswill of
coursedependonwhetherthe note isin
thebass,soprano,oraninnervoice.)Butthesevoice-leadingrules are notarbitrary,fortheir
ownsake;theyenablethelistenertoparsetheongoingmusicalfabricintomeaningfulunits.They
helpmetodetermine"byear"whether the next note isin thesamevoice,orjumpstoanother in
anarpeggiation,or is ornamental
ornot,andsoforth.Manycomposersandanalystshavesoughtsome extensionorgeneralizationof
tonalvoice-leadingfor non-tonalmusic.Analystssuch
asFelixSalzer,RoyTravis,andEdwardLauferhaveattemptedtoapplylinearconceptssuch
asSchenkerianprolongationto music thatappearstohave littleto do withtonalityor
evenpitchconcentricity.lJosephN.Straus andothers have however called such work
intoquestion.2Othertheorists have obviatedvoice-leadingasacriterion for
dis-tinguishinginearaspectsofpitchstructure. Forexample,inmyowntheoryofcompositional
design,ensembles of(un-interpreted) pcsegments,often calledlynes,are realized
inpitch,time,and other musicaldimensions,usingsome meansof musicalarticulation o
maintainanassociation between
thecomponentsofagivenlyne.3Forinstance,alynemightbeassociatedwitharegister,aninstrum
ent,adynamiclevel,amodeofarticulation,oranycombination ofthese,therebyseparatingtout
from

arlierdays,settheoryhas had an air ofthesecretsocietyaboutit,withadmissiongranted


nlyto thosewhopossessthemagic password,for-biddingtechnicalvocabulary
bristlingwithexpressionslike "6-Z44"and"intervalvector."t has thusoftenppearedto the
uninitiated s thesterileapplicationofarcane,mathematicalonceptsto inaudible
Page 187
junk_scribd.txt
anduninterestingmusicalrelationships.Thissituation has created
understandablerustrationamongmusicians,nd the
frustrationasgrownasdiscussionsoftwentieth-centurymusicintheprofessionaltheoreticali
teraturehave come tobeexpressedalmostentirelynthis unfamiliaranguage.Wheredid
thistheoryomefrom nd how has itmanagedto become sodominant?ettheorymergednresponseto
the motivicand contextualna-tureofpost-tonalmusic.Tonalmusic usesonlyasmall number f
referentialsonoritiestriadsandseventhchords);post-tonalmusicpresentsnextraor-dinarya
rietyf
musicalconfigurations.onalmusicsharescommonracticeofharmonyndvoiceleading;post-tonal
music is morehighlyelf-referen-tialeachwork definesanewits basicshapesand
modesofprogression.ntonalmusic,motivicrelationshipsre constrainedythenormsof
tonalsyn-tax;inpost-tonalmusic,motivesbecomeindependentnd function
sprimarystructural eterminants.nthissituation,newmusictheorywasneeded,freeof
traditionalo

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound

://en.wikipedia.org/wiki/Pitch_(music)
Pitch is a perceptual property of sounds that allows their ordering on a
frequency-related scale
...
Pitch is an auditory sensation in which a listener assigns musical tones to relative
positions on a musical scale based primarily on their perception of the frequency of
vibration.[6] Pitch is closely related to frequency, but the two are not equivalent.
Frequency is an objective, scientific attribute that can be measured. Pitch is each
person's subjective perception of a sound wave, which cannot be directly measured.
However, this does not necessarily mean that most people won't agree on which notes
are higher and lower.

Sound waves themselves do not have pitch, but their oscillations can be measured to
obtain a frequency. It takes a sentient mind to map the internal quality of pitch.
However, pitches are usually associated with, and thus quantified as frequencies in
cycles per second, or hertz, by comparing sounds with pure tones, which have
periodic, sinusoidal waveforms. Complex and aperiodic sound waves can often be
assigned a pitch by this method.
...
Page 188
junk_scribd.txt
Theories of pitch perception try to explain how the physical sound and specific
physiology of the auditory system work together to yield the experience of pitch. In
general, pitch perception theories can be divided into place coding and temporal
coding. Place theory holds that the perception of pitch is determined by the place
of maximum excitation on the basilar membrane.

A place code, taking advantage of the tonotopy in the auditory system, must be in
effect for the perception of high frequencies, since neurons have an upper limit on
how fast they can phase-lock their action potentials.[6] However, a purely
place-based theory cannot account for the accuracy of pitch perception in the low
and middle frequency ranges.

Temporal theories offer an alternative that appeals to the temporal structure of


action potentials, mostly the phase-locking and mode-locking of action potentials to
frequencies in a stimulus.

https

Page 189

Das könnte Ihnen auch gefallen