J Harmonic Entropy [PDF]

and ¨ ·В are shown as the three regions under the bell curve. Erlich refines this model to incorporate the. СТ У o

0 downloads 10 Views 156KB Size

Recommend Stories


J200103_19360.pdf - J-Stage
When you talk, you are only repeating what you already know. But if you listen, you may learn something

E5-J for pdf
Don’t grieve. Anything you lose comes round in another form. Rumi

J201105_29067.pdf - J-Stage
If you want to go quickly, go alone. If you want to go far, go together. African proverb

BRIAN J. WIGLEY [PDF]
Office of Admissions and Records. 2001-2002. *Graduate Assistant - Admissions Statistics Analysis. Department of Contract Administration. 2000-2002. *Coordinator of Collegiate Licensing. Department of Athletic Academic Affairs. 1999-2000. *Graduate A

J Pendidikan PDF
Forget safety. Live where you fear to live. Destroy your reputation. Be notorious. Rumi

J200204_18888.pdf - J-Stage
Be like the sun for grace and mercy. Be like the night to cover others' faults. Be like running water

J201207_29063.pdf - J-Stage
What you seek is seeking you. Rumi

J199104_28129.pdf - J-Stage
When you talk, you are only repeating what you already know. But if you listen, you may learn something

Harmonic Analysis
Happiness doesn't result from what we get, but from what we give. Ben Carson

Harmonic mixtures
The greatest of richness is the richness of the soul. Prophet Muhammad (Peace be upon him)

Idea Transcript


J Harmonic Entropy Harmonic entropy is a measure of the uncertainty in pitch perception, and it provides a physical correlate of tonalness, one aspect of the psychoacoustic concept of dissonance. This Appendix shows in detail how to calculate harmonic entropy and continues the discussion in Sect. 5.3.3.

Harmonic entropy was introduced by Erlich [W: 9] as a refinement of a model by van Eck [B: 125]. It is based on Terhardt’s [B: 196] theory of harmony, and it follows in the tradition of Rameau’s fundamental bass [B: 145]. It provides a way to measure the uncertainty of the fit of a harmonic template to a complex sound spectrum. As a major component of tonalness is the closeness of the partials of a complex sound to a harmonic series, high tonalness corresponds to low entropy and low tonalness corresponds to high entropy. In the simplest case, consider two harmonic tones. If the tones are to be understood as approximate harmonic overtones of some common root, they must form a simple-integer ratio with one another. One way to model this uses theW Farey series    of order , which lists all ratios of integers up to . For example, is 

















*

 







*



 





*

 



*





*

 A useful property of the Farey series is that the distance between successive terms is   

à  . Then larger when the ratios are simpler. Let the Ž th element of the series be 

ß the region over which [ } ß dominates goes from the mediant1 below to the mediant } 

    ß   ß  } } _ . Figure J.1 plots above, that is, from à  [ à  to à  à  . Designate this region  



 the length of _ vs. for V , the Farey series of order * . Observe that complex ratios cluster together, and that the simple ratios tend to separate. Thus, simple ratios like 1/2, 2/3, and 3/4 have wide regions with large _ , and complex ratios tend to

have small regions with small _ .  , a Gaussian distribution (a bell curve) is used to associate a For any interval 

   probability !  with the ratio in . The probability that interval  is perceived as a mistuning of the Ž th member of the Farey series is 

 !



…

 #

€ 

Recall that the mediant of two ratios

„…

and

K

… l



ž~ Y  ƒ ~



a





K l ‚

 





: †

is the fraction

: „… w †

w

.

356

J Harmonic Entropy 0:1 Mediant Distance

1:1

1:5

0

0.1

1:4

0.2

1:2

1:3

0.3

2:3

0.4 0.5 0.6 Farey Series

3:4 4:5

0.7

0.8

0.9

1

Fig. J.1. The mediant distances between entries (the length of the – ö ) are plotted as a function of the small integer ratios & ö drawn from the Farey series of order = Œ . The simplest ratios dominate. 

Thus, the probability is high when the  is close to and low when  is far from 

 ß  .ß This is depicted in Fig. J.2 where the probabilities that  is perceived as ,  

 ß R , and are shown as the three regions under the bell curve. Erlich refines ‘’ “ this model to incorporate the of the intervals and mediants, which is sensible because pitch perception is itself (roughly) logarithmic. The harmonic entropy (HE) of  is then defined (parallel to the definition of entropy used in information theory) as 

 @





 g



!

‘’ “  

 !







When the interval  lies near a simple-integer ratio , there will be one large probability and many small ones. Harmonic entropy is low. When the interval  is distant from any simple-integer ratio, many complex ratios contribute many nonzero probabilities. Harmonic entropy is high. A plot of harmonic entropy over an octave of   V     (labeled in cents) appears in Fig. 5.5 on p. 89. This figure used intervals and €  H . Clearly, intervals that are close to simple ratios are distinguished by having low entropy, and more complex intervals have high harmonic entropy. Generalizations of the harmonic entropy measure to consider more than two sounds at a time are currently under investigation; one possibility involves Voronoi cells. Harmonic series triads with simple ratios are associated with large Voronoi cells, whereas triads with complex ratios are associated with small cells. This nicely parallels the dyadic case. Recall the example (from p. 96 and sound examples [S: 40]–[S: 42]), which compares the clusters 4:5:6:7 with 1/7:1/6:1/5:1/4. In such cases, the harmonic entropy model tends to agree better with listener’s perceptions of the dissonance of these chords than does the sensory dissonance approach. Paul Erlich comments that the study of harmonic entropy is a “public work in progress” at [W: 9].

J Harmonic Entropy

357

probability pj+2(i) that i is perceived as fj+2

probability pj+3(i) that i is perceived as fj+3

this area gives the probability pj+1(i) that the interval i is perceived as the simple integer ratio fj+1

fj mediant between fj and fj+1

fj+1

rj+1

fj+2

i rj+2

fj+3

fj+4

rj+3

mediant between mediant between fj+1 and fj+2 fj+2 and fj+3

Fig. J.2. Each region – ö  extends from the mediant between & ö and & ö  to the mediant w w between & ö ‡ and & ö # . The interval ã specifies the mean of the Gaussian curve, and the w w probabilities ö r ã s are defined as the disjoint areas between the axis and the curve.

Smile Life

When life gives you a hundred reasons to cry, show life that you have a thousand reasons to smile

Get in touch

© Copyright 2015 - 2024 PDFFOX.COM - All rights reserved.