Another service from Omega

LSAT vs First Year test



A law school has the following data average LSAT score = 650, SD = 80 average first-year score = 65, SD = 8, r = 0.4

It turns out that you would expect a student who scored in the 90th percentile on the LSAT test to be only in the 69th percentile of the first-year test.


A good estimate of the LSAT percentile score of a student in the 69th percentile of the first-year tests is closest to: a. 16th b. 42nd c. 58th d. 69th e. 90th


> aveLSAT := 650: sdLSAT := 80: aveFY := 65: sdFY := 8: r := 0.4:

If someone is in the 69th percentile of the FY test it means that 69 percent of the students scored below this person. Since the FY test scores follow the normal curve we can use the normal TABLE to find out the 69th percentile in standard units. We need to find the z (let's call it z69) that leaves to the left 69% of the area under the normal curve. Since the table only gives areas symmetric about zero, we need to enter the table with the area in between -z69 and z69 i.e. 100%-2*31%=38%. The closest z is,

> z69 := 0.50:

the value z69=0.50 tells us that the student's score is 0.5 sds above the ave of first-year test scores. Thus, the regression curve will predict (0.50)*r = 0.2 standard units for his/her LSAT test. Using the Normal table again we transform 0.2 SUs to a percentile in the following way: Enter the table with z=0.2 standard units and get the area of about 16% under the normal curve from -0.2 to 0.2. Deduce from here that to the left of z=0.2 there is 42%+16%=58% of the total area under the normal curve. So the correct answer is:


Link to the commands in this file
Carlos Rodriguez <>
Last modified: Wed Mar 4 10:11:48 EST 1998