I was setting up my target downrange with a measuring tape sighting in my rifle for the first time ever today, and hit a snag so to speak. I'll be sighting in at 20 yards using a Leapers 3-12x AO scope on a Remi NPSS. When measuring the distance to where my target will be, I couldn't figure out if I should measure the distance from the bullet to the target or from the point where my eye would be to the target. I've read a lot on optics and sighting in and what I know is, when just using optics alone, the distance is measured from your eye to the target. But with a bullet thrown in the mix, I haven't seen this mentioned conclusively somewhere. I'm not trying to make this an exact science or anything, just trying to get facts straight. The difference in either situation is only half a yard, so wind will probably be a bigger problem then getting this exactly right, but this question is really eating at me.