I made a test tape with 315Hz tone at 0dB with my Nakamichi CR-5. According to the sony service manual for DDII, I am supposed to adjust the pots so that the output voltage at test points is 32mV (my previous value was about 36mW), so I did that. However, it actually sounded worse after the adjustment. The middle was muffle and turning off the dolby made things much better. That reminded me of my D3 which was the opposite with big middle. The level pots in my D3 had been set to quite high to match the meter of device (to 0dB) (This might not be correct as different manufacturers could have different meter standards). I speculated that maybe reducing the pot level caused the dull mid (Dolby error). So I increased the output to max on left (55mV) and indeed, the mid was back. Turning the dolby on and off gave relatively small change compared to before.
So what is going on? According to the forum post (https://www.tapeheads.net/threads/p-4-l300-flux-density.8179/), the sony reference tape (P-4-L300) shows 0dB for Nakamichi (dragon) also. The level should be 200 nWb/m (same as common standard). Maybe my CR-5 has a different level? I won't know without the test tape (it should be close as my 3kHz 0dB test from a ebay seller also shows close to 0dB on my CR-5).
Anyway, I don't how I should set my DDII. Maybe it will stay at 55mV for now (right channel set at ~51mV for balance; the volume knob is not perfect). I'll see how I like it.
Update: it is now set even higher.
Comments
Post a Comment