george wrote: Wed Jun 03, 2020 8:49 pm
This is a mere observation without knowledge and real understanding why this happens if this 12 μs pulse indeed appears at all and is not a bug in your software/hardware instead.
More reading needed maybe ? :
http://mirrors.apple2.org.za/ground.ica ... ii.nibbles
The bit time is nominally approximately 4 microseconds.
Code: Select all
interval between pulses
fast wr slow wr
data nominal slow rd fast rd
----- ------- ------- -------
11 4 us 4.7 us 3.4 us
101 8 us 9.4 us 6.8 us
1001 12 us 14.1 us 10.2 us
10001 16 us 18.8 us 13.6 us
1001 -> 2 zeros -> 12 us.
Maybe the 5-and-3 8us max interval between pulses assumption is wrong since i have not implemented this yet.
But there is no doubt with the 6-and-3 encoding at all : The interval between pulses used are 4, 8 and 12 us (4us per bit).
This is observed on the dump, and emulation is working quite well too on the real machine (without the HDDD) :
https://www.youtube.com/watch?v=MaR9noqb0JE
And the explanation is quite simple : 4us per bit.
1 bit interval : 1*4us = 4us
2 bits interval : 2*4us = 8us
3 bits interval : 3*4us = 12us
I don't know what you need more to understand.
If this simple explanation isn't enough for you, as you said yourself, take an oscilloscope to see it, or ask the others dealing with the Apple GCR format :
https://www.cbmstuff.com/forum/showthre ... 336#pid336
https://www.bigmessowires.com/floppy-emu/
To come back to the HDDD A2 purpose :
The normal PC/shugart are not supporting this 12 us interval. The AGC/Automatic gain control filter is not tuned the same way and false pulses can be generated with the 12us intervals because the AGC gain is going too high. The HDDD A2 add a FM clock to reduce this max interval and make the HD PC drives compatible with the DD Apple II GCR format.