You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use an FT232H as USB<->I2c interface and pyftdi 0.55.0.
My i2c slave is an FPGA with an embedded CPU. It uses clock stretching, so I connected AD0, 1, 2 and 7 as documented.
I added a small resistor in series with SCL, to see which device controls SCL. In the software I use i2c.configure(device, clockstretching=True).
When I do an i2c read, I see the expected signals on the scope (see image). Both the FT232H and the FPGA pull SCL low right after the ACK (lowest electrical level).
The first cursor (X1) marks the moment that the FT232H releases SCL, the second cursor (X2) marks the moment that the FPGA releases SCL. The FT232H correctly waits until the FPGA releases the bus before it continues clocking.
So far so good. The data pattern on the bus is 0x99 (also shown by the i2c analysis of the scope).
However, the data read by the software is 0x19.
It appears that the FT232H (and/or the driver) uses the state of SDA of the previous LSB (which is on the bus until just before the FPGA releases SCL) as the MSB, instead of the actual value during SCL high.
Just for the test, I extended the SDA setup time from 100ns (which is according to the 400 kHz i2c spec) to 2.5 us, with no effect.
My theory is that the MSB is clocked in on the internal SCL rising, so, at (X1), instead of using the moment of the actual SCL rising, at (X2).
When I reduce the bus clock rate to e.g., 10 kHz, the correct data is read. This, however, is no usable workaround, as the clock stretch time can vary (depending on internal CPU load).
Any suggestions? I see no simple workaround, except for doing 100% bit-banging (which is way too slow).
The text was updated successfully, but these errors were encountered:
About a decade ago I came to the exact same conclusion: FT232H does not honor adaptive clocking when clocking data in, but does for the clocking out only. To be fair, they never promised that, but a lot of users are fooled by the fact it works most of the time, until the first bit after the stretching happens to be a 1, which is not that common I guess. One way out of this mess is to avoid using what they call fast mode in their lib. USB latency provides gaps between the bytes of about 1ms. Ugly, but works most of the time.
Back then FT260 did not exist yet, so I came up with a somewhat hacky solution shown on the schematic below (uses UM232H). This obviously requires custom code support, but works a lot better than adaptive clocking. In my implementation it only kicks in between the bytes, so if the stretching happens inside the byte you're out of luck. I've yet to encounter that though, so watching each bit appears to be unnecessary.
Later I've abandoned FT232H altogether, it has a lot more problems when it comes to i2c, they should probably have never positioned it for that.
Uh oh!
There was an error while loading. Please reload this page.
I use an FT232H as USB<->I2c interface and pyftdi 0.55.0.
My i2c slave is an FPGA with an embedded CPU. It uses clock stretching, so I connected AD0, 1, 2 and 7 as documented.
I added a small resistor in series with SCL, to see which device controls SCL. In the software I use
i2c.configure(device, clockstretching=True)
.When I do an i2c read, I see the expected signals on the scope (see image). Both the FT232H and the FPGA pull SCL low right after the ACK (lowest electrical level).
The first cursor (X1) marks the moment that the FT232H releases SCL, the second cursor (X2) marks the moment that the FPGA releases SCL. The FT232H correctly waits until the FPGA releases the bus before it continues clocking.
So far so good. The data pattern on the bus is 0x99 (also shown by the i2c analysis of the scope).
However, the data read by the software is 0x19.
It appears that the FT232H (and/or the driver) uses the state of SDA of the previous LSB (which is on the bus until just before the FPGA releases SCL) as the MSB, instead of the actual value during SCL high.
Just for the test, I extended the SDA setup time from 100ns (which is according to the 400 kHz i2c spec) to 2.5 us, with no effect.
My theory is that the MSB is clocked in on the internal SCL rising, so, at (X1), instead of using the moment of the actual SCL rising, at (X2).
When I reduce the bus clock rate to e.g., 10 kHz, the correct data is read. This, however, is no usable workaround, as the clock stretch time can vary (depending on internal CPU load).
Any suggestions? I see no simple workaround, except for doing 100% bit-banging (which is way too slow).
The text was updated successfully, but these errors were encountered: