Wrong touch coordinates detection

    This site uses cookies. By continuing to browse this site, you are agreeing to our Cookie Policy.

    • Wrong touch coordinates detection

      Hi,

      I'm developing a user interface on a 320x240 display, with 4-wire analog touch interface. Everything is working well, just touch seems to be to much rough.
      Our problem is that, even if we calibrate the touchscreen with a 9 points method, still persist a lot of wrong revealed touch: i.e., with reference to the image attacched, finger is pressed on the button '+' (45x45 pixels), but the library detect a touch on the slider, at least 20 pixels over the button itself.

      Calibration is done run-time: once the user has touched all the required points, whose x and y values are retrieved through GUI_TOUCH_GetState() and used to build-up a 2-dimensional array. At the end, logical points and their corresponding ADC values are passed to GUI_TOUCH_CalcCoefficients() to let emWin evaluate calibration coefficients; finally, calibration is automatically enabled with GUI_TOUCH_EnableCalibration(1) so that analog data passed to GUI_PID_StoreState() are already calibrated.
      This is how we implemented calibration.


      Using a smaller object, such as a pen, to touch the screen instead of the finger, we can achieve a little better result, but wrong points detection keep to exist.
      Any suggestion on how to solve this?


      Many thanks
      Andrea
      Images
      • Immagine.jpg

        975.07 kB, 2,605×1,525, viewed 12 times