Greetings,
I am trying to understand how to use emWin's various *_SetOrientation API's. GUI_SetOrientation behaves as expected, however GUI_TOUCH_SetOrientation does not appear to do anything meaningful. Our display is mounted upside down in the product(viewing angle and connections were better this way), and as such we need to not only permanently rotate the graphics, but also reconfigure how the touch coordinates are interpreted by emWin. The manual also appears to contradict itself. In the paragraph, it leads me to believe I should not have to mirror my X and Y coordinates in the hardware layer. In the disclaimer, it mentions it has no effect on GUI_TOUCH_StoreState* or anything else. Well what in the world is this function even good for then? Am I to understand the only way I can get the touch coordinates flipped is to mirror them myself in the driver layer? I don't see how this could even be feasible in dynamic applications where the screen must be able to rotate. See below: (XSIZE_PHYS=800, YSIZE_PHYS=480) What do I need to do to not have to modify the way I pass in X and Y values into GUI_TOUCH_StoreState? Shouldn't GUI_TOUCH interpret those values based on the orientation passed into it?
I'm using emWin 6.32b from NXP's emWin downloads for the LPC43XX series microcontrollers. Thank you for the help in advance.
Edit: I had the wrong source code posted. I have updated it.
Display All
I am trying to understand how to use emWin's various *_SetOrientation API's. GUI_SetOrientation behaves as expected, however GUI_TOUCH_SetOrientation does not appear to do anything meaningful. Our display is mounted upside down in the product(viewing angle and connections were better this way), and as such we need to not only permanently rotate the graphics, but also reconfigure how the touch coordinates are interpreted by emWin. The manual also appears to contradict itself. In the paragraph, it leads me to believe I should not have to mirror my X and Y coordinates in the hardware layer. In the disclaimer, it mentions it has no effect on GUI_TOUCH_StoreState* or anything else. Well what in the world is this function even good for then? Am I to understand the only way I can get the touch coordinates flipped is to mirror them myself in the driver layer? I don't see how this could even be feasible in dynamic applications where the screen must be able to rotate. See below: (XSIZE_PHYS=800, YSIZE_PHYS=480) What do I need to do to not have to modify the way I pass in X and Y values into GUI_TOUCH_StoreState? Shouldn't GUI_TOUCH interpret those values based on the orientation passed into it?
I'm using emWin 6.32b from NXP's emWin downloads for the LPC43XX series microcontrollers. Thank you for the help in advance.
Edit: I had the wrong source code posted. I have updated it.
C Source Code
- void LCD_X_Config(void)
- {
- #if (NUM_BUFFERS > 1)
- GUI_MULTIBUF_Config(NUM_BUFFERS);
- #endif
- GUI_DEVICE_CreateAndLink(GUIDRV_LIN_16, GUICC_M444_12, 0, 0);
- LCD_SetDevFunc(0, LCD_DEVFUNC_COPYBUFFER, (void (*)(void))BSP_LCD_CopyBuffer);
- LCD_SetSizeEx (0, XSIZE_PHYS, YSIZE_PHYS);
- LCD_SetVSizeEx(0, XSIZE_PHYS, YSIZE_PHYS);
- LCD_SetVRAMAddrEx(0, BSP_LCD_GetActiveBufferAddr());
- GUI_SetOrientation(GUI_ROTATION_180); // rotates the display as anticipated
- GUI_TOUCH_SetOrientation(GUI_ROTATION_180); // does nothing
- }
The post was edited 1 time, last by brich1024 ().