HOME
Shop
  • English
  • 简体中文
HOME
Shop
  • English
  • 简体中文
  • Product Series

    • FPGA+ARM

      • GM-3568JHF

        • 1. Introduction

          • About GM-3568JHF
        • 2. Quick Start

          • 00 Introduction
          • 01 Environment Setup
          • 02 Compilation Instructions
          • 03 Flashing Guide
          • 04 Debug Tools
          • 05 Software Update
          • 06 View Information
          • 07 Test Commands
          • 08 App Compilation
          • 09 Source Code Acquisition
        • 3. Peripherals and Interfaces

          • 01 USB
          • 02 Display and Touch
          • 03 Ethernet
          • 04 WIFI
          • 05 Bluetooth
          • 06 TF-Card
          • 07 Audio
          • 08 Serial Port
          • 09 CAN
          • 10 RTC
        • 4. Application Development

          • 01 UART read and write case
          • 02 Key detection case
          • 03 LED light flashing case
          • 04 MIPI screen detection case
          • 05 Read USB device information example
          • 06 FAN Detection Case
          • 07 FPGA FSPI Communication Case
          • 08 FPGA DMA read and write case
          • 09 GPS debugging case
          • 10 Ethernet Test Cases
          • 11 RS485 reading and writing examples
          • 12 FPGA IIC read and write examples
          • 13 PN532 NFC card reader case
          • 14 TF card reading and writing case
        • 5. QT Development

          • 01 ARM64 cross compiler environment construction
          • 02 QT program added automatic startup service
        • 6. RKNN_NPU Development

          • 01 RK3568 NPU Overview
          • 02 Development Environment Setup
          • Run Official YOLOv5 Example
          • Model Conversion Detailed Explanation
          • Run Custom Model on Board
        • 7. FPGA Development

          • ARM and FPGA Communication
          • /fpga-arm/GM-3568JHF/FPGA/ch02-FPGA-Development-Manual.html
        • 8. Others

          • 01 Modification of the root directory file system
          • 02 System auto-start service
        • 9. Download

          • Download Resources
    • ShimetaPi

      • M4-R1

        • 1. Introduction

          • 1.1 About M4-R1
        • 2. Quick Start

          • 2.1 OpenHarmony Overview
          • 2.2 Image Burning
          • 2.3 Development Environment Preparation
          • 2.4 Hello World Application
        • 3. Application Development

          • 3.1 Getting Started

            • 3.1.1 ArkTS Language Overview
            • 3.1.2 UI Components (Part 1)
            • 3.1.3 UI Components (Part 2)
            • 3.1.4 UI Components (Part 3)
          • 3.2 Advanced

            • 3.2.1 Getting Started Guide
            • 3.2.2 Usage of Third Party Libraries
            • 3.2.3 Deployment of the Application
            • 3.2.4 Factory Reset
            • 3.2.5 System Debug
            • 3.2.6 APP Stability Testing
            • 3.2.7 Application Testing
          • 3.3 Getting Docs

            • 3.3.1 Official Website Information
          • 3.4 Development Instructions

            • 3.4.1 Full SDK
            • 3.4.2 Introduction of Third Party Libraries
            • 3.4.3 Introduction of HDC Tool
            • 3.4.4 Restore Factory Mode
            • 3.4.5 Update System API
          • 3.5 First Application

            • 3.5.1 First ArkTS App
          • 3.6 Application Demo

            • 3.6.1 UART Tool
            • 3.6.2 Graphics Tablet
            • 3.6.3 Digital Clock
            • 3.6.4 WIFI Tool
        • 4. Device Development

          • 4.1 Ubuntu Environment Development

            • 4.1.1 Environment Setup
            • 4.1.2 Download Source Code
            • 4.1.3 Compile Source Code
          • 4.2 Using DevEco Device Tool

            • 4.2.1 Tool Introduction
            • 4.2.2 Environment Construction
            • 4.2.3 Import SDK
            • 4.2.4 Function Introduction
        • 5. Peripherals and Interfaces

          • 5.1 Raspberry Pi Interfaces
          • 5.2 GPIO Interface
          • 5.3 I2C Interface
          • 5.4 SPI Communication
          • 5.5 PWM Control
          • 5.6 Serial Port Communication
          • 5.7 TF Card Slot
          • 5.8 Display Screen
          • 5.9 Touch Screen
          • 5.10 Audio
          • 5.11 RTC
          • 5.12 Ethernet
          • 5.13 M.2
          • 5.14 MINI PCIE
          • 5.15 Camera
          • 5.16 WIFI BT
          • 5.17 HAT
        • 6. FAQ

          • 6.1 Download Link
      • M5-R1

        • 1. Introduction

          • M5-R1 Development Documentation
        • 2. Quick Start

          • OpenHarmony Overview
          • Image Burning
          • Development Environment Preparation
          • Hello World Application and Deployment
        • 3. Peripherals and Interfaces

          • 3.1 Raspberry Pi Interfaces
          • 3.2 GPIO Interface
          • 3.3 I2C Interface
          • 3.4 SPI Communication
          • 3.5 PWM Control
          • 3.6 Serial Port Communication
          • 3.7 TF Card Slot
          • 3.8 Display Screen
          • 3.9 Touch Screen
          • 3.10 Audio
          • 3.11 RTC
          • 3.12 Ethernet
          • 3.13 M.2
          • 3.14 MINI PCIE
          • 3.15 Camera
          • 3.16 WIFI BT
          • 3.17 HAT
        • 4. Application Development

          • 4.1 Getting Started

            • 4.1.1 ArkTS Language Overview
            • 4.1.2 UI Components (Part 1)
            • 4.1.3 UI Components (Part 2)
            • 4.1.4 UI Components (Part 3)
          • 4.2 Advanced

            • 4.2.1 Getting Started Guide
            • 4.2.2 Usage of Third Party Libraries
            • 4.2.3 Deployment of the Application
            • 4.2.4 Factory Reset
            • 4.2.5 System Debug
            • 4.2.6 APP Stability Testing
            • 4.2.7 Application Testing
        • 5. Device Development

          • 5.1 Environment Setup
          • 5.2 Download Source Code
          • 5.3 Compile Source Code
        • 6. Download

          • Data Download
    • OpenHarmony

      • SC-3568HA

        • 1. Introduction

          • 1.1 About SC-3568HA
        • 2. Quick Start

          • 2.1 OpenHarmony Overview
          • 2.2 Image Burning
          • 2.3 Development Environment Preparation
          • 2.4 Hello World Application
        • 3. Application Development

          • 3.1 ArkUI

            • 3.1.1 ArkTS Language Overview
            • 3.1.2 UI Components (Part 1)
            • 3.1.3 UI Components (Part 2)
            • 3.1.4 UI Components (Part 3)
          • 3.2 Advanced

            • 3.2.1 Getting Started Guide
            • 3.2.2 Usage of Third Party Libraries
            • 3.2.3 Deployment of the Application
            • 3.2.4 Factory Reset
            • 3.2.5 System Debug
            • 3.2.6 APP Stability Testing
            • 3.2.7 Application Testing
        • 4. Device Development

          • 4.1 Environment Setup
          • 4.2 Download Source Code
          • 4.3 Compile Source Code
        • 5. Peripherals and Interfaces

          • 5.1 Raspberry Pi Interfaces
          • 5.2 GPIO Interface
          • 5.3 I2C Interface
          • 5.4 SPI Communication
          • 5.5 PWM Control
          • 5.6 Serial Port Communication
          • 5.7 TF Card Slot
          • 5.8 Display Screen
          • 5.9 Touch Screen
          • 5.10 Audio
          • 5.11 RTC
          • 5.12 Ethernet
          • 5.13 M.2
          • 5.14 MINI PCIE
          • 5.15 Camera
          • 5.16 WIFI BT
          • 5.17 HAT
        • 6. FAQ

          • 6.1 Download Link
      • M-K1HSE

        • 1. Introduction

          • 1.1 Product Introduction
        • 2. Quick Start

          • 2.1 Debug Tool Installation
          • 2.2 Development Environment Setup
          • 2.3 Source Code Download
          • 2.4 Build Instructions
          • 2.5 Flashing Guide
          • 2.6 APT Update Sources
          • 2.7 View Board Info
          • 2.8 CLI LED and Key Test
          • 2.9 GCC Build Programs
        • 3. Application Development

          • 3.1 Basic Application Development

            • 3.1.1 Development Environment Preparation
            • 3.1.2 First Application HelloWorld
            • 3.1.3 Develop HAR Package
          • 3.2 Peripheral Application Cases

            • 3.2.1 UART Read/Write
            • 3.2.2 Key Demo
            • 3.2.3 LED Flash
        • 4. Peripherals and Interfaces

          • 4.1 Standard Peripherals

            • 4.1.1 USB
            • 4.1.2 Display and Touch
            • 4.1.3 Ethernet
            • 4.1.4 WIFI
            • 4.1.5 Bluetooth
            • 4.1.6 TF Card
            • 4.1.7 Audio
            • 4.1.8 Serial Port
            • 4.1.9 CAN
            • 4.1.10 RTC
          • 4.2 Interfaces

            • 4.2.1 Audio
            • 4.2.2 RS485
            • 4.2.3 Display
            • 4.2.4 Touch
        • 5. System Customization Development

          • 5.1 System Porting
          • 5.2 System Customization
          • 5.3 Driver Development
          • 5.4 System Debugging
          • 5.5 OTA Upgrade
        • 6. Download

          • 6.1 Download
    • EVS-Camera

      • CF-NRS1

        • 1. Introduction

          • 1.1 About CF-NRS1
          • 1.2 Event-Based Concepts
          • 1.3 Quick Start
          • 1.4 Resources
        • 2. Development

          • 2.1 Development Overview

            • 2.1.1 Shimetapi Hybrid Camera SDK Introduction
          • 2.2 Environment & API

            • 2.2.1 Environment Overview
            • 2.2.2 Development API Overview
          • 2.3 Linux Development

            • 2.3.1 Linux SDK Introduction
            • 2.3.2 Linux SDK API
            • 2.3.3 Linux Algorithm
            • 2.3.4 Linux Algorithm API
          • 2.4 Service & Web

            • 2.4.1 EVS Server
            • 2.4.2 Time Server
            • 2.4.3 EVS Web
        • 3. Download

          • 3.1 Download
        • 4. Common Problems

          • 4.1 Common Problems
      • CF-CRA2

        • 1. Introduction

          • 1.1 About CF-CRA2
        • 2. Download

          • 2.1 Download
      • EVS Module

        • 1. Related Concepts
        • 2. Hardware Preparation and Environment Configuration
        • 3. Example Program User Guide
        • Resources Download
    • AI-model

      • 1684XB-32T

        • 1. Introduction

          • AIBOX-1684XB-32 Introduction
        • 2. Quick Start

          • First time use
          • Network Configuration
          • Disk usage
          • Memory allocation
          • Fan Strategy
          • Firmware Upgrade
          • Cross-Compilation
          • Model Quantization
        • 3. Application Development

          • 3.1 Development Introduction

            • Sophgo SDK Development
            • SOPHON-DEMO Introduction
          • 3.2 Large Language Models

            • Deploying Llama3 Example
            • /ai-model/AIBOX-1684XB-32/application-development/LLM/Sophon_LLM_api_server-Development-AIBOX-1684XB-32.html
            • /ai-model/AIBOX-1684XB-32/application-development/LLM/MiniCPM-V-2_6-AIBOX-1684XB-32.html
            • /ai-model/AIBOX-1684XB-32/application-development/LLM/Qwen-2-5-VL-demo-Development-AIBOX-1684XB-32.html
            • /ai-model/AIBOX-1684XB-32/application-development/LLM/Qwen-3-chat-demo-Development-AIBOX-1684XB-32.html
            • /ai-model/AIBOX-1684XB-32/application-development/LLM/Qwen3-Qwen Agent-MCP.html
            • /ai-model/AIBOX-1684XB-32/application-development/LLM/Qwen3-langchain-AI Agent.html
          • 3.3 Deep Learning

            • ResNet (Image Classification)
            • LPRNet (License Plate Recognition)
            • SAM (Universal Image Segmentation Foundation Model)
            • YOLOv5 (Object Detection)
            • OpenPose (Human Keypoint Detection)
            • PP-OCR (Optical Character Recognition)
        • 4. Download

          • Resource Download
      • 1684X-416T

        • 1. Introduction

          • AIBOX-1684X-416 Introduction
        • 2. Demo Simple Operation Guide

          • Simple instructions for using shimeta smart monitoring demo
      • RDK-X5

        • 1. Introduction

          • RDK-X5 Hardware Introduction
        • 2. Quick Start

          • RDK-X5 Quick Start
        • 3. Application Development

          • 3.1 AI Online Model Development

            • AI Online Development - Experiment01
            • AI Online Development - Experiment02
            • AI Online Development - Experiment03
            • AI Online Development - Experiment04
            • AI Online Development - Experiment05
            • AI Online Development - Experiment06
          • 3.2 Large Language Models (Voice)

            • Voice LLM Application - Experiment01
            • Voice LLM Application - Experiment02
            • Voice LLM Application - Experiment03
            • Voice LLM Application - Experiment04
            • Voice LLM Application - Experiment05
            • Voice LLM Application - Experiment06
          • 3.3 40pin-IO Development

            • 40pin IO Development - Experiment01
            • 40pin IO Development - Experiment02
            • 40pin IO Development - Experiment03
            • 40pin IO Development - Experiment04
            • 40pin IO Development - Experiment05
            • 40pin IO Development - Experiment06
            • 40pin IO Development - Experiment07
          • 3.4 USB Module Development

            • USB Module Usage - Experiment01
            • USB Module Usage - Experiment02
          • 3.5 Machine Vision

            • Machine Vision Technology Development - Experiment01
            • Machine Vision Technology Development - Experiment02
            • Machine Vision Technology Development - Experiment03
            • Machine Vision Technology Development - Experiment04
          • 3.6 ROS2 Base Development

            • ROS2 Basic Development - Experiment01
            • ROS2 Basic Development - Experiment02
            • ROS2 Basic Development - Experiment03
            • ROS2 Basic Development - Experiment04
      • RDK-S100

        • 1. Introduction

          • 1.1 About RDK-S100
        • 2. Quick Start

          • 2.1 First Use
        • 3. Application Development

          • 3.1 AI Online Model Development

            • 3.1.1 Volcano Engine Doubao AI
            • 3.1.2 Image Analysis
            • 3.1.3 Multimodal Visual Analysis
            • 3.1.4 Multimodal Image Comparison
            • 3.1.5 Multimodal Document Analysis
            • 3.1.6 Camera AI Vision Analysis
          • 3.2 Large Language Models

            • 3.2.1 Speech Recognition
            • 3.2.2 Voice Conversation
            • 3.2.3 Multimodal Image Analysis
            • 3.2.4 Multimodal Image Comparison
            • 3.2.5 Multimodal Document Analysis
            • 3.2.6 Multimodal Vision Application
          • 3.3 40pin-IO Development

            • 3.3.1 GPIO Output LED Blink
            • 3.3.2 GPIO Input
            • 3.3.3 Key Control LED
            • 3.3.4 PWM Output
            • 3.3.5 Serial Output
            • 3.3.6 I2C Experiment
          • 3.4 USB Module Development

            • 3.4.1 USB Voice Module
            • 3.4.2 Sound Source Localization
          • 3.5 Machine Vision

            • 3.5.1 USB Camera
            • 3.5.2 Image Processing Basics
            • 3.5.3 Object Detection
            • 3.5.4 Image Segmentation
          • 3.6 ROS2 Base Development

            • 3.6.1 Environment Setup
            • 3.6.2 Create and Build Workspace
            • 3.6.3 ROS2 Topic Communication
            • 3.6.4 ROS2 Camera Application
    • Core-Board

      • C-3568BQ

        • 1. Introduction

          • C-3568BQ Introduction
      • C-3588LQ

        • 1. Introduction

          • C-3588LQ Introduction
      • GC-3568JBAF

        • 1. Introduction

          • GC-3568JBAF Introduction
      • C-K1BA

        • 1. Introduction

          • C-K1BA Introduction

13 Touch

1 Touch Introduction

1.1 Touch Screen Introduction

Touch screens have been around for a long time. Initially, they were resistive touch screens, which only supported single-point touch and were widely used in learning machines and feature phones era. On January 9, 2007, Apple released the revolutionary first-generation iPhone, also known as iPhone 2G. The iPhone 2G used a multi-point capacitive touch screen, while phones at that time basically used resistive touch screens. The excellent touch quality and feel of capacitive touch screens instantly conquered consumers, bringing a major change in phone touch screens. Subsequently, newly released phones also adopted multi-point capacitive touch screens.

Comparison between capacitive touch screen and resistive touch screen:

  • Multi-touch support: The biggest advantage of capacitive touch screens is multi-touch support (later resistive screens also supported multi-touch, but it was too late)
  • Touch sensitivity: Capacitive screens only need light finger touch, while resistive screens require some pressure from the finger
  • Calibration requirement: Capacitive screens do not require calibration, making them more convenient to use

Today, multi-point capacitive touch screens have been widely used in phones, tablets, advertising machines, etc. If you want to develop human-machine interaction equipment, multi-point capacitive touch screens are basically unavoidable. So in this chapter, we will learn how to use multi-point touch screens and how to obtain multi-point touch values. We will not study the physical principles of capacitive screens, after all, we are not developing capacitive screens, but users of capacitive screens. We only need to focus on how to use capacitive screens and how to obtain multi-point touch coordinate values.

Touch screen composition structure:

A screen is actually composed of a display panel + touch panel. The display panel is at the bottom, and the touch panel is on top. Packaging them together creates a screen with touch functionality. Capacitive touch screens also require a driver IC. The driver IC generally provides an I2C interface to the main controller, through which the main controller can read the touch coordinate data from the driver IC.

Note: The M4-R1 development board is equipped with a set of I2C touch interfaces. Currently adapted drivers include gt911, FT5X06, FT5406, etc. Unlike the drivers from the Linux kernel mentioned earlier, this driver is under the HDF framework. Users can view the supported touch ICs in the /drivers/hdf_core/framework/model/input/driver/touchscreen/ path.

1.2 Introduction to Linux input Subsystem

Input means input. Therefore, the input subsystem is the subsystem that manages input. It is a framework created by the Linux kernel for certain types of devices. Such as key inputs, keyboards, mice, touch screens, etc. These all belong to input devices. Different input devices represent different meanings. Keys and keyboards represent key information, while mice and touch screens represent coordinate information. Therefore, the processing in the application layer is different.

Input subsystem architecture:

  • Input driver layer: Responsible for specific hardware device driver implementation
  • Input core layer: Provides unified interface and management mechanism
  • Input event handling layer: Processes and distributes input events

Finally, it provides accessible device nodes for user space. The input subsystem framework is shown in the figure:

Input Subsystem Framework

For application development, we only need to care about the data sent from kernel space to user space.

2 I2C Touch Board Card Interface

I2C Touch Board Card Interface

3 Touch Screen Usage - Command Line Method

3.1 Device Tree Parsing

Tips

The file path below: out/kernel/src_tmp/linux-5.10/arch/arm64/boot/dts/rockchip/ Need to compile the source code first.

Below is a simple analysis of the touch controller node mounted on the I2C1 bus.

Warning

There are two touch controllers mounted on the SoC's I2C1 bus. The following uses the Goodix GT911 touch IC node equipped with the test screen as an example.

First, the basic definition layer (rk3568.dtsi):

i2c1: i2c@fe5a0000 {
    compatible = "rockchip,rk3399-i2c";
    reg = <0x0 0xfe5a0000 0x0 0x1000>;
    clocks = <&cru CLK_I2C1>, <&cru PCLK_I2C1>;
    clock-names = "i2c", "pclk";
    interrupts = <GIC_SPI 47 IRQ_TYPE_LEVEL_HIGH>;
    pinctrl-names = "default";
    pinctrl-0 = <&i2c1_xfer>;
    #address-cells = <1>;
    #size-cells = <0>;
    status = "disabled";
};

Rockchip's basic definition device tree source files do not provide direct I2C touch controller node descriptions, but provide general I2C controller node descriptions on the I2C bus. The reason is simple: to prevent device trees from becoming too lengthy and complex, which is exactly the original intention of using device trees. Below is a simple analysis of the i2c1 node.

  • compatible: Specifies compatibility, supports RK3399 I2C controller
  • reg: Register address range (0xfe5a0000-0xfe5a0fff)
  • interrupts: Interrupt number 47, high-level triggered
  • clocks: I2C function clock (CLK_I2C1) and APB clock (PCLK_I2C1)
  • pinctrl-0: Uses i2c1_xfer pin group by default
  • status: Default disabled state

Next is the pin configuration layer (rk3568-pinctrl.dtsi)

i2c1_xfer: i2c1-xfer {
    rockchip,pins =
        /* i2c1_scl */
        <0 RK_PB3 1 &pcfg_pull_none_smt>,
        /* i2c1_sda */
        <0 RK_PB4 1 &pcfg_pull_none_smt>;
};
..............

touch_gpio: touch-gpio {
    rockchip,pins =
        /* Interrupt pin */
        <0 RK_PB5 RK_FUNC_GPIO &pcfg_pull_up>,
        /* Reset pin */
        <0 RK_PB6 RK_FUNC_GPIO &pcfg_pull_none>;
};

The above two nodes are the pin configuration nodes for the I2C1 bus and touch chip respectively. They correspond to I2C1's SCL and SDA pins, as well as the touch chip's interrupt pin and reset pin.

  • i2c1_xfer: I2C1 bus pins, using GPIO0_B3 as SCL, GPIO0_B4 as SDA
  • touch_gpio: Touch chip control pins, GPIO0_B5 as interrupt pin (pull-up), GPIO0_B6 as reset pin

Finally, let's look at the board-level configuration layer (rk3568-toybrick.dtsi):

&i2c1 {
    status = "okay";

    gt9xx: gt9xx@5d {
        compatible = "goodix,gt9xx";
        status = "okay";
        reg = <0x5d>;
        reset-gpio = <&gpio0 RK_PB6 GPIO_ACTIVE_HIGH>;
        touch-gpio = <&gpio0 RK_PB5 IRQ_TYPE_LEVEL_LOW>;
        max-x = <7200>;
        max-y = <1280>;
        tp-size = <911>;
        pinctrl-names = "default";
        pinctrl-0 = <&touch_gpio>;
        power-supply = <&vcc3v3_lcd0_n>;
    };
};

This node is used to set related parameters of the touch screen, such as maximum coordinates, number of touch points, etc. The specific content is as follows:

  • &i2c1: References the i2c1 node from the basic definition
  • status = "okay": Enables the I2C1 controller and gt9xx touch chip
  • reg = <0x5d>: I2C slave device address of gt911 chip is 0x5d
  • reset-gpio: Reset pin uses GPIO0_B6, high level active
  • touch-gpio: Interrupt pin uses GPIO0_B5, low level triggered
  • max-x/max-y: Touch screen resolution 7200x1280
  • tp-size = <911>: Specifies touch chip model as gt911
  • power-supply: Power supply comes from vcc3v3_lcd0_n

3.2 Application Layer Testing Method for Touch Related Devices

The touch screen belongs to the input subsystem device. The input subsystem is a unified driver framework provided by Linux for input devices. The driver methods for input devices such as keys, keyboards, touch screens, and mice are similar. Input devices driven by the input subsystem can submit to the kernel through a unified data structure. This data structure includes the input time, type, code, and specific key values or coordinates. The kernel passes it to user space through file interfaces under the /dev/input directory.

getevent debugging tool:

The input subsystem device can use the getevent command to obtain events reported to the system:

  • getevent is a debugging tool under Android/Linux system
  • Used to monitor and display raw input events generated by the kernel input subsystem
  • Can see the bottom-level, unprocessed hardware input signals

The board has built-in the getevent command, which can be used to debug whether the touch screen is working properly.

In the /dev/input directory, use the command:

getevent

You can obtain all input sub-devices and monitor all device reported events.

Event format parsing:

The returned event format: device: type code value

Event type table:

Type CodeEvent TypeDescription
0000EV_SYNSynchronization event
0001EV_KEYKey event
0003EV_ABSAbsolute coordinate event (touch screen)

Event code table:

CodeNameDescription
0035ABS_MT_POSITION_XX coordinate
0036ABS_MT_POSITION_YY coordinate
0039ABS_MT_TRACKING_IDTouch point ID
0000SYN_REPORTReport synchronization
0002SYN_MT_REPORTMulti-touch report

Event value description:

Value TypeMeaning
Coordinate valueHexadecimal coordinate
00000000New touch point start
ffffffffTouch point end
00000001Key press
00000000Key release

Monitoring specific devices:

If you want to monitor specific sub-devices and their reported event types, you can use the command:

getevent -l /dev/input/event*

It will give real-time feedback on hexadecimal coordinate information (x, y), touch point ID and synchronization events. Below is an explanation of event types and event codes.

Linux input event types:

Event TypeFunctionTypical Application
EV_KEYKey eventPower key, volume key
EV_ABSAbsolute coordinate eventTouch screen, game joystick
EV_RELRelative coordinate eventMouse movement, scroll wheel
EV_SYNSynchronization eventEnd of data frame flag
EV_MSCMiscellaneous eventOther type events
EV_SWSwitch eventLid open/close, headphone plug/unplug

Common touch screen event codes:

Code NameFunctionDescription
ABS_MT_TRACKING_IDTouch point IDPositive=new touch, ffffffff=touch end
ABS_MT_POSITION_XX coordinateTouch point horizontal coordinate
ABS_MT_POSITION_YY coordinateTouch point vertical coordinate
ABS_MT_PRESSUREPressure valueTouch pressure size
ABS_MT_TOUCH_MAJORTouch areaContact area size

3.3 Specific Function Demonstration

3.3.1 Viewing Input Devices

First, enter the directory /dev/input/, you can see several input events:

Input Device List

3.3.2 Identifying Touch Screen Device

Use the command getevent to view the input device corresponding to input events:

getevent Device Recognition

Obviously, the device named "touchscreen" is our touch screen.

3.3.3 Monitoring Touch Events

In the screen-on state, click the screen, and you can view the corresponding touch events in the terminal:

Raw Touch Events

3.3.4 More Readable Event Display

We use a more readable command getevent -l /dev/input/event* to test again. After clicking the screen, the screen returns information as follows:

Readable Touch Events
Edit this page on GitHub
Last Updated:
Contributors: ZSL