Skip to main content
AI & Technology

PDPC Issues Draft Personal Data Protection Guidelines for AI — How Organizations Using AI Must Adapt

สคส. เปิดร่างแนวปฏิบัติคุ้มครองข้อมูลส่วนบุคคลสำหรับ AI ฉบับแรกของไทย กำหนดให้องค์กรที่ใช้ AI เป็น Data Controller และต้องจัดการข้อมูลตาม PDPA อย่างเข้มงวด

3 Mar 20265 minMondaq
PDPAAI Governanceสคส.Data Protection

PDPA and AI — Now Formally Converging

In February 2026, Thailand’s Personal Data Protection Committee Office (PDPC) released the draft Personal Data Protection Guidelines for the Development and Use of Artificial Intelligence for public consultation across sectors.

It may sound technical, but in reality it directly affects every organization currently using AI — whether for customer service chatbots, customer data analytics systems, or even HR tools that use AI to screen applications.

Key Issues Organizations Need to Understand

1. Organizations that deploy AI are Data Controllers

The draft guidelines clearly state that organizations using AI with personal data are considered data controllers, meaning they bear the full set of responsibilities required under the PDPA, including consent, data minimization, and transparency.

This applies not only to tech companies building AI, but also to any organization that adopts and uses AI in practice.

2. Vendors that use data to train AI may be classified as Data Controllers

This is a critical issue for organizations using external AI services. If a vendor uses your customer data to train its model without a clearly defined agreement, that vendor may be reclassified as a data controller. This is something organizations should review in their contracts immediately.

3. What the guidelines cover

The draft focuses on four main areas:

  1. Consent — Clear consent must be obtained before using data in AI, not merely through general terms in an agreement
  2. Data Minimization — AI should use only the data that is genuinely necessary, rather than ingesting all available data for processing
  3. Transparency — Users must understand what decisions AI is making, how those decisions are made, and on what basis
  4. Data Subject Rights — Data subjects retain their rights to object, correct, or delete data used in AI systems

Not Yet Legally Enforceable, but the Direction Is Clear

The draft does not yet have legal force, but the direction of future enforcement is already very clear. The PDPC is building a framework that can be used to assess whether the AI systems organizations use are compliant with the PDPA.

A key point of caution is that the PDPC’s Eagle Eye Crawler is already operating proactively, which means these guidelines may begin to serve as evaluation criteria even before they are formally enacted.

What Should Organizations Do Now?

If your organization uses AI anywhere in its operations, whether extensively or in limited ways, it should start with the following:

  • Conduct an AI Data Flow Audit — Identify where personal data flows through AI systems and what outputs it generates
  • Review Consent Coverage — Confirm whether existing consent covers AI-related use cases
  • Reassess Vendor Contracts — Check whether your AI vendor agreements clearly prohibit using your data to train models
  • Prepare Transparency Notices — Ensure users or customers receive clear explanations of how AI uses their data

Enersys’ PrivacyHub helps manage the consent lifecycle from the outset, creates a data inventory that identifies which data flows into which AI systems, and builds an audit trail that demonstrates to the PDPC that the organization handles data responsibly — both before and after these guidelines become formally effective.


References: Mondaq | IAPP | Personal Data Protection Committee Office

"Empowering Innovation,
Transforming Futures."

ติดต่อเราเพื่อทำให้โปรเจกต์ของคุณเป็นจริง