Thailand’s AI Law — It’s Really Coming
AI regulation in Thailand has been discussed since 2022–2023, but it now appears to be taking much clearer shape. Thailand’s draft Artificial Intelligence Act is currently under consideration, and based on the direction signaled by relevant authorities, this is no longer something likely to be shelved.
Why Does Thailand Need a Separate AI Law if the PDPA Already Exists?
Many people may wonder: if the PDPA is already in place, why is another AI law necessary? The simple answer is that the PDPA protects personal data, while AI involves issues that go far beyond that.
Consider that AI can:
- Make decisions on credit or insurance eligibility
- Automatically screen job applications
- Analyze customer behavior for dynamic pricing
- Generate content that appears human-written (like this article... just kidding)
These issues go beyond personal data. They concern fairness, transparency, and accountability in the use of AI.
The EU AI Act — A Model Thailand Is Also Watching
Thailand’s draft AI law appears to draw inspiration from several elements of the EU AI Act, which will become fully applicable to high-risk AI systems on August 2, 2026, with penalties of up to €35 million or 7% of global annual turnover.
What is particularly notable is that the EU AI Act uses a risk-based approach, classifying AI systems by level of risk:
- Unacceptable risk — such as social scoring or emotion recognition in the workplace → prohibited outright
- High risk — such as AI used in recruitment, credit, or insurance → subject to strict compliance requirements
- Limited risk — such as chatbots → users must be informed that they are interacting with AI
- Low risk — such as spam filters → no special obligations
Thailand is likely to adopt a similar approach, while adapting it to the local context.
3 Things Thai Organizations Should Start Preparing Now
1. Build an AI Inventory
Do you know what AI your organization is already using? Many companies use AI without fully realizing it—through subscribed SaaS platforms or tools employees adopt on their own (Shadow AI). Creating an inventory is the most important first step.
2. Assess Risk
Once you know what AI systems are in use, the next step is to assess how each one may affect people. Does it make decisions about individuals? Does it use sensitive data? Is there a risk of bias?
3. Establish an AI Governance Framework
There is no need to wait for the law to be enacted. Putting an internal framework in place now—defining how AI may be used, who is responsible, and what the review process should look like—is something organizations can do immediately. It will also make legal readiness much easier once the law arrives.
PDPA and AI — Still Closely Connected
Even if a separate AI law is introduced, personal data used in AI will always remain subject to the PDPA. Thailand’s Personal Data Protection Committee is also continuing to issue additional guidance on AI governance.
That is why getting PDPA compliance in order now is so important. When the AI law arrives, organizations with a strong PDPA foundation will be able to adapt much more quickly.
Enersys’ PrivacyHub helps organizations manage end-to-end PDPA compliance through a Zero-PII Architecture that does not store personal data at all. Combined with Genesis AI, it can help analyze compliance gaps and provide automated recommendations—preparing your organization for both PDPA obligations and the AI law that is on the horizon.
AI regulation is coming — is your organization ready for both Privacy and AI compliance? Start by getting your PDPA program in order with PrivacyHub and consult the Genesis AI Platform to prepare for both in parallel.
If you need specialized advice on AI Governance and PDPA Compliance for your organization, contact the Enersys team at enersys.co.th/th/contact-us