Back to blog

EU AI Act 2026: What French SMBs Must Do Before August

10 min read March 23, 2026 by Ludovic

If you run a small or medium-sized business in France and use any AI tools — even a basic chatbot or a resume screening system — the EU AI Act applies to you. And the deadline is closing in: August 2, 2026.

The bad news: penalties can reach 35 million euros or 7% of global annual turnover. The good news: for the vast majority of SMBs, compliance is entirely achievable — as long as you don't wait until the last minute.

This article is not a law lecture. It's a practical checklist for business owners who want to understand what they actually need to do in the next few months.

The Timeline: Where Are We Now?

The EU AI Act entered into force on August 1, 2024. It's being implemented in phases:

  • February 2025: Ban on AI practices deemed unacceptable risk (cognitive manipulation, social scoring, real-time biometric identification in public spaces)
  • August 2025: Obligations for general-purpose AI models (GPAI) and limited-risk systems (transparency requirements)
  • August 2, 2026: Full application — all obligations for high-risk AI systems take effect

Important note: the European Commission proposed a "Digital Omnibus" package in late 2025 that could postpone certain high-risk obligations for Annex III systems until December 2027. But it's not confirmed. Don't count on it. Plan for August 2026.

The 4 Risk Levels: Where Does Your Business Fall?

The AI Act classifies AI systems into four categories. Your first task is figuring out which category your tools fall into.

Unacceptable Risk (Banned)

These systems have been prohibited since February 2025:

  • Behavioral manipulation targeting vulnerabilities (age, disability)
  • Social scoring systems
  • Biometric categorization based on sensitive characteristics (sexual orientation, political opinions)
  • Untargeted scraping of facial images for recognition databases

For your SMB: unless you operate in a very specific domain, you're probably not affected here. But double-check: some marketing tools using emotion recognition could fall into this category.

High Risk (Regulated)

This is the heaviest category in terms of obligations. It covers AI systems used in:

  • Recruitment and HR: resume screening, candidate evaluation, promotion decisions
  • Access to essential services: credit scoring, insurance assessment
  • Education: automated grading, student orientation
  • Critical infrastructure: water, electricity, and transport management
  • Law enforcement and justice

For your SMB: if you use an AI tool to screen resumes or evaluate candidates, you're potentially in this category. This is the most common high-risk scenario for SMBs.

Limited Risk (Transparency Required)

These systems primarily require transparency obligations:

  • Chatbots: users must know they're talking to an AI
  • Deepfakes and generated content: must be labeled as AI-generated
  • Emotion recognition systems: affected individuals must be informed

For your SMB: if you have a chatbot on your website or use AI to generate marketing content, you're here. The obligations are light but real.

Minimal Risk (No Constraints)

The majority of AI uses in SMBs: spam filters, basic product recommendations, productivity tools (AI spell-check, text summarization, etc.).

No specific obligations, though best practices are still recommended.

The 7 Concrete Steps to Compliance

Step 1: Inventory Your AI Tools (April 2026)

First, you need to know what you're using. Catalog ALL tools incorporating AI in your business:

  • Recruitment tools (Indeed, LinkedIn Recruiter, ATS solutions)
  • Chatbots and virtual assistants
  • Marketing and CRM tools with built-in AI
  • Accounting or financial solutions with AI
  • Productivity tools (Copilot, ChatGPT, Claude, etc.)

Deliverable: a spreadsheet with the tool name, its use case, the vendor, and the data processed.

Estimated cost: 0 euros (internal work) to 2,000–5,000 euros if you hire a consultant.

Step 2: Classify the Risk Level (April–May 2026)

For each tool identified, determine its risk category. The European Commission provides an AI Act Compliance Checker specifically designed to help SMBs and startups.

Pro tip: most of your SaaS software vendors should be communicating about their tools' classification by now. Ask them. If they can't answer, that's a red flag.

Step 3: For Limited-Risk Systems — Implement Transparency (May 2026)

If you have chatbots or generate content with AI:

  • Add a clear notice: "This content is generated by artificial intelligence" or "You are chatting with an AI assistant"
  • Update your terms of service and privacy policy
  • Inform your users in a visible, understandable way

Estimated cost: 500–2,000 euros (legal + technical updates).

Step 4: For High-Risk Systems — Enhanced Obligations (May–July 2026)

If you're affected (notably recruitment), the obligations are heavier:

  • Risk management system: document the risks associated with the system
  • Data governance: ensure training data is relevant, representative, and free from excessive bias
  • Technical documentation: your vendor must provide it — ask for it
  • Record-keeping: maintain usage logs for at least 6 months
  • Human oversight: a human must be able to intervene in the system's decisions
  • Accuracy and robustness: the system must function reliably

Estimated cost: 5,000–15,000 euros for comprehensive legal and technical support.

Step 5: Designate an AI Responsible Person (May 2026)

You don't need to hire someone new. Appoint an internal point of contact (often the DPO if you have one, otherwise the quality manager or the business owner themselves) who will handle AI-related questions.

Estimated cost: 0 euros (internal reorganization).

Step 6: Train Your Teams (June–July 2026)

Article 4 of the AI Act imposes an AI literacy obligation for all staff who use or supervise AI systems. This is frequently underestimated.

Concretely, your employees need to understand:

  • What the AI tool they use actually does
  • Its limitations
  • Associated risks
  • How to report problems

Estimated cost: 1,000–3,000 euros for a half-day to full-day group training session.

Step 7: Document and Archive (July 2026)

Build a compliance file that includes:

  • Your AI tools inventory
  • Risk classification for each
  • Measures taken per category
  • Proof of staff training
  • Transparency policies implemented

This file is your shield in case of an audit.

How Much Does This Actually Cost?

Let's be honest about the numbers for a typical 50-person SMB:

Item Low End High End
Inventory and classification 0 euros 5,000 euros
Limited-risk compliance 500 euros 2,000 euros
High-risk compliance 5,000 euros 15,000 euros
Staff training 1,000 euros 3,000 euros
Legal support 2,000 euros 8,000 euros
Total 8,500 euros 33,000 euros

If you only use minimal and limited-risk tools (the most common case), expect to spend 3,000 to 8,000 euros all-in. That's a reasonable investment, especially compared to potential fines.

Good news: SMBs benefit from specific relief provisions in the regulation. And each EU member state must establish at least one "AI regulatory sandbox" by August 2026 to help businesses test their systems in a safe environment.

What Your Vendors Must Do (And What You Should Verify)

A crucial point many SMBs overlook: if you use a high-risk AI system, you're considered a "deployer" under the regulation. You have obligations even though you didn't build the tool.

But your vendor has even heavier obligations as a "provider." Ask them these questions:

  1. Is your system classified under the AI Act? Which category?
  2. Can you provide the required technical documentation?
  3. Does (or will) the system carry the CE marking?
  4. How do you handle transparency obligations?
  5. What human oversight mechanisms are in place?

If your vendor can't answer these questions, it might be time to look for a new one.

My Advice: Don't Panic, But Don't Sleep On It Either

The AI Act isn't designed to prevent SMBs from using AI. It's designed to put guardrails in place. For 80% of SMBs using common AI tools (chatbots, productivity tools, marketing automation), the obligations boil down to transparency and training.

The most common mistakes I see:

  1. Ignoring the topic thinking "it's only for big companies" — wrong
  2. Panicking and stopping everything — unnecessary and costly
  3. Blindly trusting vendors who say "we've got it covered" without documentation
  4. Waiting until the last minute — now is the time to start

Need a clear assessment of where you stand with the AI Act?

I offer a pragmatic AI compliance audit designed for SMBs: no unnecessary legal jargon, just a concrete action plan with priorities and cost estimates.

Get in touch to discuss it