What does the EU AI Act mean for start-ups and SMEs?
The European Union (EU) is known for its pioneering role in technology regulation. With the advent of artificial intelligence (AI), the EU has once again taken the lead with the AI Act. This legislation, which has been partially in effect since August 1, 2024, aims to guide the development and use of AI, focusing on safety and fundamental rights. But what does this law mean for start-ups and SMEs active in the AI sector?
In this blog post, we discuss the impact of the EU AI Act on this important group of companies. We analyze the specific topics, identify potential benefits and drawbacks, and examine the supporting measures available.
Risk Classification under the AI Act
The AI Act adopts a risk-based approach and classifies AI systems into four categories:
Unacceptable risk
AI systems that pose a threat to individuals and conflict with the EU's fundamental rights and values. Examples include social scoring systems and AI systems that manipulate people.
High risk
AI systems used in critical sectors that can pose a significant risk to health, safety, or fundamental rights. Examples include AI in medical devices, AI in self-driving cars, and AI in recruitment processes.
Limited risk
AI systems that can cause confusion or deception for users. Examples include chatbots and deepfakes.
Minimal risk
AI systems that pose minimal or no risk. Examples include spam filters and AI games.
It is crucial for start-ups and SMEs to understand these classifications, as obligations under the AI Act vary depending on the risk level of their AI systems.
Specific Provisions for Start-ups and SMEs
While the EU AI Act primarily aims to regulate AI systems based on risk level, the legislation recognizes the important role start-ups and SMEs play in driving innovation and economic growth. To prevent these companies from being disproportionately affected by compliance costs, the EU has included several specific provisions to ease the burden.
A key aspect is the simplified form of technical documentation that start-ups and SMEs may use. This reduces administrative burdens and makes it easier for smaller companies to meet requirements. Additionally, external assessors must accept this simplified documentation when evaluating a company's compliance.
Obligations for Providers and Users of High-Risk AI Systems
Providers of high-risk AI systems are required to provide technical documentation that meets the requirements of Annex IV of the AI Act. For SMEs and start-ups, there is the possibility to use equivalent documentation, provided it is approved by the competent national authority.
Users of high-risk AI systems must conduct a fundamental rights impact assessment (FRIA). However, the AI Act recognizes that conducting an FRIA can be a heavy burden for SMEs and start-ups, and therefore they are exempt from the requirement to consult with stakeholders during this process.
Exceptions
The AI Act provides several exceptions to the regulations. A significant exception applies to AI systems released under a "free and open-source license". These systems are exempt from all obligations unless they are marketed as high-risk or unacceptable AI systems, or as AI systems subject to transparency obligations (such as medical devices or AI systems used in law enforcement).
However, this exception does not apply to "general-purpose AI models" (GPAI) with "systemic risks". Providers of GPAI models with high impact potential must comply with the regulations, regardless of whether they are open source.
General Obligations
In addition to the specific provisions for high-risk AI systems, there is a general obligation for all AI system providers to register their systems in the EU database.
Benefits of the EU AI Act for Start-ups and SMEs
The EU AI Act offers several potential benefits for start-ups and SMEs:
Competitive advantage: By complying with regulations, SMEs can differentiate themselves by promoting ethical AI practices and attracting customers who prioritize responsible AI use and data protection. This can give them an edge over competitors who do not meet the strict requirements of the AI Act.
Market credibility and trust: Compliance with the AI Act can enhance credibility, attract partnerships and investment opportunities, and boost customer confidence in data security and ethical AI implementation. In a market where trust is increasingly important, this can be a decisive factor for success.
Long-term stability: The regulatory framework aims to create a stable environment for AI development, giving SMEs a clear legal framework for long-term planning and business stability. This provides certainty and encourages investment in AI innovation.
Alignment with global standards: Compliance with the EU AI Act can align SMEs with emerging global standards for AI regulation, facilitating expansion into international markets with similar regulations. The EU AI Act has the potential to become a global standard for AI regulation, and by complying with requirements now, SMEs can position themselves as leaders in responsible AI.
Disadvantages and Challenges
Along with the benefits, the EU AI Act also brings several disadvantages and challenges for start-ups and SMEs:
Financial Challenges
Increased development costs: Following the requirements of the AI Act can lead to higher costs for AI system developers, which can put SMEs and start-ups at a considerable disadvantage compared to large companies with more financial resources. For start-ups with limited funding, this can be a major barrier to developing AI solutions.
Competitive Challenges
Impact on competition: Some SMEs worry that reporting requirements and transparency obligations could put EU companies at a competitive disadvantage compared to companies in regions with less stringent AI regulation. This can lead to delays in regulatory approval and higher prices for consumers.
Extended time-to-market: Complying with EU AI Act requirements can increase administrative burdens and delay the launch of new AI products. This can be particularly problematic for start-ups that need to innovate quickly to compete.
Impact on investments: The increased regulatory pressure may deter investors and hinder the growth of AI startups in Europe. Investors might choose start-ups in regions with less stringent regulations, reducing European start-ups' access to capital.
Implementation Challenges
Potential delays: There is a risk of delays in implementing the AI Act, creating legal uncertainty around AI for SMEs and affecting their ability to navigate and comply with new regulations. This uncertainty can lead to hesitation among SMEs to invest in AI.
Compliance cost concerns: Organizations fear that the proposed self-regulation could shift compliance responsibility to SMEs, resulting in high compliance costs and potential barriers to AI adoption. This could lead to a situation where only large companies can afford AI.
Copyright Considerations
The AI Act addresses copyright issues regarding data used for training AI systems. The legislation aims to ensure creators are notified when their copyrighted material is used in training AI systems. This is particularly important for SMEs developing AI systems, as it helps them maintain intellectual property integrity and prevent potential legal issues.
Supporting Measures
The EU and member states recognize the challenges faced by start-ups and SMEs and have implemented various supporting measures:
Priority access to AI sandboxes: SMEs receive priority access to AI sandboxes. These are controlled environments where they can test and refine their AI technologies without the usual bureaucracy. AI sandboxes provide a safe space to experiment with AI systems and comply with regulations before market launch.
Training and advice: The AI Act provides guidelines, training sessions, and dedicated communication channels to support SMEs in compliance. Member states must organize specific awareness and training activities tailored to SME needs, provide advice, and answer questions. National competent authorities can also provide guidance and advice to SMEs, taking into account EU-level guidelines.
Financial support: The European Commission has launched a program to help start-ups and SMEs develop AI systems that comply with EU requirements. This includes making supercomputers available for training AI models.
Reduced conformity assessment costs: National notified bodies must adjust costs for conformity assessment procedures for SMEs. This lowers the financial threshold for SMEs to comply with AI Act requirements.
Involvement in standardization: Member states must facilitate SME participation in the AI standards development process. This ensures SME voices are heard in shaping the future of AI regulation.
Consideration of SME interests in sanctions: Member states must consider SME interests when determining sanctions under the AI Act. This ensures sanctions are proportionate to SME size and resources.
An Important Step Towards Responsible AI in Europe
The EU AI Act represents a significant step towards responsible AI in Europe. For start-ups and SMEs active in the AI sector, the legislation brings both opportunities and challenges. On one hand, the AI Act offers the opportunity to stand out with ethical AI practices, build customer trust, and benefit from a stable regulatory framework. On the other hand, start-ups and SMEs must prepare for compliance costs, competitive pressure, and potential delays associated with the new regulations.
Fortunately, numerous supporting measures are available to help start-ups and SMEs implement the AI Act. By utilizing these measures, such as AI sandboxes, training programs, and financial support, these companies can leverage the benefits of the AI Act and strengthen their position in the rapidly growing AI market.
Test Your Knowledge 🎯
Now that you have an overview of the impact of the EU AI Act on start-ups and SMEs, are you ready to test your knowledge?
Test your knowledge about the EU AI Act impact on Start-ups and SMEs
Discover how much you know about how the EU AI Act affects start-ups and SMEs
Start de Quiz