Skip to content

Test Editor - AI Features

Shiplight AI's Test Editor leverages artificial intelligence to streamline test creation, maintenance, and execution. This guide covers all AI-powered capabilities.

Table of Contents

  1. AI Mode Toggle
  2. Natural Language Test Writing
  3. AI-Powered Test Generation
  4. Test Generation Status
  5. Natural Language Patterns
  6. AI-Powered Assertions
  7. Converting Between AI and Fast Modes
  8. AI Best Practices
  9. AI Limitations and Considerations
  10. Advanced AI Features
  11. Troubleshooting AI Features
  12. Next Steps

AI Mode Toggle

Each action step (except Code) features an AI toggle switch that enables dynamic, natural language test authoring.

Enabling AI Mode

Look for the toggle switch on each action:

  • AI On (✨ icon) - Dynamic mode enabled, natural language processing active
  • AI Off (standard icon) - Fast mode enabled, uses cached actions

Understanding the Difference

Fast Mode (Performance-Optimized)

  • Uses cached, pre-generated Playwright actions
  • Relies on fixed element selectors determined at test creation time
  • Executes quickly without re-evaluation
  • Best for stable applications with consistent DOM structure
  • May fail if element IDs or locators change dynamically

AI Mode / Dynamic Mode (Flexibility-Optimized)

  • Evaluates the action description against the current browser state
  • Dynamically identifies the best element to interact with
  • Adapts to changing element IDs, classes, and DOM structure
  • Handles modern web applications with dynamic content
  • Trading some execution speed for reliability and adaptability

When to Use Each Mode

Use Fast Mode when:

  • Your application has stable, consistent element IDs
  • Performance is critical
  • DOM structure rarely changes
  • Running high-frequency regression tests
  • Debugging specific selector issues

Use AI Mode (Dynamic Mode) when:

  • Testing modern SPAs with dynamic content
  • Element IDs/classes change between sessions
  • DOM structure varies based on user state
  • Building tests for evolving applications
  • Prioritizing test stability over speed

Natural Language Test Writing

Action Descriptions

Write test steps in plain English. The AI understands common testing terminology and converts it to executable code.

Examples of AI-powered descriptions:

Navigation:

Navigate to the login page
Go to the user dashboard
Open the settings menu

Interactions:

Click the submit button
Enter "john@example.com" in the email field
Select "Premium" from the subscription dropdown
Upload the invoice.pdf file

Assertions:

Verify the success message appears
Check that the total price is $99.99
Ensure the error message is not displayed
Confirm we are on the dashboard page

Context-Aware Intelligence

The AI understands:

  • Page context - Current URL and page state
  • Previous actions - What happened in earlier steps
  • Test goal - The overall objective of the test
  • Application patterns - Common UI patterns and workflows

AI-Powered Test Generation

From Test Goals

When creating a new test, provide a goal and the AI generates the complete test flow:

  1. Enter test goal: "Verify user can complete checkout with credit card"
  2. AI generates:
    • Navigation to product page
    • Adding items to cart
    • Proceeding to checkout
    • Filling payment information
    • Verifying order confirmation

Group Expansion

Use AI to expand high-level groups into detailed actions:

High-level group:

Group: Complete user registration

AI expands to:

- Click "Sign Up" button
- Fill in first name field
- Fill in last name field
- Enter email address
- Create password
- Confirm password
- Accept terms and conditions
- Click "Create Account"
- Verify welcome message

Test Generation Status

When AI is generating test content, the test status shows "Generating":

Generation Process

  1. Initialization - AI analyzes the test goal
  2. Planning - Creates high-level test structure
  3. Implementation - Generates detailed actions
  4. Validation - Ensures test completeness

Live Generation View

During generation, you can see:

  • Real-time progress updates
  • AI decision-making process
  • Preview of generated steps
  • Network activity and browser state

Natural Language Patterns

When using AI Mode, the system interprets your natural language descriptions to determine appropriate actions. Here are some common patterns and what they typically accomplish:

Common Description Patterns

Interactions:

  • "Click the submit button" → Clicking actions
  • "Type email@example.com" → Text input actions
  • "Select premium plan" → Dropdown selections
  • "Check the terms checkbox" → Checkbox interactions

Navigation:

  • "Go to the dashboard" → Page navigation
  • "Navigate to settings" → URL changes
  • "Go back to previous page" → Browser navigation

Waiting and Timing:

  • "Wait for the page to load" → Loading states
  • "Wait until results appear" → Element visibility
  • "Pause for animation to complete" → Timing delays

File Operations:

  • "Upload invoice.pdf" → File uploads
  • "Attach the document" → File attachments

Verifications:

  • "Verify success message appears" → Assertions
  • "Check that total is $99" → Value validations
  • "Ensure error is not shown" → Negative assertions

These are just examples - the AI understands many variations and phrasings. Write naturally and the AI will interpret your intent.

AI-Powered Assertions

Smart Verification

AI assertions understand various verification patterns:

Content Verification:

Verify the page shows "Welcome back, John"
Check that the price is correctly calculated
Ensure no error messages are displayed

State Verification:

Confirm the submit button is enabled
Verify the modal dialog is visible
Check that the checkbox is selected

Navigation Verification:

Ensure we're on the success page
Verify the URL contains "/dashboard"
Check that we've been redirected to login

Converting Between AI and Fast Modes

AI to Fast Mode Conversion

When switching from AI to Fast mode:

  1. The AI-generated action is preserved
  2. The action becomes a cached, fixed selector action
  3. Parameters are extracted and made editable

Fast Mode to AI Conversion

When switching from Fast mode to AI mode:

  1. The action description is preserved
  2. The action will now be evaluated dynamically
  3. AI interprets the description on next execution

Preserving Intent

The system maintains test intent during conversions:

  • Descriptions remain unchanged
  • Test logic is preserved
  • Only the execution method changes

AI Best Practices

Writing Effective Descriptions

Be Specific About Intent:

Good: "Click the Submit Order button"
Poor: "Click button"

Avoid Element Identifiers:

Good: "Click the login button"
Poor: "Click #login-btn or button.submit-class"

Both Business and Technical Language Work:

Business: "Complete the checkout process with express shipping"
Technical: "Submit the form and select express shipping option"
Both are equally valid - use what feels natural

Let AI Find the Elements:

Good: "Enter the email address"
Poor: "Enter text in input[type='email'] field"

When to Use AI Mode

Ideal for:

  • Rapid test creation
  • Non-technical team members
  • Exploratory testing
  • High-level test design
  • Complex workflows

Consider Manual Mode for:

  • Precise selector targeting
  • Performance-critical tests
  • Debugging specific issues
  • Custom JavaScript logic
  • Integration with CI/CD

Optimizing AI Performance

Provide Context:

  • Set clear test goals
  • Use descriptive test names
  • Add comments for complex logic

Leverage Learning:

  • AI improves with usage
  • Consistent terminology helps
  • Feedback improves accuracy

Structure Tests Well:

  • Group related actions in groups
  • Use clear section breaks
  • Maintain logical flow

AI Limitations and Considerations

Current Limitations

  • Cannot generate complex custom JavaScript
  • May need manual refinement for edge cases
  • Requires clear, unambiguous descriptions
  • Limited to web-based interactions

Fallback Strategies

When AI struggles:

  1. Switch to manual mode for specific steps
  2. Use Code blocks for complex logic
  3. Break complex actions into simpler steps
  4. Provide more specific descriptions

Performance Considerations

  • AI processing adds overhead compared to Fast Mode
  • Generation time varies with complexity
  • Each AI action requires real-time evaluation

Advanced AI Features

Auto-Healing

Shiplight AI automatically recovers from failures by switching to AI mode when Fast Mode actions fail. This happens in two scenarios:

1. Test Editor Auto-Healing

When debugging in the Test Editor:

  • If a Fast Mode action fails, the editor automatically switches to AI Mode
  • The AI Mode toggle visibly changes in the UI
  • The test flow is marked as "modified"
  • The action retries with AI's dynamic element detection
  • You can then:
    • Save the change to keep the action in AI Mode
    • Revert to return to Fast Mode
    • Modify the action description to fix the Fast Mode execution

2. Cloud Execution Auto-Healing

When tests run in the cloud:

  • If a Fast Mode action fails, it automatically retries in AI Mode
  • If AI Mode succeeds, the test continues execution
  • The test itself is NOT modified - the action remains in Fast Mode for future runs
  • If AI Mode also fails, the test stops and reports the failure
  • This provides resilience without changing your test configuration

Intelligent Test Maintenance

AI automatically adapts tests when:

  • UI elements change location
  • Text content updates
  • Page structure evolves
  • Workflows are modified

Smart Wait Strategies

AI determines optimal wait conditions:

  • Waits for specific elements
  • Detects loading states
  • Handles animations
  • Manages network delays

Context Awareness

During a single test session, AI maintains context:

  • Understands previous actions in the current test
  • Tracks the current page state
  • Aware of elements already interacted with
  • Maintains variable values within the session

Troubleshooting AI Features

Common Issues

AI Not Understanding Description:

  • Make description more specific
  • Use standard testing terminology
  • Break into smaller steps
  • Switch to manual mode if needed

Incorrect Action Generated:

  • Review and refine description
  • Check for ambiguous language
  • Provide more context
  • Use manual mode for precision

Generation Taking Too Long:

  • Simplify test goal
  • Break into smaller tests
  • Check network connectivity
  • Review system resources

Next Steps

Released under the MIT License.