Home
Home
German Version
Support
Impressum
25.3 Release ►

Start Chat with Collection

    Main Navigation

    • Preparation
      • Connectors
      • Create an InSpire VM on Hyper-V
      • Initial Startup for G7 appliances
      • Setup InSpire G7 primary and Standby Appliances
    • Datasources
      • Configuration - Atlassian Confluence Connector
      • Configuration - Best Bets Connector
      • Configuration - Box Connector
      • Configuration - COYO Connector
      • Configuration - Data Integration Connector
      • Configuration - Database Connector
      • Configuration - Documentum Connector
      • Configuration - Dropbox Connector
      • Configuration - Egnyte Connector
      • Configuration - GitHub Connector
      • Configuration - Google Drive Connector
      • Configuration - GSA Adapter Service
      • Configuration - HL7 Connector
      • Configuration - IBM Connections Connector
      • Configuration - IBM Lotus Connector
      • Configuration - Jira Connector
      • Configuration - JVM Launcher Service
      • Configuration - LDAP Connector
      • Configuration - Microsoft Azure Principal Resolution Service
      • Configuration - Microsoft Dynamics CRM Connector
      • Configuration - Microsoft Exchange Connector
      • Configuration - Microsoft File Connector (Legacy)
      • Configuration - Microsoft File Connector
      • Configuration - Microsoft Graph Connector
      • Configuration - Microsoft Loop Connector
      • Configuration - Microsoft Project Connector
      • Configuration - Microsoft SharePoint Connector
      • Configuration - Microsoft SharePoint Online Connector
      • Configuration - Microsoft Stream Connector
      • Configuration - Microsoft Teams Connector
      • Configuration - Salesforce Connector
      • Configuration - SCIM Principal Resolution Service
      • Configuration - SemanticWeb Connector
      • Configuration - ServiceNow Connector
      • Configuration - Web Connector
      • Configuration - Yammer Connector
      • Data Integration Guide with SQL Database by Example
      • Indexing user-specific properties (Documentum)
      • Installation & Configuration - Atlassian Confluence Sitemap Generator Add-On
      • Installation & Configuration - Caching Principal Resolution Service
      • Installation & Configuration - Mindbreeze InSpire Insight Apps in Microsoft SharePoint On-Prem
      • Mindbreeze InSpire Insight Apps in Microsoft SharePoint Online
      • Mindbreeze Web Parts for Microsoft SharePoint
      • User Defined Properties (SharePoint 2013 Connector)
      • Whitepaper - Mindbreeze InSpire Insight Apps in Salesforce
      • Whitepaper - Web Connector - Setting Up Advanced Javascript Usecases
    • Configuration
      • CAS_Authentication
      • Configuration - Alerts
      • Configuration - Alternative Search Suggestions and Automatic Search Expansion
      • Configuration - Back-End Credentials
      • Configuration - Chinese Tokenization Plugin (Jieba)
      • Configuration - CJK Tokenizer Plugin
      • Configuration - Collected Results
      • Configuration - CSV Metadata Mapping Item Transformation Service
      • Configuration - Entity Recognition
      • Configuration - Exporting Results
      • Configuration - External Query Service
      • Configuration - Filter Plugins
      • Configuration - GSA Late Binding Authentication
      • Configuration - Identity Conversion Service - Replacement Conversion
      • Configuration - InceptionImageFilter
      • Configuration - Index-Servlets
      • Configuration - InSpire AI Chat and Insight Services for Retrieval Augmented Generation
      • Configuration - Item Property Generator
      • Configuration - Japanese Language Tokenizer
      • Configuration - Kerberos Authentication
      • Configuration - Management Center Menu
      • Configuration - Metadata Enrichment
      • Configuration - Metadata Reference Builder Plugin
      • Configuration - Mindbreeze Proxy Environment (Remote Connector)
      • Configuration - Personalized Relevance
      • Configuration - Plugin Installation
      • Configuration - Principal Validation Plugin
      • Configuration - Profile
      • Configuration - Reporting Query Logs
      • Configuration - Reporting Query Performance Tests
      • Configuration - Request Header Session Authentication
      • Configuration - Shared Configuration (Windows)
      • Configuration - Vocabularies for Synonyms and Suggest
      • Configuration of Thumbnail Images
      • Cookie-Authentication
      • Documentation - Mindbreeze InSpire
      • I18n Item Transformation
      • Installation & Configuration - Outlook Add-In
      • Installation - GSA Base Configuration Package
      • JWT Authentication
      • Language detection - LanguageDetector Plugin
      • Mindbreeze Personalization
      • Mindbreeze Property Expression Language
      • Mindbreeze Query Expression Transformation
      • SAML-based Authentication
      • Trusted Peer Authentication for Mindbreeze InSpire
      • Using the InSpire Snapshot for Development in a CI_CD Scenario
      • Whitepaper - AI Chat
      • Whitepaper - Create a Google Compute Cloud Virtual Machine InSpire Appliance
      • Whitepaper - Create a Microsoft Azure Virtual Machine InSpire Appliance
      • Whitepaper - Create AWS 10M InSpire Appliance
      • Whitepaper - Create AWS 1M InSpire Appliance
      • Whitepaper - Create AWS 2M InSpire Appliance
      • Whitepaper - Create Oracle Cloud 10M InSpire Application
      • Whitepaper - Create Oracle Cloud 1M InSpire Application
      • Whitepaper - MMC_ Services
      • Whitepaper - Natural Language Question Answering (NLQA)
      • Whitepaper - SSO with Microsoft AAD or AD FS
      • Whitepaper - Text Classification Insight Services
    • Operations
      • Adjusting the InSpire Host OpenSSH Settings - Set LoginGraceTime to 0 (Mitigation for CVE-2024-6387)
      • app.telemetry Statistics Regarding Search Queries
      • CIS Level 2 Hardening - Setting SELinux to Enforcing mode
      • Configuration - app.telemetry dashboards for usage analysis
      • Configuration - Usage Analysis
      • Deletion of Hard Disks
      • Handbook - Backup & Restore
      • Handbook - Command Line Tools
      • Handbook - Distributed Operation (G7)
      • Handbook - Filemanager
      • Handbook - Indexing and Search Logs
      • Handbook - Updates and Downgrades
      • Index Operating Concepts
      • Inspire Diagnostics and Resource Monitoring
      • Provision of app.telemetry Information on G7 Appliances via SNMPv3
      • Restoring to As-Delivered Condition
      • Whitepaper - Administration of Insight Services for Retrieval Augmented Generation
    • User Manual
      • Browser Extension
      • Cheat Sheet
      • iOS App
      • Keyboard Operation
    • SDK
      • api.chat.v1beta.generate Interface Description
      • api.v2.alertstrigger Interface Description
      • api.v2.export Interface Description
      • api.v2.personalization Interface Description
      • api.v2.search Interface Description
      • api.v2.suggest Interface Description
      • api.v3.admin.SnapshotService Interface Description
      • Debugging (Eclipse)
      • Developing an API V2 search request response transformer
      • Developing Item Transformation and Post Filter Plugins with the Mindbreeze SDK
      • Development of a Query Expression Transformer
      • Development of Insight Apps
      • Embedding the Insight App Designer
      • Java API Interface Description
      • OpenAPI Interface Description
    • Release Notes
      • Release Notes 20.1 Release - Mindbreeze InSpire
      • Release Notes 20.2 Release - Mindbreeze InSpire
      • Release Notes 20.3 Release - Mindbreeze InSpire
      • Release Notes 20.4 Release - Mindbreeze InSpire
      • Release Notes 20.5 Release - Mindbreeze InSpire
      • Release Notes 21.1 Release - Mindbreeze InSpire
      • Release Notes 21.2 Release - Mindbreeze InSpire
      • Release Notes 21.3 Release - Mindbreeze InSpire
      • Release Notes 22.1 Release - Mindbreeze InSpire
      • Release Notes 22.2 Release - Mindbreeze InSpire
      • Release Notes 22.3 Release - Mindbreeze InSpire
      • Release Notes 23.1 Release - Mindbreeze InSpire
      • Release Notes 23.2 Release - Mindbreeze InSpire
      • Release Notes 23.3 Release - Mindbreeze InSpire
      • Release Notes 23.4 Release - Mindbreeze InSpire
      • Release Notes 23.5 Release - Mindbreeze InSpire
      • Release Notes 23.6 Release - Mindbreeze InSpire
      • Release Notes 23.7 Release - Mindbreeze InSpire
      • Release Notes 24.1 Release - Mindbreeze InSpire
      • Release Notes 24.2 Release - Mindbreeze InSpire
      • Release Notes 24.3 Release - Mindbreeze InSpire
      • Release Notes 24.4 Release - Mindbreeze InSpire
      • Release Notes 24.5 Release - Mindbreeze InSpire
      • Release Notes 24.6 Release - Mindbreeze InSpire
      • Release Notes 24.7 Release - Mindbreeze InSpire
      • Release Notes 24.8 Release - Mindbreeze InSpire
      • Release Notes 25.1 Release - Mindbreeze InSpire
      • Release Notes 25.2 Release - Mindbreeze InSpire
      • Release Notes 25.3 Release - Mindbreeze InSpire
    • Security
      • Known Vulnerablities
    • Product Information
      • Product Information - Mindbreeze InSpire - Standby
      • Product Information - Mindbreeze InSpire
    Home

    Path

    Sure, you can handle it. But should you?
    Let our experts manage the tech maintenance while you focus on your business.
    See Consulting Packages

    Query Performance Testing
    Installation and Configuration

    IntroductionPermanent link for this heading

    With the Query Performance Tester, the search performance and expected search hits can be tested automatically and reproducibly. The most important features are:

    • Testing with custom search apps
    • Creation of test plans (what is searched for, what are the expected results)
    • Parameterization of test runs (user, number of parallel searches, number of iterations)
    • Monitoring a test run
    • Display detailed statistics of a test run

    ConfigurationPermanent link for this heading

    Open the tab “Indices” in the menu “Configuration” of the Mindbreeze Management Center. Add a new Service in the “Services” section by clicking on the “Add Service” button. Select the service type “QueryPerformanceTesterService” from the “Service” list.

    Base ConfigurationPermanent link for this heading

    Setting

    Description

    Service Bind Port

    The port on which the service is listening for requests

    SetupPermanent link for this heading

    Launching a Test JobPermanent link for this heading

    You can find the user interface of the Query Performance Tester Service in the Mindbreeze Management Center in the menu "Reporting", submenu "Query Performance Tests".

    If you are using CentOS 6, you can add the URL to “Query Performance Tests” via resources.json. By default, this feature is only visible in CentOS7. The URL looks like this:

    :8443/index/<YOUR SERVICE BIND PORT>/admin/index.html

    You can find the Service Bind Port in ‘Services’ – Settings in the Configuration.

    Here you can find the documentation how to add a menu entry: Managementcenter Menu

    Adding a Search AppPermanent link for this heading

    In the “Search Apps” section click on “Add Search App”.

    Set the name of the search app. If JavaScript code is available for the given app, which transforms the request before sending it, paste it in the text field or upload it from a file using the “Upload Search App” button. For a simple use case the field can be left empty.

    Uploading a Test PlanPermanent link for this heading

    In the “Test Plan” of the management interface click on “Add Test Plan”. On the dialog you can specify a name for the test plan to be added and you can add some query terms that should be tested.

    If more advanced features are needed, like key and property expectations, the test plan source can be manually edited and refined using the “View Source Code” button.

    After the test plan is complete, it should be saved and then will be available for a test job launch.

    Starting a TestPermanent link for this heading

    After a test plan is created you can start a test job by clicking on the “Start Test” button next to the created test plan in the “Test Plans” list.

    For starting a test job, the following parameters are required:

    Setting

    Description

    Endpoint

    The URL of the Mindbreeze Client Service search API endpoint.

    Users

    Count of parallel searches.

    Iterations

    Count of iterations over the given test plan.

    User group

    The list of users associated to the searches, if not available, this can be configured with “Manage User Groups”.

    Job name

    The name of the job.

    Search App

    Selected search app for performing the test.

    Description

    A text description of the current test job.

    Setting up a „User Group”Permanent link for this heading

    Click on “Manage User Groups” in the dialog for test starting and set a name for the user group in the “Add User Group” section.

    Add users to the created group and finally click on the “Add User Group” button.

    After finished, with “Cancel” you can return to the original dialog:

    The created user group should be available for selection in the “User group” dropdown list.

    If a user group is selected for a test job run, the requests are sent with HTTP header “X-Username” set to the user names included in the user group. For the Client Service to recognize this header, the Request Header Session Authentication must be configured.

    If the count of the users set for the test run exceeds the user group size, the remaining user names are set again with the first users of the user group.

    Add EnvironmentsPermanent link for this heading

    Using the “Manage Environments” dialog you can create and save predefined test environments. Environments can be used to define some base settings. These base settings can be applied when starting a test by selecting the defined “Environment” in the “Start test” dialog (see section “Starting a Test”).

    The following parameters are required:

    Setting

    Description

    Environment Name

    Name of the environment.

    Endpoint

    Mindbreeze Client Service search API endpoint, for example https://myserver.myorganization.com/api/v2/search.

    User Count

    Number of parallel search threads.

    Iteration Count

    Number of test iterations: how many times the action list set defined.

    Monitoring a Test RunPermanent link for this heading

    Start the test run as described in the previous section. Real time monitoring of the current test is available in “Execution (Jobs)”:

    Here a real time graph showing the number of queries executed per second and the average request duration. Different colors of the graph mark different iterations.

    Test ReportsPermanent link for this heading

    After the job finished, it is listed in the Job History. By clicking on “Details” a report of the executed job is shown.

    On the first page a general report and, if it is the case, a list of errors that occurred during the test job execution is shown.

    The following execution parameters are displayed:

    Parameter

    Description

    Search App

    The name of the search app.

    Testplan

    The name of the test plan.

    Duration

    The duration of the test plan execution (job) in seconds.

    Average Requets Per Second

    The average number of executed requests per second during the job.

    Max. Requests Per Second

    The maximum number of executed requests per second during the job.

    Min. Requests Per Second

    The minimum number of executed requests per second during the job.

    Job Description

    The job description.

    Endpoint

    The URL of the Mindbreeze Client Service search API endpoint.

    Start Date

    The date when the job started.

    Average Duration of Search Requests

    The average duration of a search request in milliseconds during the job.

    Requests Per Second

    The average requests per second during the job.

    Iteration Count

    The number of search iterations.

    Status

    The status of the jobs (running, finished, canceled, queuing).

    Usercount

    The number of concurrently searching users.

    Using the “Request Report” view a detailed report of each executed query is shown. The data shown in the table is the minimum, the average and the maximum value of the request parameter chosen. Using the paging controls, one can navigate through all test iterations and show the data.

    The following request parameters can be shown:

    Parameter

    Description

    Duration

    Duration of request in milliseconds.

    Count

    Count of delivered results for the given query.

    Estimated Count

    Estimated count of available results for the given query.

    Answer Count

    Count of delivered NLQA answers for the given query.

    Example of a Test Plan Execution (Job)Permanent link for this heading

    The following example describes the procedure how a job is executed with the following settings:

    SettingsPermanent link for this heading

    1. Testplan (Query Terms):
      1. Active
      2. testing
      3. python
      4. mindbreeze
      5. “mindbreeze search”
      6. mindbreeze AND search
    2. Users: 5
    3. Iterations: 3
    4. User Group
      1. User 1
      2. User 2
      3. User 3

    ExecutionPermanent link for this heading

    Since 5 users are configured, 5 users will search concurrently. Because only 3 different users are configured in the “User Group”, the 5 search users will be:

    1. User 1
    2. User 2
    3. User 3
    4. User 1
    5. User 2

    These users will start searching simultaneously, starting with the first configured query term and continuing with the other search terms (“Active”, “testing”, …, “mindbreeze AND search”). When all users completed the searches, the first iteration is finished.

    Because 3 iterations are configured, this process will be repeated two more times.

    During this process, the test run can be monitored (see section “Monitoring a Test Run”). When all iterations have been completed, the report can be viewed (see section “Test Reports”).

    Download PDF

    • Configuration - Reporting Query Performance Tests

    Content

    • Introduction
    • Configuration
    • Setup

    Download PDF

    • Configuration - Reporting Query Performance Tests