Copyright ©
Mindbreeze GmbH, A-4020 Linz, 2024.
All rights reserved. All hardware and software names used are brand names and/or trademarks of their respective manufacturers.
These documents are strictly confidential. The submission and presentation of these documents does not confer any rights to our software, our services and service outcomes, or any other protected rights. The dissemination, publication, or reproduction hereof is prohibited.
For ease of readability, gender differentiation has been waived. Corresponding terms and definitions apply within the meaning and intent of the equal treatment principle for both sexes.
Mindbreeze customers have the option of linking the Mindbreeze InSpire AI Chat with Large Language Models (LLMs) that are operated either locally or via a Mindbreeze SaaS environment. With the Mindbreeze InSpire 24.1 Release Version 24.1.1.338, linking Mindbreeze InSpire AI Chat with LLMs from cloud providers OpenAI, Microsoft Azure OpenAI and Aleph Alpha is now also available. This enables the use of these comprehensively trained models in combination with the Mindbreeze InSpire AI Chat.
Linking a Large Language Model from OpenAI, Microsoft Azure OpenAI or Aleph Alpha with the Mindbreeze InSpire AI Chat is easy to configure. With the help of Retrieval Augmented Generation Administration (RAG Administration), administrators are able to easily link OpenAI and Aleph Alpha models such as GPT-3.5 Turbo and GPT-4 with Mindbreeze InSpire. The integration of an OpenAI LLM requires a valid API key or the provision of the OpenAI LLM via Microsoft Azure.
Mindbreeze InSpire is now also available as a Google Cloud Image in the Google Cloud Marketplace. Mindbreeze customers can choose between the contract tiers, Mindbreeze InSpire 1M and Mindbreeze InSpire 10M. Thus, Mindbreeze InSpire can also be operated with Google Cloud as an alternative to On-Premises, SaaS, AWS and Microsoft Azure. In addition to a Google Cloud Marketplace subscription a Mindbreeze license per instance is required.
By using Large Language Models, users receive answers to their questions in complete sentences. This enables more direct and faster access to information. With the Mindbreeze InSpire 24.1 release, the LLM also considers the layout of the document when generating an answer. As displayed in the screenshot, the recognition of sentences is more precise and the comprehensibility of answers is increased.
This is made possible by optimised sentence segmentation. In addition to punctuation, the LLM now also considers certain layout information. As a result, the recognition of individual sentences is more reliable and the answers generated are more understandable.
Users of Atlassian Jira have the option of creating property fields in "Issues". Mindbreeze InSpire is able to index these fields and make the metadata available as a filter and search restriction. With the Mindbreeze InSpire 24.1 release, indexing custom fields is also an option. Users are now able to narrow down the search results in the Search Client more precisely with those metadata.
For example, a user has created a custom field for a business unit with the value "Sales" in Atlassian Jira. This metadata is now available through Mindbreeze InSpire to filter the search results by the value "Sales". In addition, it is possible to control indexing by specifying the indexing of certain metadata from an issue.
The addition of custom translations in the Mindbreeze InSpire Insight App Client is supported even more extensively with the Mindbreeze InSpire 24.1 release. Administrators now have the option to add translations according to the ISO 639 standard in up to 184 languages. This makes it possible to customise the Mindbreeze InSpire Insight App Client even more precisely to your own requirements.
Link to the documentation for adding custom translations
Depending on the use case, it may be necessary for users to configure multiple Hierarchical CSV Enricher services, for example to generate multiple metadata. With the Mindbreeze InSpire 24.1 release, this configuration process is easier to perform. Users are now able to cover multiple use cases with a single Hierarchical CSV Enricher service. This favours faster inversion, which also speeds up the index creation process.
The simplified configuration is made possible by dynamic metadata. Users are thus able to determine the generation of the metadata in the CSV depending on the use case. This means that only the configuration of one Hierarchical CSV Enricher is required for several use cases.
Mindbreeze places particular importance on security, which is why it is constantly being expanded and improved. With the Mindbreeze InSpire 24.1 release, Mindbreeze is updating all container images to a distribution that is binary-compatible with RHEL8 (AlmaLinux 8). This update enables Mindbreeze to provide its customers with security updates more quickly, and to react faster to security issues. In addition, the increased performance of Mindbreeze InSpire is a further advantage for Mindbreeze customers.
Fix for: Kerberos keytabs without AES keys did not work.