This guide introduces the ContextService
API, a versatile abstraction for managing application context in the GraphDB
Workbench application. Each view in the application is expected to implement a concrete version of the ContextService
API, tailored to its specific requirements.
The ContextService
API facilitates state management by:
The API is implemented as an abstract class, requiring developers to define specific fields and methods for their application’s needs.
ContextService
The ContextService
class is generic and requires a type parameter TFields
that defines the fields the service can
handle. Each field corresponds to a property of the service and is managed via the context map.
Key methods include:
updateContextProperty
: Updates the value of a property.
getContextPropertyValue
: Retrieves the current value of a property.
subscribe
: Registers a callback to be notified of property value changes.
SnakeToPascalCase
Converts SNAKE_CASE
field names to PascalCase
for method naming.
DeriveContextServiceContract
Generates update methods for each field. For example, a field SELECTED_REPOSITORY
generates an
updateSelectedRepository
method.
RepositoryContextService
The RepositoryContextService
class manages repository-related application context for views in the GraphDB Workbench.
It implements the abstract ContextService
and provides methods for updating and subscribing to repository-related data.
readonly SELECTED_REPOSITORY = 'selectedRepository';
readonly REPOSITORY_LIST = 'repositoryList';
These fields define the context properties managed by the service.
Updating Context
updateSelectedRepository(repository: Repository | undefined): void
Updates the selected repository.
updateRepositoryList(repositories: RepositoryList): void
Updates the list of repositories.
Subscribing to Changes
onSelectedRepositoryChanged(callbackFunction: ValueChangeCallback<Repository | undefined>): () => void
Subscribes to changes in the selected repository.
onRepositoriesChanged(callbackFunction: ValueChangeCallback<RepositoryList | undefined>): () => void
Subscribes to changes in the repository list.
ContextService
1. Define Context Fields and Parameters
Define the fields and their corresponding parameter types:
type RepositoryContextFields = {
readonly SELECTED_REPOSITORY: string;
readonly REPOSITORY_LIST: string;
};
type RepositoryContextFieldParams = {
readonly SELECTED_REPOSITORY: Repository;
readonly REPOSITORY_LIST: RepositoryList;
};
2. Extend ContextService
Implement a concrete class that extends ContextService
:
export class RepositoryContextService extends ContextService<RepositoryContextFields> implements DeriveContextServiceContract<RepositoryContextFields, RepositoryContextFieldParams> {
readonly SELECTED_REPOSITORY = 'selectedRepository';
readonly REPOSITORY_LIST = 'repositoryList';
updateSelectedRepository(repository: Repository | undefined): void {
this.updateContextProperty(this.SELECTED_REPOSITORY, repository);
}
onSelectedRepositoryChanged(callbackFunction: ValueChangeCallback<Repository | undefined>): () => void {
return this.subscribe(this.SELECTED_REPOSITORY, callbackFunction);
}
updateRepositoryList(repositories: RepositoryList): void {
this.updateContextProperty(this.REPOSITORY_LIST, repositories);
}
onRepositoriesChanged(callbackFunction: ValueChangeCallback<RepositoryList | undefined>): () => void {
return this.subscribe(this.REPOSITORY_LIST, callbackFunction);
}
}
3. Using the Service
Import the Service using the ServiceProvider
API:
Warning: Everything in the api package must be imported using the alias
@ontotext/workbench-api
and not by relative or absolute paths. The reason for this is that the api module is a separate package managed as a microservice which is loaded using import maps where the alias is defined.
import { ServiceProvider, RepositoryContextService } from '@ontotext/workbench-api';
// Get the service instance
const repositoryContextService = ServiceProvider.get(RepositoryContextService);
Update Context Values:
const repository: Repository = { id: 1, name: 'Repo1' };
repositoryContextService.updateSelectedRepository(repository);
const repositoryList: RepositoryList = [{ id: 1, name: 'Repo1' }, { id: 2, name: 'Repo2' }];
repositoryContextService.updateRepositoryList(repositoryList);
4. Subscribe to Changes
const unsubscribeSelectedRepository = repositoryContextService.onSelectedRepositoryChanged((newRepository) => {
console.log('Selected repository changed:', newRepository);
});
const unsubscribeRepositoryList = repositoryContextService.onRepositoriesChanged((newList) => {
console.log('Repository list changed:', newList);
});
// To unsubscribe:
unsubscribeSelectedRepository();
unsubscribeRepositoryList();
The ContextService
API provides a simple yet powerful mechanism for managing context in GraphDB Workbench views. By
extending ContextService
, developers can create view-specific services that streamline state management and improve
code maintainability.
This guide explains the Persistence
API and its local storage implementation in the GraphDB Workbench application. It
includes details about the key interfaces, abstract classes, and practical examples for implementing persistent storage
using local storage.
Persistence
APIThe Persistence
API provides a generic interface for interacting with a storage system. It supports storing,
retrieving, and removing data via a Storage
interface-compatible implementation (e.g., localStorage or sessionStorage).
Persistence
Interface
The Persistence
interface defines the structure for storage-related services.
Methods:
getStorage(): Storage
- Returns the underlying storage implementation.
get(key: string): StorageData
- Retrieves the value associated with the provided key.
set(key: string, value: string): void
- Stores the given value under the provided key.
remove(key: string): void
- Deletes the value associated with the provided key.
The LocalStorageService
abstract class implements the Persistence
interface using the localStorage API. This
implementation serves as a base class for specialized storage services.
1. Namespace Support:
NAMESPACE
to scope its keys.StorageKey.GLOBAL_NAMESPACE
) and the service-specific namespace.2. Storage Methods:
get(key: string): StorageData
- Fetches a value from localStorage
.
storeValue(key: string, value: string): void
- Saves a value to localStorage
.
remove(key: string): void
- Removes a value from localStorage
.
3. Key Management:
getPrefixedKey(key: string): string
method ensures that all keys are prefixed correctly for consistency and
collision avoidance.export class LanguageStorageService extends LocalStorageService {
readonly NAMESPACE = 'i18n';
set(key: string, value: string) {
this.storeValue(key, value);
}
}
In this example, LanguageStorageService
manages language-related properties in the localStorage
, scoped under the
i18n namespace.
The LocalStorageSubscriptionHandlerService
listens to storage change events and updates the application context
accordingly. It works in conjunction with ContextService
implementations to resolve and update the context properties.
StorageEvent
, the service parses the key to extract the namespace and property name.resolveHandler(namespace, propertyName)
method.handleStorageChange(event: StorageEvent): void {
const withoutGlobalPrefix = event.key?.substring(StorageKey.GLOBAL_NAMESPACE.length + 1);
let namespace = '';
let contextPropertyKey = '';
if (withoutGlobalPrefix) {
namespace = withoutGlobalPrefix.substring(0, withoutGlobalPrefix.indexOf('.'));
contextPropertyKey = withoutGlobalPrefix.substring(namespace.length + 1);
}
const handler = this.resolveHandler(namespace, contextPropertyKey);
if (handler) {
handler.updateContextProperty(contextPropertyKey, event.newValue);
}
}
LocalStorageService
to define domain-specific storage services, specifying the NAMESPACE
and implementing
additional functionality if needed.The Persistence
API and its local storage implementation provide a robust framework for managing persistent data in
the GraphDB Workbench application. By adhering to the namespace conventions and leveraging the LocalStorageService
as
a base class, developers can efficiently implement and maintain storage-related functionality.
The font kit used in the project is our own and is a custom PRO set. Below are the steps for manually updating and managing the fontawesome iconset:
If there is an update to the custom PRO font kit:
src/js/lib/awesome_me/css
Make sure you copy all necessary CSS, fonts, and any other related files.
If any icons or configurations within the PRO set are changed:
src/js/lib/awesome_me/css
Since v1.2, GraphDB workbench features a plugin system which allows components to be plugged in without introducing coupling between new and existing components. The new system allows extending or replacing existing components, introduction of new single or compositional components. All this could be achieved without any changes in the rest of the system.
Currently the new system is integrated in the workbench main components registration. These are the
components which implement the main workbench views (extension point route
) and their respective
main menu entries (extension point main.menu
). In next versions more extension points might be
introduced.
The plugin system consist of four components.
The PluginRegistry
which is a service that has a role to maintain a runtime registry of all
registered to given extension point plugins. It has the following interface:
PluginRegistry {
add(extensionPoint:String, pluginDefinition:[Object|Array]),
get(extensionPoint:String):Array,
clear(extensionPoint:String),
listModules()
}
The second component is the plugin definition which a developer can define for each extension point
where he needs new behavior or customization. When there is a need given component to be extended or
customized, the developer declares an extension point which is the contract to which plugins can be
registered. Every plugin should conform to that contract. Plugin definitions are javascript files
with mandatory name plugin.js
.
Example of a plugin definition:
// src/js/angular/autocomplete/plugin.js
PluginRegistry.add('route', {
'url': '/autocomplete',
'module': 'graphdb.framework.autocomplete',
'path': 'autocomplete/app',
'chunk': 'autocomplete',
'controller': 'AutocompleteCtrl',
'templateUrl': 'pages/autocomplete.html',
'title': 'Autocomplete index',
'helpInfo': 'The Autocomplete index is used for automatic ...',
'documentationUrl': 'autocomplete-index.html'
});
In a single plugin.js definition file can be registered plugins to multiple extension points.
The above plugin.js
definition can be extended like this:
PluginRegistry.add('route', {
'url': '/autocomplete',
'module': 'graphdb.framework.autocomplete',
'path': 'autocomplete/app',
'chunk': 'autocomplete',
'controller': 'AutocompleteCtrl',
'templateUrl': 'pages/autocomplete.html',
'title': 'Autocomplete index',
'helpInfo': 'The Autocomplete index is used for automatic completion of URIs in the SPARQL editor and the View resource page. Use this view to enable or disable the index and check its status.',
'documentationUrl': 'autocomplete-index.html'
});
PluginRegistry.add('main.menu', {
'items': [
{label: 'Setup', href: '#', order: 5, role: 'IS_AUTHENTICATED_FULLY', icon: "icon-settings"},
{label: 'Autocomplete', href: 'autocomplete', order: 40, parent: 'Setup', role: "IS_AUTHENTICATED_FULLY"}
]
});
Plugin definitions can have the following optional attributes: order
, priority
, disabled
.
order
can be used when the extension point contract requires plugins to be loaded in
particular order.
// src/module_1/plugin.js
PluginRegistry.add('extension.point', {
'label': 'plugin-1',
'order': 10
});
// src/module_2/plugin.js
PluginRegistry.add('extension.point', {
'label': 'plugin-2',
'order': 20
});
// src/main.js
const plugins = PluginRegistry.get('extension.point');
// plugins[0] -> `plugin-1`
// plugins[1] -> `plugin-2`
If order is not provided, then plugins are loaded in order of their registration which is the order in which the plugin.js files are processed during the application bundling.
priority
attribute can be used when an ordered plugin should be overridden by another plugin.
If not provided, priority for every plugin is set by default to 0
. Having two plugins with same
order but different priority results in overriding the lower priority plugin by the other. If there
are two plugins with equal order and priority, then an error is thrown to warn the developer for the
issue.
// src/module_1/plugin.js
PluginRegistry.add('extension.point', {
'label': 'plugin-1',
'order': 10
});
// src/module_2/plugin.js
PluginRegistry.add('extension.point', {
'label': 'plugin-2',
'order': 10,
'priority': 10
});
// src/main.js
const plugins = PluginRegistry.get('extension.point');
// plugins[0] -> `plugin-2`
disabled: true
attribute which means that the plugin
won’t be loaded. All plugins by default are considered enabled. If a plugin is disabled, its
definition is not validated and processed - it’s just skipped.
// src/module_1/plugin.js
PluginRegistry.add('extension.point', {
'label': 'plugin-1',
'disabled': true
});
// src/module_2/plugin.js
PluginRegistry.add('extension.point', {
'label': 'plugin-2',
});
// src/main.js
const plugins = PluginRegistry.get('extension.point');
// plugins[0] -> `plugin-2`
The third component from the plugin system is the plugins.js
file which is autogenerated and is
composed from all plugin.js
files during the workbench build. Usually developers don’t need to
touch this file.
The last part are the extension points which the developer implements in the application code. An extension point is the place where plugins are loaded and eventually executed.
Below is an example of extension point in the workbench which allows registering modules that can be accessed by navigating to different url (routes).
// src/app.js
// 1. Get all registered extension for the "route" extension point.
let routes = PluginRegistry.get('route');
// 2. Loop through all extensions
routes.forEach(function (route) {
// 3. Register every extension with the $routeProvider
$routeProvider.when(route.url, {
controller: route.controller,
templateUrl: route.templateUrl,
title: route.title,
helpInfo: route.helpInfo,
documentationUrl: route.documentationUrl,
reloadOnSearch: route.reloadOnSearch !== undefined ? route.reloadOnSearch : true
});
});
plugins.js
. This file is programmatically generated during the bundling.PluginRegistry
and the plugins.js
are defined in the main application html template.
```This allows the plugins to be registered runtime in the registry immediately after the web
application is loaded by issuing calls to `PluginRegistry.add(extensionPoint, pluginDefintion)`
method.
## Color themes
The workbench allows custom color themes to be provided by developers. Themes are implemented as plugins
and can be registered to the `themes` extension point. Any custom theme must be placed in the project's
src folder a file named `plugin.js`. Example theme plugin can be seen below:
```javascript
PluginRegistry.add('themes', {
// The name of the theme. Must contain only lowercase letters, hyphen, underscore. This is the differentiator
// property for all registered themes.
'name': 'default-theme',
// The theme label or a key for a label from i18n resource bundle.
'label': 'security.workbench.settings.theme.default-theme',
// CSS variables, "foo: bar" becomes "--foo: bar"
'variables': {
// Primary color, like a main brand color. This is in a HSL format composed by three values below
'primary-color-hue': '13.4',
'primary-color-saturation': '87.9%',
'primary-color-lightness': '33%',
// Secondary color, like a contrast main brand color. This is in a HSL format composed by three values below
'secondary-color-hue': '207.3',
'secondary-color-saturation': '100%',
'secondary-color-lightness': '19.4%',
// Tertiary color, like a complimentary color. This is in a HSL format composed by three values below
'tertiary-color-hue': '174.6',
'tertiary-color-saturation': '97.7%',
'tertiary-color-lightness': '33.5%',
// A color used for the font/svg icons when placed on a primary color background.
'icon-on-primary-color': 'rgba(255, 255, 255, 0.8)',
'gray-color': '#97999C',
'gray-color-dark': '#575757',
// Colors for the toastr notifications, the tag-xxx and the text-xxx classes in any of their four states
// (i.e. dark colored things)
'color-danger-dark': 'hsl(353, 78%, 36%)',
'color-success-dark': 'hsl(var(--tertiary-color-hue), var(--tertiary-color-saturation), calc(var(--tertiary-color-lightness)*0.5))',
'color-warning-dark': 'var(--primary-color-dark)',
'color-info-dark': 'var(--secondary-color-light)',
// Colors for the alert boxes (i.e. light colored things).
// Success and info are the same color since we don't use success much if at all
'color-danger-light': '#a4142433',
'color-success-light': 'hsla(var(--tertiary-color-hsl), 0.15)',
'color-warning-light': 'hsla(var(--primary-color-hsl), 0.07)',
'color-info-light': 'hsla(var(--tertiary-color-hsl), 0.15)',
'color-help-light': 'hsla(var(--secondary-color-hsl), 0.1)',
// Colors for the logo - logo proper, text in logo, logo background
'logo-color': 'var(--primary-color-light)',
'logo-text-color': 'white',
'logo-background-color': 'var(--secondary-color-dark)'
},
// Dark theme
'dark': {
'variables': {
// Dark definition variables that affect things at a global scale
'body-filter': 'invert(95%) hue-rotate(180deg)',
'html-background': '#0d0d0d',
'media-filter': 'invert(100%) hue-rotate(180deg)',
'alert-filter': 'contrast(2)',
'checkbox-filter': 'invert(100%) hue-rotate(180deg)',
'toast-filter': 'invert(95%) hue-rotate(180deg) contrast(1.2)',
// Slightly different colors that work better in dark mode
'primary-color-lightness': '60%',
'secondary-color-saturation': '70%',
'color-warning-light': 'hsla(var(--primary-color-hsl), 0.15)',
'logo-color': 'var(--primary-color-dark)'
},
// CSS properties, "foo: bar" becomes "foo: bar"
'properties': {
// Notify browser that we support dark theme, makes checkboxes look better
'color-scheme': 'light dark'
}
}
});
The plugin definition is compiled to a stylesheet and embedded in the html document
:root {
--primary-color-hue: 13.4;
--primary-color-saturation: 87.9%;
--primary-color-lightness: 33%;
--primary-color-hsl: var(--primary-color-hue), var(--primary-color-saturation), var(--primary-color-lightness);
--primary-color: hsl(var(--primary-color-hsl));
--primary-color-light: hsl(var(--primary-color-hue), var(--primary-color-saturation), calc(var(--primary-color-lightness)*1.2));
--primary-color-dark: hsl(var(--primary-color-hue), var(--primary-color-saturation), calc(var(--primary-color-lightness)*0.8));
--secondary-color-hue: 207.3;
--secondary-color-saturation: 100%;
--secondary-color-lightness: 19.4%;
--secondary-color-hsl: var(--secondary-color-hue), var(--secondary-color-saturation), var(--secondary-color-lightness);
--secondary-color: hsl(var(--secondary-color-hsl));
--secondary-color-light: hsl(var(--secondary-color-hue), var(--secondary-color-saturation), calc(var(--secondary-color-lightness)*1.2));
--secondary-color-dark: hsl(var(--secondary-color-hue), var(--secondary-color-saturation), calc(var(--secondary-color-lightness)*0.8));
...
}
:root.dark {
--body-filter: invert(95%) hue-rotate(180deg);
--html-background: #0d0d0d;
--media-filter: invert(100%) hue-rotate(180deg);
--alert-filter: contrast(2);
--checkbox-filter: invert(100%) hue-rotate(180deg);
--toast-filter: invert(95%) hue-rotate(180deg) contrast(1.2);
...
color-scheme: light dark;
}
All available registered themes are loaded using the PluginsRegistry and displayed in a combobox in “my settings” page.
The definition of the selected in the ‘my settings’ page and saved theme is loaded and applied by default when the
workbench is opened. If no theme is selected and saved in the local storage, then the default-theme is applied.
All properties in the definition are mandatory. Definitions with missing properties are rejected, and an error will be
reported in the browser log. Theme plugins validation happens when definitions are loaded through the PluginRegistry by
the “my settings” page controller. The themes selector menu is in a widget on that page and it lists all available
registered theme plugins for the user to select from. There are two prebuilt themes in the workbench. The
default-theme
is carefully crafted so that the colors used in the theme comply with the WCAG AAA color contrast
standard. The second provided theme uses the Ontotext brand colors and does not comply with the WCAG color standard.
The workbench application consists of many resources: script files, stylesheets, images, fonts. In a non optimized build this would lead to hundreds of http requests. Nowadays web applications are usually optimized by bundling their resources and minifying them. The reason for this is to have significantly fewer http requests which leads to faster initial load and less traffic.
Bundling of the workbench is done with webpack. It’s configured with three config files:
The common config contains shared configuration for the other two.
There are two ways the application to be built. First is for production and the other is for dev. See the Build and Dev server sections above.
The code for production is build in in the /dest
folder. The bundling covers the following
tasks:
/app.js
and is emitted in /dest
in a
bundle file named bundle.[hash].js
. The hash is generated and applied in order to allow cache
busting. Every change in the application code forces new hash to be generated thus forcing browsers
to request the new bundle version. If the application code contains dynamic imports, webpack emits
new bundle. This is the case for the clustermanagement
module which is only imported if the
workbench is loaded in enterprise mode.vendor.[hash].js
and is bundled by the webpack. Here goes third
party libraries like jquery, angularjs and third party libraries stylesheets.main.js
which is
bundled by the webpack and emitted as main.[hash].js
. This is needed because importing them in the
app.js
breaks the bundle.index.html
at the end of the body tag.less-loader
, converted to javascript in the bundles
with css-loader
and finally injected at the end of the head tag in index.html
. Not injecting
them during the build time would lead to unwanted showing the un-styled application until webpack
injects them runtime.bootstrap
and angular
depend on jquery
to be loaded and present
globally. That’s why it is mandatory jquery
to be properly pre-loaded and exposed in the global
scope. This is done using the expose-loader
. It ensures that when requested, it will be available
for the libraries./dist
folder. Resources referenced from within those templates are also directly copied to allow proper
loading. The copying is done using the CopyPlugin
.
template.html
processing, referenced images are automatically copied in the /dist
folder. The file-loader
is used for the purpose./dist
folder using the
url-loader
./dist
directory is cleaned up before every build to prevent accumulating bundle files with
different hashes in their names.The system can intercept HTTP requests and responses done by the HttpService. This allows us to perform certain actions upon requests, before they are sent, or after a response has been received.
There are two chains of interceptors. A request interceptor chain and a response interceptor chain. Once a request is triggered it will first be processed by the request interceptor chain: InterceptorService#preProcess The finished request object will then be used to form the actual HTTP Request to the backend in the HttpService The resulting response will be processed by the response interceptor chain, before being finally returned to the caller.
Currently, there is no explicit mechanism for cancelling a request/response, but throwing an error, or rejecting the promise will stop either chain.
An example can be made with the AuthRequestInterceptor, which is used to add auth headers to each request, before it is sent to the backend
Interceptors are executed based on their priority. Each interceptor has a priority
property, which defaults to 0.
Higher priority indicates execution priority as well. The highest number will be executed first, lower numbers will be next.
HttpRequest
in the generic type, whereas response interceptors should provide Response
Example request interceptor:
import {HttpInterceptor} from './http-interceptor';
import {HttpRequest} from './http-request';
class MyRequestInterceptor extends HttpInterceptor<HttpRequest> {
shouldProcess(request: HttpRequest) {
// implementation
}
process(request: HttpRequest) {
// implementation
}
}
Example response interceptor:
import {HttpInterceptor} from './http-interceptor';
class MyResponseInterceptor extends HttpInterceptor<Response> {
shouldProcess(response: Response) {
// implementation
}
process(response: Response) {
// implementation
}
}
/**
Example:
import {ServiceProvider} from './service.provider';
const interceptorService = ServiceProvider.get(InterceptorService);
interceptorService.registerRequestInterceptors(new ModelList([new MyRequestInterceptor()]));
interceptorService.registerResponseInterceptors(new ModelList([new MyResponseInterceptor()]));
The system ensures that translations from different modules are merged into a single bundle per language,
using the merge-i18n-plugin.js. After they are merged, all .json
files from src/assets/i18n
directories are transferred to the webpack output directory. That includes
language-config.json, which contains the default language
and the available languages for the application. The configuration file is read, upon starting the application inside
ontotext-root-config.js#getLanguageConfig. The default language
is loaded and the app starts listening for ontotext-root-config.js#onLanguageChange
events. Once a language changes, the respective bundle is loaded and emitted via language-context.service.ts#updateLanguageBundle.
Modules listen for bundle changes from language-context.service.ts#onLanguageBundleChanged
and apply translation logic independently from each other.
Every module in the application must follow the convention of placing translation files under the src/assets/i18n
directory.
For example:
packages/
module1/
src/assets/i18n/en.json
src/assets/i18n/fr.json
module2/
src/assets/i18n/en.json
src/assets/i18n/fr.json
Translation files should be JSON objects where the keys are the translation identifiers, and the values are the translated strings. For example:
src/assets/i18n/en.json
:
{
"greeting": "Hello",
"farewell": {
"label": "Goodbye"
}
}
src/assets/i18n/fr.json:
{
"greeting": "Bonjour",
"farewell": {
"label": "Au revoir"
}
}
The merge-i18n-plugin.js aggregates these translation files across all modules and merges them into a single bundle for each language. For example:
packages/module1/src/assets/i18n/en.json
and packages/module2/src/assets/i18n/en.json
will be combined into a single en.json.
The output will look like this:
dist/${outputDirectory}/en.json
The plugin resolves conflicts by throwing an error if multiple files define the same key for the same language.
For example:
module1/src/assets/i18n/en.json
{
"some-prop": "Hello",
"menu.logo.link.title": {
"label": "Goodbye"
}
}
module2/src/assets/i18n/en.json
:
{
"another-prop": "Bonjour",
"menu.logo.link.title": {
"another-label": "Different Goodbye"
}
}
Will result in an error, similar to this one:
Processing file: en.json
Error: Conflict detected for key 'menu.logo.link.title' in language 'en' in file: packages/workbench/src/assets/i18n/en.json
Automatic Directory Traversal
: Scans all modules for src/assets/i18n folders.
Conflict Detection
: Throws an error if there are duplicate keys in the same language.
JSON Merging
: Combines all translations into one file per language.
Asset Emission
: Writes the merged bundles to the specified output directory in the Webpack dist folder.
startDirectory
: The base directory where the plugin begins searching for modules.
outputDirectory
: The directory inside the Webpack output folder where the merged translation bundles will be written.
const { MergeI18nPlugin } = require('./plugins/MergeI18nPlugin');
module.exports = {
// Other Webpack configurations...
plugins: [
new MergeI18nPlugin({
startDirectory: 'packages',
outputDirectory: 'assets/i18n',
}),
],
};
availableLanguages
array in language-config.json.src/assets/i18n
folder, where there are translations.
The key
in availableLanguages
, should be the name of the new file translation, e.g. ${key}.json
.
Avoid conflicting keys in your new bundle, as it will cause an error.${webpack.outputFolder}/${mergeI18nPlugin.outputDirectory}
(currently dist/assets/i18n
) folder to verify that all translations are included and correctly merged.To validate translations locally:
npm run validate
If the project structure differs or the script is placed elsewhere, provide the project root manually:
SCRIPT_ROOT=/absolute/path/to/repo-root npm run validate
The script will generate a file:
translation-report.json
If issues are found, the script will exit with code 1
.
On Jenkins, the script runs automatically in the Validate
stage.
If issues are found:
Look for this file under “Build Artifacts”:
translation-report.json
The graphdb-workbench
project uses a Jenkins pipeline to automate installation, testing, building, validation, and SonarQube analysis.
The pipeline is configured to execute the following steps:
If new static folders are created in the dist
folder to be published (or old ones are renamed), they must be added to the BE Spring Security configuration. Failure to do so will prevent the server from serving these resources, causing the Workbench to malfunction.
18.9.0
(configured tool), uses node:20-alpine
image in most stagesSONAR_ENVIRONMENT=SonarCloud
aws-large
nodesnpm run install:ci
npm run build
in the same Dockerized environmentnpm run lint
npm run validate
translation-report.json
regardless of outcomemaster
: regular scannpm run test
packages/shared-components
docker-compose.yaml
)master
branche2e-tests/fixtures
docker-compose-test.yaml
configFileProvider
Code coverage is run in a separate Jenkins job, which is triggered manually on demand, typically for feature branches.
Find the graphdb-workbench-coverage job in Jenkins.
Start it using the Build with Parameters option, specifying the Git branch you want to analyze.
After the job completes, the final HTML report with the results is available as a build artifact in Jenkins.
This Jenkins pipeline facilitates the release process for the graphdb-workbench
project. It automates versioning, building, and publishing to npm, ensuring a smooth release workflow.
aws-small
agentsnode:20-bullseye
GIT_BRANCH
: Git branch to release from (default: master
)RELEASE_VERSION
: Required version to releaseNOTIFY_SLACK
: Whether to send a Slack notificationSLACK_CHANNEL
: Slack channel to notify (if enabled)npm run install:ci
npm run build
e2e-tests/
.npmrc
auth tokennpm whoami
git commit -a -m 'Release ${RELEASE_VERSION}'
git tag -a v${RELEASE_VERSION} -m 'Release v${RELEASE_VERSION}'
.npmrc
and removes any local tokens.npmrc
is mounted into the container and cleaned afterwardIf you’ve ever filled in a browser-native Basic Auth popup in Chrome for the workbench login, Chrome will silently remember and auto-send those credentials in every subsequent request to that domain — even across sessions and after logout.
You will see an Authorization
header automatically added to requests like:
Authorization: Basic <base64-encoded-user:password>
This can be confirmed in the Network tab of DevTools, particularly in requests like:
GET /rest/security/authenticated-user
To remove the cached Basic Auth credentials:
chrome://settings/clearBrowserData
and clear:
Note: Simply logging out will not remove Basic Auth credentials remembered by Chrome — they are not managed by the code.