diff --git a/docs/api/sitemap.xml.gz b/docs/api/sitemap.xml.gz index 018479562..5284ae8a6 100644 Binary files a/docs/api/sitemap.xml.gz and b/docs/api/sitemap.xml.gz differ diff --git a/docs/blogs/sitemap.xml.gz b/docs/blogs/sitemap.xml.gz index 452308af3..7415a34d6 100644 Binary files a/docs/blogs/sitemap.xml.gz and b/docs/blogs/sitemap.xml.gz differ diff --git a/docs/help/search/search_index.json b/docs/help/search/search_index.json index b044ed92f..445984e4d 100644 --- a/docs/help/search/search_index.json +++ b/docs/help/search/search_index.json @@ -1 +1 @@ -{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Welcome! How can we help? Build your first service with Eclipse Dirigible. Get Started Explore different setup options. Setup Understand the nuts and bolts. Architecture More helpful information Read about our vision and opinions on cloud development and Eclipse Dirigible. Blogs Read essential definitions. Concepts Review major features. Features Learn more about Enterprise JavaScript API availability, versions, and status. API Find out what, why, and how. FAQ Learn how to contribute. Community","title":"Welcome"},{"location":"#welcome-how-can-we-help","text":"Build your first service with Eclipse Dirigible. Get Started Explore different setup options. Setup Understand the nuts and bolts. Architecture","title":"Welcome! How can we help?"},{"location":"#more-helpful-information","text":"Read about our vision and opinions on cloud development and Eclipse Dirigible. Blogs Read essential definitions. Concepts Review major features. Features Learn more about Enterprise JavaScript API availability, versions, and status. API Find out what, why, and how. FAQ Learn how to contribute. Community","title":"More helpful information"},{"location":"community/","text":"Community Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible. Contributor Guide Eclipse Dirigible is an open source project, which means that you can propose contributions by sending pull requests through GitHub . Before you get started, here are some prerequisites that you need to complete: Legal considerations Please read the Eclipse Foundation policy on accepting contributions via Git . Please read the Code of Conduct . Your contribution cannot be accepted unless you have an Eclipse Contributor Agreement in place. Here is the checklist for contributions to be acceptable: Create an account at Eclipse Add your GitHub user name in your account settings Log into the projects portal , look for \"Eclipse Contributor Agreement\" , and agree to the terms. Ensure that you sign-off your Git commits with the same email address as your Eclipse Foundation profile. For more information see the Contributor Guide . Style Guide In this section we have outlined text stylizing options and for what elements they should be used. If everyone follows it, we will have visually consistent documentation . Bold How it looks as text: Bold Text How it looks in markdown: **Bold Text** Use it for: UI elements Navigation paths Monospace How it looks as text: Monospace Text How it looks in markdown: `Monospace Text` Use it for: File names and extensions Terms File paths Monospace/Bold How it looks as text: Monospace/Bold Text How it looks in markdown: **`Monospace/Bold Text`** Use it for: User input Headings How it looks: Use Heading 1 for the titles Heading 2 is for main topics Continue with Heading 3 and 4 where needed Structure your topic with no more than 3 heading levels(heading 2, 3 and 4) Blogs We'd welcome any contribution to our Blogs site as long as it conforms with out Legal considerations outlined above. Below we've provided more details about the organization of the Blogs site and the frontmatter that needs to be added so new blogs have the same look and feel as all the others. Add Your Blog to the Right Folder All blogs are organized in folders by year, month, and day of publishing. Hence, a blog written on November 19, 2020 is placed in the directory docs/2020/11/19/ : This also helps arranging the blogs by year of publishing. When publishing, add your blog to the right folder depending on the date. You can also create folders if needed. Include Markdown Frontmatter A big part of any blog's layout is controlled by its .md file frontmatter. This is metadata about the .md file and is denoted by the triple dashes at the start and end of the block. Here's an example with the title of this Community page: --- title : Community --- Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible. ## Contributor Guide Set the title of the blog in the frontmatter: --- title : --- When the title is set in the frontmatter, use Heading 2 level ( ## This is a heading 2 ) as the highest heading level in your blog. Otherwise, the first Heading 1 you use will overwrite the title from the frontmatter and cause formatting issues. Set the author: --- title : author : --- Set your GitHub user: --- title : author : author_gh_user : --- 4. Set reading time and publishing date: --- title : author : author_gh_user : read_time : publish_date : --- Providing all the metadata in the frontmatter as described will include: the title in the beginning of the page your GitHub avatar, your name, and a link to your GitHub profile in the author section reading time and publishing date in the details section Here's an example from one of our recent blogs: Happy Blogging! Join the Discussion Reach out to other contributors and join in the discussion around Dirigible here .","title":"Community"},{"location":"community/#community","text":"Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible.","title":"Community"},{"location":"community/#contributor-guide","text":"Eclipse Dirigible is an open source project, which means that you can propose contributions by sending pull requests through GitHub . Before you get started, here are some prerequisites that you need to complete:","title":"Contributor Guide"},{"location":"community/#legal-considerations","text":"Please read the Eclipse Foundation policy on accepting contributions via Git . Please read the Code of Conduct . Your contribution cannot be accepted unless you have an Eclipse Contributor Agreement in place. Here is the checklist for contributions to be acceptable: Create an account at Eclipse Add your GitHub user name in your account settings Log into the projects portal , look for \"Eclipse Contributor Agreement\" , and agree to the terms. Ensure that you sign-off your Git commits with the same email address as your Eclipse Foundation profile. For more information see the Contributor Guide .","title":"Legal considerations"},{"location":"community/#style-guide","text":"In this section we have outlined text stylizing options and for what elements they should be used. If everyone follows it, we will have visually consistent documentation . Bold How it looks as text: Bold Text How it looks in markdown: **Bold Text** Use it for: UI elements Navigation paths Monospace How it looks as text: Monospace Text How it looks in markdown: `Monospace Text` Use it for: File names and extensions Terms File paths Monospace/Bold How it looks as text: Monospace/Bold Text How it looks in markdown: **`Monospace/Bold Text`** Use it for: User input Headings How it looks: Use Heading 1 for the titles Heading 2 is for main topics Continue with Heading 3 and 4 where needed Structure your topic with no more than 3 heading levels(heading 2, 3 and 4)","title":"Style Guide"},{"location":"community/#blogs","text":"We'd welcome any contribution to our Blogs site as long as it conforms with out Legal considerations outlined above. Below we've provided more details about the organization of the Blogs site and the frontmatter that needs to be added so new blogs have the same look and feel as all the others.","title":"Blogs"},{"location":"community/#add-your-blog-to-the-right-folder","text":"All blogs are organized in folders by year, month, and day of publishing. Hence, a blog written on November 19, 2020 is placed in the directory docs/2020/11/19/ : This also helps arranging the blogs by year of publishing. When publishing, add your blog to the right folder depending on the date. You can also create folders if needed.","title":"Add Your Blog to the Right Folder"},{"location":"community/#include-markdown-frontmatter","text":"A big part of any blog's layout is controlled by its .md file frontmatter. This is metadata about the .md file and is denoted by the triple dashes at the start and end of the block. Here's an example with the title of this Community page: --- title : Community --- Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible. ## Contributor Guide Set the title of the blog in the frontmatter: --- title : --- When the title is set in the frontmatter, use Heading 2 level ( ## This is a heading 2 ) as the highest heading level in your blog. Otherwise, the first Heading 1 you use will overwrite the title from the frontmatter and cause formatting issues. Set the author: --- title : author : --- Set your GitHub user: --- title : author : author_gh_user : --- 4. Set reading time and publishing date: --- title : author : author_gh_user : read_time : publish_date : --- Providing all the metadata in the frontmatter as described will include: the title in the beginning of the page your GitHub avatar, your name, and a link to your GitHub profile in the author section reading time and publishing date in the details section Here's an example from one of our recent blogs: Happy Blogging!","title":"Include Markdown Frontmatter"},{"location":"community/#join-the-discussion","text":"Reach out to other contributors and join in the discussion around Dirigible here .","title":"Join the Discussion"},{"location":"developer-resources/cheat-sheet/","text":"Cheat Sheet Clean Up Database Go to the Database perspective. Switch to the local datasource type and select the SystemDB . Execute the following queries: Delete Data Drop Tables DELETE FROM DIRIGIBLE_BPM ; DELETE FROM DIRIGIBLE_DATA_STRUCTURES ; DELETE FROM DIRIGIBLE_EXTENSIONS ; DELETE FROM DIRIGIBLE_EXTENSION_POINTS ; DELETE FROM DIRIGIBLE_IDENTITY ; DELETE FROM DIRIGIBLE_JOBS ; DELETE FROM DIRIGIBLE_LISTENERS ; DELETE FROM DIRIGIBLE_MIGRATIONS ; DELETE FROM DIRIGIBLE_ODATA ; DELETE FROM DIRIGIBLE_ODATA_CONTAINER ; DELETE FROM DIRIGIBLE_ODATA_MAPPING ; DELETE FROM DIRIGIBLE_ODATA_SCHEMA ; DELETE FROM DIRIGIBLE_ODATA_HANDLER ; DELETE FROM DIRIGIBLE_PUBLISH_LOGS ; DELETE FROM DIRIGIBLE_PUBLISH_REQUESTS ; DELETE FROM DIRIGIBLE_SECURITY_ACCESS ; DELETE FROM DIRIGIBLE_SECURITY_ROLES ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DELETE FROM DIRIGIBLE_WEBSOCKETS ; DELETE FROM QUARTZ_BLOB_TRIGGERS ; DELETE FROM QUARTZ_CALENDARS ; DELETE FROM QUARTZ_CRON_TRIGGERS ; DELETE FROM QUARTZ_FIRED_TRIGGERS ; DELETE FROM QUARTZ_LOCKS ; DELETE FROM QUARTZ_PAUSED_TRIGGER_GRPS ; DELETE FROM QUARTZ_SCHEDULER_STATE ; DELETE FROM QUARTZ_SIMPLE_TRIGGERS ; DELETE FROM QUARTZ_SIMPROP_TRIGGERS ; DELETE FROM QUARTZ_TRIGGERS ; DELETE FROM QUARTZ_JOB_DETAILS ; DROP TABLE DIRIGIBLE_BPM ; DROP TABLE DIRIGIBLE_DATA_STRUCTURES ; DROP TABLE DIRIGIBLE_EXTENSIONS ; DROP TABLE DIRIGIBLE_EXTENSION_POINTS ; DROP TABLE DIRIGIBLE_IDENTITY ; DROP TABLE DIRIGIBLE_JOBS ; DROP TABLE DIRIGIBLE_LISTENERS ; DROP TABLE DIRIGIBLE_MIGRATIONS ; DROP TABLE DIRIGIBLE_ODATA ; DROP TABLE DIRIGIBLE_ODATA_CONTAINER ; DROP TABLE DIRIGIBLE_ODATA_MAPPING ; DROP TABLE DIRIGIBLE_ODATA_SCHEMA ; DROP TABLE DIRIGIBLE_ODATA_HANDLER ; DROP TABLE DIRIGIBLE_PUBLISH_LOGS ; DROP TABLE DIRIGIBLE_PUBLISH_REQUESTS ; DROP TABLE DIRIGIBLE_SECURITY_ACCESS ; DROP TABLE DIRIGIBLE_SECURITY_ROLES ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DROP TABLE DIRIGIBLE_WEBSOCKETS ; DROP TABLE QUARTZ_BLOB_TRIGGERS ; DROP TABLE QUARTZ_CALENDARS ; DROP TABLE QUARTZ_CRON_TRIGGERS ; DROP TABLE QUARTZ_FIRED_TRIGGERS ; DROP TABLE QUARTZ_LOCKS ; DROP TABLE QUARTZ_PAUSED_TRIGGER_GRPS ; DROP TABLE QUARTZ_SCHEDULER_STATE ; DROP TABLE QUARTZ_SIMPLE_TRIGGERS ; DROP TABLE QUARTZ_SIMPROP_TRIGGERS ; DROP TABLE QUARTZ_TRIGGERS ; DROP TABLE QUARTZ_JOB_DETAILS ;","title":"Cheat Sheet"},{"location":"developer-resources/cheat-sheet/#cheat-sheet","text":"","title":"Cheat Sheet"},{"location":"developer-resources/cheat-sheet/#clean-up-database","text":"Go to the Database perspective. Switch to the local datasource type and select the SystemDB . Execute the following queries: Delete Data Drop Tables DELETE FROM DIRIGIBLE_BPM ; DELETE FROM DIRIGIBLE_DATA_STRUCTURES ; DELETE FROM DIRIGIBLE_EXTENSIONS ; DELETE FROM DIRIGIBLE_EXTENSION_POINTS ; DELETE FROM DIRIGIBLE_IDENTITY ; DELETE FROM DIRIGIBLE_JOBS ; DELETE FROM DIRIGIBLE_LISTENERS ; DELETE FROM DIRIGIBLE_MIGRATIONS ; DELETE FROM DIRIGIBLE_ODATA ; DELETE FROM DIRIGIBLE_ODATA_CONTAINER ; DELETE FROM DIRIGIBLE_ODATA_MAPPING ; DELETE FROM DIRIGIBLE_ODATA_SCHEMA ; DELETE FROM DIRIGIBLE_ODATA_HANDLER ; DELETE FROM DIRIGIBLE_PUBLISH_LOGS ; DELETE FROM DIRIGIBLE_PUBLISH_REQUESTS ; DELETE FROM DIRIGIBLE_SECURITY_ACCESS ; DELETE FROM DIRIGIBLE_SECURITY_ROLES ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DELETE FROM DIRIGIBLE_WEBSOCKETS ; DELETE FROM QUARTZ_BLOB_TRIGGERS ; DELETE FROM QUARTZ_CALENDARS ; DELETE FROM QUARTZ_CRON_TRIGGERS ; DELETE FROM QUARTZ_FIRED_TRIGGERS ; DELETE FROM QUARTZ_LOCKS ; DELETE FROM QUARTZ_PAUSED_TRIGGER_GRPS ; DELETE FROM QUARTZ_SCHEDULER_STATE ; DELETE FROM QUARTZ_SIMPLE_TRIGGERS ; DELETE FROM QUARTZ_SIMPROP_TRIGGERS ; DELETE FROM QUARTZ_TRIGGERS ; DELETE FROM QUARTZ_JOB_DETAILS ; DROP TABLE DIRIGIBLE_BPM ; DROP TABLE DIRIGIBLE_DATA_STRUCTURES ; DROP TABLE DIRIGIBLE_EXTENSIONS ; DROP TABLE DIRIGIBLE_EXTENSION_POINTS ; DROP TABLE DIRIGIBLE_IDENTITY ; DROP TABLE DIRIGIBLE_JOBS ; DROP TABLE DIRIGIBLE_LISTENERS ; DROP TABLE DIRIGIBLE_MIGRATIONS ; DROP TABLE DIRIGIBLE_ODATA ; DROP TABLE DIRIGIBLE_ODATA_CONTAINER ; DROP TABLE DIRIGIBLE_ODATA_MAPPING ; DROP TABLE DIRIGIBLE_ODATA_SCHEMA ; DROP TABLE DIRIGIBLE_ODATA_HANDLER ; DROP TABLE DIRIGIBLE_PUBLISH_LOGS ; DROP TABLE DIRIGIBLE_PUBLISH_REQUESTS ; DROP TABLE DIRIGIBLE_SECURITY_ACCESS ; DROP TABLE DIRIGIBLE_SECURITY_ROLES ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DROP TABLE DIRIGIBLE_WEBSOCKETS ; DROP TABLE QUARTZ_BLOB_TRIGGERS ; DROP TABLE QUARTZ_CALENDARS ; DROP TABLE QUARTZ_CRON_TRIGGERS ; DROP TABLE QUARTZ_FIRED_TRIGGERS ; DROP TABLE QUARTZ_LOCKS ; DROP TABLE QUARTZ_PAUSED_TRIGGER_GRPS ; DROP TABLE QUARTZ_SCHEDULER_STATE ; DROP TABLE QUARTZ_SIMPLE_TRIGGERS ; DROP TABLE QUARTZ_SIMPROP_TRIGGERS ; DROP TABLE QUARTZ_TRIGGERS ; DROP TABLE QUARTZ_JOB_DETAILS ;","title":"Clean Up Database"},{"location":"developer-resources/java-remote-debugging/","text":"Java Remote Debugging Debugging To connect for remote java debugging of Eclipse Dirigible, follow the next steps: Start the Tomcat server in JPDA (debug) mode: Run Tomcat in JPDA mode on macOS on Linux on Windows Docker Image ./catalina.sh jpda run ./catalina.sh jpda run catalina.bat jpda run Run the docker image with Java Debugging Options as described here . Eclipse IDE IntelliJ IDEA Create new Debug Configuration : New Remote Java Application configuration: Note Double click on the Remote Java Application to create new configuration. Update the host and port properties, if needed. Press the Debug button to start new remote debug session. Create new Debug Configuration from the Edit Configurations.. option: Add new Remote JVM Debug configuration using the + button and double click on Remote JVM Debug : Use the configuration provided on the screenshot below, update the host and port properties if needed: Press the Debug button to start new remote debug session.","title":"Java Remote Debugging"},{"location":"developer-resources/java-remote-debugging/#java-remote-debugging","text":"","title":"Java Remote Debugging"},{"location":"developer-resources/java-remote-debugging/#debugging","text":"To connect for remote java debugging of Eclipse Dirigible, follow the next steps: Start the Tomcat server in JPDA (debug) mode: Run Tomcat in JPDA mode on macOS on Linux on Windows Docker Image ./catalina.sh jpda run ./catalina.sh jpda run catalina.bat jpda run Run the docker image with Java Debugging Options as described here . Eclipse IDE IntelliJ IDEA Create new Debug Configuration : New Remote Java Application configuration: Note Double click on the Remote Java Application to create new configuration. Update the host and port properties, if needed. Press the Debug button to start new remote debug session. Create new Debug Configuration from the Edit Configurations.. option: Add new Remote JVM Debug configuration using the + button and double click on Remote JVM Debug : Use the configuration provided on the screenshot below, update the host and port properties if needed: Press the Debug button to start new remote debug session.","title":"Debugging"},{"location":"developer-resources/keyboard-shortcuts/","text":"Keyboard Shortcuts Keyboard shortcuts represent combinations of two or more keyboard buttons that, when pressed at the same time, yield actions that can also be achieved by clicking a button on the UI. Keyboard Combination Action Ctrl + S / Cmd + S Save Alt + W / Option + W Close active editor Alt + Shift + W / Option + Shift + W Close all opened editors Ctrl + Shift + F / Cmd + Shift + F Open Search view Command Pallette The command pallette gives you access to the most common operations in Eclipse Dirigible along with their keyboard shortcuts. You can access the command pallette by pressing F1 on your keyboard.","title":"Keyboard Shortcuts"},{"location":"developer-resources/keyboard-shortcuts/#keyboard-shortcuts","text":"Keyboard shortcuts represent combinations of two or more keyboard buttons that, when pressed at the same time, yield actions that can also be achieved by clicking a button on the UI. Keyboard Combination Action Ctrl + S / Cmd + S Save Alt + W / Option + W Close active editor Alt + Shift + W / Option + Shift + W Close all opened editors Ctrl + Shift + F / Cmd + Shift + F Open Search view","title":"Keyboard Shortcuts"},{"location":"developer-resources/keyboard-shortcuts/#command-pallette","text":"The command pallette gives you access to the most common operations in Eclipse Dirigible along with their keyboard shortcuts. You can access the command pallette by pressing F1 on your keyboard.","title":"Command Pallette"},{"location":"development/","text":"Getting Started Overview This guide explains how to setup an Eclipse Dirigible instance and how to use it to build your very first Hello World service. The references section below points to the documentation with more technical details for the different aspects of the platform and its components and capabilities. Setup Trial Environment In case you are using the shared https://trial.dirigible.io environment, you can skip this section. Get the binary In case you want to use a prebuild package, you can get the one built for your environment from the downloads section. To build Eclipse Dirigible from sources by yourself, just follow the instructions in the README . Choose the environment You can choose one of the setup options available to get an Eclipse Dirigible instance depending on your target environment. A shared trial instance is also available and can be accessed from here: https://trial.dirigible.io Environment Variables There are many configuration options , so you can connect to different databases, use different platforms, choose a specific set of plugins, and many more. Access the instance In case of a local setup on your machine, you can access Eclipse Dirigible at the following location: http://localhost:8080 Default Credentials The default username is admin and the default password is admin . The credentials can be updated, as described in the configuration options . Hello World Application Create a Hello World service Once you have a running Eclipse Dirigible instance, you can start with your project: Right-click inside the Projects view. From the menu select the New Project option. Enter hello-world for the name of the project and click the Create button. Right-click on the hello-world project in the Projects view and choose TypeScript or JavaScript ECMA6 service from the New dropdown: TypeScript JavaScript ECMA6 Select the New \u2192 TypeScript Service option: Enter service.ts for the name of the TypeScript Service : Double-click on the service.ts to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.ts selected in the Projects view, check the result of the execution of the server-side TypeScript Service in the Preview view: Note The TypeScript Service is published and available at the http://localhost:8080/services/ts/hello-world/service.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . Select the New \u2192 JavaScript ESM Service option: Enter service.mjs for the name of the JavaScript ESM Service : Double-click on the service.mjs to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.mjs selected in the Projects view, check the result of the execution of the server-side JavaScript ESM Service in the Preview view: Note The JavaScript ESM Service is published and available at the http://localhost:8080/services/js/hello-world/service.mjs URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . Update the Hello World service Go to line 3 in the editor and change the Hello World! message to Hello Eclipse Dirigible! . TypeScript JavaScript ECMA6 import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); Save the file: Ctrl + S for Windows, Cmd + S for macOS The output in the Preview view changes immediately. Note This is due to the default configuration of auto-publish on save . You can find more about this dynamic behavior in Dynamic Applications . References So far we saw how easy it is to create and modify a Hello World RESTful service, but Eclipse Dirigible capabilities goes way beyond that. References You can explore the Tutorials section for more scenarios. If you would like to build complex services, you can go to the API section to find more JavaScript APIs that Eclipse Dirigible provides out-of-the-box. If you are curious what you can do with Eclipse Dirigible apart from writing server-side JavaScript services, you can have a look at the features section. In case you are interested in Modeling and Generation with the Low-Code/No-Code tooling of Eclipse Dirigible, you can read about Entity Data Models and Generation .","title":"Getting Started"},{"location":"development/#getting-started","text":"","title":"Getting Started"},{"location":"development/#overview","text":"This guide explains how to setup an Eclipse Dirigible instance and how to use it to build your very first Hello World service. The references section below points to the documentation with more technical details for the different aspects of the platform and its components and capabilities.","title":"Overview"},{"location":"development/#setup","text":"Trial Environment In case you are using the shared https://trial.dirigible.io environment, you can skip this section.","title":"Setup"},{"location":"development/#get-the-binary","text":"In case you want to use a prebuild package, you can get the one built for your environment from the downloads section. To build Eclipse Dirigible from sources by yourself, just follow the instructions in the README .","title":"Get the binary"},{"location":"development/#choose-the-environment","text":"You can choose one of the setup options available to get an Eclipse Dirigible instance depending on your target environment. A shared trial instance is also available and can be accessed from here: https://trial.dirigible.io Environment Variables There are many configuration options , so you can connect to different databases, use different platforms, choose a specific set of plugins, and many more.","title":"Choose the environment"},{"location":"development/#access-the-instance","text":"In case of a local setup on your machine, you can access Eclipse Dirigible at the following location: http://localhost:8080 Default Credentials The default username is admin and the default password is admin . The credentials can be updated, as described in the configuration options .","title":"Access the instance"},{"location":"development/#hello-world-application","text":"","title":"Hello World Application"},{"location":"development/#create-a-hello-world-service","text":"Once you have a running Eclipse Dirigible instance, you can start with your project: Right-click inside the Projects view. From the menu select the New Project option. Enter hello-world for the name of the project and click the Create button. Right-click on the hello-world project in the Projects view and choose TypeScript or JavaScript ECMA6 service from the New dropdown: TypeScript JavaScript ECMA6 Select the New \u2192 TypeScript Service option: Enter service.ts for the name of the TypeScript Service : Double-click on the service.ts to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.ts selected in the Projects view, check the result of the execution of the server-side TypeScript Service in the Preview view: Note The TypeScript Service is published and available at the http://localhost:8080/services/ts/hello-world/service.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . Select the New \u2192 JavaScript ESM Service option: Enter service.mjs for the name of the JavaScript ESM Service : Double-click on the service.mjs to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.mjs selected in the Projects view, check the result of the execution of the server-side JavaScript ESM Service in the Preview view: Note The JavaScript ESM Service is published and available at the http://localhost:8080/services/js/hello-world/service.mjs URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL .","title":"Create a Hello World service"},{"location":"development/#update-the-hello-world-service","text":"Go to line 3 in the editor and change the Hello World! message to Hello Eclipse Dirigible! . TypeScript JavaScript ECMA6 import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); Save the file: Ctrl + S for Windows, Cmd + S for macOS The output in the Preview view changes immediately. Note This is due to the default configuration of auto-publish on save . You can find more about this dynamic behavior in Dynamic Applications .","title":"Update the Hello World service"},{"location":"development/#references","text":"So far we saw how easy it is to create and modify a Hello World RESTful service, but Eclipse Dirigible capabilities goes way beyond that. References You can explore the Tutorials section for more scenarios. If you would like to build complex services, you can go to the API section to find more JavaScript APIs that Eclipse Dirigible provides out-of-the-box. If you are curious what you can do with Eclipse Dirigible apart from writing server-side JavaScript services, you can have a look at the features section. In case you are interested in Modeling and Generation with the Low-Code/No-Code tooling of Eclipse Dirigible, you can read about Entity Data Models and Generation .","title":"References"},{"location":"development/devops/","text":"Development & Operations Providing modern business applications in the Cloud nowadays requires a tight relation between the development and operations activities. Dirigible by promoting the in-system development for a full-stack applications needs to cover the both phases with the necessary tools and backend frameworks. Development The front-facing Web IDE component is a collection of plugins for project management, source code editing, modeling, SCM integration, database management, and many more. Workbench Git Database Debugger Documents Search Import Preview Editor - Monaco BPMN Modeler Database Schema Modeler Entity Data Modeler Operations The functionality for import and export of projects or workspaces as well as cloning of a whole Dirigible instance, monitoring, document management, etc. are also integrated in the Web IDE component. Operations Database Repository Terminal Snapshot Logs Console","title":"Development & Operations"},{"location":"development/devops/#development-operations","text":"Providing modern business applications in the Cloud nowadays requires a tight relation between the development and operations activities. Dirigible by promoting the in-system development for a full-stack applications needs to cover the both phases with the necessary tools and backend frameworks.","title":"Development & Operations"},{"location":"development/devops/#development","text":"The front-facing Web IDE component is a collection of plugins for project management, source code editing, modeling, SCM integration, database management, and many more. Workbench Git Database Debugger Documents Search Import Preview Editor - Monaco BPMN Modeler Database Schema Modeler Entity Data Modeler","title":"Development"},{"location":"development/devops/#operations","text":"The functionality for import and export of projects or workspaces as well as cloning of a whole Dirigible instance, monitoring, document management, etc. are also integrated in the Web IDE component. Operations Database Repository Terminal Snapshot Logs Console","title":"Operations"},{"location":"development/artifacts/","text":"Artifacts Overview File extensions Database *.table - a JSON based database table descriptor file. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artefacts. *.view - a JSON based database table descriptor file. The synchroniser reads and creates the database views as defined in the model. *.csvim - a JSON based descriptor file containing, pointing to a CSV file to be imported to the configured database table. Security *.access - security constraints file. It defines the access permissions for the given endpoints. *.role - roles definition file. Flows *.listener - listener definition describing the link between the message queue or topic and the corresponding handler. *.job - job definition describing the period in which the scheduled handler will be executed. Scripting *.js - a JavaScript file supposed to be executed either server-side by the supported engine (GraalJS) or at the client-side by the browser's built-in engine. *.command - a Shell Command service *.md - a Markdown Wiki file. ES6 and TypeScript Starting from version 8.x of Eclipse Dirigible, it's possible to use also *.mjs (ES6 modules) and *.ts (TypeScript) for the development of server-side services. Modeling *.dsm - an internal XML based format file containing a database schema model diagram. *.schema - a JSON descriptor for a database schema layout produced by the Database Schema Modeler *.edm - an internal XML based format file containing an entity data model diagram. *.model - a JSON descriptor for an entity data model produced by the Entity Data Modeler *.bpmn - a BPMN 2.0 XML file containing a definition of a business process.","title":"Artifacts Overview"},{"location":"development/artifacts/#artifacts-overview","text":"","title":"Artifacts Overview"},{"location":"development/artifacts/#file-extensions","text":"","title":"File extensions"},{"location":"development/artifacts/#database","text":"*.table - a JSON based database table descriptor file. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artefacts. *.view - a JSON based database table descriptor file. The synchroniser reads and creates the database views as defined in the model. *.csvim - a JSON based descriptor file containing, pointing to a CSV file to be imported to the configured database table.","title":"Database"},{"location":"development/artifacts/#security","text":"*.access - security constraints file. It defines the access permissions for the given endpoints. *.role - roles definition file.","title":"Security"},{"location":"development/artifacts/#flows","text":"*.listener - listener definition describing the link between the message queue or topic and the corresponding handler. *.job - job definition describing the period in which the scheduled handler will be executed.","title":"Flows"},{"location":"development/artifacts/#scripting","text":"*.js - a JavaScript file supposed to be executed either server-side by the supported engine (GraalJS) or at the client-side by the browser's built-in engine. *.command - a Shell Command service *.md - a Markdown Wiki file. ES6 and TypeScript Starting from version 8.x of Eclipse Dirigible, it's possible to use also *.mjs (ES6 modules) and *.ts (TypeScript) for the development of server-side services.","title":"Scripting"},{"location":"development/artifacts/#modeling","text":"*.dsm - an internal XML based format file containing a database schema model diagram. *.schema - a JSON descriptor for a database schema layout produced by the Database Schema Modeler *.edm - an internal XML based format file containing an entity data model diagram. *.model - a JSON descriptor for an entity data model produced by the Entity Data Modeler *.bpmn - a BPMN 2.0 XML file containing a definition of a business process.","title":"Modeling"},{"location":"development/artifacts/data-files/","text":"Data Files Delimiter Separated Values *.dsv data files are used for importing test data during development or for defining static content for e.g. nomenclatures. The data file name has to be the same as the target table name. The delimiter uses the | char, and the order of the data fields should be the same as the natural order in the target table. Be careful when using static data in tables. Entity Services (generated by the templates) use sequence algorithm to identity columns starting from 1. The automatic re-initialization of static content from the data file can be achieved when you create a *.dsv file in your project. To make it more flexible it is introduced semantic files as follows: REPLACE (*.replace) - the rows in the database table always correspond to the lines in the data file. Processing of this type of files means - first delete all the records in the database table and insert the rows from the file. This is the behavior of the initial format - DSV (*.dsv). The processing is triggered on restart of the App/Web Server or on publishing of the project containing these files. APPEND (*.append) - the rows from these files are imported only once into the corresponding database tables. If the tables already contain some records the inserting is skipped. After the initial import the corresponding sequence is set to the max ID of the table, so that this table can be used afterwards as persistence storage for the e.g. standard CRUD JavaScript Services. DELETE (*.delete) - if the file contains * as the only line, the whole table is cleaned up. Otherwise only the listed records got deleted by the ID (first column = ID = primary key) . UPDATE (*.update) - the records in the database table got updated with the corresponding lines in the data files. The first column is the ID = primary key used as selection parameter for the update clause. The existing records in the table are not deleted in advance as at the REPLACE case. If no record exist by the given ID , it got inserted. Samples Data Structures and Data Files samples could be found here: Database Table (*.table) . Database View (*.view) . Data Replace (*.replace) . Data Append (*.append) . Data Delete (*.delete) . Data Update (*.update) .","title":"Data Files"},{"location":"development/artifacts/data-files/#data-files","text":"Delimiter Separated Values *.dsv data files are used for importing test data during development or for defining static content for e.g. nomenclatures. The data file name has to be the same as the target table name. The delimiter uses the | char, and the order of the data fields should be the same as the natural order in the target table. Be careful when using static data in tables. Entity Services (generated by the templates) use sequence algorithm to identity columns starting from 1. The automatic re-initialization of static content from the data file can be achieved when you create a *.dsv file in your project. To make it more flexible it is introduced semantic files as follows: REPLACE (*.replace) - the rows in the database table always correspond to the lines in the data file. Processing of this type of files means - first delete all the records in the database table and insert the rows from the file. This is the behavior of the initial format - DSV (*.dsv). The processing is triggered on restart of the App/Web Server or on publishing of the project containing these files. APPEND (*.append) - the rows from these files are imported only once into the corresponding database tables. If the tables already contain some records the inserting is skipped. After the initial import the corresponding sequence is set to the max ID of the table, so that this table can be used afterwards as persistence storage for the e.g. standard CRUD JavaScript Services. DELETE (*.delete) - if the file contains * as the only line, the whole table is cleaned up. Otherwise only the listed records got deleted by the ID (first column = ID = primary key) . UPDATE (*.update) - the records in the database table got updated with the corresponding lines in the data files. The first column is the ID = primary key used as selection parameter for the update clause. The existing records in the table are not deleted in advance as at the REPLACE case. If no record exist by the given ID , it got inserted. Samples Data Structures and Data Files samples could be found here: Database Table (*.table) . Database View (*.view) . Data Replace (*.replace) . Data Append (*.append) . Data Delete (*.delete) . Data Update (*.update) .","title":"Data Files"},{"location":"development/artifacts/database-table/","text":"Table Model Table Model is a JSON formatted *.table descriptor. It represents the layout of the database table, which will be created during the activation process. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artifacts. Example descriptor: { \"tableName\" : \"TEST001\" , \"columns\" : [ { \"name\" : \"ID\" , \"type\" : \"INTEGER\" , \"length\" : \"0\" , \"notNull\" : \"true\" , \"primaryKey\" : \"true\" , \"defaultValue\" : \"\" }, { \"name\" : \"NAME\" , \"type\" : \"VARCHAR\" , \"length\" : \"20\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"DATEOFBIRTH\" , \"type\" : \"DATE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"SALARY\" , \"type\" : \"DOUBLE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" } ] } The supported database types are: VARCHAR - for text-based fields long up to 2K characters CHAR - for text-based fields with fixed length of up to 255 characters INTEGER - 32 bit BIGINT - 64 bit SMALLINT - 16 bit REAL - 7 digits of mantissa DOUBLE - 15 digits of mantissa DATE - represents a date consisting of day, month, and year TIME - represents a time consisting of hours, minutes, and seconds TIMESTAMP - represents DATE, TIME, a nanosecond field, and a time zone BLOB - a binary object, such as an image, audio, etc. The activation of the table descriptor is the process of creating a database table in the target database. The activator constructs a CREATE TABLE SQL statement considering the dialect of the target database system. If a particular table name already exists, the activator checks whether there is a compatible change, such as adding new columns, and constructs an ALTER TABLE SQL statement. If the change is incompatible, the activator returns an error that has to be solved manually through the SQL console. Data Structures Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing. Scripting Services Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers. Web Content Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc. Wiki Content Support of Markdown format for Wiki pages. Integration Services Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ). Mobile Applications Support of native mobile application development via Tabris.js . Extension Definitions Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ). Tooling Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS Modeling Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer Security Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly) Registry Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Database Table"},{"location":"development/artifacts/database-table/#table-model","text":"Table Model is a JSON formatted *.table descriptor. It represents the layout of the database table, which will be created during the activation process. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artifacts. Example descriptor: { \"tableName\" : \"TEST001\" , \"columns\" : [ { \"name\" : \"ID\" , \"type\" : \"INTEGER\" , \"length\" : \"0\" , \"notNull\" : \"true\" , \"primaryKey\" : \"true\" , \"defaultValue\" : \"\" }, { \"name\" : \"NAME\" , \"type\" : \"VARCHAR\" , \"length\" : \"20\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"DATEOFBIRTH\" , \"type\" : \"DATE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"SALARY\" , \"type\" : \"DOUBLE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" } ] } The supported database types are: VARCHAR - for text-based fields long up to 2K characters CHAR - for text-based fields with fixed length of up to 255 characters INTEGER - 32 bit BIGINT - 64 bit SMALLINT - 16 bit REAL - 7 digits of mantissa DOUBLE - 15 digits of mantissa DATE - represents a date consisting of day, month, and year TIME - represents a time consisting of hours, minutes, and seconds TIMESTAMP - represents DATE, TIME, a nanosecond field, and a time zone BLOB - a binary object, such as an image, audio, etc. The activation of the table descriptor is the process of creating a database table in the target database. The activator constructs a CREATE TABLE SQL statement considering the dialect of the target database system. If a particular table name already exists, the activator checks whether there is a compatible change, such as adding new columns, and constructs an ALTER TABLE SQL statement. If the change is incompatible, the activator returns an error that has to be solved manually through the SQL console.","title":"Table Model"},{"location":"development/artifacts/database-table/#data-structures","text":"Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing.","title":"Data Structures"},{"location":"development/artifacts/database-table/#scripting-services","text":"Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers.","title":"Scripting Services"},{"location":"development/artifacts/database-table/#web-content","text":"Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc.","title":"Web Content"},{"location":"development/artifacts/database-table/#wiki-content","text":"Support of Markdown format for Wiki pages.","title":"Wiki Content"},{"location":"development/artifacts/database-table/#integration-services","text":"Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ).","title":"Integration Services"},{"location":"development/artifacts/database-table/#mobile-applications","text":"Support of native mobile application development via Tabris.js .","title":"Mobile Applications"},{"location":"development/artifacts/database-table/#extension-definitions","text":"Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ).","title":"Extension Definitions"},{"location":"development/artifacts/database-table/#tooling","text":"Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS","title":"Tooling"},{"location":"development/artifacts/database-table/#modeling","text":"Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer","title":"Modeling"},{"location":"development/artifacts/database-table/#security","text":"Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly)","title":"Security"},{"location":"development/artifacts/database-table/#registry","text":"Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Registry"},{"location":"development/concepts/","text":"Concepts Overview Dynamic Applications There are several must-know concepts that are implied in the cloud toolkit and have to be understood before getting started. Some of them are closely related to the dynamic applications nature and behavior, others just follow the best practices from the service architecture reflected also in the cloud applications. Repository First comes the concept of a repository . It is the place where the application's content is stored - such as a database for the Eclipse Dirigible's instance. Workspace Next is the concept of a workspace that is very similar to the well known workspace from desktop IDEs (e.g., Eclipse). The workspace can hold one or more projects. One user can have multiple workspaces, but can work in only one at a given moment of time. Registry Registry and the related publishing processes are taken from the SOA (UDDI) and recent API management trends to bring some of their strengths, such as discoverability, reusability, loose coupling, relevance, etc. Generation To boost the development productivity at the very initial phase, we introduced template-based generation of application artifacts via wizards. Entity Services The new Web 2.0 paradigm and the leveraged REST architectural style changed the way services should behave and be described. Although there is push for bilateral contracts only and free description of the services, we decided to introduce a more sophisticated kind of services for special purposes - entity services . Modeling This is the visual definition of database schema models, entity data models, and BPMN processes. In Eclipse Dirigible, modeling is enabled by several editors and modelers . REST framework Along with the low level HTTP request, response, and session handling, Eclipse Dirigible provides a higher level framework for building REST services. More information on how to use this framework can be found here . Web Content This is the client-side application code transported via the container web channel. More information can be found here . Mobile Apps Mobile application support in Eclipse Dirigible is achieved via Tabris.js . Extensions Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions .","title":"Concepts Overview"},{"location":"development/concepts/#concepts-overview","text":"","title":"Concepts Overview"},{"location":"development/concepts/#dynamic-applications","text":"There are several must-know concepts that are implied in the cloud toolkit and have to be understood before getting started. Some of them are closely related to the dynamic applications nature and behavior, others just follow the best practices from the service architecture reflected also in the cloud applications.","title":"Dynamic Applications"},{"location":"development/concepts/#repository","text":"First comes the concept of a repository . It is the place where the application's content is stored - such as a database for the Eclipse Dirigible's instance.","title":"Repository"},{"location":"development/concepts/#workspace","text":"Next is the concept of a workspace that is very similar to the well known workspace from desktop IDEs (e.g., Eclipse). The workspace can hold one or more projects. One user can have multiple workspaces, but can work in only one at a given moment of time.","title":"Workspace"},{"location":"development/concepts/#registry","text":"Registry and the related publishing processes are taken from the SOA (UDDI) and recent API management trends to bring some of their strengths, such as discoverability, reusability, loose coupling, relevance, etc.","title":"Registry"},{"location":"development/concepts/#generation","text":"To boost the development productivity at the very initial phase, we introduced template-based generation of application artifacts via wizards.","title":"Generation"},{"location":"development/concepts/#entity-services","text":"The new Web 2.0 paradigm and the leveraged REST architectural style changed the way services should behave and be described. Although there is push for bilateral contracts only and free description of the services, we decided to introduce a more sophisticated kind of services for special purposes - entity services .","title":"Entity Services"},{"location":"development/concepts/#modeling","text":"This is the visual definition of database schema models, entity data models, and BPMN processes. In Eclipse Dirigible, modeling is enabled by several editors and modelers .","title":"Modeling"},{"location":"development/concepts/#rest-framework","text":"Along with the low level HTTP request, response, and session handling, Eclipse Dirigible provides a higher level framework for building REST services. More information on how to use this framework can be found here .","title":"REST framework"},{"location":"development/concepts/#web-content","text":"This is the client-side application code transported via the container web channel. More information can be found here .","title":"Web Content"},{"location":"development/concepts/#mobile-apps","text":"Mobile application support in Eclipse Dirigible is achieved via Tabris.js .","title":"Mobile Apps"},{"location":"development/concepts/#extensions","text":"Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions .","title":"Extensions"},{"location":"development/concepts/dynamic-applications/","text":"Dynamic Applications We introduced the term \"dynamic applications\" as one that narrows the scope of the target applications that can be created using Eclipse Dirigible. The overall process of building dynamic applications lies on well-known and proved principles: In-system development - known from microcontrollers to business software systems. A major benefit is working on a live system where all changes you make take effect immediately, hence the impact and side effects can be realized in the early stages of the development process. Content-centric - known from networking to development processes in the context of dynamic applications it comprises. All the artifacts are text-based models or executable scripts stored in a generic repository (along with the related binaries, such as images). This makes the life-cycle management of the application itself and the transport between the landscapes (Dev/Test/Prod) straight forward. In result, you can set up the whole system only by pulling the content from a remote source code repository such as git . Scripting languages - programming languages written for a special runtime environment that can interpret (rather than compile) the execution of tasks. Dynamic languages existing nowadays, as well as the existing smooth integration in the Web servers, make the rise of the in-system development in the cloud possible. Shortest turn-around time - the driving principle for our tooling because instant access and instant value are some of the most important requirements for the developers. In general, components of a dynamic application can be separated into the following categories: Data structures - The artifacts representing the domain model of the application. In our case, we have chosen the well-accepted JSON format for describing a normalized entity model. There is no intermediate adaptation layer, hence all entities represent directly the database artifacts - tables and views. Entity services - Once we have the domain model entities, next step is to expose them as Web services. Following the modern Web patterns, we provide the scripting capabilities so you can create your RESTful services in JavaScript, Ruby, and Groovy. Scripting services - During the development, you can use a rich set of APIs that give you access to the database and HTTP layer, utilities, and to the direct Java APIs underneath. Support for creating unit tests is important and is, therefore, integrated as an atomic part of the scripting support itself - you can use the same language for the tests as the one for the services themselves. User interface - Web 2.0 paradigm, as well as HTML5 specification, bring the Web UI to another level. There are already many cool client-side AJAX frameworks that you can use depending on the nature of your application. Integration services - Following the principle of atomicity, one dynamic application should be as self contained as possible. Unfortunately, in the real world there are always some external services that have to be integrated in your application - for data transfer, triggering external processes, lookup in external sources, etc. For this purpose, we provide capabilities for creating simple routing services and dynamic EIPs. Documentation - The documentation is integral part of your application. The target format for describing services and for overall development documentation is already well accepted - wiki .","title":"Dynamic Applications"},{"location":"development/concepts/dynamic-applications/#dynamic-applications","text":"We introduced the term \"dynamic applications\" as one that narrows the scope of the target applications that can be created using Eclipse Dirigible. The overall process of building dynamic applications lies on well-known and proved principles: In-system development - known from microcontrollers to business software systems. A major benefit is working on a live system where all changes you make take effect immediately, hence the impact and side effects can be realized in the early stages of the development process. Content-centric - known from networking to development processes in the context of dynamic applications it comprises. All the artifacts are text-based models or executable scripts stored in a generic repository (along with the related binaries, such as images). This makes the life-cycle management of the application itself and the transport between the landscapes (Dev/Test/Prod) straight forward. In result, you can set up the whole system only by pulling the content from a remote source code repository such as git . Scripting languages - programming languages written for a special runtime environment that can interpret (rather than compile) the execution of tasks. Dynamic languages existing nowadays, as well as the existing smooth integration in the Web servers, make the rise of the in-system development in the cloud possible. Shortest turn-around time - the driving principle for our tooling because instant access and instant value are some of the most important requirements for the developers. In general, components of a dynamic application can be separated into the following categories: Data structures - The artifacts representing the domain model of the application. In our case, we have chosen the well-accepted JSON format for describing a normalized entity model. There is no intermediate adaptation layer, hence all entities represent directly the database artifacts - tables and views. Entity services - Once we have the domain model entities, next step is to expose them as Web services. Following the modern Web patterns, we provide the scripting capabilities so you can create your RESTful services in JavaScript, Ruby, and Groovy. Scripting services - During the development, you can use a rich set of APIs that give you access to the database and HTTP layer, utilities, and to the direct Java APIs underneath. Support for creating unit tests is important and is, therefore, integrated as an atomic part of the scripting support itself - you can use the same language for the tests as the one for the services themselves. User interface - Web 2.0 paradigm, as well as HTML5 specification, bring the Web UI to another level. There are already many cool client-side AJAX frameworks that you can use depending on the nature of your application. Integration services - Following the principle of atomicity, one dynamic application should be as self contained as possible. Unfortunately, in the real world there are always some external services that have to be integrated in your application - for data transfer, triggering external processes, lookup in external sources, etc. For this purpose, we provide capabilities for creating simple routing services and dynamic EIPs. Documentation - The documentation is integral part of your application. The target format for describing services and for overall development documentation is already well accepted - wiki .","title":"Dynamic Applications"},{"location":"development/concepts/entity-service/","text":"Entity Service In general, the entity service is a fully capable RESTful service as it is defined by REST architectural style for performance, scalability, simplicity, and so on. It exposes the CRUD operations of a given domain model object. Underneath it, the database store is connected as a data transfer layer. The domain object management is the service pattern that is used most often when following the RESTful paradigm on business software components. In Eclipse Dirigible, the standard functionality of Web services is enhanced but without breaking the REST principles. This is useful for generic utilities and user interface generation. Standard functionality: GET method If the requested path points directly to the service endpoint (no additional parameters), it lists all the entities of this type (in this collection). If the request contains an id parameter, the service returns only the requested entity. POST method - creates an entity, getting the fields from the request body (JSON formatted) and auto-generated ID. PUT method - updates the entity, getting the ID from the request body (JSON formatted). DELETE method - deletes the entity by the provided ID parameter, which is mandatory. Enhancements to the standard functionality of GET with the following parameters: count - returns the number of the entities collection size. metadata - returns the simplified descriptor of the entity in JSON (see below). sort - indicates the order of the entities. desc - indicates the reverse order used with the above parameter. limit - used for paging, returns limited result set. offset - used for paging, result set starts from the offset value. Example metadata for an entity: { \"name\" : \"books\" , \"type\" : \"object\" , \"properties\" : [ { \"name\" : \"book_id\" , \"type\" : \"integer\" , \"key\" : \"true\" , \"required\" : \"true\" }, { \"name\" : \"book_isbn\" , \"type\" : \"string\" }, { \"name\" : \"book_title\" , \"type\" : \"string\" }, { \"name\" : \"book_author\" , \"type\" : \"string\" }, { \"name\" : \"book_editor\" , \"type\" : \"string\" }, { \"name\" : \"book_publisher\" , \"type\" : \"string\" }, { \"name\" : \"book_format\" , \"type\" : \"string\" }, { \"name\" : \"book_publication_date\" , \"type\" : \"date\" }, { \"name\" : \"book_price\" , \"type\" : \"double\" } ] } All these features of entity services are implied during the generation process. As an input, the template uses a database table and an entity service name that are entered in the Entity Data Modeler . Just select the *.entity artifact in the Workspace view. Choose Generate \u2192 User Interface for Entity Service . Limitations for the table to be entity-service compliant: There should be only one column as a primary key that will be used for its identity . There should be only one set of database column types that are supported by default for generation (simple types only as clob and blob are not supported). Generic query methods are not generated because: It will cover only very simple cases with reasonable performance. For the complex queries, the introduction of an additional layer results in worse performance in comparison to the SQL script. Entity services are generated in JavaScript, hence they can be accessed right after generation and publishing on: ://:/services/v4/js// Here's an example: https://example.com/services/v4/js/bookstore/books.js Or just select them in the Workspace view and check the result in the Preview view.","title":"Entity Service"},{"location":"development/concepts/entity-service/#entity-service","text":"In general, the entity service is a fully capable RESTful service as it is defined by REST architectural style for performance, scalability, simplicity, and so on. It exposes the CRUD operations of a given domain model object. Underneath it, the database store is connected as a data transfer layer. The domain object management is the service pattern that is used most often when following the RESTful paradigm on business software components. In Eclipse Dirigible, the standard functionality of Web services is enhanced but without breaking the REST principles. This is useful for generic utilities and user interface generation. Standard functionality: GET method If the requested path points directly to the service endpoint (no additional parameters), it lists all the entities of this type (in this collection). If the request contains an id parameter, the service returns only the requested entity. POST method - creates an entity, getting the fields from the request body (JSON formatted) and auto-generated ID. PUT method - updates the entity, getting the ID from the request body (JSON formatted). DELETE method - deletes the entity by the provided ID parameter, which is mandatory. Enhancements to the standard functionality of GET with the following parameters: count - returns the number of the entities collection size. metadata - returns the simplified descriptor of the entity in JSON (see below). sort - indicates the order of the entities. desc - indicates the reverse order used with the above parameter. limit - used for paging, returns limited result set. offset - used for paging, result set starts from the offset value. Example metadata for an entity: { \"name\" : \"books\" , \"type\" : \"object\" , \"properties\" : [ { \"name\" : \"book_id\" , \"type\" : \"integer\" , \"key\" : \"true\" , \"required\" : \"true\" }, { \"name\" : \"book_isbn\" , \"type\" : \"string\" }, { \"name\" : \"book_title\" , \"type\" : \"string\" }, { \"name\" : \"book_author\" , \"type\" : \"string\" }, { \"name\" : \"book_editor\" , \"type\" : \"string\" }, { \"name\" : \"book_publisher\" , \"type\" : \"string\" }, { \"name\" : \"book_format\" , \"type\" : \"string\" }, { \"name\" : \"book_publication_date\" , \"type\" : \"date\" }, { \"name\" : \"book_price\" , \"type\" : \"double\" } ] } All these features of entity services are implied during the generation process. As an input, the template uses a database table and an entity service name that are entered in the Entity Data Modeler . Just select the *.entity artifact in the Workspace view. Choose Generate \u2192 User Interface for Entity Service . Limitations for the table to be entity-service compliant: There should be only one column as a primary key that will be used for its identity . There should be only one set of database column types that are supported by default for generation (simple types only as clob and blob are not supported). Generic query methods are not generated because: It will cover only very simple cases with reasonable performance. For the complex queries, the introduction of an additional layer results in worse performance in comparison to the SQL script. Entity services are generated in JavaScript, hence they can be accessed right after generation and publishing on: ://:/services/v4/js// Here's an example: https://example.com/services/v4/js/bookstore/books.js Or just select them in the Workspace view and check the result in the Preview view.","title":"Entity Service"},{"location":"development/concepts/extensions/","text":"Extension Definitions Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions. Extension Points An extension point is the place in the core module, which is expected to be enhanced by particular custom created modules. It is a simple JSON formatted *.extensionpoint file and can be placed anywhere in your project. { \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension Point 1\" } Extensions An extension is the plug-in in the custom module, which extends the core functionality. It is a simple JSON formatted *.extension file and can be placed anywhere in your project. { \"extension\" : \"/project1/extension1\" , \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension 1\" } Note The 'extension' parameter above should point to a valid JavaScript module. For a full example you can look at sample-ide-perspective . Calling Extensions Within the core module, you can iterate over the defined extensions and call theirs functions: let extensions = extensionManager . getExtensions ( \"/project1/extensionPoint1\" ); for ( let i = 0 ; i < extensions . length ; i ++ ) { let extension = require ( extensions [ i ]); response . println ( extension . enhanceProcess ()); } In the code above, the extension is a JavaScript module ( extension1.js ) within the same project, and it has exposed an enhanceProcess() function.","title":"Extension Definitions"},{"location":"development/concepts/extensions/#extension-definitions","text":"Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions.","title":"Extension Definitions"},{"location":"development/concepts/extensions/#extension-points","text":"An extension point is the place in the core module, which is expected to be enhanced by particular custom created modules. It is a simple JSON formatted *.extensionpoint file and can be placed anywhere in your project. { \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension Point 1\" }","title":"Extension Points"},{"location":"development/concepts/extensions/#extensions","text":"An extension is the plug-in in the custom module, which extends the core functionality. It is a simple JSON formatted *.extension file and can be placed anywhere in your project. { \"extension\" : \"/project1/extension1\" , \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension 1\" } Note The 'extension' parameter above should point to a valid JavaScript module. For a full example you can look at sample-ide-perspective .","title":"Extensions"},{"location":"development/concepts/extensions/#calling-extensions","text":"Within the core module, you can iterate over the defined extensions and call theirs functions: let extensions = extensionManager . getExtensions ( \"/project1/extensionPoint1\" ); for ( let i = 0 ; i < extensions . length ; i ++ ) { let extension = require ( extensions [ i ]); response . println ( extension . enhanceProcess ()); } In the code above, the extension is a JavaScript module ( extension1.js ) within the same project, and it has exposed an enhanceProcess() function.","title":"Calling Extensions"},{"location":"development/concepts/generation/","text":"Generation Template-based generation of artifacts helps for developer productivity in the initial phase of building the application. There are several application components that have similar behavior and often very similar implementation. A prominent example is entity service . It has several predefined methods based on REST concepts and HTTP - GET , POST , PUT , DELETE on an entity level as well as a list of all entities. Additionally, the most notable storage for the entity data is the RDBMS provided by the platform. Another example are user interface templates based on patterns - list , master-detail , input form , etc. Templates can also be provided based on different frameworks for client-side interaction. Note The generation here is a one-time process. Once you have the generated artifact, you can modify it based on your own requirements._ In contrast to the approach above, in case of MDA , you can expect to regenerate the PSMs every time you make changes on PIMs . For this approach, we introduced the entity data modeler where you can define declaratively all the needed components and their attributes. Afterwards, you can use them to generate a complete full-stack data-driven application. Note The enhancements in this case must go via extensions only.","title":"Generation"},{"location":"development/concepts/generation/#generation","text":"Template-based generation of artifacts helps for developer productivity in the initial phase of building the application. There are several application components that have similar behavior and often very similar implementation. A prominent example is entity service . It has several predefined methods based on REST concepts and HTTP - GET , POST , PUT , DELETE on an entity level as well as a list of all entities. Additionally, the most notable storage for the entity data is the RDBMS provided by the platform. Another example are user interface templates based on patterns - list , master-detail , input form , etc. Templates can also be provided based on different frameworks for client-side interaction. Note The generation here is a one-time process. Once you have the generated artifact, you can modify it based on your own requirements._ In contrast to the approach above, in case of MDA , you can expect to regenerate the PSMs every time you make changes on PIMs . For this approach, we introduced the entity data modeler where you can define declaratively all the needed components and their attributes. Afterwards, you can use them to generate a complete full-stack data-driven application. Note The enhancements in this case must go via extensions only.","title":"Generation"},{"location":"development/concepts/mobile-apps/","text":"Mobile Applications Overview Mobile application support in Eclipse Dirigible is achieved via Tabris.js . It is a mobile framework that allows you to develop native iOS and Android mobile applications, written entirely in JavaScript. This framework provides native performance, native look and feel, and single code-base (JavaScript). You can use existing JavaScript libraries and native extensions to extend the core functionality. Unlike other frameworks, which use webviews or cross-platform intermediate runtimes, Tabris.js executes the JavaScript directly on the device and renders everything using native widgets. Thanks to the framework capabilities, the developers can focus more on the mobile application development and less on the platform specifics (iOS and Android).","title":"Mobile Applications"},{"location":"development/concepts/mobile-apps/#mobile-applications","text":"","title":"Mobile Applications"},{"location":"development/concepts/mobile-apps/#overview","text":"Mobile application support in Eclipse Dirigible is achieved via Tabris.js . It is a mobile framework that allows you to develop native iOS and Android mobile applications, written entirely in JavaScript. This framework provides native performance, native look and feel, and single code-base (JavaScript). You can use existing JavaScript libraries and native extensions to extend the core functionality. Unlike other frameworks, which use webviews or cross-platform intermediate runtimes, Tabris.js executes the JavaScript directly on the device and renders everything using native widgets. Thanks to the framework capabilities, the developers can focus more on the mobile application development and less on the platform specifics (iOS and Android).","title":"Overview"},{"location":"development/concepts/publishing/","text":"Publishing There is a conceptual separation between design-time and runtime phases of the development life cycle. During the design-time phase, the source artifacts are created and managed within the isolated developer's area - workspace . When you are ready with a given feature, you have to publish the project so that the application artifacts become available for the other users. The meaning of \"available\" depends on the type of artifact. For example, for JavaScript services this is the registration of a public endpoint, while for web and wiki content, it is just the access to the artifacts them self, etc. Publishing action is accessible from the context menu in the Workspace view. The space within the repository, where all the public artifact are placed, is called \"registry\".","title":"Publishing"},{"location":"development/concepts/publishing/#publishing","text":"There is a conceptual separation between design-time and runtime phases of the development life cycle. During the design-time phase, the source artifacts are created and managed within the isolated developer's area - workspace . When you are ready with a given feature, you have to publish the project so that the application artifacts become available for the other users. The meaning of \"available\" depends on the type of artifact. For example, for JavaScript services this is the registration of a public endpoint, while for web and wiki content, it is just the access to the artifacts them self, etc. Publishing action is accessible from the context menu in the Workspace view. The space within the repository, where all the public artifact are placed, is called \"registry\".","title":"Publishing"},{"location":"development/concepts/registry/","text":"Registry The registry is the entry point for searching and browsing for service endpoints, as well as for monitoring and administration at runtime. Technically, it is a space within the repository where all the published artifacts are placed.","title":"Registry"},{"location":"development/concepts/registry/#registry","text":"The registry is the entry point for searching and browsing for service endpoints, as well as for monitoring and administration at runtime. Technically, it is a space within the repository where all the published artifacts are placed.","title":"Registry"},{"location":"development/concepts/repository/","text":"Repository The repository component is the main place where all the project's artifacts are stored. It provides an abstract \"file-system-like\" structure with folder and files that can be backed by different underlying persistence storages - file system, relational database, noSQL database, etc. In a single repository instance there are several spaces holding different types of content - users' workspaces, public registry , search indices, git metadata, versions, etc.","title":"Repository"},{"location":"development/concepts/repository/#repository","text":"The repository component is the main place where all the project's artifacts are stored. It provides an abstract \"file-system-like\" structure with folder and files that can be backed by different underlying persistence storages - file system, relational database, noSQL database, etc. In a single repository instance there are several spaces holding different types of content - users' workspaces, public registry , search indices, git metadata, versions, etc.","title":"Repository"},{"location":"development/concepts/rest/","text":"REST The http-rs module is designed to define and run a broad range of HTTP REST services. A very simple example hello-api.js : var rs = require ( \"http/v4/rs\" ); // serve GET HTTP requests sent to resource path \"\" (i.e. directly to hello-api.js) rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) . execute (); Sending a GET request to /services/v4/js/test/hello-api.js to the server and hosting the hello-api.js piece of code above in test/hello-api.js will return response body: Hello there! Overview Let\u2019s have a closer look at the methods shown in the example above. First, we requested a new REST service instance from the framework: rs.service() Next, we configured the instance to serve HTTP GET requests sent to root path (\"\") using the supplied function: . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) Technically, configuration is not required to execute a service, but obviously it will do nothing, if you don't instruct it what to do. Finally, we run the service and it processes the HTTP request: .execute(); Now, this is a fairly simplistic example aiming to give you a hint of how you can bring up a REST API to life with http-rs. There is a whole lot more that we shall explore in the next sections. Creating REST services rs.service() Creating new service instances is as simple as invoking rs.service() . That returns a configurable and/or executable instance of the HttpController class. The controller API allows to: - start configuring REST service (method resource() ) - serve requests (method execute() ) - perform a couple of more advanced activities, which will be reviewed in the Advanced section below Additionally, the controller API features also shortcut factory methods that are useful for simplistic configurations (like the one in our initial example) such as get(sPath, fServe, arrConsume, arrProduces) . Read below for more examples how to use the methods. Serving requests execute() The mechanism for serving requests is implemented in the execute() method of the HttpController. It tries to match the request to the service API configuration. If the mechanism matches the request successfully, it triggers the execution flow of the callback functions. The execution flow processes the request and response. If the mechanism doesn't match the request successfully, it sends a Bad Request error to the client. The request and response objects are implicitly those that were used to request the script where the execute() method invocation occurred. But they can be exchanged for others as shown in the Advanced section. The execute() method is defined in the service instance (class HttpController) obtained with rs.service() . The execute() method can be triggered with rs.service().execute() . The rs API configuration also provides numerous references to the method so you can invoke it on any stage. For example, rs . service (). get ( \"\" ). execute () rs . service (). resource ( \"\" ). get (). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). execute () are all valid ways to serve requests. What you need to consider is that execute() must be the final method invocation. Even if you retain a reference to a configuration object and change it after that, it will be irrelevant since the response will be flushed and closed by then. Configuring services There are three options as far as configuration is concerned. You can start from scratch and build the configuration using the rs API. You can use configuration objects. They are holding the configuration that the rs API produces. You can start with a configuration object and then enhance or override the configuration using the rs API. Configuration objects A configuration object is a JS object with canonical structure that the http-rs can interpret. We will discuss its schema later on in this guide. For now, let's just consider that it's the same thing that the rs-fluent API will actually produce behind the scenes so it's a completely valid alternative and complement to the rs-fluent API configuration approach. Refer to the Advanced section for more details on using configuration objects. Defining service resources resource(sPath, oConfiguration?) Resources are the top-level configuration objects that represent an HTTP (server) resource , for which we will be defining a protocol. Each resource is identified by a URL on the server. You can have multiple resources per service configuration, provided that their URLs do not overlap. Resource vs Path vs Resource Path As per the REST terms, a resource is an abstraction or a server-side resource that can be a file, a dynamically generated content, or a procedure (although the last is considered heresy by purists). It's virtually anything hosted on a server that has an address and can be accessed with a standard HTTP method. It is often referred to as \"path\" or \"resource path\" due to its singular most notable identifying characteristic. But to be precise, \"path\" is only a property of the resource. As far as configuration is concerned, the resource defines the configuration scope for which we define method handlers and constraints, and is identifiable by its \"path\" property. Resource paths and path templates The sPath string parameter (mandatory) of the resource() method will serve as the resource URL. It is relative to the location where the JavaScript service is running (e.g. /services/v4/my-application/api/my-service.js ). No path ( \"\" ), request directly to the JavaScript service root ( \"\" ) path. The path can also be a URL template, i.e. parameterized. For example consider the path template: {id}/assets/{assetType}/{name} This will resolve request paths such as: /services/js/test.js/1/assets/longterm/building to service path: 1/assets/longterm/building If a request is matched to such path, the service mechanism will provide the resolved parameters as an object map to the function that handles the request. Using the sample path above the path parameters object will look like this: { \"id\" : 1 , \"assetType\" : \"longterm\" , \"name\" : \"building\" } Defining HTTP methods allowed for a resource resource . get () resource . post () resource . put () resource [ \"delete\" ]() and resource . remove () resource . method () By default, only the HTTP request methods that you have configured for a resource are allowed. The fluent API of Resource instances, obtained with the resource(sPath) method that we discussed above, exposes the most popular REST API methods ( get , post , put and delete ). They are simply aliases for the generic method method . Whichever we consider, we will receive a ResourceMethod instance from the invocation and its API will allow us to specify processing functions and further specify constraints on the request/response for which they are applicable: rs.resource('').get().produces([\"application/json\"]).serve(function(){}) Alternatively, as we have already seen, we can supply the serve callback function directly as first argument to the method, which comes in handy if we have nothing more to setup: rs.resource('').get(function(){}) We can also use configuration object as a third option and this will be discussed in the Advanced section. The samples here are all for configuring HTTP GET Method but the usage pattern is still the same for all: rs.resource('').post().consumes([\"application/json\"]).serve(function(){}) Shortcuts You already noticed that instead of explicitly using serve to configure callback for serving the requests we could directly provide the function as argument to the method configuring the HTTP method (e.g. get ). rs.resource('').get(function(){}) rs.resource('').get().serve(function(){}) So why bother provisioning an explicit serve() function in the first place then? The answer is that serve() configures only one of the callback functions that are triggered during the request processing flow. And this shortcut is handy if it is only serve() that you are interested into configuring. Of course, nothing prevents you also from using the shortcut and still configure the other callback functions, unless you find it confusing. These are all valid options. Find out more about configuring request processing callback functions in the section dedicated to this. When the controller API was discussed, it was mentioned that there are shortcut factory methods that combine a couple operations to produce directly a method handler for a resource path. Example rs . service () . get ( \"\" , function ( ctx , request , response ) { response . print ( 'ok' ); }) . execute (); That would be equivalent to the following: rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . print ( 'ok' ); }) . execute (); These shortcut methods share the same names with those in Resource that are used for defining HTTP method handlers: get , post , put , delete and its alias remove , but differ in signature (first argument is a string for the resource path) and the return type (the HttpController instance, instead of ResourceMethod). They are useful as a compact mechanism if you intend to build something simple and simplistic, such as a single resource and one or few handler functions for it. You will not be able to go much further with this API so if you consider anything even slightly more sophisticated you should look into the fluent API of resource instead: rs.service().resource(\"\") . Note Note that the scope of these shortcut methods is the controller, not the resource. That has effect on the method chaining. For clean code, do not confuse despite the similar names and avoid mixing them. Defining content types that an API consumes and produces rs . resource ( \"\" ). get (). produces ( \"[application/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[application/json]\" ) rs . resource ( \"\" ). put (). consumes ( \"[application/json]\" ). produces ( \"[application/json]\" ) Optionally, but also quite likely, we will add some constraints on the data type consumed and produced by the resource method handler that we configure. At request processing runtime, these constraints will be matched for compatibility against the HTTP request headers before delegating to the handler processing function. You can use wildcards (*) in the MIME types arguments both for type and sub-type and it will be considered as anything during the execution: rs . resource ( \"\" ). post (). consumes ( \"[*/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[*/*]\" ) Request processing flow and events Before we continue, let us take a look at the request processing flow. The request is matched against the resource method handling definitions in the configuration and if there is a compatible one it is elicited for execution. Otherwise, a Bad request error is returned back to the client. The before callback function is invoked if any was configured. The serve callback function is invoked if any was configured. If an Error was thrown from the serve function, a catch callback function is invoked. The callback function is either configured or the default one. A finally (always executed) function is invoked if one was configured. Or in pseudocode: try { before ( ctx , request , response , resourceMethod , controller ); serve ( ctx , request , response , resourceMethod , controller ); } catch ( err ){ catch ( ctx , err , request , response , resourceMethod , controller ); } finally { finally (); } As evident form the flow, it is only the serve event callback handler function that is required to be setup. But if you require fine grained reaction to the other events, you can configure handlers for each of those you are interested in. Currently, the API supports a single handler function per event so in multiple invocation of a setup method on the same resource method only the last will matter. Defining event handling functions resource . get (). before ( function ( ctx , request , response , resourceMethod , controller ){ //Implements pre-processing logic }) resource . get (). serve ( function ( ctx , request , response , resourceMethod , controller ){ //Implements request-processing logic }) resource . get (). catch ( function ( ctx , error , request , response , resourceMethod , controller ){ //Implements error-processing logic overriding the default }) resource . get (). finally ( function (){ //Implements post-processing logic regardless of error or success of the serve function }) A valid, executable resource method configuration requires at least the serve callback function to be setup: resource . get (). serve ( function ( ctx , request , response ){ response . println ( 'OK' ); }); The rest are optional and/or have default implementations. Errors thrown from the before and serve callbacks are delegated to the catch callback. There is a default catch callback that sends formatted error back in the response and it can be overridden using the catch method to setup another error processing logic. The finally callback is invoked after the response has been flushed and closed (regardless if in error or success) and can be used to cleanup resources. Example: rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response ){ request . setHeader ( 'X-arestme-version' , '1.0' ); }) . serve ( function ( ctx , request , response ){ response . println ( 'Serving GET request' ); }) . catch ( function ( ctx , err , request , response ){ console . error ( err . message ); }) . finally ( function (){ console . info ( 'GET request processing finished' ); }) Advanced Using configuration objects Configuration objects are particularly useful when you are enhancing or overriding an existing protocol so you don't start configuring from scratch but rather amend or change pieces of the configuration. It is also useful when you are dealing with dynamically generated HTTP-based protocol configurations. For example, consider the simple sample that we started with. It is completely identical with this one, which uses a configuration object and provides it to the service function: rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }). execute (); It is also completely identical with this one: rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); or this one: rs . service () . resource ( \"\" ) . get ([{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }]). execute (); In fact, here is a sample how to define a whole API providing configuration directly to the service method and then enhance it. rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }) . resource ( \"\" ) . post () . serve ( function ( ctx , request , response ){ console . info ( request . readText ()); }) . execute (); In this way we essentially are exploiting the fluent API to configure a service but we will not start from scratch. Many of the API methods accept as a second argument configuration object and this doesn't prevent you to continue the API design with fluent API to enhance or override it. The sendError method in HttpController The HttpController class instances that we receive when rs.service() is invoked, features a sendError method. It implements the logic for formatting errors and returning them back to the client taking into account its type and content type preferences. Should you require to change this behavior globally you can redefine the function. If you require different behavior for particular resources or resource method handlers, then using the catch callback is the better approach. Sometimes it's useful to reuse the method and send error in your handler functions. The standard request processing mechanism in HttpController does not account for logical errors. It doesn't know for example that a parameter form a client input is out of valid range. For such cases you would normally implement validation either in before event handler or in serve. And if you need tighter control on what is sent back, e.g. the HTTP code you wouldn't simply throw an Error but invoke the sendError function with the right parameters yourself. For these purposes the last argument of each event handler function is conveniently the controller instance. rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response , methodHandler , controller ){ //check if requested file exists if ( ! file . exists ()){ controller . sendError (); } }) . serve ( function (){ //return file content }) Defining readonly APIs mappings.readonly() An obvious way of defining readonly APIs is to use only GET resource methods definitions. In some cases though APIs can be created from external configuration that also contains other resource method handlers, or we can receive an API instance from another module factory, or we want to support two instances of the same API, one readonly and one with edit capabilities, with minimal code. In such cases, we already have non-GET resource methods that we have to get rid of somehow. Here the readonly method steps in and does exactly this - removes all but the GET resource handlers if any. Example: rs . service () . resource ( \"\" ) . post () . serve ( function (){}); . get () . serve ( function (){}); . readonly () . execute (); If you inspect the configuration after .readonly() is invoked (use resource(\"\").configuration() ) you will notice that the post verb definition is gone. Consecutively, POST requests to this resource will end up in Bad Request (400). Note that for this to work, this must be the last configuration action for a resource. Otherwise, you are resetting the resource configuration to readonly, only to define write methods again. The readonly method is available both for ResourceMapping and Resource objects returned by either invocations of service mappings() method or retained references from configuration API invocations. Disabling a ResourceMethod Handler api.disable(sPath, sVerb, arrConsumesTypes, arrProducesTypes) Similar to the use cases explored for the readonly method above yo might not be in full control of the definition of the API, but rather takeover at some point. Similar to the readonly method, this one will remove the handler definition identified by the four parameters - resource path, resource verb, consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order), but it will do it for any verb, not only GET . In that sense readonly is a specialization of this one only for GET verbs. Example: var mappings = rs . service ({ \"\" : { \"post\" : [{ serve : function (){} }], \"get\" : [{ serve : function (){} }] } }). mappings (); mappings . disable ( \"\" , \"post\" ); With this API definition, invoking mappings.find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var mappings = rs . service (). get ( function (){}). mappings (); //later in code var handler = mappings . find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); } Executing service with explicit request/response arguments The request and response parameters of the execute method are optional. If you don't supply them, the service will take the request/response objects that were used to request the script. Most of the time this is what you want. However, supplying your own request and response arguments can be very handy for testing as you can easily mock and inspect them. Fluency for execute method The execute method is defined by the service instance (HttpController) obtained with rs.service() and can be executed with: rs.service().execute() . The fluent configuration API also provides references to the method, so you can actually invoke it on any stage. Examples: rs . service (). resource ( \"\" ). get ( function (){}). execute () rs . service (). resource ( \"\" ). get (). serve ( function (){}). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). serve ( function (){}). execute () rs . service () . resource ( \"\" ) . produces ([ \"application/json\" ]) . get ( function (){}) . resource ( \"\" ) . consumes ([ \"*/json\" ]) . post ( function (){}) . execute () Mappings vs Configurations The API supplies two methods mappings() and configuration() that provide configuration in two forms. The mappings method supplies typed API objects such as Resource aggregating ResourceMethod instances. To get a reference to a service mappings, invoke mappings on the service instance: rs.service().mappings() With a reference to mappings you have their fluent API at disposal. This is useful when extending and enhancing the core rs functionality to build dedicated services. For example the HttpController constructor function is designed to accept mappings and if you extend or initialize it internally in another API you will likely need this form of configuration. An invocation of the configuration method on the other hand provides the underlying JS configuration object. It can be used to supply generic configurations that are used to initialize new types of services as the public fluent API is designed to accept this form of configuration. Both are represent configuration but while the mappings are sort of internal, parsed version, the configuration object is the version that the public api accepts and is also therefore kind of advanced public form of the internal configuration. It is also possible to convert between the two: rs . service ( jsConfig ). mappings () rs . service (). resource (). configuration () Finding a ResourceMethod rs.service().mappings().find(sPath, sMethod, arrConsumesTypes, arrProducesTypes) Suppose you want to redefine a handler definition to e.g. change the serve callback, add a before handler, change or add to the consumes media types constraint etc. To do that you need a reference to the handler, which is identified by the four parameters - resource path , resource method , consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order). On a successful search hit you get a reference to the handler definition and can perform changes on it. Example: rs . service () . resource ( \"\" ) . get ( function (){}); With this API definition, invoking rs.service().mappings().find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var handler = svc . mappings (). find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); } With consumes and produces constraints on a resource method handler, getting a reference will require them specified too. Example: var svc = rs . service (); svc . mapings () . resource ( \"\" ) . post () . consumes ([ 'application/json' , 'text/json' ]) . produces ([ 'application/json' ]) . serve ( function (){}); var handler = svc . mapings (). find ( \"\" , \"post\" , [ 'text/json' , 'application/json' ], [ 'application/json' ]); Note, that the order of the MIME type string values in the consumes/produces array parameters is not significant. They will be sorted before matching the sorted corresponding arrays in the definition. Configuring resource with JS object Having defined a resource with path we have two options for configuring it. We can proceed using its fluent API or we can provision a configuration JS object as second argument to the resource method and have it done in one step. Considering the latter, we will be provisioning configuration for this resource only, so it should be an object with method definitions as root members. rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); Refer to the next sections for comparison how to achieve the same, using fluent API and/or configuration objects on the lower levels. JS Configuration object schema In progress. Check back later. Schema: { pathString : { methodString : [{ \"consumes\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"produces\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"before\" : Function \"serve\" : Function \"catch\" : Function \"finally\" : Function }] } } pathString is a string that represents the resource path. There could be 0 or more such non-overlapping members. methodString is a string for the HTTP resource method. There could be 0 or more such non-overlapping members. The value of methodString is an array of 0 or more objects, each defining a request method processing that will be executed under unique conditions (constraints) that match the request. A component in the methodString array, can consist of constraints (consumes, produces) and request processing flow event handlers (before, serve, catch, finally) consumes value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. produces value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. before , serve , catch and finally values are functions. Except for the serve function, the rest can be undefined . Building a CRUD rest service The code snippet below shows a sample design for a REST API for simple CRUD file operations. It has illustrative purposes. The service design is to work with files in the HOME directory of the user that runs the dirigible instance currently. Users can create, read, update and delete files by sending corresponding POST, GET, PUT and DELETE requests using the file name as path segment (e.g. /services/js/file-serivce.js/test.json ) and they can also upload files if they don't specify file name but send multipart-form-data POST request directly to the service (e.g. /services/js/file-serivce.js ). Note how the before handler is used to validate user has permissions on resources and how it makes use of controller's sendError method. var LOGGER = require ( \"log/v4/logging\" ). getLogger ( 'http.filesvc' ); var rs = require ( \"http/v4/rs\" ); var upload = require ( 'http/v4/upload' ); var files = require ( 'io/v4/files' ); var user = require ( 'security/v4/user' ); var env = require ( 'core/v4/env' ); var validateRequest = function ( permissions , ctx , request , response , methodHandler , controller ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; if ( ! files . exists ( filePath )){ LOGGER . info ( \"Requested file \" + filePath + \" does not exist.\" ); controller . sendError ( response . NOT_FOUND , undefined , response . HttpCodesReasons . getReason ( String ( response . NOT_FOUND )), ctx . pathParameters . fileName + \" does not exist.\" ); return ; } if ( permissions ){ var resourcePermissions = files . getPermissions ( filePath ); if ( resourcePermissions !== null && resourcePermissions . indexOf ( permissions ) >- 1 ){ var loggedUser = user . getName (); LOGGER . error ( \"User {} does not have sufficient permissions[{}] for {}\" , loggedUser , files . getPermissions ( filePath ), filePath ); controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), \"User \" + loggedUser + \" does not have sufficient permissions for \" + ctx . pathParameters . fileName ); return ; } } LOGGER . error ( 'validation successfull' ); }; var postProcess = function ( operationName ){ LOGGER . info ( \"{} operation finished\" , operationName ); }; rs . service () . resource ( \"\" ) . post ( function ( ctx , request , response ){ var fileItems = upload . parseRequest (); for ( var i = 0 ; i < fileItems . size (); i ++ ) { var filePath = env . get ( 'HOME' ) + '/' ; var content ; var fileItem = fileItems . get ( i ); if ( ! fileItem . isFormField ()) { filePath += fileItem . getName (); content = String . fromCharCode . apply ( null , fileItem . getBytes ()); } else { filePath += fileItem . getFieldName (); content = fileItem . getText (); } LOGGER . debug ( \"Creating file\" + filePath ); files . writeText ( filePath , content ); } response . setStatus ( response . CREATED ); }) . before ( function ( ctx , request , response , methodHandler , controller ){ var loggedUser = user . getName (); if ( files . getOwner ( ctx . pathParameters . fileName ) !== loggedUser ) controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), loggedUser + \" is not owner of \" + ctx . pathParameters . fileName ); }) . finally ( postProcess . bind ( this , \"Upload\" )) . consumes ([ \"multipart/form-data\" ]) . resource ( \"{fileName}\" ) . post ( function ( ctx , request , response ){ var content = request . getText (); var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Creating file \" + filePath ); files . writeText ( filePath , content ); files . setPermissions ( filePath , 'rw' ); response . setStatus ( response . CREATED ); }) . finally ( postProcess . bind ( this , \"Create\" )) . consumes ([ \"application/json\" ]) . get ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . error ( \"Reading file \" + filePath ); var content = files . readText ( filePath ); response . setStatus ( response . OK ); response . print ( content ); }) . before ( validateRequest . bind ( this , 'r' )) . finally ( postProcess . bind ( this , \"Read\" )) . produces ([ \"application/json\" ]) . put ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Updating file \" + filePath ); var content = request . getJSON (); files . deleteFile ( filePath ); files . writeText ( filePath , content ); response . setStatus ( response . ACCEPTED ); }) . finally ( postProcess . bind ( this , \"Update\" )) . before ( validateRequest . bind ( this , 'rw' )) . consumes ([ \"application/json\" ]) . remove ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Removing file \" + filePath ); files . deleteFile ( filePath ); response . setStatus ( response . NO_CONTENT ); }) . before ( validateRequest . bind ( this , 'w' )) . finally ( postProcess . bind ( this , \"Delete\" )) . execute (); You can find the complete documentation for http/rs and http/rs-data under the API page .","title":"REST"},{"location":"development/concepts/rest/#rest","text":"The http-rs module is designed to define and run a broad range of HTTP REST services. A very simple example hello-api.js : var rs = require ( \"http/v4/rs\" ); // serve GET HTTP requests sent to resource path \"\" (i.e. directly to hello-api.js) rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) . execute (); Sending a GET request to /services/v4/js/test/hello-api.js to the server and hosting the hello-api.js piece of code above in test/hello-api.js will return response body: Hello there!","title":"REST"},{"location":"development/concepts/rest/#overview","text":"Let\u2019s have a closer look at the methods shown in the example above. First, we requested a new REST service instance from the framework: rs.service() Next, we configured the instance to serve HTTP GET requests sent to root path (\"\") using the supplied function: . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) Technically, configuration is not required to execute a service, but obviously it will do nothing, if you don't instruct it what to do. Finally, we run the service and it processes the HTTP request: .execute(); Now, this is a fairly simplistic example aiming to give you a hint of how you can bring up a REST API to life with http-rs. There is a whole lot more that we shall explore in the next sections.","title":"Overview"},{"location":"development/concepts/rest/#creating-rest-services","text":"rs.service() Creating new service instances is as simple as invoking rs.service() . That returns a configurable and/or executable instance of the HttpController class. The controller API allows to: - start configuring REST service (method resource() ) - serve requests (method execute() ) - perform a couple of more advanced activities, which will be reviewed in the Advanced section below Additionally, the controller API features also shortcut factory methods that are useful for simplistic configurations (like the one in our initial example) such as get(sPath, fServe, arrConsume, arrProduces) . Read below for more examples how to use the methods.","title":"Creating REST services"},{"location":"development/concepts/rest/#serving-requests","text":"execute() The mechanism for serving requests is implemented in the execute() method of the HttpController. It tries to match the request to the service API configuration. If the mechanism matches the request successfully, it triggers the execution flow of the callback functions. The execution flow processes the request and response. If the mechanism doesn't match the request successfully, it sends a Bad Request error to the client. The request and response objects are implicitly those that were used to request the script where the execute() method invocation occurred. But they can be exchanged for others as shown in the Advanced section. The execute() method is defined in the service instance (class HttpController) obtained with rs.service() . The execute() method can be triggered with rs.service().execute() . The rs API configuration also provides numerous references to the method so you can invoke it on any stage. For example, rs . service (). get ( \"\" ). execute () rs . service (). resource ( \"\" ). get (). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). execute () are all valid ways to serve requests. What you need to consider is that execute() must be the final method invocation. Even if you retain a reference to a configuration object and change it after that, it will be irrelevant since the response will be flushed and closed by then.","title":"Serving requests"},{"location":"development/concepts/rest/#configuring-services","text":"There are three options as far as configuration is concerned. You can start from scratch and build the configuration using the rs API. You can use configuration objects. They are holding the configuration that the rs API produces. You can start with a configuration object and then enhance or override the configuration using the rs API. Configuration objects A configuration object is a JS object with canonical structure that the http-rs can interpret. We will discuss its schema later on in this guide. For now, let's just consider that it's the same thing that the rs-fluent API will actually produce behind the scenes so it's a completely valid alternative and complement to the rs-fluent API configuration approach. Refer to the Advanced section for more details on using configuration objects.","title":"Configuring services"},{"location":"development/concepts/rest/#defining-service-resources","text":"resource(sPath, oConfiguration?) Resources are the top-level configuration objects that represent an HTTP (server) resource , for which we will be defining a protocol. Each resource is identified by a URL on the server. You can have multiple resources per service configuration, provided that their URLs do not overlap. Resource vs Path vs Resource Path As per the REST terms, a resource is an abstraction or a server-side resource that can be a file, a dynamically generated content, or a procedure (although the last is considered heresy by purists). It's virtually anything hosted on a server that has an address and can be accessed with a standard HTTP method. It is often referred to as \"path\" or \"resource path\" due to its singular most notable identifying characteristic. But to be precise, \"path\" is only a property of the resource. As far as configuration is concerned, the resource defines the configuration scope for which we define method handlers and constraints, and is identifiable by its \"path\" property.","title":"Defining service resources"},{"location":"development/concepts/rest/#resource-paths-and-path-templates","text":"The sPath string parameter (mandatory) of the resource() method will serve as the resource URL. It is relative to the location where the JavaScript service is running (e.g. /services/v4/my-application/api/my-service.js ). No path ( \"\" ), request directly to the JavaScript service root ( \"\" ) path. The path can also be a URL template, i.e. parameterized. For example consider the path template: {id}/assets/{assetType}/{name} This will resolve request paths such as: /services/js/test.js/1/assets/longterm/building to service path: 1/assets/longterm/building If a request is matched to such path, the service mechanism will provide the resolved parameters as an object map to the function that handles the request. Using the sample path above the path parameters object will look like this: { \"id\" : 1 , \"assetType\" : \"longterm\" , \"name\" : \"building\" }","title":"Resource paths and path templates"},{"location":"development/concepts/rest/#defining-http-methods-allowed-for-a-resource","text":"resource . get () resource . post () resource . put () resource [ \"delete\" ]() and resource . remove () resource . method () By default, only the HTTP request methods that you have configured for a resource are allowed. The fluent API of Resource instances, obtained with the resource(sPath) method that we discussed above, exposes the most popular REST API methods ( get , post , put and delete ). They are simply aliases for the generic method method . Whichever we consider, we will receive a ResourceMethod instance from the invocation and its API will allow us to specify processing functions and further specify constraints on the request/response for which they are applicable: rs.resource('').get().produces([\"application/json\"]).serve(function(){}) Alternatively, as we have already seen, we can supply the serve callback function directly as first argument to the method, which comes in handy if we have nothing more to setup: rs.resource('').get(function(){}) We can also use configuration object as a third option and this will be discussed in the Advanced section. The samples here are all for configuring HTTP GET Method but the usage pattern is still the same for all: rs.resource('').post().consumes([\"application/json\"]).serve(function(){})","title":"Defining HTTP methods allowed for a resource"},{"location":"development/concepts/rest/#shortcuts","text":"You already noticed that instead of explicitly using serve to configure callback for serving the requests we could directly provide the function as argument to the method configuring the HTTP method (e.g. get ). rs.resource('').get(function(){}) rs.resource('').get().serve(function(){}) So why bother provisioning an explicit serve() function in the first place then? The answer is that serve() configures only one of the callback functions that are triggered during the request processing flow. And this shortcut is handy if it is only serve() that you are interested into configuring. Of course, nothing prevents you also from using the shortcut and still configure the other callback functions, unless you find it confusing. These are all valid options. Find out more about configuring request processing callback functions in the section dedicated to this. When the controller API was discussed, it was mentioned that there are shortcut factory methods that combine a couple operations to produce directly a method handler for a resource path. Example rs . service () . get ( \"\" , function ( ctx , request , response ) { response . print ( 'ok' ); }) . execute (); That would be equivalent to the following: rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . print ( 'ok' ); }) . execute (); These shortcut methods share the same names with those in Resource that are used for defining HTTP method handlers: get , post , put , delete and its alias remove , but differ in signature (first argument is a string for the resource path) and the return type (the HttpController instance, instead of ResourceMethod). They are useful as a compact mechanism if you intend to build something simple and simplistic, such as a single resource and one or few handler functions for it. You will not be able to go much further with this API so if you consider anything even slightly more sophisticated you should look into the fluent API of resource instead: rs.service().resource(\"\") . Note Note that the scope of these shortcut methods is the controller, not the resource. That has effect on the method chaining. For clean code, do not confuse despite the similar names and avoid mixing them.","title":"Shortcuts"},{"location":"development/concepts/rest/#defining-content-types-that-an-api-consumes-and-produces","text":"rs . resource ( \"\" ). get (). produces ( \"[application/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[application/json]\" ) rs . resource ( \"\" ). put (). consumes ( \"[application/json]\" ). produces ( \"[application/json]\" ) Optionally, but also quite likely, we will add some constraints on the data type consumed and produced by the resource method handler that we configure. At request processing runtime, these constraints will be matched for compatibility against the HTTP request headers before delegating to the handler processing function. You can use wildcards (*) in the MIME types arguments both for type and sub-type and it will be considered as anything during the execution: rs . resource ( \"\" ). post (). consumes ( \"[*/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[*/*]\" )","title":"Defining content types that an API consumes and produces"},{"location":"development/concepts/rest/#request-processing-flow-and-events","text":"Before we continue, let us take a look at the request processing flow. The request is matched against the resource method handling definitions in the configuration and if there is a compatible one it is elicited for execution. Otherwise, a Bad request error is returned back to the client. The before callback function is invoked if any was configured. The serve callback function is invoked if any was configured. If an Error was thrown from the serve function, a catch callback function is invoked. The callback function is either configured or the default one. A finally (always executed) function is invoked if one was configured. Or in pseudocode: try { before ( ctx , request , response , resourceMethod , controller ); serve ( ctx , request , response , resourceMethod , controller ); } catch ( err ){ catch ( ctx , err , request , response , resourceMethod , controller ); } finally { finally (); } As evident form the flow, it is only the serve event callback handler function that is required to be setup. But if you require fine grained reaction to the other events, you can configure handlers for each of those you are interested in. Currently, the API supports a single handler function per event so in multiple invocation of a setup method on the same resource method only the last will matter.","title":"Request processing flow and events"},{"location":"development/concepts/rest/#defining-event-handling-functions","text":"resource . get (). before ( function ( ctx , request , response , resourceMethod , controller ){ //Implements pre-processing logic }) resource . get (). serve ( function ( ctx , request , response , resourceMethod , controller ){ //Implements request-processing logic }) resource . get (). catch ( function ( ctx , error , request , response , resourceMethod , controller ){ //Implements error-processing logic overriding the default }) resource . get (). finally ( function (){ //Implements post-processing logic regardless of error or success of the serve function }) A valid, executable resource method configuration requires at least the serve callback function to be setup: resource . get (). serve ( function ( ctx , request , response ){ response . println ( 'OK' ); }); The rest are optional and/or have default implementations. Errors thrown from the before and serve callbacks are delegated to the catch callback. There is a default catch callback that sends formatted error back in the response and it can be overridden using the catch method to setup another error processing logic. The finally callback is invoked after the response has been flushed and closed (regardless if in error or success) and can be used to cleanup resources. Example: rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response ){ request . setHeader ( 'X-arestme-version' , '1.0' ); }) . serve ( function ( ctx , request , response ){ response . println ( 'Serving GET request' ); }) . catch ( function ( ctx , err , request , response ){ console . error ( err . message ); }) . finally ( function (){ console . info ( 'GET request processing finished' ); })","title":"Defining event handling functions"},{"location":"development/concepts/rest/#advanced","text":"","title":"Advanced"},{"location":"development/concepts/rest/#using-configuration-objects","text":"Configuration objects are particularly useful when you are enhancing or overriding an existing protocol so you don't start configuring from scratch but rather amend or change pieces of the configuration. It is also useful when you are dealing with dynamically generated HTTP-based protocol configurations. For example, consider the simple sample that we started with. It is completely identical with this one, which uses a configuration object and provides it to the service function: rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }). execute (); It is also completely identical with this one: rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); or this one: rs . service () . resource ( \"\" ) . get ([{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }]). execute (); In fact, here is a sample how to define a whole API providing configuration directly to the service method and then enhance it. rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }) . resource ( \"\" ) . post () . serve ( function ( ctx , request , response ){ console . info ( request . readText ()); }) . execute (); In this way we essentially are exploiting the fluent API to configure a service but we will not start from scratch. Many of the API methods accept as a second argument configuration object and this doesn't prevent you to continue the API design with fluent API to enhance or override it.","title":"Using configuration objects"},{"location":"development/concepts/rest/#the-senderror-method-in-httpcontroller","text":"The HttpController class instances that we receive when rs.service() is invoked, features a sendError method. It implements the logic for formatting errors and returning them back to the client taking into account its type and content type preferences. Should you require to change this behavior globally you can redefine the function. If you require different behavior for particular resources or resource method handlers, then using the catch callback is the better approach. Sometimes it's useful to reuse the method and send error in your handler functions. The standard request processing mechanism in HttpController does not account for logical errors. It doesn't know for example that a parameter form a client input is out of valid range. For such cases you would normally implement validation either in before event handler or in serve. And if you need tighter control on what is sent back, e.g. the HTTP code you wouldn't simply throw an Error but invoke the sendError function with the right parameters yourself. For these purposes the last argument of each event handler function is conveniently the controller instance. rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response , methodHandler , controller ){ //check if requested file exists if ( ! file . exists ()){ controller . sendError (); } }) . serve ( function (){ //return file content })","title":"The sendError method in HttpController"},{"location":"development/concepts/rest/#defining-readonly-apis","text":"mappings.readonly() An obvious way of defining readonly APIs is to use only GET resource methods definitions. In some cases though APIs can be created from external configuration that also contains other resource method handlers, or we can receive an API instance from another module factory, or we want to support two instances of the same API, one readonly and one with edit capabilities, with minimal code. In such cases, we already have non-GET resource methods that we have to get rid of somehow. Here the readonly method steps in and does exactly this - removes all but the GET resource handlers if any. Example: rs . service () . resource ( \"\" ) . post () . serve ( function (){}); . get () . serve ( function (){}); . readonly () . execute (); If you inspect the configuration after .readonly() is invoked (use resource(\"\").configuration() ) you will notice that the post verb definition is gone. Consecutively, POST requests to this resource will end up in Bad Request (400). Note that for this to work, this must be the last configuration action for a resource. Otherwise, you are resetting the resource configuration to readonly, only to define write methods again. The readonly method is available both for ResourceMapping and Resource objects returned by either invocations of service mappings() method or retained references from configuration API invocations.","title":"Defining readonly APIs"},{"location":"development/concepts/rest/#disabling-a-resourcemethod-handler","text":"api.disable(sPath, sVerb, arrConsumesTypes, arrProducesTypes) Similar to the use cases explored for the readonly method above yo might not be in full control of the definition of the API, but rather takeover at some point. Similar to the readonly method, this one will remove the handler definition identified by the four parameters - resource path, resource verb, consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order), but it will do it for any verb, not only GET . In that sense readonly is a specialization of this one only for GET verbs. Example: var mappings = rs . service ({ \"\" : { \"post\" : [{ serve : function (){} }], \"get\" : [{ serve : function (){} }] } }). mappings (); mappings . disable ( \"\" , \"post\" ); With this API definition, invoking mappings.find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var mappings = rs . service (). get ( function (){}). mappings (); //later in code var handler = mappings . find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); }","title":"Disabling a ResourceMethod Handler"},{"location":"development/concepts/rest/#executing-service-with-explicit-requestresponse-arguments","text":"The request and response parameters of the execute method are optional. If you don't supply them, the service will take the request/response objects that were used to request the script. Most of the time this is what you want. However, supplying your own request and response arguments can be very handy for testing as you can easily mock and inspect them.","title":"Executing service with explicit request/response arguments"},{"location":"development/concepts/rest/#fluency-for-execute-method","text":"The execute method is defined by the service instance (HttpController) obtained with rs.service() and can be executed with: rs.service().execute() . The fluent configuration API also provides references to the method, so you can actually invoke it on any stage. Examples: rs . service (). resource ( \"\" ). get ( function (){}). execute () rs . service (). resource ( \"\" ). get (). serve ( function (){}). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). serve ( function (){}). execute () rs . service () . resource ( \"\" ) . produces ([ \"application/json\" ]) . get ( function (){}) . resource ( \"\" ) . consumes ([ \"*/json\" ]) . post ( function (){}) . execute ()","title":"Fluency for execute method"},{"location":"development/concepts/rest/#mappings-vs-configurations","text":"The API supplies two methods mappings() and configuration() that provide configuration in two forms. The mappings method supplies typed API objects such as Resource aggregating ResourceMethod instances. To get a reference to a service mappings, invoke mappings on the service instance: rs.service().mappings() With a reference to mappings you have their fluent API at disposal. This is useful when extending and enhancing the core rs functionality to build dedicated services. For example the HttpController constructor function is designed to accept mappings and if you extend or initialize it internally in another API you will likely need this form of configuration. An invocation of the configuration method on the other hand provides the underlying JS configuration object. It can be used to supply generic configurations that are used to initialize new types of services as the public fluent API is designed to accept this form of configuration. Both are represent configuration but while the mappings are sort of internal, parsed version, the configuration object is the version that the public api accepts and is also therefore kind of advanced public form of the internal configuration. It is also possible to convert between the two: rs . service ( jsConfig ). mappings () rs . service (). resource (). configuration ()","title":"Mappings vs Configurations"},{"location":"development/concepts/rest/#finding-a-resourcemethod","text":"rs.service().mappings().find(sPath, sMethod, arrConsumesTypes, arrProducesTypes) Suppose you want to redefine a handler definition to e.g. change the serve callback, add a before handler, change or add to the consumes media types constraint etc. To do that you need a reference to the handler, which is identified by the four parameters - resource path , resource method , consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order). On a successful search hit you get a reference to the handler definition and can perform changes on it. Example: rs . service () . resource ( \"\" ) . get ( function (){}); With this API definition, invoking rs.service().mappings().find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var handler = svc . mappings (). find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); } With consumes and produces constraints on a resource method handler, getting a reference will require them specified too. Example: var svc = rs . service (); svc . mapings () . resource ( \"\" ) . post () . consumes ([ 'application/json' , 'text/json' ]) . produces ([ 'application/json' ]) . serve ( function (){}); var handler = svc . mapings (). find ( \"\" , \"post\" , [ 'text/json' , 'application/json' ], [ 'application/json' ]); Note, that the order of the MIME type string values in the consumes/produces array parameters is not significant. They will be sorted before matching the sorted corresponding arrays in the definition.","title":"Finding a ResourceMethod"},{"location":"development/concepts/rest/#configuring-resource-with-js-object","text":"Having defined a resource with path we have two options for configuring it. We can proceed using its fluent API or we can provision a configuration JS object as second argument to the resource method and have it done in one step. Considering the latter, we will be provisioning configuration for this resource only, so it should be an object with method definitions as root members. rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); Refer to the next sections for comparison how to achieve the same, using fluent API and/or configuration objects on the lower levels.","title":"Configuring resource with JS object"},{"location":"development/concepts/rest/#js-configuration-object-schema","text":"In progress. Check back later. Schema: { pathString : { methodString : [{ \"consumes\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"produces\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"before\" : Function \"serve\" : Function \"catch\" : Function \"finally\" : Function }] } } pathString is a string that represents the resource path. There could be 0 or more such non-overlapping members. methodString is a string for the HTTP resource method. There could be 0 or more such non-overlapping members. The value of methodString is an array of 0 or more objects, each defining a request method processing that will be executed under unique conditions (constraints) that match the request. A component in the methodString array, can consist of constraints (consumes, produces) and request processing flow event handlers (before, serve, catch, finally) consumes value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. produces value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. before , serve , catch and finally values are functions. Except for the serve function, the rest can be undefined .","title":"JS Configuration object schema"},{"location":"development/concepts/rest/#building-a-crud-rest-service","text":"The code snippet below shows a sample design for a REST API for simple CRUD file operations. It has illustrative purposes. The service design is to work with files in the HOME directory of the user that runs the dirigible instance currently. Users can create, read, update and delete files by sending corresponding POST, GET, PUT and DELETE requests using the file name as path segment (e.g. /services/js/file-serivce.js/test.json ) and they can also upload files if they don't specify file name but send multipart-form-data POST request directly to the service (e.g. /services/js/file-serivce.js ). Note how the before handler is used to validate user has permissions on resources and how it makes use of controller's sendError method. var LOGGER = require ( \"log/v4/logging\" ). getLogger ( 'http.filesvc' ); var rs = require ( \"http/v4/rs\" ); var upload = require ( 'http/v4/upload' ); var files = require ( 'io/v4/files' ); var user = require ( 'security/v4/user' ); var env = require ( 'core/v4/env' ); var validateRequest = function ( permissions , ctx , request , response , methodHandler , controller ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; if ( ! files . exists ( filePath )){ LOGGER . info ( \"Requested file \" + filePath + \" does not exist.\" ); controller . sendError ( response . NOT_FOUND , undefined , response . HttpCodesReasons . getReason ( String ( response . NOT_FOUND )), ctx . pathParameters . fileName + \" does not exist.\" ); return ; } if ( permissions ){ var resourcePermissions = files . getPermissions ( filePath ); if ( resourcePermissions !== null && resourcePermissions . indexOf ( permissions ) >- 1 ){ var loggedUser = user . getName (); LOGGER . error ( \"User {} does not have sufficient permissions[{}] for {}\" , loggedUser , files . getPermissions ( filePath ), filePath ); controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), \"User \" + loggedUser + \" does not have sufficient permissions for \" + ctx . pathParameters . fileName ); return ; } } LOGGER . error ( 'validation successfull' ); }; var postProcess = function ( operationName ){ LOGGER . info ( \"{} operation finished\" , operationName ); }; rs . service () . resource ( \"\" ) . post ( function ( ctx , request , response ){ var fileItems = upload . parseRequest (); for ( var i = 0 ; i < fileItems . size (); i ++ ) { var filePath = env . get ( 'HOME' ) + '/' ; var content ; var fileItem = fileItems . get ( i ); if ( ! fileItem . isFormField ()) { filePath += fileItem . getName (); content = String . fromCharCode . apply ( null , fileItem . getBytes ()); } else { filePath += fileItem . getFieldName (); content = fileItem . getText (); } LOGGER . debug ( \"Creating file\" + filePath ); files . writeText ( filePath , content ); } response . setStatus ( response . CREATED ); }) . before ( function ( ctx , request , response , methodHandler , controller ){ var loggedUser = user . getName (); if ( files . getOwner ( ctx . pathParameters . fileName ) !== loggedUser ) controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), loggedUser + \" is not owner of \" + ctx . pathParameters . fileName ); }) . finally ( postProcess . bind ( this , \"Upload\" )) . consumes ([ \"multipart/form-data\" ]) . resource ( \"{fileName}\" ) . post ( function ( ctx , request , response ){ var content = request . getText (); var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Creating file \" + filePath ); files . writeText ( filePath , content ); files . setPermissions ( filePath , 'rw' ); response . setStatus ( response . CREATED ); }) . finally ( postProcess . bind ( this , \"Create\" )) . consumes ([ \"application/json\" ]) . get ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . error ( \"Reading file \" + filePath ); var content = files . readText ( filePath ); response . setStatus ( response . OK ); response . print ( content ); }) . before ( validateRequest . bind ( this , 'r' )) . finally ( postProcess . bind ( this , \"Read\" )) . produces ([ \"application/json\" ]) . put ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Updating file \" + filePath ); var content = request . getJSON (); files . deleteFile ( filePath ); files . writeText ( filePath , content ); response . setStatus ( response . ACCEPTED ); }) . finally ( postProcess . bind ( this , \"Update\" )) . before ( validateRequest . bind ( this , 'rw' )) . consumes ([ \"application/json\" ]) . remove ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Removing file \" + filePath ); files . deleteFile ( filePath ); response . setStatus ( response . NO_CONTENT ); }) . before ( validateRequest . bind ( this , 'w' )) . finally ( postProcess . bind ( this , \"Delete\" )) . execute (); You can find the complete documentation for http/rs and http/rs-data under the API page .","title":"Building a CRUD rest service"},{"location":"development/concepts/web-content/","text":"Web Content Overview The Web content includes all the static client-side resources, such as HTML files, CSS, and related theme ingredients, as well as the dynamic scripts and the images. In general, a Web content adapter plays the role of a tunnel that takes the desired resource location from the request path, loads the corresponding content from the repository, and sends it back without any modification. By default, the Web content adapter accepts requests to particular resources and responds with an error code to requests to whole collections. This way, the Web content adapter indicates that folder listing is forbidden. Note If the specific application/json Accept header is supplied with the request itself, then a JSON formatted array with sub-folders and resources will be returned. To boost developer productivity in the most common cases, we provide a set of templates that can help during UI creation. There is a set of templates that can be used with the entity services , a list of entities, master-detail, input form, and so on. The other templates can be used as utilities for the creation an application shell in index.html with main menu or as samples that show the most common controls on different AJAX UI frameworks, such as jQuery , Bootstrap , AngularJS , and OpenUI5 .","title":"Web Content"},{"location":"development/concepts/web-content/#web-content","text":"","title":"Web Content"},{"location":"development/concepts/web-content/#overview","text":"The Web content includes all the static client-side resources, such as HTML files, CSS, and related theme ingredients, as well as the dynamic scripts and the images. In general, a Web content adapter plays the role of a tunnel that takes the desired resource location from the request path, loads the corresponding content from the repository, and sends it back without any modification. By default, the Web content adapter accepts requests to particular resources and responds with an error code to requests to whole collections. This way, the Web content adapter indicates that folder listing is forbidden. Note If the specific application/json Accept header is supplied with the request itself, then a JSON formatted array with sub-folders and resources will be returned. To boost developer productivity in the most common cases, we provide a set of templates that can help during UI creation. There is a set of templates that can be used with the entity services , a list of entities, master-detail, input form, and so on. The other templates can be used as utilities for the creation an application shell in index.html with main menu or as samples that show the most common controls on different AJAX UI frameworks, such as jQuery , Bootstrap , AngularJS , and OpenUI5 .","title":"Overview"},{"location":"development/concepts/workspace/","text":"Workspace The workspace is the developer's place where you create and manage the application artifacts. The first-level citizens of the workspace are the projects. Each project can contain multiple folders and files (artifacts). A single user can have multiple workspaces that contain different sets of projects. The artifacts, i.e. the project management, can be done via the views and editors in the Workbench perspective .","title":"Workspace"},{"location":"development/concepts/workspace/#workspace","text":"The workspace is the developer's place where you create and manage the application artifacts. The first-level citizens of the workspace are the projects. Each project can contain multiple folders and files (artifacts). A single user can have multiple workspaces that contain different sets of projects. The artifacts, i.e. the project management, can be done via the views and editors in the Workbench perspective .","title":"Workspace"},{"location":"development/extensions/","text":"Extensions Overview Extensibility Extensibility is an important requirement for business applications built to follow custom processes in Line of Business(LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions. To learn more about the Extensions concept, click here Extension Points IDE ide-perspective ide-view ide-editor ide-template ide-menu ide-themes ide-workspace-menu-new-template api-modules ide-operations-menu ide-documents-content-type ide-documents-menu ide-git-menu ide-terminal-menu ide-discussions-menu ide-database-menu ide-repository-menu Server ide-workspace-on-save ide-workspace-before-publish ide-workspace-after-publish ide-workspace-before-unpublish ide-workspace-after-unpublish Events IDE editor.file.saved editor.file.dirty status.message status.caret status.error database.database.selection.changed database.datasource.selection.changed database.sql.execute database.sql.run git.repository.run workspace.file.selected workspace.file.created workspace.file.open workspace.file.pull workspace.file.deleted workspace.file.renamed workspace.file.moved workspace.file.copied workspace.file.properties workspace.file.published workspace.project.exported repository.resource.selected repository.resource.created repository.resource.open repository.resource.deleted","title":"Extensions Overview"},{"location":"development/extensions/#extensions-overview","text":"","title":"Extensions Overview"},{"location":"development/extensions/#extensibility","text":"Extensibility is an important requirement for business applications built to follow custom processes in Line of Business(LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions. To learn more about the Extensions concept, click here","title":"Extensibility"},{"location":"development/extensions/#extension-points","text":"","title":"Extension Points"},{"location":"development/extensions/#ide","text":"ide-perspective ide-view ide-editor ide-template ide-menu ide-themes ide-workspace-menu-new-template api-modules ide-operations-menu ide-documents-content-type ide-documents-menu ide-git-menu ide-terminal-menu ide-discussions-menu ide-database-menu ide-repository-menu","title":"IDE"},{"location":"development/extensions/#server","text":"ide-workspace-on-save ide-workspace-before-publish ide-workspace-after-publish ide-workspace-before-unpublish ide-workspace-after-unpublish","title":"Server"},{"location":"development/extensions/#events","text":"","title":"Events"},{"location":"development/extensions/#ide_1","text":"editor.file.saved editor.file.dirty status.message status.caret status.error database.database.selection.changed database.datasource.selection.changed database.sql.execute database.sql.run git.repository.run workspace.file.selected workspace.file.created workspace.file.open workspace.file.pull workspace.file.deleted workspace.file.renamed workspace.file.moved workspace.file.copied workspace.file.properties workspace.file.published workspace.project.exported repository.resource.selected repository.resource.created repository.resource.open repository.resource.deleted","title":"IDE"},{"location":"development/extensions/editor/","text":"Editor Descriptors To contribute a new Editor (text-based or form-based) to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-editor.extension { \"module\" : \"my-project/services/my-editor.js\" , \"extensionPoint\" : \"ide-editor\" , \"description\" : \"The description of my editor\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute. my-editor.js exports . getView = function () { var view = { name : \"My Editor\" , factory : \"frame\" , region : \"center-top\" , link : \"../my-project/index.html\" , contentTypes : [ \"application/json\" ] }; return view ; }; name - The exact name of the view, which will be shown in the e.g. menu. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. contentTypes - The content types array of supported files. The project structure in this case should look like this: | my-project |---- extensions |----> my-editor.extension |---- services |----> my-editor.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project Implementation < html lang = \"en\" ng-app = \"editor\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1.0\" > < meta name = \"description\" content = \"\" > < meta name = \"author\" content = \"\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/bootstrap.min.css\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/web/resources/font-awesome-4.7.0/css/font-awesome.min.css\" > < link type = \"image/png\" rel = \"shortcut icon\" href = \"../../../../../services/v4/web/resources/images/favicon.png\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/ide.css\" /> < body ng-controller = \"EditorController\" > < div class = \"container\" > < div class = \"page-header\" > < h1 > My Editor Description: {{file}} < form > < div class = \"form-group\" > < label > Group < input type = \"text\" class = \"form-control\" ng-model = \"myModel.group\" value = \"\" > ... < button type = \"button\" class = \"btn btn-primary\" ng-click = \"save()\" > Save < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/jquery/2.0.3/jquery.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/bootstrap/3.3.7/bootstrap.min.js\" async > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular-resource.min.js\" > < script src = \"../../../../../services/v4/web/ide-core/ui/message-hub.js\" > < script type = \"text/javascript\" src = \"editor.js\" > For \u0430 real world example you can look at Jobs Plugin or Monaco Editor .","title":"Editor"},{"location":"development/extensions/editor/#editor","text":"","title":"Editor"},{"location":"development/extensions/editor/#descriptors","text":"To contribute a new Editor (text-based or form-based) to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/editor/#my-editorextension","text":"{ \"module\" : \"my-project/services/my-editor.js\" , \"extensionPoint\" : \"ide-editor\" , \"description\" : \"The description of my editor\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute.","title":"my-editor.extension"},{"location":"development/extensions/editor/#my-editorjs","text":"exports . getView = function () { var view = { name : \"My Editor\" , factory : \"frame\" , region : \"center-top\" , link : \"../my-project/index.html\" , contentTypes : [ \"application/json\" ] }; return view ; }; name - The exact name of the view, which will be shown in the e.g. menu. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. contentTypes - The content types array of supported files. The project structure in this case should look like this: | my-project |---- extensions |----> my-editor.extension |---- services |----> my-editor.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project","title":"my-editor.js"},{"location":"development/extensions/editor/#implementation","text":" < html lang = \"en\" ng-app = \"editor\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1.0\" > < meta name = \"description\" content = \"\" > < meta name = \"author\" content = \"\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/bootstrap.min.css\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/web/resources/font-awesome-4.7.0/css/font-awesome.min.css\" > < link type = \"image/png\" rel = \"shortcut icon\" href = \"../../../../../services/v4/web/resources/images/favicon.png\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/ide.css\" /> < body ng-controller = \"EditorController\" > < div class = \"container\" > < div class = \"page-header\" > < h1 > My Editor Description: {{file}} < form > < div class = \"form-group\" > < label > Group < input type = \"text\" class = \"form-control\" ng-model = \"myModel.group\" value = \"\" > ... < button type = \"button\" class = \"btn btn-primary\" ng-click = \"save()\" > Save < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/jquery/2.0.3/jquery.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/bootstrap/3.3.7/bootstrap.min.js\" async > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular-resource.min.js\" > < script src = \"../../../../../services/v4/web/ide-core/ui/message-hub.js\" > < script type = \"text/javascript\" src = \"editor.js\" > For \u0430 real world example you can look at Jobs Plugin or Monaco Editor .","title":"Implementation"},{"location":"development/extensions/perspective/","text":"Perspective Descriptors To contribute a new Perspective to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-perspective.extension { \"module\" : \"my-project/services/my-perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"The description of my perspective\" } module - Points to the corresponding perspective descriptor (see below). extensionPoint - Where and how this perspective will be shown. Some of the possible values are: ide-perspective ide-view ide-editor ide-database-menu ide-documents-content-type ide-workspace-menu-new-template my-perspective.js exports . getPerspective = function () { var perspective = { name : \"My Perspective\" , link : \"../my-project/index.html\" , order : \"901\" , image : \"files-o\" }; return perspective ; }; name - The exact name of the perspective. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a perspective order - Used to sort the perspective tabs in the sidebar. image - The name of the image which will be used for this perspective. This is a Font awesome icon name. The project structure in this case should look like this: | my-project |---- extensions |----> my-perspective.extension |---- services |----> my-perspective.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project. Implementation In general you can embed any valid HTML in the index.html file and it will be rendered in the place where the perspective should be embedded. For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Database Perspective Project .","title":"Perspective"},{"location":"development/extensions/perspective/#perspective","text":"","title":"Perspective"},{"location":"development/extensions/perspective/#descriptors","text":"To contribute a new Perspective to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/perspective/#my-perspectiveextension","text":"{ \"module\" : \"my-project/services/my-perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"The description of my perspective\" } module - Points to the corresponding perspective descriptor (see below). extensionPoint - Where and how this perspective will be shown. Some of the possible values are: ide-perspective ide-view ide-editor ide-database-menu ide-documents-content-type ide-workspace-menu-new-template","title":"my-perspective.extension"},{"location":"development/extensions/perspective/#my-perspectivejs","text":"exports . getPerspective = function () { var perspective = { name : \"My Perspective\" , link : \"../my-project/index.html\" , order : \"901\" , image : \"files-o\" }; return perspective ; }; name - The exact name of the perspective. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a perspective order - Used to sort the perspective tabs in the sidebar. image - The name of the image which will be used for this perspective. This is a Font awesome icon name. The project structure in this case should look like this: | my-project |---- extensions |----> my-perspective.extension |---- services |----> my-perspective.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project.","title":"my-perspective.js"},{"location":"development/extensions/perspective/#implementation","text":"In general you can embed any valid HTML in the index.html file and it will be rendered in the place where the perspective should be embedded. For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Database Perspective Project .","title":"Implementation"},{"location":"development/extensions/template/","text":"Template Descriptors To contribute a new Template to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-template.extension { \"module\" : \"my-project/services/my-template.js\" , \"extensionPoint\" : \"ide-template\" , \"description\" : \"The description of my template\" } module - Points to the corresponding template descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute. my-template.js exports . getTemplate = function () { var template = { name : \"My Template\" , description : \"My cool template\" , extension : \"myfile\" , sources : [ { location : \"/my-project/my-source.template\" , action : \"generate\" , rename : \"{{fileName}}.\" , engine : \"velocity\" , start : \"[[\" , end : \"]]\" } ], parameters : [] }; return template ; }; name - The exact name of the template, which will be shown in drop-down boxes. description - Text associated with the template. extension - Optional, if present the template will be shown only if a given file with the specified extension is selected. sources - The list of the templates which will be used during the generation phase. location - The relative path to the template. action - The type of the processing which will be used for this templates. rename - If renaming of the target artifact will be needed. engine - The template engine which will be used for this template - \"mustache\" (default), \"velocity\" and \"javascript\". start and end - Tags if the default \"{{\" and \"}}\" are not applicable. handler - The javascript transformation service, in case of javascript engine. parameters - The list of parameters if any which will be passed to the generator. The project structure in this case should look like this: | my-project |---- extensions |----> my-template.extension |---- services |----> my-template.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project Implementation < html xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < title > ${fileName} < body ng-app = \"my-view\" ng-controller = \"MyController as controller\" class = \"view\" > < form class = \"input-group\" name = \"myForm\" > < span class = \"input-group-btn\" > < button class = \"btn btn-default\" type = \"button\" ng-click = \"myClick()\" >< i class = \"fa fa-bolt\" > For \u0430 real world example you can look at Bookstore Template","title":"Template"},{"location":"development/extensions/template/#template","text":"","title":"Template"},{"location":"development/extensions/template/#descriptors","text":"To contribute a new Template to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/template/#my-templateextension","text":"{ \"module\" : \"my-project/services/my-template.js\" , \"extensionPoint\" : \"ide-template\" , \"description\" : \"The description of my template\" } module - Points to the corresponding template descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute.","title":"my-template.extension"},{"location":"development/extensions/template/#my-templatejs","text":"exports . getTemplate = function () { var template = { name : \"My Template\" , description : \"My cool template\" , extension : \"myfile\" , sources : [ { location : \"/my-project/my-source.template\" , action : \"generate\" , rename : \"{{fileName}}.\" , engine : \"velocity\" , start : \"[[\" , end : \"]]\" } ], parameters : [] }; return template ; }; name - The exact name of the template, which will be shown in drop-down boxes. description - Text associated with the template. extension - Optional, if present the template will be shown only if a given file with the specified extension is selected. sources - The list of the templates which will be used during the generation phase. location - The relative path to the template. action - The type of the processing which will be used for this templates. rename - If renaming of the target artifact will be needed. engine - The template engine which will be used for this template - \"mustache\" (default), \"velocity\" and \"javascript\". start and end - Tags if the default \"{{\" and \"}}\" are not applicable. handler - The javascript transformation service, in case of javascript engine. parameters - The list of parameters if any which will be passed to the generator. The project structure in this case should look like this: | my-project |---- extensions |----> my-template.extension |---- services |----> my-template.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project","title":"my-template.js"},{"location":"development/extensions/template/#implementation","text":" < html xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < title > ${fileName} < body ng-app = \"my-view\" ng-controller = \"MyController as controller\" class = \"view\" > < form class = \"input-group\" name = \"myForm\" > < span class = \"input-group-btn\" > < button class = \"btn btn-default\" type = \"button\" ng-click = \"myClick()\" >< i class = \"fa fa-bolt\" > For \u0430 real world example you can look at Bookstore Template","title":"Implementation"},{"location":"development/extensions/view/","text":"View Descriptors To contribute a new View to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-view.extension { \"module\" : \"my-project/services/my-view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"The description of my view\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will be shown initially. my-view.js exports . getView = function () { var view = { id : \"my-view\" , name : \"My View\" , factory : \"frame\" , region : \"center-bottom\" , label : \"My View\" , link : \"../my-project/index.html\" }; return view ; }; id - The unique id of the view. name - The exact name of the view. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. label - The name which will be used in the heading bar. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. The project structure in this case should look like this: | my-project |---- extensions |----> my-view.extension |---- services |----> my-view.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project. Implementation For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Preview View .","title":"View"},{"location":"development/extensions/view/#view","text":"","title":"View"},{"location":"development/extensions/view/#descriptors","text":"To contribute a new View to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/view/#my-viewextension","text":"{ \"module\" : \"my-project/services/my-view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"The description of my view\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will be shown initially.","title":"my-view.extension"},{"location":"development/extensions/view/#my-viewjs","text":"exports . getView = function () { var view = { id : \"my-view\" , name : \"My View\" , factory : \"frame\" , region : \"center-bottom\" , label : \"My View\" , link : \"../my-project/index.html\" }; return view ; }; id - The unique id of the view. name - The exact name of the view. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. label - The name which will be used in the heading bar. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. The project structure in this case should look like this: | my-project |---- extensions |----> my-view.extension |---- services |----> my-view.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project.","title":"my-view.js"},{"location":"development/extensions/view/#implementation","text":"For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Preview View .","title":"Implementation"},{"location":"development/ide/","text":"IDE Overview Web IDE The Web-based integrated development environment (Web IDE) runs directly in a browser and, therefore, does not require additional downloads and installations. It has a rich set of editors, viewers, wizards, DevOps productivity tools, and a new Web IDE for in-system application development. The Web IDE is a composition of perspectives, each consisting of the necessary tools to accomplish a certain goal. Three of the UI elements retain their positions in all perpectives: top-area toolbar for the menus, theme selection, and user control sidebar on the left with shortcuts to the perspectives status bar at the bottom, for notifications and other use by the tools The tools that constitute the perspectives are laid out in predefined regions of the work plot, but you can change their position using drag and drop. The perspectives are simply predefined configurations, hence you can open, move, or close different tools on the work plot of a perspective for your convenience. You can also be maximize, minimize, or even pop out any of the tools in a separate window. The tools are the smallest atomic parts in the Web IDE. They are referred to as views or editors, and each type is handled differently. Perspectives By default, the different views and editors are separated into a few perspectives: Workbench Git Database Repository Terminal Operations Documents Debugger Views Each perspective is comprised of different views. Learn more about them following the list below: Snapshot Debugger Roles Jobs Documents Git Preview Workspace SQL Extensions Terminal Variables Breakpoints Console Logs Data Structures Access Listeners Database Search Import Registry Repository Editors Monaco is the editor integrated into the Eclipse Dirigible Web IDE. Modelers There are some more sophisticated visual editors: BPMN Modeler Database Schema Modeler Entity Data Modeler Form Designer Layouts The Web IDE layout API delegates the layout management to the GoldenLayout framework. Layouts is a convenience bag of functions that significantly simplifies the work with layouts. It takes care of views registry setup, the work plot regions configuration, layout initialization, serialization, control on the layout manager, open view and open editor functions, global notifications, and others. The top-area toolbar is a composite that aggregates the drop-down menus, the theme selection, the user name, and sign-out control. It uses the corresponding UI microservices available in the ideUiCore module as Menu, User, and Theme. By convention, all UI components are built with Bootstrap 3.x CSS and the themes in the Web IDE are actually custom Bootstrap CSS. A UI microservice enables dynamic change of the CSS upon change of the theme automatically. It is available as Angular factory theme. The Angular service User provides the details for the user that are rendered by the Menu directive, such as the user name. The sidebar is Angular directive that takes care of rendering a standard sidebar in the framework template. It works with the perspectives.js service to populate the registered perspectives as shortcuts. The status bar is an Angular directive that renders a standard, fixed-position footer. The component is subscribed to listen to message types configured as value of the status-bar-topic attribute, or by default to status-message messages.","title":"IDE Overview"},{"location":"development/ide/#ide-overview","text":"","title":"IDE Overview"},{"location":"development/ide/#web-ide","text":"The Web-based integrated development environment (Web IDE) runs directly in a browser and, therefore, does not require additional downloads and installations. It has a rich set of editors, viewers, wizards, DevOps productivity tools, and a new Web IDE for in-system application development. The Web IDE is a composition of perspectives, each consisting of the necessary tools to accomplish a certain goal. Three of the UI elements retain their positions in all perpectives: top-area toolbar for the menus, theme selection, and user control sidebar on the left with shortcuts to the perspectives status bar at the bottom, for notifications and other use by the tools The tools that constitute the perspectives are laid out in predefined regions of the work plot, but you can change their position using drag and drop. The perspectives are simply predefined configurations, hence you can open, move, or close different tools on the work plot of a perspective for your convenience. You can also be maximize, minimize, or even pop out any of the tools in a separate window. The tools are the smallest atomic parts in the Web IDE. They are referred to as views or editors, and each type is handled differently.","title":"Web IDE"},{"location":"development/ide/#perspectives","text":"By default, the different views and editors are separated into a few perspectives: Workbench Git Database Repository Terminal Operations Documents Debugger","title":"Perspectives"},{"location":"development/ide/#views","text":"Each perspective is comprised of different views. Learn more about them following the list below: Snapshot Debugger Roles Jobs Documents Git Preview Workspace SQL Extensions Terminal Variables Breakpoints Console Logs Data Structures Access Listeners Database Search Import Registry Repository","title":"Views"},{"location":"development/ide/#editors","text":"Monaco is the editor integrated into the Eclipse Dirigible Web IDE.","title":"Editors"},{"location":"development/ide/#modelers","text":"There are some more sophisticated visual editors: BPMN Modeler Database Schema Modeler Entity Data Modeler Form Designer","title":"Modelers"},{"location":"development/ide/#layouts","text":"The Web IDE layout API delegates the layout management to the GoldenLayout framework. Layouts is a convenience bag of functions that significantly simplifies the work with layouts. It takes care of views registry setup, the work plot regions configuration, layout initialization, serialization, control on the layout manager, open view and open editor functions, global notifications, and others. The top-area toolbar is a composite that aggregates the drop-down menus, the theme selection, the user name, and sign-out control. It uses the corresponding UI microservices available in the ideUiCore module as Menu, User, and Theme. By convention, all UI components are built with Bootstrap 3.x CSS and the themes in the Web IDE are actually custom Bootstrap CSS. A UI microservice enables dynamic change of the CSS upon change of the theme automatically. It is available as Angular factory theme. The Angular service User provides the details for the user that are rendered by the Menu directive, such as the user name. The sidebar is Angular directive that takes care of rendering a standard sidebar in the framework template. It works with the perspectives.js service to populate the registered perspectives as shortcuts. The status bar is an Angular directive that renders a standard, fixed-position footer. The component is subscribed to listen to message types configured as value of the status-bar-topic attribute, or by default to status-message messages.","title":"Layouts"},{"location":"development/ide/about/","text":"About The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About"},{"location":"development/ide/about/#about","text":"The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About"},{"location":"development/ide/editor-access/","text":"Access Editor The Access editor lets you manage access to your project through security constraints files ( *.access ). You can create multiple access constraints within your project as part of one security constraints file. Create a Security Constraints File Right-click on your project in the Workspace view and choose New \u2192 Access Constraints . Enter a name for the security constraints file. Create an Access Constraint Double-click on your security constraints file to open it in the Access editor. Choose New ( + ). In the Create Constraint dialog, fill in the path to the file for which you're creating the access constraint in the Path field. Choose an HTTP or CMIS method for which the access constraint will be valid in the Method field. Select HTTP or CMIS scope from the drop-down list in the Scope field. Fill in a role for which the access constraint is valid in the Roles field. Choose Save . Create a Public Endpoint You can also use the Access editor to make a resource publicly accessible. To do this, fill in the role public in step 6 above. This way, you're effectively creating a new public endpoint for the resource. You can access the public endpoint by replacing web with public in the endpoint's URL. Fill in the public role in the Roles field of the Create Constraint dialog and choose Save . Publish your project. Copy the endpoint's URL from the Preview view. Open a browser and replace web with public in the URL. Check if you can access the public endpoint.","title":"Access Editor"},{"location":"development/ide/editor-access/#access-editor","text":"The Access editor lets you manage access to your project through security constraints files ( *.access ). You can create multiple access constraints within your project as part of one security constraints file.","title":"Access Editor"},{"location":"development/ide/editor-access/#create-a-security-constraints-file","text":"Right-click on your project in the Workspace view and choose New \u2192 Access Constraints . Enter a name for the security constraints file.","title":"Create a Security Constraints File"},{"location":"development/ide/editor-access/#create-an-access-constraint","text":"Double-click on your security constraints file to open it in the Access editor. Choose New ( + ). In the Create Constraint dialog, fill in the path to the file for which you're creating the access constraint in the Path field. Choose an HTTP or CMIS method for which the access constraint will be valid in the Method field. Select HTTP or CMIS scope from the drop-down list in the Scope field. Fill in a role for which the access constraint is valid in the Roles field. Choose Save .","title":"Create an Access Constraint"},{"location":"development/ide/editor-access/#create-a-public-endpoint","text":"You can also use the Access editor to make a resource publicly accessible. To do this, fill in the role public in step 6 above. This way, you're effectively creating a new public endpoint for the resource. You can access the public endpoint by replacing web with public in the endpoint's URL. Fill in the public role in the Roles field of the Create Constraint dialog and choose Save . Publish your project. Copy the endpoint's URL from the Preview view. Open a browser and replace web with public in the URL. Check if you can access the public endpoint.","title":"Create a Public Endpoint"},{"location":"development/ide/editor-csv/","text":"CSV Editor The CSV editor in the Eclipse Dirigible IDE is based on the AG Grid library. The CSV editor allows you to create, edit, and manage CSV files. Create CSV Files To create a new CSV file in the IDE, first create a project. Right-click your project and create a file. Finally, give the newly created file a name followed by the .csv extension, and press Enter . To open the CSV file, double-click it. By default, the CSV file has the headers already enabled. If you want to disable the headers, click the vertical ellipsis icon (\" \u22ee \") to open the kebab menu, and click Disable Header . Edit CSV Files While editing a CSV file in the Eclipse Dirigible IDE, you can perform a list of actions: Add a new column To add a column, right-click on the Column field and click Add Column . You can also edit and delete a column by right-clicking it. Add a new row To add a row, right-click on the field where rows should go and click Add Row . Clicking an already existing row allows you to add a new row before or after it, or delete it. Reorder rows You can use the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") to change the order of existing rows by dragging and dropping them where you want them to be: Select rows You can select: - separate single rows with the \"Cmd (macOS) / Ctrl (Linux & Windows) + left click\" shortcut, or - multiple consequent rows with the \"Shift + left click\" shortcut. Once you've selected some rows, you can either reorder them using the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") or delete them with righ-click and Delete Row(s) . Manage CSV Files Filter a CSV file You can filter the CSV file using these predefined filter options: Contains Not Contains Equals Not Equal Starts With Ends With To select one of these options, click the \u201c \u2261 \u201d hamburger icon: Export a CSV file To export a CSV file, click Export . The CSV file will be downloaded automatically.","title":"CSV Editor"},{"location":"development/ide/editor-csv/#csv-editor","text":"The CSV editor in the Eclipse Dirigible IDE is based on the AG Grid library. The CSV editor allows you to create, edit, and manage CSV files.","title":"CSV Editor"},{"location":"development/ide/editor-csv/#create-csv-files","text":"To create a new CSV file in the IDE, first create a project. Right-click your project and create a file. Finally, give the newly created file a name followed by the .csv extension, and press Enter . To open the CSV file, double-click it. By default, the CSV file has the headers already enabled. If you want to disable the headers, click the vertical ellipsis icon (\" \u22ee \") to open the kebab menu, and click Disable Header .","title":"Create CSV Files"},{"location":"development/ide/editor-csv/#edit-csv-files","text":"While editing a CSV file in the Eclipse Dirigible IDE, you can perform a list of actions: Add a new column To add a column, right-click on the Column field and click Add Column . You can also edit and delete a column by right-clicking it. Add a new row To add a row, right-click on the field where rows should go and click Add Row . Clicking an already existing row allows you to add a new row before or after it, or delete it. Reorder rows You can use the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") to change the order of existing rows by dragging and dropping them where you want them to be: Select rows You can select: - separate single rows with the \"Cmd (macOS) / Ctrl (Linux & Windows) + left click\" shortcut, or - multiple consequent rows with the \"Shift + left click\" shortcut. Once you've selected some rows, you can either reorder them using the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") or delete them with righ-click and Delete Row(s) .","title":"Edit CSV Files"},{"location":"development/ide/editor-csv/#manage-csv-files","text":"Filter a CSV file You can filter the CSV file using these predefined filter options: Contains Not Contains Equals Not Equal Starts With Ends With To select one of these options, click the \u201c \u2261 \u201d hamburger icon: Export a CSV file To export a CSV file, click Export . The CSV file will be downloaded automatically.","title":"Manage CSV Files"},{"location":"development/ide/editor-csvim/","text":"CSVIM Editor The CSVIM editor in the Eclipse Dirigible IDE allows you to open, save, delete, and edit the properties of CSV files. Such properties are: Table Schema File path Delimiter Quote character Header Use header names Distinguish empty from null Version Table The Table input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). Schema The Schema input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). File path The File path input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), forward slashes (\"/\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). Here\u2019s an example for a correct file path: /workspace/csv/subfolder/bigstats.csv . In this example, the file path consists of: workspace The workspace is the place where you create and manage the artifacts of your application. csv This is the name of your project. subfolder This is the subfolder that contains the CSV file. bigstats.csv This is the CSV file. Note: If the file path is formatted properly but doesn't exist, you will be able to save the CSVIM file, but you won't be able to open it with the CSV editor. If the file path isn't formatted properly (for example, by having unsupported characters), you won\u2019t be able to save the CSVIM file or open the CSV file. Delimiter The currently supported delimiters are comma (\",\"), tab (\"/t\"), vertical bar (\"|\"), semicolon (\";\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"+\", \"The delimiter is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file and save the CSVIM file. Quote character The currently supported quote characters are apostrophe (\"\u2018\"), quotation mark (\"\u201c\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"^\", \"The quote character is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file or save the CSVIM file. Header If you select this checkbox, the first line of your CSV file will be treated as a column title or header. Use header names If you select this checkbox, the first line of the specified CSV file will be interpreted when importing the file. This option will work only if you have enabled the \"Header\" checkbox. Distinguish empty from null Select this checkbox if you want to make sure that the table-import process interprets correctly all empty values in the CSV file, which is enclosed with the value selected in the Quote character dropdown, for example, as an empty space. This ensures that an empty space is imported \"as is\" into the target table. If the empty space isn't interpreted correctly, it is imported as null. Version You can specify the version of the CSVIM so you can better manage your CSV and database data.","title":"CSVIM Editor"},{"location":"development/ide/editor-csvim/#csvim-editor","text":"The CSVIM editor in the Eclipse Dirigible IDE allows you to open, save, delete, and edit the properties of CSV files. Such properties are: Table Schema File path Delimiter Quote character Header Use header names Distinguish empty from null Version","title":"CSVIM Editor"},{"location":"development/ide/editor-csvim/#table","text":"The Table input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\").","title":"Table"},{"location":"development/ide/editor-csvim/#schema","text":"The Schema input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\").","title":"Schema"},{"location":"development/ide/editor-csvim/#file-path","text":"The File path input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), forward slashes (\"/\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). Here\u2019s an example for a correct file path: /workspace/csv/subfolder/bigstats.csv . In this example, the file path consists of: workspace The workspace is the place where you create and manage the artifacts of your application. csv This is the name of your project. subfolder This is the subfolder that contains the CSV file. bigstats.csv This is the CSV file. Note: If the file path is formatted properly but doesn't exist, you will be able to save the CSVIM file, but you won't be able to open it with the CSV editor. If the file path isn't formatted properly (for example, by having unsupported characters), you won\u2019t be able to save the CSVIM file or open the CSV file.","title":"File path"},{"location":"development/ide/editor-csvim/#delimiter","text":"The currently supported delimiters are comma (\",\"), tab (\"/t\"), vertical bar (\"|\"), semicolon (\";\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"+\", \"The delimiter is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file and save the CSVIM file.","title":"Delimiter"},{"location":"development/ide/editor-csvim/#quote-character","text":"The currently supported quote characters are apostrophe (\"\u2018\"), quotation mark (\"\u201c\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"^\", \"The quote character is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file or save the CSVIM file.","title":"Quote character"},{"location":"development/ide/editor-csvim/#header","text":"If you select this checkbox, the first line of your CSV file will be treated as a column title or header.","title":"Header"},{"location":"development/ide/editor-csvim/#use-header-names","text":"If you select this checkbox, the first line of the specified CSV file will be interpreted when importing the file. This option will work only if you have enabled the \"Header\" checkbox.","title":"Use header names"},{"location":"development/ide/editor-csvim/#distinguish-empty-from-null","text":"Select this checkbox if you want to make sure that the table-import process interprets correctly all empty values in the CSV file, which is enclosed with the value selected in the Quote character dropdown, for example, as an empty space. This ensures that an empty space is imported \"as is\" into the target table. If the empty space isn't interpreted correctly, it is imported as null.","title":"Distinguish empty from null"},{"location":"development/ide/editor-csvim/#version","text":"You can specify the version of the CSVIM so you can better manage your CSV and database data.","title":"Version"},{"location":"development/ide/editor-monaco/","text":"Monaco Editor Monaco Editor is the code editor that powers VS Code . It is not supported in mobile browsers or mobile web frameworks. The Editor supports syntax highlighting for XML, PHP, C#, C++, Razor, Markdown, Diff, Java, VB, CoffeeScript, Handlebars, Batch, Pug, F#, Lua, Powershell, Python, SASS, R, Objective-C and side by side live comparison for all languages out of the box. Monaco has a rich set of default keyboard shortcuts as well as allowing you to customize them. Monaco supports multiple cursors for fast simultaneous edits. You can also add secondary cursors.","title":"Monaco Editor"},{"location":"development/ide/editor-monaco/#monaco-editor","text":"Monaco Editor is the code editor that powers VS Code . It is not supported in mobile browsers or mobile web frameworks. The Editor supports syntax highlighting for XML, PHP, C#, C++, Razor, Markdown, Diff, Java, VB, CoffeeScript, Handlebars, Batch, Pug, F#, Lua, Powershell, Python, SASS, R, Objective-C and side by side live comparison for all languages out of the box. Monaco has a rich set of default keyboard shortcuts as well as allowing you to customize them. Monaco supports multiple cursors for fast simultaneous edits. You can also add secondary cursors.","title":"Monaco Editor"},{"location":"development/ide/modelers/bpmn/","text":"BPMN Modeler The BPMN Modeler provides capabilities for visual design of a business process. Such business processes can include Dirigible services.","title":"BPMN"},{"location":"development/ide/modelers/bpmn/#bpmn-modeler","text":"The BPMN Modeler provides capabilities for visual design of a business process. Such business processes can include Dirigible services.","title":"BPMN Modeler"},{"location":"development/ide/modelers/database-schema/","text":"Database Schema Modeler The Database Schema Modeler provides capabilities for visual design of a database schema.","title":"Database Schema"},{"location":"development/ide/modelers/database-schema/#database-schema-modeler","text":"The Database Schema Modeler provides capabilities for visual design of a database schema.","title":"Database Schema Modeler"},{"location":"development/ide/modelers/entity-data/","text":"Entity Data Modeler The Entity Data Modeler provides capabilities for visual design of a domain model. After that you can generate a full-stack applications for basic operations over the defined entities.","title":"Entity Data"},{"location":"development/ide/modelers/entity-data/#entity-data-modeler","text":"The Entity Data Modeler provides capabilities for visual design of a domain model. After that you can generate a full-stack applications for basic operations over the defined entities.","title":"Entity Data Modeler"},{"location":"development/ide/modelers/form-designer/","text":"Form Designer The Form Designer provides capabilities for visual design of a Web form. You can drag and drop UI controls from a predefined list and edit their properties.","title":"Form Designer"},{"location":"development/ide/modelers/form-designer/#form-designer","text":"The Form Designer provides capabilities for visual design of a Web form. You can drag and drop UI controls from a predefined list and edit their properties.","title":"Form Designer"},{"location":"development/ide/perspectives/database/","text":"Database Perspective The Database perspective contains tools for inspection and manipulation of the artifacts within the underlying relational database. It is comprised of Database , SQL , Console and Result views. The Database perspective features a database explorer, a console to execute SQL statements and to preview results in table format.","title":"Database"},{"location":"development/ide/perspectives/database/#database-perspective","text":"The Database perspective contains tools for inspection and manipulation of the artifacts within the underlying relational database. It is comprised of Database , SQL , Console and Result views. The Database perspective features a database explorer, a console to execute SQL statements and to preview results in table format.","title":"Database Perspective"},{"location":"development/ide/perspectives/debugger/","text":"Debugger Perspective The Web IDE includes a Debugger perspective which is comprised of the following views: Debugger Variables Breakpoints Console Preview The Debugger perspective enables you to monitor the execution of your code, stop it, restart it or set breakpoints, and change values in memory.","title":"Debugger"},{"location":"development/ide/perspectives/debugger/#debugger-perspective","text":"The Web IDE includes a Debugger perspective which is comprised of the following views: Debugger Variables Breakpoints Console Preview The Debugger perspective enables you to monitor the execution of your code, stop it, restart it or set breakpoints, and change values in memory.","title":"Debugger Perspective"},{"location":"development/ide/perspectives/documents/","text":"Documents Perspective The Documents perspective is the place where the user manages the binary artifacts such as pictures, spreadsheets, PDF files, etc. It enables him/her to upload, overwrite, download, delete and search for artifacts. At the moment the Documents perspective consists of only one view, which is also called Documents.","title":"Documents"},{"location":"development/ide/perspectives/documents/#documents-perspective","text":"The Documents perspective is the place where the user manages the binary artifacts such as pictures, spreadsheets, PDF files, etc. It enables him/her to upload, overwrite, download, delete and search for artifacts. At the moment the Documents perspective consists of only one view, which is also called Documents.","title":"Documents Perspective"},{"location":"development/ide/perspectives/git/","text":"Git Perspective The Git perspective aims at presenting a simplified interface for the most common Git operations. It is built from tools that support Git client operations. The Git perspective is comprised of Git and Console views, and workspace menu. It enables the users to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the workspace menu. Note In case of merge conflict on Push operation, a new branch with your local changes will be created in the remote repository. From this point, you can use your preferred tooling to apply the actual merge between the two branches. Video","title":"Git"},{"location":"development/ide/perspectives/git/#git-perspective","text":"The Git perspective aims at presenting a simplified interface for the most common Git operations. It is built from tools that support Git client operations. The Git perspective is comprised of Git and Console views, and workspace menu. It enables the users to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the workspace menu. Note In case of merge conflict on Push operation, a new branch with your local changes will be created in the remote repository. From this point, you can use your preferred tooling to apply the actual merge between the two branches.","title":"Git Perspective"},{"location":"development/ide/perspectives/git/#video","text":"","title":"Video"},{"location":"development/ide/perspectives/operations/","text":"Operations Perspective The Web IDE includes an Operations perspective , which is comprised of the following views: Registry Repository Extension Jobs Listeners Data Structures Access Roles Console Terminal Logs The Operations perspective enables you to monitor the ongoing processes and operation activities.","title":"Operations"},{"location":"development/ide/perspectives/operations/#operations-perspective","text":"The Web IDE includes an Operations perspective , which is comprised of the following views: Registry Repository Extension Jobs Listeners Data Structures Access Roles Console Terminal Logs The Operations perspective enables you to monitor the ongoing processes and operation activities.","title":"Operations Perspective"},{"location":"development/ide/perspectives/repository/","text":"Repository Perspective The Repository perspective gives access to the raw structure of the Dirigible instance. It is comprised of Repository , Snapshot , Preview and Console views. There the user can inspect at low level the project and folder structure, as well as the artifacts content. The user is able to import/export snapshots via the Snapshot view. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository"},{"location":"development/ide/perspectives/repository/#repository-perspective","text":"The Repository perspective gives access to the raw structure of the Dirigible instance. It is comprised of Repository , Snapshot , Preview and Console views. There the user can inspect at low level the project and folder structure, as well as the artifacts content. The user is able to import/export snapshots via the Snapshot view. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository Perspective"},{"location":"development/ide/perspectives/terminal/","text":"Terminal Perspective The key view in the perspective is a terminal that emulates console client connected to the environment of the Dirigible that can execute commands. The difference here is that the whole communication goes via HTTP(S) only and does not require the SSH port to be opened.","title":"Terminal"},{"location":"development/ide/perspectives/terminal/#terminal-perspective","text":"The key view in the perspective is a terminal that emulates console client connected to the environment of the Dirigible that can execute commands. The difference here is that the whole communication goes via HTTP(S) only and does not require the SSH port to be opened.","title":"Terminal Perspective"},{"location":"development/ide/perspectives/workbench/","text":"Workbench Perspective This is the place where the user develops the dynamic applications. This perspective contains all views and editors that may help in the overall implementation, from domain models via services to the user interface. The Workbench perspective is comprised of Workspace , Import , Properties , Console , and Preview views, plus the editors registered for each file type. In other words, the minimal toolset for file management, preview, and editing operations. The main view opened by default in this perspective is the Workspace view, a standard view with the projects in your workspace .","title":"Workbench"},{"location":"development/ide/perspectives/workbench/#workbench-perspective","text":"This is the place where the user develops the dynamic applications. This perspective contains all views and editors that may help in the overall implementation, from domain models via services to the user interface. The Workbench perspective is comprised of Workspace , Import , Properties , Console , and Preview views, plus the editors registered for each file type. In other words, the minimal toolset for file management, preview, and editing operations. The main view opened by default in this perspective is the Workspace view, a standard view with the projects in your workspace .","title":"Workbench Perspective"},{"location":"development/ide/views/about/","text":"About View The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About"},{"location":"development/ide/views/about/#about-view","text":"The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About View"},{"location":"development/ide/views/access/","text":"Access View The Access view displays the defined security constraints on HTTP servers access or paths to the document repository. These constraints are defined in *.access files. More info about the type of the artifacts you can find in Artifacts . Related content Documents View Constraints View Documents Perspective","title":"Access"},{"location":"development/ide/views/access/#access-view","text":"The Access view displays the defined security constraints on HTTP servers access or paths to the document repository. These constraints are defined in *.access files. More info about the type of the artifacts you can find in Artifacts . Related content Documents View Constraints View Documents Perspective","title":"Access View"},{"location":"development/ide/views/configurations/","text":"Configurations View The Configurations view contains a list of configuration parameters and environment variables. Each of them begins with \"DIRIGIBLE_\" and continues with a unique name. In addition to Name, each of the other four columns in the table holds a distinct parameter. They are Environment, Runtime, Deployment and Module (priority left to right). Changing a variable The values of the configuration parameters are set by the module, but they can be overwritten. This can be done either during the deployment of Dirigible, by creating a dirigible.properties file with different values or by changing the values during runtime. Changing a variable during runtime Follow steps 1-5 outlined in the Create a hello-world.js service tutorial. Insert the following code at line 2: var response = require ( \"http/v4/response\" ); var config = require ( \"core/v4/configurations\" ); config . set ( \"DIRIGIBLE_BRANDING_NAME\" , \"RuntimeDemo\" ) response . println ( \"Hello World!\" ); response . flush (); response . close (); Save the file. Refresh the page. Navigate to Window \u2192 Select View \u2192 Configurations You can learn more about how to setup Environment Variables here .","title":"Configurations"},{"location":"development/ide/views/configurations/#configurations-view","text":"The Configurations view contains a list of configuration parameters and environment variables. Each of them begins with \"DIRIGIBLE_\" and continues with a unique name. In addition to Name, each of the other four columns in the table holds a distinct parameter. They are Environment, Runtime, Deployment and Module (priority left to right).","title":"Configurations View"},{"location":"development/ide/views/configurations/#changing-a-variable","text":"The values of the configuration parameters are set by the module, but they can be overwritten. This can be done either during the deployment of Dirigible, by creating a dirigible.properties file with different values or by changing the values during runtime.","title":"Changing a variable"},{"location":"development/ide/views/configurations/#changing-a-variable-during-runtime","text":"Follow steps 1-5 outlined in the Create a hello-world.js service tutorial. Insert the following code at line 2: var response = require ( \"http/v4/response\" ); var config = require ( \"core/v4/configurations\" ); config . set ( \"DIRIGIBLE_BRANDING_NAME\" , \"RuntimeDemo\" ) response . println ( \"Hello World!\" ); response . flush (); response . close (); Save the file. Refresh the page. Navigate to Window \u2192 Select View \u2192 Configurations You can learn more about how to setup Environment Variables here .","title":"Changing a variable during runtime"},{"location":"development/ide/views/console/","text":"Console View The Console view is a major debugging tool. It displays the output of the code that you are executing.","title":"Console"},{"location":"development/ide/views/console/#console-view","text":"The Console view is a major debugging tool. It displays the output of the code that you are executing.","title":"Console View"},{"location":"development/ide/views/constraints/","text":"Constraints View The Constraints view lets you restrict access through the Documents view to specific folders or files by creating constraints. This way, users will be able to access certain resources based on their roles. To create a constraint, you have to specify: a path to the folder or file. For example, /Folder A a method - READ or WRITE ( WRITE constraint includes READ access) a role - the role that the user needs to have in order to be able to see or edit the folder/file. For example, Admin . As specified in the screenshot below, only users with the role Admin can read Folder C that can be accessed by following the path /Folder A/FolderC . The constraints created in the Constraints view are also visible in the Access view. Related content Access View Documents View Documents Perspective","title":"Constraints"},{"location":"development/ide/views/constraints/#constraints-view","text":"The Constraints view lets you restrict access through the Documents view to specific folders or files by creating constraints. This way, users will be able to access certain resources based on their roles. To create a constraint, you have to specify: a path to the folder or file. For example, /Folder A a method - READ or WRITE ( WRITE constraint includes READ access) a role - the role that the user needs to have in order to be able to see or edit the folder/file. For example, Admin . As specified in the screenshot below, only users with the role Admin can read Folder C that can be accessed by following the path /Folder A/FolderC . The constraints created in the Constraints view are also visible in the Access view. Related content Access View Documents View Documents Perspective","title":"Constraints View"},{"location":"development/ide/views/database/","text":"Database View The Database view gives you direct access to the configured data source(s). It enables you to expand the schema item and see the list of all tables and views created either via the data structures models or directly via SQL script in SQL View . Note All created tables can be discovered under the PUBLIC schema (for local deployment with H2 database) . The PUBLIC schema will appear, after the local data source type and the DefaultDB data source are selected in the upper right corner.","title":"Database"},{"location":"development/ide/views/database/#database-view","text":"The Database view gives you direct access to the configured data source(s). It enables you to expand the schema item and see the list of all tables and views created either via the data structures models or directly via SQL script in SQL View . Note All created tables can be discovered under the PUBLIC schema (for local deployment with H2 database) . The PUBLIC schema will appear, after the local data source type and the DefaultDB data source are selected in the upper right corner.","title":"Database View"},{"location":"development/ide/views/datastructures/","text":"Data Structures View The Data Structures view lists all data structures defined in the following files: *.table - the table layout definition in JSON *.view - the view layout definition in JSON *.schema - the schema layout definition in JSON *.append - append mode data file in DSV *.delete - delete mode data file in DSV *.update - update mode data file in DSV *.replace - replace mode data file in DSV More info about the type of the artifacts you can find in Artifacts .","title":"Data Structures"},{"location":"development/ide/views/datastructures/#data-structures-view","text":"The Data Structures view lists all data structures defined in the following files: *.table - the table layout definition in JSON *.view - the view layout definition in JSON *.schema - the schema layout definition in JSON *.append - append mode data file in DSV *.delete - delete mode data file in DSV *.update - update mode data file in DSV *.replace - replace mode data file in DSV More info about the type of the artifacts you can find in Artifacts .","title":"Data Structures View"},{"location":"development/ide/views/debugger/","text":"Debugger View The Debugger view enables you to navigate the debugging of your code. You can: Start Pause Restart Proceed step by step This view includes a few panes that are helpful during the debugging process. See below for more details. Scope When you're paused on a line of code, the Scope pane shows you what local and global variables are currently defined, along with the value of each variable. It also shows closure variables, when applicable. Double-click a variable value to edit it. When you're not paused on a line of code, the Scope pane is empty. Breakpoints The Breakpoints pane shows any line-of-code breakpoints you've added to your code. As the name suggests, you can use a line-of-code breakpoint when you've got a specific line of code that you want to pause on. As you can see in the Breakpoints pane, currently there are two breakpoints added: \"Unnamed\" at row 5 and \"Unnamed\" at row 8. Debug Preview This pane displays the result of executing the debugged file. The Debug Preview is similar in functionality to the Preview view. Related content Console view Debugger perspective","title":"Debugger"},{"location":"development/ide/views/debugger/#debugger-view","text":"The Debugger view enables you to navigate the debugging of your code. You can: Start Pause Restart Proceed step by step This view includes a few panes that are helpful during the debugging process. See below for more details. Scope When you're paused on a line of code, the Scope pane shows you what local and global variables are currently defined, along with the value of each variable. It also shows closure variables, when applicable. Double-click a variable value to edit it. When you're not paused on a line of code, the Scope pane is empty. Breakpoints The Breakpoints pane shows any line-of-code breakpoints you've added to your code. As the name suggests, you can use a line-of-code breakpoint when you've got a specific line of code that you want to pause on. As you can see in the Breakpoints pane, currently there are two breakpoints added: \"Unnamed\" at row 5 and \"Unnamed\" at row 8. Debug Preview This pane displays the result of executing the debugged file. The Debug Preview is similar in functionality to the Preview view. Related content Console view Debugger perspective","title":"Debugger View"},{"location":"development/ide/views/discussions/","text":"Discussions View The Discussions view adds forum-like capabilities to the Eclipse Dirigible's UI. You can review and rate comments, as well as participate in the discussion by commenting under topics. There's also the possibility to toggle between thread view and timeline view for each discussion.","title":"Discussions"},{"location":"development/ide/views/discussions/#discussions-view","text":"The Discussions view adds forum-like capabilities to the Eclipse Dirigible's UI. You can review and rate comments, as well as participate in the discussion by commenting under topics. There's also the possibility to toggle between thread view and timeline view for each discussion.","title":"Discussions View"},{"location":"development/ide/views/documents/","text":"Documents View The Documents view enables you to manage the binary artifacts such as pictures, spreadsheets, PDF, etc. You can upload, overwrite, download, delete, and search for artifacts. Related content Access View Constraints View Documents Perspective","title":"Documents"},{"location":"development/ide/views/documents/#documents-view","text":"The Documents view enables you to manage the binary artifacts such as pictures, spreadsheets, PDF, etc. You can upload, overwrite, download, delete, and search for artifacts. Related content Access View Constraints View Documents Perspective","title":"Documents View"},{"location":"development/ide/views/extensions/","text":"Extensions View The Extensions view lists all defined extensions and extension points through *.extension and *.extensionpoint descriptor. More info about the type of the artifacts you can find here","title":"Extensions"},{"location":"development/ide/views/extensions/#extensions-view","text":"The Extensions view lists all defined extensions and extension points through *.extension and *.extensionpoint descriptor. More info about the type of the artifacts you can find here","title":"Extensions View"},{"location":"development/ide/views/git/","text":"Git View The Git view enables you to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the Workspace menu. Related content Console view Staging view History view","title":"Git"},{"location":"development/ide/views/git/#git-view","text":"The Git view enables you to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the Workspace menu. Related content Console view Staging view History view","title":"Git View"},{"location":"development/ide/views/history/","text":"History View The History view provides a commit history record that includes ID, message, author, and time of each commit.","title":"History"},{"location":"development/ide/views/history/#history-view","text":"The History view provides a commit history record that includes ID, message, author, and time of each commit.","title":"History View"},{"location":"development/ide/views/import/","text":"Import View The Import view enables the user to upload a *.zip file, containing one or more projects, to the selected Workspace . The view includes a progress bar for navigation of the process. The user can manage and switch between multiple workspaces through the Workspace menu.","title":"Import"},{"location":"development/ide/views/import/#import-view","text":"The Import view enables the user to upload a *.zip file, containing one or more projects, to the selected Workspace . The view includes a progress bar for navigation of the process. The user can manage and switch between multiple workspaces through the Workspace menu.","title":"Import View"},{"location":"development/ide/views/jobs/","text":"Jobs View The Jobs view lists all registered custom jobs scheduled for execution in a *.job file. More info about the type of the artifacts you can find here","title":"Jobs"},{"location":"development/ide/views/jobs/#jobs-view","text":"The Jobs view lists all registered custom jobs scheduled for execution in a *.job file. More info about the type of the artifacts you can find here","title":"Jobs View"},{"location":"development/ide/views/listeners/","text":"Listeners View The Listeners view shows all message listeners registered by the *.listener files. Their type depends on the type of the message hub - topic or queue. More info about the type of the artifacts you can find in Artifacts .","title":"Listeners"},{"location":"development/ide/views/listeners/#listeners-view","text":"The Listeners view shows all message listeners registered by the *.listener files. Their type depends on the type of the message hub - topic or queue. More info about the type of the artifacts you can find in Artifacts .","title":"Listeners View"},{"location":"development/ide/views/logs/","text":"Logs View The Logs view lists all available log files.","title":"Logs"},{"location":"development/ide/views/logs/#logs-view","text":"The Logs view lists all available log files.","title":"Logs View"},{"location":"development/ide/views/plugins/","text":"Plugins Info The Plugins view is currently in an initial stage of development and does not have all features. Overview The Plugins view contains a list of plugins that you can install in Dirigible. Each plugin name is a link that leads to a page containing more information about it. Installing a plugin Once you have a running Eclipse Dirigible instance, you can navigate to the Plugins view: Choose Window \u2192 Show View \u2192 Plugins . Install the plugin.","title":"Plugins"},{"location":"development/ide/views/plugins/#plugins","text":"Info The Plugins view is currently in an initial stage of development and does not have all features.","title":"Plugins"},{"location":"development/ide/views/plugins/#overview","text":"The Plugins view contains a list of plugins that you can install in Dirigible. Each plugin name is a link that leads to a page containing more information about it.","title":"Overview"},{"location":"development/ide/views/plugins/#installing-a-plugin","text":"Once you have a running Eclipse Dirigible instance, you can navigate to the Plugins view: Choose Window \u2192 Show View \u2192 Plugins . Install the plugin.","title":"Installing a plugin"},{"location":"development/ide/views/preview/","text":"Preview View The Preview view displays the result of executing the selected file. It refreshes automatically during Workspace change events e.g. Save.","title":"Previews"},{"location":"development/ide/views/preview/#preview-view","text":"The Preview view displays the result of executing the selected file. It refreshes automatically during Workspace change events e.g. Save.","title":"Preview View"},{"location":"development/ide/views/registry/","text":"Registry View Technically, the Registry is a space within the Repository where all the published artifacts are placed. Caution Editing of the file contents via the Registry perspective is not recommended as it can lead to inconsistencies!","title":"Registry"},{"location":"development/ide/views/registry/#registry-view","text":"Technically, the Registry is a space within the Repository where all the published artifacts are placed. Caution Editing of the file contents via the Registry perspective is not recommended as it can lead to inconsistencies!","title":"Registry View"},{"location":"development/ide/views/repository/","text":"Repository View The Repository view gives access to the raw structure of the underlying Repository content. There you can inspect at low level the project and folder structure, as well as the artifacts content. The view enables the user to create new collections and resources, to delete existing ones, or to export them. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository"},{"location":"development/ide/views/repository/#repository-view","text":"The Repository view gives access to the raw structure of the underlying Repository content. There you can inspect at low level the project and folder structure, as well as the artifacts content. The view enables the user to create new collections and resources, to delete existing ones, or to export them. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository View"},{"location":"development/ide/views/resultview/","text":"Result View The Result view graphically shows you result of executed script via SQL View or if you press show content on some table in Database view .","title":"Result"},{"location":"development/ide/views/resultview/#result-view","text":"The Result view graphically shows you result of executed script via SQL View or if you press show content on some table in Database view .","title":"Result View"},{"location":"development/ide/views/roles/","text":"Roles View The Roles view lists all security roles defined in the roles descriptor *.roles . More info about the type of the artifacts you can find in Artifacts .","title":"Roles"},{"location":"development/ide/views/roles/#roles-view","text":"The Roles view lists all security roles defined in the roles descriptor *.roles . More info about the type of the artifacts you can find in Artifacts .","title":"Roles View"},{"location":"development/ide/views/search/","text":"Search View The Search view enables the user to make a free-text search in the selected workspace. The user can switch between multiple workspaces through the Workspace menu.","title":"Search"},{"location":"development/ide/views/search/#search-view","text":"The Search view enables the user to make a free-text search in the selected workspace. The user can switch between multiple workspaces through the Workspace menu.","title":"Search View"},{"location":"development/ide/views/snapshot/","text":"Snapshot View The Snapshot view enables the user to upload the whole repository (including all users' Workspaces) and all registry public contents. It includes a progress bar for navigation of the process.","title":"Snapshot"},{"location":"development/ide/views/snapshot/#snapshot-view","text":"The Snapshot view enables the user to upload the whole repository (including all users' Workspaces) and all registry public contents. It includes a progress bar for navigation of the process.","title":"Snapshot View"},{"location":"development/ide/views/sql/","text":"SQL View The SQL view is one of the most powerful tools for database management. In the SQL console you can enter and execute SQL scripts, compliant to the underlying database system. Note Scripts are executed by pressing: Windows : Ctrl + X Mac: Cmd + X In Database View you can press the refresh button, and preview the data by selecting Show Content . You get the result of the execution in the Results view below.","title":"SQL"},{"location":"development/ide/views/sql/#sql-view","text":"The SQL view is one of the most powerful tools for database management. In the SQL console you can enter and execute SQL scripts, compliant to the underlying database system. Note Scripts are executed by pressing: Windows : Ctrl + X Mac: Cmd + X In Database View you can press the refresh button, and preview the data by selecting Show Content . You get the result of the execution in the Results view below.","title":"SQL View"},{"location":"development/ide/views/staging/","text":"Staging View The Staging view provides a visual alternative of executing Git commands from a terminal. You can manage your locally changed files and prepare them for pushing to your remote repository. Unstaged Files - all files that you've changed are listed here. However, these changes aren't ready yet to be binded to a commit. For this purpose, you have to stage them. Use the downward arrow to move files from unstaged to staged state. Staged Files - all files that you've changed and staged are listed here. These are ready to be binded to a commit. Use the upward arrow to move files from staged back to unstaged state. Commit Message - provide details about the changes included in your commit. Username , Password , Email - provide your authentication credentials. Commit & Push - commit your changes and directly push them to your remote repository. Commit - commit your changes without pushing them. This way, you can organize your changes in several commits and push them together.","title":"Staging"},{"location":"development/ide/views/staging/#staging-view","text":"The Staging view provides a visual alternative of executing Git commands from a terminal. You can manage your locally changed files and prepare them for pushing to your remote repository. Unstaged Files - all files that you've changed are listed here. However, these changes aren't ready yet to be binded to a commit. For this purpose, you have to stage them. Use the downward arrow to move files from unstaged to staged state. Staged Files - all files that you've changed and staged are listed here. These are ready to be binded to a commit. Use the upward arrow to move files from staged back to unstaged state. Commit Message - provide details about the changes included in your commit. Username , Password , Email - provide your authentication credentials. Commit & Push - commit your changes and directly push them to your remote repository. Commit - commit your changes without pushing them. This way, you can organize your changes in several commits and push them together.","title":"Staging View"},{"location":"development/ide/views/terminal/","text":"Terminal View Via the Terminal view, you can execute OS commands. Examples: Linux OS: ls -al Microsoft Windows OS: dir","title":"Terminal"},{"location":"development/ide/views/terminal/#terminal-view","text":"Via the Terminal view, you can execute OS commands. Examples: Linux OS: ls -al Microsoft Windows OS: dir","title":"Terminal View"},{"location":"development/ide/views/websockets/","text":"Web Sockets View The Web Sockets view lists all the connections that Dirigible has currently established with other ports. The different properties and sections are: Location Endpoint Handler Created Creator","title":"Web Sockets"},{"location":"development/ide/views/websockets/#web-sockets-view","text":"The Web Sockets view lists all the connections that Dirigible has currently established with other ports. The different properties and sections are: Location Endpoint Handler Created Creator","title":"Web Sockets View"},{"location":"development/ide/views/workspace/","text":"Workspace View The Workspace is the developer's place where he/she creates and manages the application artifacts. The first-level citizens of the workspace are the projects. With Eclipse Dirigible the users can create, manage, and switch between multiple workspaces through the Workspace view. Each project can contain multiple folders and files (artifacts). The new template-based project and artifacts scaffolding generators features are worthy of mention. The projects file organization is now non-normative and entirely up-to the preferences of the users. The IDE supports multiple editors registered for different file (MIME) types. More than one editor can be registered for one file type and in this case a \"Open with\u2026\" context menu entry is rendered for the user to select, which one to use. The Workspace explorer displays a standard view on the projects in your workspace . It shows the folder structure along with the files. There is a context menu assigned to the project node: Via this context menu, you can create new artifacts such as: Database Table Database View Database Schema Model Entity Data Model JavaScript Service HTML5 Page Scheduled Job Message Listener Business Process Model Access Constraints Roles Definitions or just regular ones: File Folder More info about the type of the artifacts you can find here . When selecting an artifact, you can use the \"Open\" or \"Open With\" actions to load its content in the corresponding editor, for example, Monaco Editor . A single user can have multiple workspaces, containing different set of projects. The artifacts i.e. the project management, can be done via the views and editors in the Workbench Perspective .","title":"Workspace"},{"location":"development/ide/views/workspace/#workspace-view","text":"The Workspace is the developer's place where he/she creates and manages the application artifacts. The first-level citizens of the workspace are the projects. With Eclipse Dirigible the users can create, manage, and switch between multiple workspaces through the Workspace view. Each project can contain multiple folders and files (artifacts). The new template-based project and artifacts scaffolding generators features are worthy of mention. The projects file organization is now non-normative and entirely up-to the preferences of the users. The IDE supports multiple editors registered for different file (MIME) types. More than one editor can be registered for one file type and in this case a \"Open with\u2026\" context menu entry is rendered for the user to select, which one to use. The Workspace explorer displays a standard view on the projects in your workspace . It shows the folder structure along with the files. There is a context menu assigned to the project node: Via this context menu, you can create new artifacts such as: Database Table Database View Database Schema Model Entity Data Model JavaScript Service HTML5 Page Scheduled Job Message Listener Business Process Model Access Constraints Roles Definitions or just regular ones: File Folder More info about the type of the artifacts you can find here . When selecting an artifact, you can use the \"Open\" or \"Open With\" actions to load its content in the corresponding editor, for example, Monaco Editor . A single user can have multiple workspaces, containing different set of projects. The artifacts i.e. the project management, can be done via the views and editors in the Workbench Perspective .","title":"Workspace View"},{"location":"overview/","text":"The Eclipse Dirigible Project Eclipse Dirigible is an open source project that provides Integrated Development Environment as a Service (IDEaaS), as well as integrated runtime execution engines. The applications created with Eclipse Dirigible comply with the Dynamic Applications concept and structure. The main project goal is to provide all required capabilities needed to develop and run end-to-end vertical applications in the cloud in the shortest time ever. The environment itself runs directly in a browser, therefore does not require additional downloads and installations. It packs all the needed components, which makes it a self-contained and well-integrated software stack that can be deployed on any Java based Web server, such as Tomcat, Jetty, JBoss, etc. Eclipse Dirigible project came out of an internal SAP initiative to address the extension and adaptation use cases related to SOA and Enterprise Services. On one hand, in this project were implied the lessons learned from the standard tools and approaches so far. On the other hand, there were added features aligned with the most recent technologies and architectural patterns related to Web 2.0 and HTML5 . This made it complete enough to be used as the only environment needed for building and running applications in the cloud. From the beginning, the project follows the principles of Simplicity, Openness, Agility, Completeness, and Perfection, which provide a sustainable environment where maximum impact is achieved with minimal effort. Features section describes in detail what is included in the project. Concepts section gives you an overview about the internal and the chosen patterns. Samples section shows you how to start and build your first dynamic Web application in seconds.","title":"The Eclipse Dirigible Project"},{"location":"overview/#the-eclipse-dirigible-project","text":"Eclipse Dirigible is an open source project that provides Integrated Development Environment as a Service (IDEaaS), as well as integrated runtime execution engines. The applications created with Eclipse Dirigible comply with the Dynamic Applications concept and structure. The main project goal is to provide all required capabilities needed to develop and run end-to-end vertical applications in the cloud in the shortest time ever. The environment itself runs directly in a browser, therefore does not require additional downloads and installations. It packs all the needed components, which makes it a self-contained and well-integrated software stack that can be deployed on any Java based Web server, such as Tomcat, Jetty, JBoss, etc. Eclipse Dirigible project came out of an internal SAP initiative to address the extension and adaptation use cases related to SOA and Enterprise Services. On one hand, in this project were implied the lessons learned from the standard tools and approaches so far. On the other hand, there were added features aligned with the most recent technologies and architectural patterns related to Web 2.0 and HTML5 . This made it complete enough to be used as the only environment needed for building and running applications in the cloud. From the beginning, the project follows the principles of Simplicity, Openness, Agility, Completeness, and Perfection, which provide a sustainable environment where maximum impact is achieved with minimal effort. Features section describes in detail what is included in the project. Concepts section gives you an overview about the internal and the chosen patterns. Samples section shows you how to start and build your first dynamic Web application in seconds.","title":"The Eclipse Dirigible Project"},{"location":"overview/architecture/","text":"Architecture The Eclipse Dirigible architecture follows the well-proved principles of simplicity and scalability in the classical service-oriented architecture. The components are separated between the design time (definition work, modeling, scripting) and the runtime (execution of services, content provisioning, and monitoring). The transition between design time and runtime is achieved with a repository component. The only linking part is the content itself. At design time, the programmers and designers use the Web-based integrated development environment Web IDE . This tooling is based on the most popular client side JavaScript framework - AngularJS, as well as Bootstrap for theme-ing and GoldenLayout for windows management. The runtime components provide the cloud application after you create it. The underlying technology platform is a Java-Web-Profile-compliant application server (such as Tomcat). On top are the Eclipse Dirigible containers for service execution. Depending on the scripting language and purpose, they can be: GraalVM JS Mylyn Lucene Quartz ActiveMQ Flowable Mustache Chemistry The runtime can be scaled independently from the design time and can be deployed without the design time at all (for productive landscapes). Depending on the target cloud platform, you can integrate the services provided by the underlying technology platform in Eclipse Dirigible.","title":"Architecture"},{"location":"overview/architecture/#architecture","text":"The Eclipse Dirigible architecture follows the well-proved principles of simplicity and scalability in the classical service-oriented architecture. The components are separated between the design time (definition work, modeling, scripting) and the runtime (execution of services, content provisioning, and monitoring). The transition between design time and runtime is achieved with a repository component. The only linking part is the content itself. At design time, the programmers and designers use the Web-based integrated development environment Web IDE . This tooling is based on the most popular client side JavaScript framework - AngularJS, as well as Bootstrap for theme-ing and GoldenLayout for windows management. The runtime components provide the cloud application after you create it. The underlying technology platform is a Java-Web-Profile-compliant application server (such as Tomcat). On top are the Eclipse Dirigible containers for service execution. Depending on the scripting language and purpose, they can be: GraalVM JS Mylyn Lucene Quartz ActiveMQ Flowable Mustache Chemistry The runtime can be scaled independently from the design time and can be deployed without the design time at all (for productive landscapes). Depending on the target cloud platform, you can integrate the services provided by the underlying technology platform in Eclipse Dirigible.","title":"Architecture"},{"location":"overview/credits/","text":"Credits and Special Thanks We would like to say a big THANK YOU! to all the open source projects that we use as components of our platform: GraalJS Mylyn CXF Derby Commons HttpClient Xerces Xalan WS Log4j Batik Velocity Quartz Spring Framework StaX Gson Antlr Hamcrest wsdl4j Slf4j jsoap ICU Mockito AOP Alliance jQuery Bootstrap AngularJS GoldenLayout Flowable Monaco Xtermjs ttyd acorn MkDocs Material for MkDocs unDraw and those who boosted our productivity in the past versions: Rhino Eclipse Equinox Eclipse OSGi Remote Application Platfrom Eclipse Orion Camel Ant Geronimo Felix JUnit Avalon JAF jRuby ACE Editor ASM Woodstox Jettison Groovy CyberNeko HTML EZMorph JCraft JLine","title":"Credits"},{"location":"overview/credits/#credits-and-special-thanks","text":"We would like to say a big THANK YOU! to all the open source projects that we use as components of our platform: GraalJS Mylyn CXF Derby Commons HttpClient Xerces Xalan WS Log4j Batik Velocity Quartz Spring Framework StaX Gson Antlr Hamcrest wsdl4j Slf4j jsoap ICU Mockito AOP Alliance jQuery Bootstrap AngularJS GoldenLayout Flowable Monaco Xtermjs ttyd acorn MkDocs Material for MkDocs unDraw and those who boosted our productivity in the past versions: Rhino Eclipse Equinox Eclipse OSGi Remote Application Platfrom Eclipse Orion Camel Ant Geronimo Felix JUnit Avalon JAF jRuby ACE Editor ASM Woodstox Jettison Groovy CyberNeko HTML EZMorph JCraft JLine","title":"Credits and Special Thanks"},{"location":"overview/editors-modelers/","text":"Editors & Modelers Editors List Monaco - the editor that powers VS Code . Modelers List Entity Data Modeler - design a domain model. Database Schema Modeler - desing a database schema. BPMN Modeler - design a business process. Form Designer - design a Web form.","title":"Editors & Modelers"},{"location":"overview/editors-modelers/#editors-modelers","text":"","title":"Editors & Modelers"},{"location":"overview/editors-modelers/#editors-list","text":"Monaco - the editor that powers VS Code .","title":"Editors List"},{"location":"overview/editors-modelers/#modelers-list","text":"Entity Data Modeler - design a domain model. Database Schema Modeler - desing a database schema. BPMN Modeler - design a business process. Form Designer - design a Web form.","title":"Modelers List"},{"location":"overview/engines/","text":"Engines Engines List Javascript GraalVM JS - a Javascript module based on the GraalVM JS engine. Web - serving the static content via the underlying web container's capabilities e.g. Apache Tomcat . Wiki Markdown - a Wiki engine supporting Markdown markup language and uses the Mylyn underlying framework. BPM - a BPMN specification supporting engine Flowable . OData - expose OData services from database tables/views. Command - execute shell commands and bash scripts. Deprecated Javascript Rhino - a Javascript module based on the Mozilla Rhino engine. Javascript Nashorn - a Javascript module based on the built-in Java Nashorn engine. Javascript V8 - a Javascript module based on the Chrome V8 engine.","title":"Engines"},{"location":"overview/engines/#engines","text":"","title":"Engines"},{"location":"overview/engines/#engines-list","text":"Javascript GraalVM JS - a Javascript module based on the GraalVM JS engine. Web - serving the static content via the underlying web container's capabilities e.g. Apache Tomcat . Wiki Markdown - a Wiki engine supporting Markdown markup language and uses the Mylyn underlying framework. BPM - a BPMN specification supporting engine Flowable . OData - expose OData services from database tables/views. Command - execute shell commands and bash scripts.","title":"Engines List"},{"location":"overview/engines/#deprecated","text":"Javascript Rhino - a Javascript module based on the Mozilla Rhino engine. Javascript Nashorn - a Javascript module based on the built-in Java Nashorn engine. Javascript V8 - a Javascript module based on the Chrome V8 engine.","title":"Deprecated"},{"location":"overview/faq/","text":"If you have a question that is not covered here, but it should be, please let us know . Concepts In-System Development In-System Development is a programming model used when you work directly on a live system. Avoid side-effects of a simulated (local) environment by working on a live system. Access live data via the same channel which will be used in production. All the dependencies and integrations are on place as they will be in production. Shortest development turn-around time. Short life-cycle management process. Vertical Scenarios & Horizontal Scaling Covering end-to-end scenarios including all the application layers from architecture perspective as well as all the development process phases from project management perspective. All or nothing \u2013 partial doesn't count. Equal runtime instances based on a single content package for simple and reliable management. Content-Centric & Centralized Repository All application artifacts are in a single repository. Operational repository vs SCM repository. During development process is used IO optimized repository. After the code is ready it is committed to SCM - version, inspection and support optimized repository. Simple life-cycle management and transport. Workspace, Public Registry separation based on the development life-cycle phases. Dynamic Languages Perfect match to Dynamic Applications - built for change. Can interpret (rather than compile) the execution of tasks. Existing smooth integration within the web servers. No restart required. Java is used for the core components of the platform, while JavaScript is for the application business logic (the glue code). Injected Services Available out-of-the-box for developers \u2013 request, response, datasource, http, CMIS storage, BPMN engine, wiki, indexer, user, etc. Standardized API for cloud developers. Different language's implementations are possible integrated via the extension point. Different provider's implementations can be exposed to developers on their cloud. Integration Services Why integration services are part of the core? Cloud applications usually are extensions to a packaged software (on-premise or on-demand). Re-use of 3-rd party services is very often in this context. Replication use-case - major scenario for on-premise to on-demand cross-platform applications. Scheduled jobs as asynchronous activities usually needed. Semantic separation of integration and orchestration services from the other general purpose services. Extensibility Why is the extensibility important and for whom? Software vendor's code vs customer's specific extension's code. Update and Upgrade issues. Business agility depends on the process change -ability. Bilateral extension-points and extensions descriptors. Web IDE Why it looks like Eclipse in a web browser? Why not more webby style? Lower barrier for Eclipse developers. Overall experience comfortable for developers proven for years from on-premise tools. Using of Resource like API and concepts. There are some themes you can choose from the menu for more \"webby\" look and feel. Decisions GraalJS Why GraalJS ? What about Rhino, Nashorn and V8? Mature engine with the best performance. Built-in debugger with simple API. Possibility to invoke standard Java objects directly, which is not recommended of course. Angular, Bootstrap & GoldenLayout Why moved from RAP to Angular, Bootstrap, GoldenLayout web frameworks? RAP is an Eclipse framework providing a rendering of the user interface for standard SWT/JFace widgets remotely e.g. in a browser. It brings for us: RAP is a mature framework and depends on a reliable API, but not so attractive for pure web developers (HTML, JavaScript, etc.). RAP is a stable framework with great support, but also it could be said for Angular 1.x and Bootstrap 3.x RAP rely on the standard modularization \u2013 OSGi, plugins, but comes with the complexity of Maven, Tycho, OSGi, Orbit, etc. integration. In RAP developers can write mostly in pure Java with all the benefits it brings by itself, but for web developers it turns out it is not a benefit, but a drawback. In RAP one can have a single sourcing components - reuse of existing functionality written as Eclipse plugins, which has never happen in the reality. RAP has possibility to integrate non-Java modules as well (pure client side HTML and JavaScript) via the browser component, but it is much more complex than pure web coding. JSON Models Why JSON for models? JSON is very simple data exchange format. We have chosen it for the standard format for all the models. Simple enough and human readable/writable. Support by mature frameworks for parsing/serializing. Quite popular and proved in web applications context. Flat Data Models Why flat data models? Proved by many business applications for years. Straight forward implementation on relational-database. Easy to be understood and used by the developers. Tools for it are also simple and easy to use. REST Why REST instead of server-side generation? We leverage the use of REST paradigm for the cloud applications created with the toolkit. There are quite enough reasons for these already well described in blogs related to Web 2.0. Clean separation of the data services from the user interface. Independent development of both including easy mocking. Possibility of reuse and/or composition of services in different user interfaces. Possibility of UI-less integration if needed. Better operations and support. Publish Why Publish? Developers can work safely on multiple workspaces. \"Publish\" transfers the artifacts to the central registry space for public use. One-Time-Generation Why one-time-generation? It is enough to boost productivity in some cases. MDA is also supported via Entity Data Modeler. No OSGi OSGi is the only real modularization framework for Java, but comes with much more complexity than needed for our case. We moved from OSGi to build only simple Maven dependency management with Java Services and Guice for runtime injections for the backend. How to How to build my own Dirigible? It is a standard Maven based project, so: git clone cd dirigible mvn clean install should work. How to add my own templates? It is quite easy - create a project with layout similar to ones from DirigibleLabs How to integrate my Java framework? It is even simpler - add it during the packaging phase as a regular Maven module to be packaged in the WAR or the executable JAR files. How to register my Enterprise JavaScript API? Once you make the your core framework available as a Maven module packaged into your WAR file, you can implement your own Enterprise JavaScript API facade. How to integrate my non-Java framework? It depends on the particular framework. Usually, it is via the Command feature. Please, contact us in case of interest. How to integrate my dynamic language? There is an Engine API which can be implemented, as well as a REST service which can execute the code. Warning Please, contact us if you plan such an integration.","title":"FAQ"},{"location":"overview/faq/#concepts","text":"In-System Development In-System Development is a programming model used when you work directly on a live system. Avoid side-effects of a simulated (local) environment by working on a live system. Access live data via the same channel which will be used in production. All the dependencies and integrations are on place as they will be in production. Shortest development turn-around time. Short life-cycle management process. Vertical Scenarios & Horizontal Scaling Covering end-to-end scenarios including all the application layers from architecture perspective as well as all the development process phases from project management perspective. All or nothing \u2013 partial doesn't count. Equal runtime instances based on a single content package for simple and reliable management. Content-Centric & Centralized Repository All application artifacts are in a single repository. Operational repository vs SCM repository. During development process is used IO optimized repository. After the code is ready it is committed to SCM - version, inspection and support optimized repository. Simple life-cycle management and transport. Workspace, Public Registry separation based on the development life-cycle phases. Dynamic Languages Perfect match to Dynamic Applications - built for change. Can interpret (rather than compile) the execution of tasks. Existing smooth integration within the web servers. No restart required. Java is used for the core components of the platform, while JavaScript is for the application business logic (the glue code). Injected Services Available out-of-the-box for developers \u2013 request, response, datasource, http, CMIS storage, BPMN engine, wiki, indexer, user, etc. Standardized API for cloud developers. Different language's implementations are possible integrated via the extension point. Different provider's implementations can be exposed to developers on their cloud. Integration Services Why integration services are part of the core? Cloud applications usually are extensions to a packaged software (on-premise or on-demand). Re-use of 3-rd party services is very often in this context. Replication use-case - major scenario for on-premise to on-demand cross-platform applications. Scheduled jobs as asynchronous activities usually needed. Semantic separation of integration and orchestration services from the other general purpose services. Extensibility Why is the extensibility important and for whom? Software vendor's code vs customer's specific extension's code. Update and Upgrade issues. Business agility depends on the process change -ability. Bilateral extension-points and extensions descriptors. Web IDE Why it looks like Eclipse in a web browser? Why not more webby style? Lower barrier for Eclipse developers. Overall experience comfortable for developers proven for years from on-premise tools. Using of Resource like API and concepts. There are some themes you can choose from the menu for more \"webby\" look and feel.","title":"Concepts"},{"location":"overview/faq/#decisions","text":"GraalJS Why GraalJS ? What about Rhino, Nashorn and V8? Mature engine with the best performance. Built-in debugger with simple API. Possibility to invoke standard Java objects directly, which is not recommended of course. Angular, Bootstrap & GoldenLayout Why moved from RAP to Angular, Bootstrap, GoldenLayout web frameworks? RAP is an Eclipse framework providing a rendering of the user interface for standard SWT/JFace widgets remotely e.g. in a browser. It brings for us: RAP is a mature framework and depends on a reliable API, but not so attractive for pure web developers (HTML, JavaScript, etc.). RAP is a stable framework with great support, but also it could be said for Angular 1.x and Bootstrap 3.x RAP rely on the standard modularization \u2013 OSGi, plugins, but comes with the complexity of Maven, Tycho, OSGi, Orbit, etc. integration. In RAP developers can write mostly in pure Java with all the benefits it brings by itself, but for web developers it turns out it is not a benefit, but a drawback. In RAP one can have a single sourcing components - reuse of existing functionality written as Eclipse plugins, which has never happen in the reality. RAP has possibility to integrate non-Java modules as well (pure client side HTML and JavaScript) via the browser component, but it is much more complex than pure web coding. JSON Models Why JSON for models? JSON is very simple data exchange format. We have chosen it for the standard format for all the models. Simple enough and human readable/writable. Support by mature frameworks for parsing/serializing. Quite popular and proved in web applications context. Flat Data Models Why flat data models? Proved by many business applications for years. Straight forward implementation on relational-database. Easy to be understood and used by the developers. Tools for it are also simple and easy to use. REST Why REST instead of server-side generation? We leverage the use of REST paradigm for the cloud applications created with the toolkit. There are quite enough reasons for these already well described in blogs related to Web 2.0. Clean separation of the data services from the user interface. Independent development of both including easy mocking. Possibility of reuse and/or composition of services in different user interfaces. Possibility of UI-less integration if needed. Better operations and support. Publish Why Publish? Developers can work safely on multiple workspaces. \"Publish\" transfers the artifacts to the central registry space for public use. One-Time-Generation Why one-time-generation? It is enough to boost productivity in some cases. MDA is also supported via Entity Data Modeler. No OSGi OSGi is the only real modularization framework for Java, but comes with much more complexity than needed for our case. We moved from OSGi to build only simple Maven dependency management with Java Services and Guice for runtime injections for the backend.","title":"Decisions"},{"location":"overview/faq/#how-to","text":"How to build my own Dirigible? It is a standard Maven based project, so: git clone cd dirigible mvn clean install should work. How to add my own templates? It is quite easy - create a project with layout similar to ones from DirigibleLabs How to integrate my Java framework? It is even simpler - add it during the packaging phase as a regular Maven module to be packaged in the WAR or the executable JAR files. How to register my Enterprise JavaScript API? Once you make the your core framework available as a Maven module packaged into your WAR file, you can implement your own Enterprise JavaScript API facade. How to integrate my non-Java framework? It depends on the particular framework. Usually, it is via the Command feature. Please, contact us in case of interest. How to integrate my dynamic language? There is an Engine API which can be implemented, as well as a REST service which can execute the code. Warning Please, contact us if you plan such an integration.","title":"How to"},{"location":"overview/features/","text":"Features Note The feature set listed bellow contains only the major part of what is currently available. For more insights on what can be done with Eclipse Dirigible, we recommend to try it out . Data Structures Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing. Scripting Services Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support for Typescript services ( *.ts ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers. Web Content Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc. Wiki Content Support of Markdown format for Wiki pages. Integration Services Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ). Mobile Applications Support of native mobile application development via Tabris.js . Extension Definitions Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ). Tooling Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS Modeling Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer Security Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly) Registry Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Features"},{"location":"overview/features/#features","text":"Note The feature set listed bellow contains only the major part of what is currently available. For more insights on what can be done with Eclipse Dirigible, we recommend to try it out .","title":"Features"},{"location":"overview/features/#data-structures","text":"Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing.","title":"Data Structures"},{"location":"overview/features/#scripting-services","text":"Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support for Typescript services ( *.ts ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers.","title":"Scripting Services"},{"location":"overview/features/#web-content","text":"Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc.","title":"Web Content"},{"location":"overview/features/#wiki-content","text":"Support of Markdown format for Wiki pages.","title":"Wiki Content"},{"location":"overview/features/#integration-services","text":"Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ).","title":"Integration Services"},{"location":"overview/features/#mobile-applications","text":"Support of native mobile application development via Tabris.js .","title":"Mobile Applications"},{"location":"overview/features/#extension-definitions","text":"Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ).","title":"Extension Definitions"},{"location":"overview/features/#tooling","text":"Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS","title":"Tooling"},{"location":"overview/features/#modeling","text":"Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer","title":"Modeling"},{"location":"overview/features/#security","text":"Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly)","title":"Security"},{"location":"overview/features/#registry","text":"Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Registry"},{"location":"overview/license/","text":"License The Dirigible project source code base is provided under the Eclipse Public License - v 2.0 Eclipse Public License - v 2.0 THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE PUBLIC LICENSE (\"AGREEMENT\"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT. 1. DEFINITIONS \"Contribution\" means: a) in the case of the initial Contributor, the initial code and documentation distributed under this Agreement, and b) in the case of each subsequent Contributor: i) changes to the Program, and ii) additions to the Program; where such changes and/or additions to the Program originate from and are distributed by that particular Contributor. A Contribution 'originates' from a Contributor if it was added to the Program by such Contributor itself or anyone acting on such Contributor's behalf. Contributions do not include additions to the Program which: (i) are separate modules of software distributed in conjunction with the Program under their own license agreement, and (ii) are not derivative works of the Program. \"Contributor\" means any person or entity that distributes the Program. \"Licensed Patents\" mean patent claims licensable by a Contributor which are necessarily infringed by the use or sale of its Contribution alone or when combined with the Program. \"Program\" means the Contributions distributed in accordance with this Agreement. \"Recipient\" means anyone who receives the Program under this Agreement, including all Contributors. 2. GRANT OF RIGHTS a) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, distribute and sublicense the Contribution of such Contributor, if any, and such derivative works, in source code and object code form. b) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free patent license under Licensed Patents to make, use, sell, offer to sell, import and otherwise transfer the Contribution of such Contributor, if any, in source code and object code form. This patent license shall apply to the combination of the Contribution and the Program if, at the time the Contribution is added by the Contributor, such addition of the Contribution causes such combination to be covered by the Licensed Patents. The patent license shall not apply to any other combinations which include the Contribution. No hardware per se is licensed hereunder. c) Recipient understands that although each Contributor grants the licenses to its Contributions set forth herein, no assurances are provided by any Contributor that the Program does not infringe the patent or other intellectual property rights of any other entity. Each Contributor disclaims any liability to Recipient for claims brought by any other entity based on infringement of intellectual property rights or otherwise. As a condition to exercising the rights and licenses granted hereunder, each Recipient hereby assumes sole responsibility to secure any other intellectual property rights needed, if any. For example, if a third party patent license is required to allow Recipient to distribute the Program, it is Recipient's responsibility to acquire that license before distributing the Program. d) Each Contributor represents that to its knowledge it has sufficient copyright rights in its Contribution, if any, to grant the copyright license set forth in this Agreement. 3. REQUIREMENTS A Contributor may choose to distribute the Program in object code form under its own license agreement, provided that: a) it complies with the terms and conditions of this Agreement; and b) its license agreement: i) effectively disclaims on behalf of all Contributors all warranties and conditions, express and implied, including warranties or conditions of title and non-infringement, and implied warranties or conditions of merchantability and fitness for a particular purpose; ii) effectively excludes on behalf of all Contributors all liability for damages, including direct, indirect, special, incidental and consequential damages, such as lost profits; iii) states that any provisions which differ from this Agreement are offered by that Contributor alone and not by any other party; and iv) states that source code for the Program is available from such Contributor, and informs licensees how to obtain it in a reasonable manner on or through a medium customarily used for software exchange. When the Program is made available in source code form: a) it must be made available under this Agreement; and b) a copy of this Agreement must be included with each copy of the Program. Contributors may not remove or alter any copyright notices contained within the Program. Each Contributor must identify itself as the originator of its Contribution, if any, in a manner that reasonably allows subsequent Recipients to identify the originator of the Contribution. 4. COMMERCIAL DISTRIBUTION Commercial distributors of software may accept certain responsibilities with respect to end users, business partners and the like. While this license is intended to facilitate the commercial use of the Program, the Contributor who includes the Program in a commercial product offering should do so in a manner which does not create potential liability for other Contributors. Therefore, if a Contributor includes the Program in a commercial product offering, such Contributor (\"Commercial Contributor\") hereby agrees to defend and indemnify every other Contributor (\"Indemnified Contributor\") against any losses, damages and costs (collectively \"Losses\") arising from claims, lawsuits and other legal actions brought by a third party against the Indemnified Contributor to the extent caused by the acts or omissions of such Commercial Contributor in connection with its distribution of the Program in a commercial product offering. The obligations in this section do not apply to any claims or Losses relating to any actual or alleged intellectual property infringement. In order to qualify, an Indemnified Contributor must: a) promptly notify the Commercial Contributor in writing of such claim, and b) allow the Commercial Contributor to control, and cooperate with the Commercial Contributor in, the defense and any related settlement negotiations. The Indemnified Contributor may participate in any such claim at its own expense. For example, a Contributor might include the Program in a commercial product offering, Product X. That Contributor is then a Commercial Contributor. If that Commercial Contributor then makes performance claims, or offers warranties related to Product X, those performance claims and warranties are such Commercial Contributor's responsibility alone. Under this section, the Commercial Contributor would have to defend claims against the other Contributors related to those performance claims and warranties, and if a court requires any other Contributor to pay any damages as a result, the Commercial Contributor must pay those damages. 5. NO WARRANTY EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON AN \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the appropriateness of using and distributing the Program and assumes all risks associated with its exercise of rights under this Agreement , including but not limited to the risks and costs of program errors, compliance with applicable laws, damage to or loss of data, programs or equipment, and unavailability or interruption of operations. 6. DISCLAIMER OF LIABILITY EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 7. GENERAL If any provision of this Agreement is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this Agreement, and without further action by the parties hereto, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable. If Recipient institutes patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Program itself (excluding combinations of the Program with other software or hardware) infringes such Recipient's patent(s), then such Recipient's rights granted under Section 2(b) shall terminate as of the date such litigation is filed. All Recipient's rights under this Agreement shall terminate if it fails to comply with any of the material terms or conditions of this Agreement and does not cure such failure in a reasonable period of time after becoming aware of such noncompliance. If all Recipient's rights under this Agreement terminate, Recipient agrees to cease use and distribution of the Program as soon as reasonably practicable. However, Recipient's obligations under this Agreement and any licenses granted by Recipient relating to the Program shall continue and survive. Everyone is permitted to copy and distribute copies of this Agreement, but in order to avoid inconsistency the Agreement is copyrighted and may only be modified in the following manner. The Agreement Steward reserves the right to publish new versions (including revisions) of this Agreement from time to time. No one other than the Agreement Steward has the right to modify this Agreement. The Eclipse Foundation is the initial Agreement Steward. The Eclipse Foundation may assign the responsibility to serve as the Agreement Steward to a suitable separate entity. Each new version of the Agreement will be given a distinguishing version number. The Program (including Contributions) may always be distributed subject to the version of the Agreement under which it was received. In addition, after a new version of the Agreement is published, Contributor may elect to distribute the Program (including its Contributions) under the new version. Except as expressly stated in Sections 2(a) and 2(b) above, Recipient receives no rights or licenses to the intellectual property of any Contributor under this Agreement, whether expressly, by implication, estoppel or otherwise. All rights in the Program not expressly granted under this Agreement are reserved. This Agreement is governed by the laws of the State of New York and the intellectual property laws of the United States of America. No party to this Agreement will bring a legal action under this Agreement more than one year after the cause of action arose. Each party waives its rights to a jury trial in any resulting litigation.","title":"License"},{"location":"overview/license/#license","text":"The Dirigible project source code base is provided under the Eclipse Public License - v 2.0","title":"License"},{"location":"overview/license/#eclipse-public-license-v-20","text":"THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE PUBLIC LICENSE (\"AGREEMENT\"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT.","title":"Eclipse Public License - v 2.0"},{"location":"overview/license/#1-definitions","text":"\"Contribution\" means: a) in the case of the initial Contributor, the initial code and documentation distributed under this Agreement, and b) in the case of each subsequent Contributor: i) changes to the Program, and ii) additions to the Program; where such changes and/or additions to the Program originate from and are distributed by that particular Contributor. A Contribution 'originates' from a Contributor if it was added to the Program by such Contributor itself or anyone acting on such Contributor's behalf. Contributions do not include additions to the Program which: (i) are separate modules of software distributed in conjunction with the Program under their own license agreement, and (ii) are not derivative works of the Program. \"Contributor\" means any person or entity that distributes the Program. \"Licensed Patents\" mean patent claims licensable by a Contributor which are necessarily infringed by the use or sale of its Contribution alone or when combined with the Program. \"Program\" means the Contributions distributed in accordance with this Agreement. \"Recipient\" means anyone who receives the Program under this Agreement, including all Contributors.","title":"1. DEFINITIONS"},{"location":"overview/license/#2-grant-of-rights","text":"a) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, distribute and sublicense the Contribution of such Contributor, if any, and such derivative works, in source code and object code form. b) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free patent license under Licensed Patents to make, use, sell, offer to sell, import and otherwise transfer the Contribution of such Contributor, if any, in source code and object code form. This patent license shall apply to the combination of the Contribution and the Program if, at the time the Contribution is added by the Contributor, such addition of the Contribution causes such combination to be covered by the Licensed Patents. The patent license shall not apply to any other combinations which include the Contribution. No hardware per se is licensed hereunder. c) Recipient understands that although each Contributor grants the licenses to its Contributions set forth herein, no assurances are provided by any Contributor that the Program does not infringe the patent or other intellectual property rights of any other entity. Each Contributor disclaims any liability to Recipient for claims brought by any other entity based on infringement of intellectual property rights or otherwise. As a condition to exercising the rights and licenses granted hereunder, each Recipient hereby assumes sole responsibility to secure any other intellectual property rights needed, if any. For example, if a third party patent license is required to allow Recipient to distribute the Program, it is Recipient's responsibility to acquire that license before distributing the Program. d) Each Contributor represents that to its knowledge it has sufficient copyright rights in its Contribution, if any, to grant the copyright license set forth in this Agreement.","title":"2. GRANT OF RIGHTS"},{"location":"overview/license/#3-requirements","text":"A Contributor may choose to distribute the Program in object code form under its own license agreement, provided that: a) it complies with the terms and conditions of this Agreement; and b) its license agreement: i) effectively disclaims on behalf of all Contributors all warranties and conditions, express and implied, including warranties or conditions of title and non-infringement, and implied warranties or conditions of merchantability and fitness for a particular purpose; ii) effectively excludes on behalf of all Contributors all liability for damages, including direct, indirect, special, incidental and consequential damages, such as lost profits; iii) states that any provisions which differ from this Agreement are offered by that Contributor alone and not by any other party; and iv) states that source code for the Program is available from such Contributor, and informs licensees how to obtain it in a reasonable manner on or through a medium customarily used for software exchange. When the Program is made available in source code form: a) it must be made available under this Agreement; and b) a copy of this Agreement must be included with each copy of the Program. Contributors may not remove or alter any copyright notices contained within the Program. Each Contributor must identify itself as the originator of its Contribution, if any, in a manner that reasonably allows subsequent Recipients to identify the originator of the Contribution.","title":"3. REQUIREMENTS"},{"location":"overview/license/#4-commercial-distribution","text":"Commercial distributors of software may accept certain responsibilities with respect to end users, business partners and the like. While this license is intended to facilitate the commercial use of the Program, the Contributor who includes the Program in a commercial product offering should do so in a manner which does not create potential liability for other Contributors. Therefore, if a Contributor includes the Program in a commercial product offering, such Contributor (\"Commercial Contributor\") hereby agrees to defend and indemnify every other Contributor (\"Indemnified Contributor\") against any losses, damages and costs (collectively \"Losses\") arising from claims, lawsuits and other legal actions brought by a third party against the Indemnified Contributor to the extent caused by the acts or omissions of such Commercial Contributor in connection with its distribution of the Program in a commercial product offering. The obligations in this section do not apply to any claims or Losses relating to any actual or alleged intellectual property infringement. In order to qualify, an Indemnified Contributor must: a) promptly notify the Commercial Contributor in writing of such claim, and b) allow the Commercial Contributor to control, and cooperate with the Commercial Contributor in, the defense and any related settlement negotiations. The Indemnified Contributor may participate in any such claim at its own expense. For example, a Contributor might include the Program in a commercial product offering, Product X. That Contributor is then a Commercial Contributor. If that Commercial Contributor then makes performance claims, or offers warranties related to Product X, those performance claims and warranties are such Commercial Contributor's responsibility alone. Under this section, the Commercial Contributor would have to defend claims against the other Contributors related to those performance claims and warranties, and if a court requires any other Contributor to pay any damages as a result, the Commercial Contributor must pay those damages.","title":"4. COMMERCIAL DISTRIBUTION"},{"location":"overview/license/#5-no-warranty","text":"EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON AN \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the appropriateness of using and distributing the Program and assumes all risks associated with its exercise of rights under this Agreement , including but not limited to the risks and costs of program errors, compliance with applicable laws, damage to or loss of data, programs or equipment, and unavailability or interruption of operations.","title":"5. NO WARRANTY"},{"location":"overview/license/#6-disclaimer-of-liability","text":"EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.","title":"6. DISCLAIMER OF LIABILITY"},{"location":"overview/license/#7-general","text":"If any provision of this Agreement is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this Agreement, and without further action by the parties hereto, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable. If Recipient institutes patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Program itself (excluding combinations of the Program with other software or hardware) infringes such Recipient's patent(s), then such Recipient's rights granted under Section 2(b) shall terminate as of the date such litigation is filed. All Recipient's rights under this Agreement shall terminate if it fails to comply with any of the material terms or conditions of this Agreement and does not cure such failure in a reasonable period of time after becoming aware of such noncompliance. If all Recipient's rights under this Agreement terminate, Recipient agrees to cease use and distribution of the Program as soon as reasonably practicable. However, Recipient's obligations under this Agreement and any licenses granted by Recipient relating to the Program shall continue and survive. Everyone is permitted to copy and distribute copies of this Agreement, but in order to avoid inconsistency the Agreement is copyrighted and may only be modified in the following manner. The Agreement Steward reserves the right to publish new versions (including revisions) of this Agreement from time to time. No one other than the Agreement Steward has the right to modify this Agreement. The Eclipse Foundation is the initial Agreement Steward. The Eclipse Foundation may assign the responsibility to serve as the Agreement Steward to a suitable separate entity. Each new version of the Agreement will be given a distinguishing version number. The Program (including Contributions) may always be distributed subject to the version of the Agreement under which it was received. In addition, after a new version of the Agreement is published, Contributor may elect to distribute the Program (including its Contributions) under the new version. Except as expressly stated in Sections 2(a) and 2(b) above, Recipient receives no rights or licenses to the intellectual property of any Contributor under this Agreement, whether expressly, by implication, estoppel or otherwise. All rights in the Program not expressly granted under this Agreement are reserved. This Agreement is governed by the laws of the State of New York and the intellectual property laws of the United States of America. No party to this Agreement will bring a legal action under this Agreement more than one year after the cause of action arose. Each party waives its rights to a jury trial in any resulting litigation.","title":"7. GENERAL"},{"location":"overview/runtime-services/","text":"Runtime Services There are several REST services available at runtime, which can give you another communication channel with Dirigible containers.","title":"Runtime Services"},{"location":"overview/runtime-services/#runtime-services","text":"There are several REST services available at runtime, which can give you another communication channel with Dirigible containers.","title":"Runtime Services"},{"location":"setup/","text":"Setup in Tomcat Deploy Eclipse Dirigible in Apache Tomcat web container. In this case the built-in H2 database is used. Prerequisites Download the Tomcat binary . More information about how to deploy on Tomcat can be found here . JDK 11 or JDK 13 - OpenJDK versions can be found here . macOS Linux Windows Install ttyd : brew install ttyd Linux support is built-in. More info about ttyd can be found at: ttyd You may experience certain functional limitations, if you decide to run the Web IDE locally on Windows using Tomcat: Limitations related to the Create symbolic links policy . Some tests in local builds of Dirigible may fail on Windows due to the same policy restriction. You may grant your user account access to create symbolic links by editing the policy: Go to (WIN + R) > gpedit.msc Navigate to: Computer Configuration -> Windows Settings -> Security Settings -> Local Policies -> User Rights Assignment -> Create Symbolic links . Add your Windows user account to the policy. Note : Editing this policy may make your machine vulnerable to symbolic link attacks as noted here . Alternative of the Windows setup would be to follow the Setup as a Docker Image . Some parts of Dirigible are sensitive to line endings, and assume Unix-style newlines. Git on Windows may attempt to switch files to use a Windows-style CR/LF endings, which will cause problems when building and running Dirigible on Windows. In order to prevent this, git should be instructed to preserve the line endings of files. From a command prompt, type git config core.autocrlf . If the result is not false , change it with git config core.autocrlf false . Steps Download ROOT.war for Tomcat from: download.dirigible.io Note For local test & development purposes, we recommend the server-all distribution. Configure the Users store under $CATALINA_HOME/conf/tomcat-users.xml : Copy the Dirigible's ROOT.war to $TOMCAT/webapps folder. Configure the target Database setup, if needed: Local (H2) PostgreSQL MySQL HANA Sybase ASE No additional setup is needed. Install postgresql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install postgresql postgresql-contrib Create a default database for Eclipse Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: psql dirigible_database create user dirigible_system with password 'dirigible1234'; grant all on database dirigible_database to dirigible_system; Datasource configuration: Download the postgresql JDBC driver version 4.1 from here . Copy the postgresql-*.jar file to the /lib directory. Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=POSTGRES export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=POSTGRES export POSTGRES_DRIVER=org.postgresql.Driver export POSTGRES_URL=jdbc:postgresql://localhost:5432/dirigible_database export POSTGRES_USERNAME=dirigible_system export POSTGRES_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=org.postgresql.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:postgresql://localhost:5432/dirigible_database export DIRIGIBLE_SCHEDULER_DATABASE_USER=dirigible_system export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=true export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=true export DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE=true Install mysql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install mysql-server sudo mysql\\_install\\_db sudo /usr/bin/mysql\\_secure\\_installation Create the default database for Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: mysql -u root -p CREATE DATABASE dirigible_database; CREATE USER 'dirigible_system'@'localhost' IDENTIFIED BY 'dirigible1234'; GRANT ALL PRIVILEGES ON dirigible_database.* TO 'dirigible_system'@'localhost' WITH GRANT OPTION; Datasource configuration: Download the mysql JDBC driver version 5.1 from here . Copy the mysql-*.jar file to the /lib directory. Open the file /conf/context.xml and add the following within the context: web.xml - make sure the initial parameter jndiDefaultDataSource is uncommented: jndiDefaultDataSource java:comp/env/jdbc/DefaultDB Also, the initial parameter jdbcAutoCommit must be set to false (by default). jdbcAutoCommit false The type of the datasource is jndi instead of local . defaultDataSourceType jndi Lastly, the resource reference for the datasource has to be uncommented. jdbc/DefaultDB javax.sql.DataSource Container Install HANA Express . Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=HANA export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=HANA export HANA_DRIVER=com.sap.db.jdbc.Driver export HANA_URL=jdbc:sap://: export HANA_USERNAME= export HANA_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sap.db.jdbc.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:sap://: export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=false Remember to replace the , , , placeholders. How to setup a test environment on Amazon: Select Image Size: t2.medium Security Group: TCP Custom, 5000 Download Sybase ASE Express from here . Transfer: scp -i dirigible-aws.pem ASE_Suite.linuxamd64.tgz ec2-user@:~ scp -i dirigible-aws.pem apache-tomcat-XXX.zip ec2-user@:~ scp -i dirigible-aws.pem ROOT.war ec2-user@:~ scp -i dirigible-aws.pem jdk-8u144-linux-x64.tar.gz ec2-user@:~ Prepare OS: sudo mkdir -p /opt/sybase sudo mkdir -p /var/sybase sudo groupadd sybase sudo useradd -g sybase -d /opt/sybase sybase sudo passwd sybase sudo chown sybase:sybase /opt/sybase sudo chown sybase:sybase /var/sybase Login: ssh ec2-user@ -i dirigible-aws.pem Setup: su - sybase mkdir install cd install cp /home/ec2-user/ASE_Suite.linuxamd64.tgz . tar -xvf ASE_Suite.linuxamd64.tgz ./setup.bin -i console Parameters: Choose Install Folder -> use: /opt/sybase Choose Install Set -> 1- Typical Software License Type Selection -> 2- Install Express Edition of SAP Adaptive Server Enterprise End-user License Agreement -> 1) All regions Configure New Servers -> [X] 1 - Configure new SAP ASE Configure Servers with Different User Account -> 2- No SAP ASE Name ASE160 System Administrator's Password ****** Enable SAP ASE for SAP ASE Cockpit monitoring false Technical user tech_user Technical user password ******** Host Name ip-.eu-central-1.comp Port Number 5000 Application Type Mixed (OLTP/DSS) Create sample databases false Page Size 4k Error Log /opt/sybase/ASE-16_0/install/ASE1 Default Language Default Character Set Default Sort Order Master Device /opt/sybase/data/master.dat Master Device Size (MB) 500 Master Database Size (MB) 250 System Procedure Device /opt/sybase/data/sysprocs.dat System Procedure Device Size (MB) 500 System Procedure Database Size (MB) 500 System Device /opt/sybase/data/sybsysdb.dat System Device Size (MB) 100 System Database Size (MB) 100 Tempdb Device /opt/sybase/data/tempdbdev.dat Tempdb Device Size (MB) 1000 Tempdb Database Size (MB) 1000 Enable PCI false Optimize SAP ASE Configuration false Show Servers: /opt/sybase/ASE-16_0/install/showserver Prepare Test Environment: cd /opt/sybase/install cp /home/ec2-user/apache-tomcat-XXX.zip . cp /home/ec2-user/jdk-8u144-linux-x64.tar.gz . unzip apache-tomcat-XXX.zip tar -xvf jdk-8u144-linux-x64.tar.gz export JAVA_HOME=/opt/sybase/install/jdk1.8.0_144 Add the provided JDBC driver to the lib folder: cp /opt/sybase/shared/lib/jconn4.jar /home/ec2-user/apache-tomcat-XXX/lib Useful actions in case of issues: Start Server: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/sybase/OCS-16_0/lib3p64 export LANG=C cd /opt/sybase/ASE-16_0/bin ./startserver -f /opt/sybase/ASE-16_0/install/RUN_ASE160 Stop Server: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 shutdown with nowait go Kill Hanging Requests: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 sp_who go kill spid Uninstall: cd /opt/sybase/sybuninstall/ASESuite ./uninstall -i console Set the environment variables export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=SYBASE export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=SYBASE export SYBASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export SYBASE_URL=jdbc:sybase:Tds::?ServiceName= export SYBASE_USERNAME= export SYBASE_PASSWORD= export SYBASE_CONNECTION_PROPERTIES=\"DYNAMIC_PREPARE=true;SSL_TRUST_ALL_CERTS=true;JCONNECT_VERSION=0;ENABLE_SSL=true;\" export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export DIRIGIBLE_SCHEDULER_DATABASE_URL=\"jdbc:sybase:Tds::?ServiceName=&DYNAMIC_PREPARE=true&JCONNECT_VERSION=0&ENABLE_SSL=true&SSL_TRUST_ALL_CERTS=true\" export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.SybaseDelegate Remember to replace the , , , placeholders._ Start the Tomcat server. Open a web browser and go to: http://localhost:8080/ Note The default user name and password are dirigible/dirigible Manager App In case you want to use Apache Tomcat's Manager App to deploy the ROOT.war file, you have to increase the file size limit for upload (e.g. to 200MB): conf\\server.xml webapps\\manager\\WEB-INF\\web.xml ... ... 0 209715200 209715200 ... ... ","title":"Tomcat"},{"location":"setup/#setup-in-tomcat","text":"Deploy Eclipse Dirigible in Apache Tomcat web container. In this case the built-in H2 database is used. Prerequisites Download the Tomcat binary . More information about how to deploy on Tomcat can be found here . JDK 11 or JDK 13 - OpenJDK versions can be found here . macOS Linux Windows Install ttyd : brew install ttyd Linux support is built-in. More info about ttyd can be found at: ttyd You may experience certain functional limitations, if you decide to run the Web IDE locally on Windows using Tomcat: Limitations related to the Create symbolic links policy . Some tests in local builds of Dirigible may fail on Windows due to the same policy restriction. You may grant your user account access to create symbolic links by editing the policy: Go to (WIN + R) > gpedit.msc Navigate to: Computer Configuration -> Windows Settings -> Security Settings -> Local Policies -> User Rights Assignment -> Create Symbolic links . Add your Windows user account to the policy. Note : Editing this policy may make your machine vulnerable to symbolic link attacks as noted here . Alternative of the Windows setup would be to follow the Setup as a Docker Image . Some parts of Dirigible are sensitive to line endings, and assume Unix-style newlines. Git on Windows may attempt to switch files to use a Windows-style CR/LF endings, which will cause problems when building and running Dirigible on Windows. In order to prevent this, git should be instructed to preserve the line endings of files. From a command prompt, type git config core.autocrlf . If the result is not false , change it with git config core.autocrlf false .","title":"Setup in Tomcat"},{"location":"setup/#steps","text":"Download ROOT.war for Tomcat from: download.dirigible.io Note For local test & development purposes, we recommend the server-all distribution. Configure the Users store under $CATALINA_HOME/conf/tomcat-users.xml : Copy the Dirigible's ROOT.war to $TOMCAT/webapps folder. Configure the target Database setup, if needed: Local (H2) PostgreSQL MySQL HANA Sybase ASE No additional setup is needed. Install postgresql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install postgresql postgresql-contrib Create a default database for Eclipse Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: psql dirigible_database create user dirigible_system with password 'dirigible1234'; grant all on database dirigible_database to dirigible_system; Datasource configuration: Download the postgresql JDBC driver version 4.1 from here . Copy the postgresql-*.jar file to the /lib directory. Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=POSTGRES export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=POSTGRES export POSTGRES_DRIVER=org.postgresql.Driver export POSTGRES_URL=jdbc:postgresql://localhost:5432/dirigible_database export POSTGRES_USERNAME=dirigible_system export POSTGRES_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=org.postgresql.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:postgresql://localhost:5432/dirigible_database export DIRIGIBLE_SCHEDULER_DATABASE_USER=dirigible_system export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=true export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=true export DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE=true Install mysql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install mysql-server sudo mysql\\_install\\_db sudo /usr/bin/mysql\\_secure\\_installation Create the default database for Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: mysql -u root -p CREATE DATABASE dirigible_database; CREATE USER 'dirigible_system'@'localhost' IDENTIFIED BY 'dirigible1234'; GRANT ALL PRIVILEGES ON dirigible_database.* TO 'dirigible_system'@'localhost' WITH GRANT OPTION; Datasource configuration: Download the mysql JDBC driver version 5.1 from here . Copy the mysql-*.jar file to the /lib directory. Open the file /conf/context.xml and add the following within the context: web.xml - make sure the initial parameter jndiDefaultDataSource is uncommented: jndiDefaultDataSource java:comp/env/jdbc/DefaultDB Also, the initial parameter jdbcAutoCommit must be set to false (by default). jdbcAutoCommit false The type of the datasource is jndi instead of local . defaultDataSourceType jndi Lastly, the resource reference for the datasource has to be uncommented. jdbc/DefaultDB javax.sql.DataSource Container Install HANA Express . Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=HANA export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=HANA export HANA_DRIVER=com.sap.db.jdbc.Driver export HANA_URL=jdbc:sap://: export HANA_USERNAME= export HANA_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sap.db.jdbc.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:sap://: export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=false Remember to replace the , , , placeholders. How to setup a test environment on Amazon: Select Image Size: t2.medium Security Group: TCP Custom, 5000 Download Sybase ASE Express from here . Transfer: scp -i dirigible-aws.pem ASE_Suite.linuxamd64.tgz ec2-user@:~ scp -i dirigible-aws.pem apache-tomcat-XXX.zip ec2-user@:~ scp -i dirigible-aws.pem ROOT.war ec2-user@:~ scp -i dirigible-aws.pem jdk-8u144-linux-x64.tar.gz ec2-user@:~ Prepare OS: sudo mkdir -p /opt/sybase sudo mkdir -p /var/sybase sudo groupadd sybase sudo useradd -g sybase -d /opt/sybase sybase sudo passwd sybase sudo chown sybase:sybase /opt/sybase sudo chown sybase:sybase /var/sybase Login: ssh ec2-user@ -i dirigible-aws.pem Setup: su - sybase mkdir install cd install cp /home/ec2-user/ASE_Suite.linuxamd64.tgz . tar -xvf ASE_Suite.linuxamd64.tgz ./setup.bin -i console Parameters: Choose Install Folder -> use: /opt/sybase Choose Install Set -> 1- Typical Software License Type Selection -> 2- Install Express Edition of SAP Adaptive Server Enterprise End-user License Agreement -> 1) All regions Configure New Servers -> [X] 1 - Configure new SAP ASE Configure Servers with Different User Account -> 2- No SAP ASE Name ASE160 System Administrator's Password ****** Enable SAP ASE for SAP ASE Cockpit monitoring false Technical user tech_user Technical user password ******** Host Name ip-.eu-central-1.comp Port Number 5000 Application Type Mixed (OLTP/DSS) Create sample databases false Page Size 4k Error Log /opt/sybase/ASE-16_0/install/ASE1 Default Language Default Character Set Default Sort Order Master Device /opt/sybase/data/master.dat Master Device Size (MB) 500 Master Database Size (MB) 250 System Procedure Device /opt/sybase/data/sysprocs.dat System Procedure Device Size (MB) 500 System Procedure Database Size (MB) 500 System Device /opt/sybase/data/sybsysdb.dat System Device Size (MB) 100 System Database Size (MB) 100 Tempdb Device /opt/sybase/data/tempdbdev.dat Tempdb Device Size (MB) 1000 Tempdb Database Size (MB) 1000 Enable PCI false Optimize SAP ASE Configuration false Show Servers: /opt/sybase/ASE-16_0/install/showserver Prepare Test Environment: cd /opt/sybase/install cp /home/ec2-user/apache-tomcat-XXX.zip . cp /home/ec2-user/jdk-8u144-linux-x64.tar.gz . unzip apache-tomcat-XXX.zip tar -xvf jdk-8u144-linux-x64.tar.gz export JAVA_HOME=/opt/sybase/install/jdk1.8.0_144 Add the provided JDBC driver to the lib folder: cp /opt/sybase/shared/lib/jconn4.jar /home/ec2-user/apache-tomcat-XXX/lib Useful actions in case of issues: Start Server: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/sybase/OCS-16_0/lib3p64 export LANG=C cd /opt/sybase/ASE-16_0/bin ./startserver -f /opt/sybase/ASE-16_0/install/RUN_ASE160 Stop Server: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 shutdown with nowait go Kill Hanging Requests: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 sp_who go kill spid Uninstall: cd /opt/sybase/sybuninstall/ASESuite ./uninstall -i console Set the environment variables export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=SYBASE export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=SYBASE export SYBASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export SYBASE_URL=jdbc:sybase:Tds::?ServiceName= export SYBASE_USERNAME= export SYBASE_PASSWORD= export SYBASE_CONNECTION_PROPERTIES=\"DYNAMIC_PREPARE=true;SSL_TRUST_ALL_CERTS=true;JCONNECT_VERSION=0;ENABLE_SSL=true;\" export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export DIRIGIBLE_SCHEDULER_DATABASE_URL=\"jdbc:sybase:Tds::?ServiceName=&DYNAMIC_PREPARE=true&JCONNECT_VERSION=0&ENABLE_SSL=true&SSL_TRUST_ALL_CERTS=true\" export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.SybaseDelegate Remember to replace the , , , placeholders._ Start the Tomcat server. Open a web browser and go to: http://localhost:8080/ Note The default user name and password are dirigible/dirigible","title":"Steps"},{"location":"setup/#manager-app","text":"In case you want to use Apache Tomcat's Manager App to deploy the ROOT.war file, you have to increase the file size limit for upload (e.g. to 200MB): conf\\server.xml webapps\\manager\\WEB-INF\\web.xml ... ... 0 209715200 209715200 ... ... ","title":"Manager App"},{"location":"setup/cloud-foundry/","text":"Setup in Cloud Foundry Deploy Eclipse Dirigible in SAP BTP 1 , Cloud Foundry environment. Prerequisites Install Cloud Foundry Command Line Interface . Access to SAP BTP account (the Trial landscape can be accessed here ). Steps Set the SAP BTP Cloud Foundry API host: cf api Log in to the SAP BTP, Cloud Foundry environment with: cf login Create XSUAA service instance: Copy and paste the following content into xs-security.json : { \"xsappname\" : \"-xsuaa\" , \"tenant-mode\" : \"shared\" , \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your application name, e.g. dirigible . Create a XSUAA service instance: cf create-service xsuaa application -xsuaa -c xs-security.json Note Use the same as in the previous step. Deploy Eclipse Dirigible: Docker Buildpack cf push dirigible \\ --docker-image=dirigiblelabs/dirigible-sap-cf:latest \\ -m 2G -k 2G Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Bind the XSUAA service instance to the Eclipse Dirigible deployment: cf bind-service dirigible -xsuaa Note Replace the placeholder with the application name used in the previous steps. Restart the dirigible deployment: cf restart dirigible Download the sap-cf-all binaries from the downloads site: download.dirigible.io Unzip the downloaded archieve to extract the ROOT.war file. Create manifest.yaml file in the same directory where the ROOT.war is located: applications : - name : dirigible host : dirigible- memory : 2G buildpack : sap_java_buildpack path : ROOT.war env : JBP_CONFIG_COMPONENTS : \"jres: ['com.sap.xs.java.buildpack.jdk.SAPMachineJDK']\" JBP_CONFIG_SAP_MACHINE_JRE : 'jre: { version: 11.+ }' services : - -xsuaa Note Replace the placeholder with your subaccount's Subdomain value. Replace the placeholder with the application name used in the previous steps. Deploy with: cf push Assign the Developer and Operator roles. Log in. Additional Materials Step-by-step tutorial can be found here . SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"Cloud Foundry"},{"location":"setup/cloud-foundry/#setup-in-cloud-foundry","text":"Deploy Eclipse Dirigible in SAP BTP 1 , Cloud Foundry environment. Prerequisites Install Cloud Foundry Command Line Interface . Access to SAP BTP account (the Trial landscape can be accessed here ).","title":"Setup in Cloud Foundry"},{"location":"setup/cloud-foundry/#steps","text":"Set the SAP BTP Cloud Foundry API host: cf api Log in to the SAP BTP, Cloud Foundry environment with: cf login Create XSUAA service instance: Copy and paste the following content into xs-security.json : { \"xsappname\" : \"-xsuaa\" , \"tenant-mode\" : \"shared\" , \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your application name, e.g. dirigible . Create a XSUAA service instance: cf create-service xsuaa application -xsuaa -c xs-security.json Note Use the same as in the previous step. Deploy Eclipse Dirigible: Docker Buildpack cf push dirigible \\ --docker-image=dirigiblelabs/dirigible-sap-cf:latest \\ -m 2G -k 2G Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Bind the XSUAA service instance to the Eclipse Dirigible deployment: cf bind-service dirigible -xsuaa Note Replace the placeholder with the application name used in the previous steps. Restart the dirigible deployment: cf restart dirigible Download the sap-cf-all binaries from the downloads site: download.dirigible.io Unzip the downloaded archieve to extract the ROOT.war file. Create manifest.yaml file in the same directory where the ROOT.war is located: applications : - name : dirigible host : dirigible- memory : 2G buildpack : sap_java_buildpack path : ROOT.war env : JBP_CONFIG_COMPONENTS : \"jres: ['com.sap.xs.java.buildpack.jdk.SAPMachineJDK']\" JBP_CONFIG_SAP_MACHINE_JRE : 'jre: { version: 11.+ }' services : - -xsuaa Note Replace the placeholder with your subaccount's Subdomain value. Replace the placeholder with the application name used in the previous steps. Deploy with: cf push Assign the Developer and Operator roles. Log in. Additional Materials Step-by-step tutorial can be found here . SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"Steps"},{"location":"setup/docker/","text":"Setup as a Docker Image Deploy Eclipse Dirigible in Docker. Prerequisites Install Docker . Steps Pull the Dirigible Docker image: docker pull dirigiblelabs/dirigible:latest Start the container: Run with Mounted Volume with Environment Configurations with Java Debugging Options docker run --name dirigible \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -v :/target \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -e DIRIGIBLE_BRANDING_NAME=\"My Web IDE\" \\ -e DIRIGIBLE_BRANDING_BRAND=\"WebIDE\" \\ -e DIRIGIBLE_BRANDING_BRAND_URL=\"https://www.eclipse.org\" \\ -e DIRIGIBLE_THEME_DEFAULT=\"fiori\" \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Note The complete list of Dirigible environment variables could be found here . docker run --name dirigible \\ -e JPDA_ADDRESS=0.0.0.0:8000 \\ -e JPDA_TRANSPORT=dt_socket \\ --rm -p 8000:8000 -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Open a web browser and go to: http://localhost:8080/ Note The default user name and password are admin/admin Stop the container: docker stop dirigible","title":"Docker"},{"location":"setup/docker/#setup-as-a-docker-image","text":"Deploy Eclipse Dirigible in Docker. Prerequisites Install Docker .","title":"Setup as a Docker Image"},{"location":"setup/docker/#steps","text":"Pull the Dirigible Docker image: docker pull dirigiblelabs/dirigible:latest Start the container: Run with Mounted Volume with Environment Configurations with Java Debugging Options docker run --name dirigible \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -v :/target \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -e DIRIGIBLE_BRANDING_NAME=\"My Web IDE\" \\ -e DIRIGIBLE_BRANDING_BRAND=\"WebIDE\" \\ -e DIRIGIBLE_BRANDING_BRAND_URL=\"https://www.eclipse.org\" \\ -e DIRIGIBLE_THEME_DEFAULT=\"fiori\" \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Note The complete list of Dirigible environment variables could be found here . docker run --name dirigible \\ -e JPDA_ADDRESS=0.0.0.0:8000 \\ -e JPDA_TRANSPORT=dt_socket \\ --rm -p 8000:8000 -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Open a web browser and go to: http://localhost:8080/ Note The default user name and password are admin/admin Stop the container: docker stop dirigible","title":"Steps"},{"location":"setup/setup-environment-variables/","text":"Environment Variables Configuration Types Based on the layer, they are defined, configuration variables have the following priorities: Runtime Environment Deployment Module Highest precedence: No rebuild or restart of the application is required when configuration is changed. The Configuration API could be used to apply changes in the Runtime configuration. Second precedence: No rebuild is required when configuration is changed, however the application should be restarted, to apply the environment changes. Usually the Environment configurations are provided during the application deployment, as part of application descriptor (e.g. Define environment variable for container in Kubernetes or in Cloud Foundry App Manifest ) . Third precedence: Rebuild and re-deployment is required. \"Default\" deployment ( ROOT.war ) configuration variables are taken from dirigible.properties properties file (sample could be found here ) . Lowest precedence: Rebuild and re-deployment is required. \"Default\" module (e.g. dirigible-database-custom.jar , dirigible-database-h2.jar ) configuration variables are taken from dirigible-xxx.properties properties files (sample could be found here and here ) Note The precedence order means that, if the there is an Environment variable with name DIRIGIBLE_TEST and Runtime variable with the same name, the Runtime variable will have high prority and will be applied. All applied configuration values could be found under the Configurations View . Configuration Parameters Branding Parameter Description Default* DIRIGIBLE_BRANDING_NAME The brand name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND The branding name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND_URL The branding URL https://www.dirigible.io DIRIGIBLE_BRANDING_ICON The branding icon ../../../../services/v4/web/resources/images/favicon.png DIRIGIBLE_BRANDING_WELCOME_PAGE_DEFAULT The branding welcome page ../../../../services/v4/web/ide/welcome.html DIRIGIBLE_BRANDING_HELP_ITEMS The list of the custom help menu items (comma separated) - Branding - Help Items Note Replace CUSTOM_ITEM with the actual name set by DIRIGIBLE_BRANDING_HELP_ITEMS e.g. ITEM1 Parameter Description Default* DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_NAME The name of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_URL The url of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_ORDER (Optional) The order of the custom help item 0 DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_DIVIDER (Optional) Whether to set divider after the custom help item false Server Parameter Description Default* DIRIGIBLE_SERVER_PORT The port that Eclipse Dirigible will start on 8080 Basic Parameter Description Default* DIRIGIBLE_BASIC_ENABLED Whether the Basic authentication is enabled true DIRIGIBLE_BASIC_USERNAME Base64 encoded property, which will be used as user name for basic authentication admin DIRIGIBLE_BASIC_PASSWORD Base64 encoded property, which will be used as password for basic authentication admin OAuth Parameter Description Default* DIRIGIBLE_OAUTH_ENABLED Whether the OAuth authentication is enabled false DIRIGIBLE_OAUTH_AUTHORIZE_URL The OAuth authorization URL (e.g. https://my-oauth-server/oauth/authorize ) - DIRIGIBLE_OAUTH_TOKEN_URL The OAuth token URL (e.g. https://my-oauth-server/oauth/token ) - DIRIGIBLE_OAUTH_TOKEN_REQUEST_METHOD The OAuth token request method ( GET or POST ) GET DIRIGIBLE_OAUTH_CLIENT_ID The OAuth clientid (e.g. sb-xxx-yyy ) - DIRIGIBLE_OAUTH_CLIENT_SECRET The OAuth clientsecret (e.g. PID/cpkD8aZzbGaa6+muYYOOMWPDeM1ug/sQ5ZF... ) - DIRIGIBLE_OAUTH_APPLICATION_HOST The application host (e.g. https://my-application-host ) - DIRIGIBLE_OAUTH_ISSUER The OAuth issuer (e.g. http://xxx.localhost:8080/uaa/oauth/token ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY The OAuth verificationkey (e.g. -----BEGIN PUBLIC KEY-----MIIBIjANBgkqhki... ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY_EXPONENT The OAuth verificationkey exponent (e.g. AQAB ) - DIRIGIBLE_OAUTH_CHECK_ISSUER_ENABLED Sets whether the JWT verifier should check the token issuer true DIRIGIBLE_OAUTH_CHECK_AUDIENCE_ENABLED Sets whether the JWT verifier should check the token aud true DIRIGIBLE_OAUTH_APPLICATION_NAME The application name (e.g. dirigible-xxx ) - Redirect/Callback URL Configure the Redirect/Callback URL in the OAuth client to: /services/v4/oauth/callback Keycloak Parameter Description Default* DIRIGIBLE_KEYCLOAK_ENABLED Sets whether the Keycloak Authentication is enabled* false DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL The Keycloak Authentication Server URL (e.g. https://keycloak-server/auth/ ) - DIRIGIBLE_KEYCLOAK_REALM The Keycloak realm (e.g. my-realm ) - DIRIGIBLE_KEYCLOAK_SSL_REQUIRED The Keyclaok SSL Required (e.g. none / external ) - DIRIGIBLE_KEYCLOAK_CLIENT_ID The Keycloak Client ID (e.g. my-client ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - SERVER_MAXHTTPHEADERSIZE The HTTP header max size (e.g. 48000 ) Default for the underlying server (e.g. Tomcat) Note In addition to setting the DIRIGIBLE_KEYCLOAK_ENABLED property to true , the DIRIGIBLE_BASIC_ENABLED property should be set to false in order to enable the Keycloak integration. To find more details about the Keycloak configuration go to Keycloak Java Adapter Configuration . Git Parameter Description Default* DIRIGIBLE_GIT_ROOT_FOLDER The external folder that will be used for synchronizing git projects - Registry Parameter Description Default* DIRIGIBLE_REGISTRY_EXTERNAL_FOLDER The external folder that will be used for synchronizing the public registry - DIRIGIBLE_REGISTRY_IMPORT_WORKSPACE The external folder that will be imported into the public registry - Repository Parameter Description Default* DIRIGIBLE_REPOSITORY_PROVIDER The name of the repository provider used in this instance local or database DIRIGIBLE_REPOSITORY_CACHE_ENABLED Enable the usage of the repository cache true Local Repository Parameter Description Default* DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER The location of the root folder where the repository artifacts will be stored . DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false Master Repository Parameter Description Default* DIRIGIBLE_MASTER_REPOSITORY_PROVIDER The name of the master repository provider used in this instance ( filesystem , zip or jar ) - DIRIGIBLE_MASTER_REPOSITORY_ROOT_FOLDER The location of the root folder where the master repository artifacts will be loaded from . DIRIGIBLE_MASTER_REPOSITORY_ZIP_LOCATION The location of the zip file where the master repository artifacts will be loaded from (e.g. /User/data/my-repo.zip ) - DIRIGIBLE_MASTER_REPOSITORY_JAR_PATH The JAR path location of the zip file where the master repository artifacts will be loaded from (e.g. /org/dirigible/example/my-repo.zip ) - Note The JAR path is absolute inside the class path Repository Search Parameter Description Default* DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER The location of the root folder to be used by the indexing engine . DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false DIRIGIBLE_REPOSITORY_SEARCH_INDEX_LOCATION The sub-folder under the root folder where the index files will be stored dirigible/repository/index Repository Versioning Parameter Description Default* DIRIGIBLE_REPOSITORY_VERSIONING_ENABLED The flag whether versioning for repository is enabled false Database Common Parameters Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance ( local , managed or custom ) local DIRIGIBLE_DATABASE_DEFAULT_SET_AUTO_COMMIT The AUTO_COMMIT data source parameter true DIRIGIBLE_DATABASE_DEFAULT_MAX_CONNECTIONS_COUNT The MAX_CONNECTIONS_COUNT data source parameter 8 DIRIGIBLE_DATABASE_DEFAULT_WAIT_TIMEOUT The WAIT_TIMEOUT data source parameter 500 DIRIGIBLE_DATABASE_DEFAULT_WAIT_COUNT The WAIT_COUNT data source parameter 5 DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance DefaultDB DIRIGIBLE_DATABASE_DATASOURCE_NAME_SYSTEM The name of the system data source used in this instance SystemDB DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE The names of the tables, views and columns to be considered as case sensitive false DIRIGIBLE_DATABASE_TRANSFER_BATCH_SIZE The batch size used during the data transfer 1000 DIRIGIBLE_DATABASE_DEFAULT_QUERY_LIMIT The batch size used during quering data from the database 1000 DIRIGIBLE_DATABASE_SYSTEM_DRIVER The driver used for the SystemDB database connection org.h2.Driver DIRIGIBLE_DATABASE_SYSTEM_URL The JDBC url used for the SystemDB database connection jdbc:h2:file:./target/dirigible/h2/SystemDB DIRIGIBLE_DATABASE_SYSTEM_USERNAME The username used for the SystemDB database connection sa DIRIGIBLE_DATABASE_SYSTEM_PASSWORD The password used for the SystemDB database connection (empty) Custom Database Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance to be set to custom local DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES The list of the custom data sources names used in this instance e.g. DS1,DS2 `` DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance e.g. DS1 DefaultDB DS1_DRIVER The JDBC driver used for the examplary DS1 database connection `` DS1_URL The JDBC url used for the examplary DS1 database connection `` DS1_SCHEMA The default schema used for the examplary DS1 database connection `` DS1_USERNAME The username used for the examplary DS1 database connection `` DS1_PASSWORD The password used for the examplary DS1 database connection `` Database H2 Parameter Description Default* DIRIGIBLE_DATABASE_H2_ROOT_FOLDER_DEFAULT The location used by H2 database ./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_DRIVER The Driver used by H2 database org.h2.Driver DIRIGIBLE_DATABASE_H2_URL The URL used by H2 database jdbc:h2:./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_USERNAME The Username used by H2 database sa DIRIGIBLE_DATABASE_H2_PASSWORD The Password used by H2 database - Database Snowflake Parameter Description Default* SNOWFLAKE_DATABASE The database used by Snowflake - SNOWFLAKE_SCHEMA The schema used by Snowflake - SNOWFLAKE_WAREHOUSE The warehouse used by Snowflake - SNOWFLAKE_DEFAULT_TABLE_TYPE Default table type for create table statements HYBRID Persistence Parameter Description Default* DIRIGIBLE_PERSISTENCE_CREATE_TABLE_ON_USE Whether the table to be created automatically on use if it does not exist true MongoDB Parameter Description Default* DIRIGIBLE_MONGODB_CLIENT_URI The location used by MongoDB server mongodb://localhost:27017 DIRIGIBLE_MONGODB_DATABASE_DEFAULT The default database name db Lifecycle Parameter Description Default* DIRIGIBLE_PUBLISH_DISABLED Disable publishing process in this instance false Scheduler Parameter Description Default* DIRIGIBLE_SCHEDULER_MEMORY_STORE Whether Quartz to use in-memory job store false DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_TYPE The type of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_NAME The name of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_LOGS_RETANTION_PERIOD The period the logs of the job execution will be kept (the default is one week - 24x7) 168 DIRIGIBLE_SCHEDULER_EMAIL_SENDER The sender for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_RECIPIENTS The recipients list for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_ERROR The error subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_NORMAL The normal subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_ERROR The error template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_NORMAL The normal template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_SCHEME The scheme part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_HOST The host part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_PORT The port part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE The name of the JDBC delegate used by Quartz, if not the default one org.quartz.impl.jdbcjobstore.StdJDBCDelegate Note Quartz JDBC delegates: org.quartz.impl.jdbcjobstore.StdJDBCDelegate (for fully JDBC-compliant drivers) org.quartz.impl.jdbcjobstore.MSSQLDelegate (for Microsoft SQL Server, and Sybase) org.quartz.impl.jdbcjobstore.PostgreSQLDelegate org.quartz.impl.jdbcjobstore.WebLogicDelegate (for WebLogic drivers) org.quartz.impl.jdbcjobstore.oracle.OracleDelegate org.quartz.impl.jdbcjobstore.oracle.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.oracle.weblogic.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.CloudscapeDelegate org.quartz.impl.jdbcjobstore.DB2v6Delegate org.quartz.impl.jdbcjobstore.DB2v7Delegate org.quartz.impl.jdbcjobstore.DB2v8Delegate org.quartz.impl.jdbcjobstore.HSQLDBDelegate org.quartz.impl.jdbcjobstore.PointbaseDelegate org.quartz.impl.jdbcjobstore.SybaseDelegate Synchronizer Parameter Description Default* DIRIGIBLE_SYNCHRONIZER_IGNORE_DEPENDENCIES Whether to ignore dependencies for synchronizers, e.g. for tests purposes false DIRIGIBLE_SYNCHRONIZER_EXCLUDE_PATHS Paths to be excluded from processing (comma separated list) `` DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_COUNT Cross-dependencies processing count 10 DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_INTERVAL Cross-dependencies processing interval 10000 Job Expression Parameter Description Default* DIRIGIBLE_JOB_EXPRESSION_BPM BPM synchronizer job config 0/50 * * * * ? DIRIGIBLE_JOB_EXPRESSION_DATA_STRUCTURES Data structures job synchronizer config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_EXTENSIONS Extension synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_JOBS Jobs synchronizer job config 0/15 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MESSAGING Messaging synchronizer job config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MIGRATIONS Migration synchronizer job config 0/55 * * * * ? DIRIGIBLE_JOB_EXPRESSION_ODATA OData synchronizer job config 0/45 * * * * ? DIRIGIBLE_JOB_EXPRESSION_PUBLISHER Publisher synchronizer job config 0/5 * * * * ? DIRIGIBLE_JOB_EXPRESSION_SECURITY Security synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_REGISTRY Registry synchronizer job config 0/35 * * * * ? DIRIGIBLE_JOB_DEFAULT_TIMEOUT Default timeout in minutes 3 Runtime Core Parameter Description Default* DIRIGIBLE_HOME_URL The home URL where the user to be redirected on access /services/v4/web/ide/index.html Vert.x Parameter Description Default* DIRIGIBLE_VERTX_PORT The Vert.x server port, if used 8888 CSV Parameter Description Default* DIRIGIBLE_CSV_DATA_MAX_COMPARE_SIZE The maximum number of CSV records for which will be performed comparison with the existing table data 1000 DIRIGIBLE_CSV_DATA_BATCH_SIZE The number of CSV records to be included in a batch operation 100 CMS Parameter Description Default* DIRIGIBLE_CMS_PROVIDER The type of the CMS provider used in this instance (e.g. cms-provider-internal , cms-provider-s3 , managed or database ) internal DIRIGIBLE_CMS_ROLES_ENABLED Whether the RBAC over the CMS content to be enabled true CMS - Internal Parameter Description Default* DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER The location of the CMS internal repository target DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER_IS_ABSOLUTE Whether the root folder parameter is absolute or not false DIRIGIBLE_CMS_INTERNAL_VERSIONING_ENABLED Whether the versioning of the files is enabled or not false CMS - S3 Parameter Description Default* AWS_ACCESS_KEY_ID The AWS access key used for authentication target AWS_SECRET_ACCESS_KEY The AWS secret key used for authentication target AWS_DEFAULT_REGION The region where the bucket is stored eu-central-1 DIRIGIBLE_S3_BUCKET The bucket to be used for content management. Will be created if the provided one does not exist target DIRIGIBLE_S3_PROVIDER The provider to be used for S3. For local testing an option with localstack is available aws CMS - Managed Parameter Description Default* DIRIGIBLE_CMS_MANAGED_CONFIGURATION_JNDI_NAME The JNDI name of the managed CMS repository java:comp/env/EcmService in case of SAP package DIRIGIBLE_CMS_MANAGED_CONFIGURATION_AUTH_METHOD The authentication method (e.g. key or destination ) key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_NAME The name of the repository cmis:dirigible DIRIGIBLE_CMS_MANAGED_CONFIGURATION_KEY The key of the repository cmis:dirigible:key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_DESTINATION The name of the destination where the name and the key for the repository are stored (e.g. CMIS_DESTINATION ) - DIRIGIBLE_CONNECTIVITY_CONFIGURATION_JNDI_NAME The JNDI name of the connectivity configuration serivce java:comp/env/connectivity/Configuration in case of SAP package CMS Database Parameter Description Default* DIRIGIBLE_CMS_DATABASE_DATASOURCE_TYPE Type of the database for CMS repository (e.g. local , managed , custom , dynamic ) managed DIRIGIBLE_CMS_DATABASE_DATASOURCE_NAME The datasource name DefaultDB BPM Parameter Description Default* DIRIGIBLE_BPM_PROVIDER The provider of the BPM engine (e.g. internal , managed , remote ) internal BPM - Flowable Parameter Description Default* DIRIGIBLE_FLOWABLE_DATABASE_DRIVER The driver of the Flowable engine (e.g. org.postgresql.Driver ) - DIRIGIBLE_FLOWABLE_DATABASE_URL The URL of the Flowable engine (e.g. jdbc:postgresql://localhost:5432/ ) - DIRIGIBLE_FLOWABLE_DATABASE_USER The user of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_PASSWORD The driver of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_DATASOURCE_NAME The datasource name of the Flowable engine, if any configured - DIRIGIBLE_FLOWABLE_DATABASE_SCHEMA_UPDATE Whether to materialize the database layout or not true DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in H2 (e.g. true (DefaultDB) or false (H2)) true Mail Parameter Description Default* DIRIGIBLE_MAIL_USERNAME Mailbox username - DIRIGIBLE_MAIL_PASSWORD Mailbox password - DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL Mail transport protocol smtps DIRIGIBLE_MAIL_SMTPS_HOST Mailbox SMTPS host - DIRIGIBLE_MAIL_SMTPS_PORT Mailbox SMTPS port - DIRIGIBLE_MAIL_SMTPS_AUTH Enable/disable mailbox SMTPS authentication - DIRIGIBLE_MAIL_SMTP_HOST Mailbox SMTP host - DIRIGIBLE_MAIL_SMTP_PORT Mailbox SMTP port - DIRIGIBLE_MAIL_SMTP_AUTH Enable/disable mailbox SMTP authentication - Messaging Parameter Description Default* DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in KahaDB (e.g. true (DefaultDB) or false (KahaDB)) true Kafka Parameter Description Default* DIRIGIBLE_KAFKA_BOOTSTRAP_SERVER The Kafka server location localhost:9092 DIRIGIBLE_KAFKA_ACKS The number of brokers that must receive the record before considering the write as successful all DIRIGIBLE_KAFKA_KEY_SERIALIZER The Key serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_VALUE_SERIALIZER The Value serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_AUTOCOMMIT_ENABLED Whether Auto Commit is enabled true DIRIGIBLE_KAFKA_AUTOCOMMIT_INTERVAL Auto Commit interval in ms 1000 Engines JavaScript Parameter Description Default* DIRIGIBLE_JAVASCRIPT_ENGINE_TYPE_DEFAULT The type of the JavaScript engine provider used in this instance (e.g. graalvm , rhino , nashorn or v8 ) graalvm since 5.0 GraalVM Parameter Description Default* DIRIGIBLE_GRAALIUM_ENABLE_DEBUG Whether the debug mode is enabled false DIRIGIBLE_JAVASCRIPT_GRAALVM_DEBUGGER_PORT The GraalVM debugger port 8081 and 0.0.0.0:8081 in Docker environment DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_HOST_ACCESS Whether GraalVM can load classes form custom packages true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_THREAD Whether GraalVM can create threads true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_PROCESS Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_IO Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_NASHORN Whether GraalVM has enabled compatibility mode for Nashorn true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_MOZILLA Whether GraalVM has enabled compatibility mode for Mozilla false TypeScript Parameter Description Default* DIRIGIBLE_PROJECT_TYPESCRIPT Whether the project is TypeScript enabled true OData Parameter Description Default* DIRIGIBLE_ODATA_HANDLER_EXECUTOR_TYPE The type of the JavaScript engine to be used for event handlers in OData DIRIGIBLE_ODATA_HANDLER_EXECUTOR_ON_EVENT The location of the wrapper helper to be used for event handlers in OData FTP Parameter Description Default* DIRIGIBLE_FTP_USERNAME The FTP server username admin DIRIGIBLE_FTP_PASSWORD The FTP server password admin DIRIGIBLE_FTP_PORT The FTP server port 8022 SFTP Parameter Description Default* DIRIGIBLE_SFTP_USERNAME The SFTP server username admin DIRIGIBLE_SFTP_PASSWORD The SFTP server password admin DIRIGIBLE_SFTP_PORT The SFTP server port 8022 Operations Logs Parameter Description Default* DIRIGIBLE_OPERATIONS_LOGS_ROOT_FOLDER_DEFAULT The folder where the log files are stored in ../logs DIRIGIBLE_EXEC_COMMAND_LOGGING_ENABLED Whether to log the executed command by the exec API false Look & Feel Theme Parameter Description Default* DIRIGIBLE_THEME_DEFAULT The name of the default name Default Terminal Parameter Description Default* DIRIGIBLE_TERMINAL_ENABLED Whether the Terminal view is enabled true","title":"Environment Variables"},{"location":"setup/setup-environment-variables/#environment-variables","text":"","title":"Environment Variables"},{"location":"setup/setup-environment-variables/#configuration-types","text":"Based on the layer, they are defined, configuration variables have the following priorities: Runtime Environment Deployment Module Highest precedence: No rebuild or restart of the application is required when configuration is changed. The Configuration API could be used to apply changes in the Runtime configuration. Second precedence: No rebuild is required when configuration is changed, however the application should be restarted, to apply the environment changes. Usually the Environment configurations are provided during the application deployment, as part of application descriptor (e.g. Define environment variable for container in Kubernetes or in Cloud Foundry App Manifest ) . Third precedence: Rebuild and re-deployment is required. \"Default\" deployment ( ROOT.war ) configuration variables are taken from dirigible.properties properties file (sample could be found here ) . Lowest precedence: Rebuild and re-deployment is required. \"Default\" module (e.g. dirigible-database-custom.jar , dirigible-database-h2.jar ) configuration variables are taken from dirigible-xxx.properties properties files (sample could be found here and here ) Note The precedence order means that, if the there is an Environment variable with name DIRIGIBLE_TEST and Runtime variable with the same name, the Runtime variable will have high prority and will be applied. All applied configuration values could be found under the Configurations View .","title":"Configuration Types"},{"location":"setup/setup-environment-variables/#configuration-parameters","text":"","title":"Configuration Parameters"},{"location":"setup/setup-environment-variables/#branding","text":"Parameter Description Default* DIRIGIBLE_BRANDING_NAME The brand name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND The branding name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND_URL The branding URL https://www.dirigible.io DIRIGIBLE_BRANDING_ICON The branding icon ../../../../services/v4/web/resources/images/favicon.png DIRIGIBLE_BRANDING_WELCOME_PAGE_DEFAULT The branding welcome page ../../../../services/v4/web/ide/welcome.html DIRIGIBLE_BRANDING_HELP_ITEMS The list of the custom help menu items (comma separated) -","title":"Branding"},{"location":"setup/setup-environment-variables/#branding-help-items","text":"Note Replace CUSTOM_ITEM with the actual name set by DIRIGIBLE_BRANDING_HELP_ITEMS e.g. ITEM1 Parameter Description Default* DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_NAME The name of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_URL The url of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_ORDER (Optional) The order of the custom help item 0 DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_DIVIDER (Optional) Whether to set divider after the custom help item false","title":"Branding - Help Items"},{"location":"setup/setup-environment-variables/#server","text":"Parameter Description Default* DIRIGIBLE_SERVER_PORT The port that Eclipse Dirigible will start on 8080","title":"Server"},{"location":"setup/setup-environment-variables/#basic","text":"Parameter Description Default* DIRIGIBLE_BASIC_ENABLED Whether the Basic authentication is enabled true DIRIGIBLE_BASIC_USERNAME Base64 encoded property, which will be used as user name for basic authentication admin DIRIGIBLE_BASIC_PASSWORD Base64 encoded property, which will be used as password for basic authentication admin","title":"Basic"},{"location":"setup/setup-environment-variables/#oauth","text":"Parameter Description Default* DIRIGIBLE_OAUTH_ENABLED Whether the OAuth authentication is enabled false DIRIGIBLE_OAUTH_AUTHORIZE_URL The OAuth authorization URL (e.g. https://my-oauth-server/oauth/authorize ) - DIRIGIBLE_OAUTH_TOKEN_URL The OAuth token URL (e.g. https://my-oauth-server/oauth/token ) - DIRIGIBLE_OAUTH_TOKEN_REQUEST_METHOD The OAuth token request method ( GET or POST ) GET DIRIGIBLE_OAUTH_CLIENT_ID The OAuth clientid (e.g. sb-xxx-yyy ) - DIRIGIBLE_OAUTH_CLIENT_SECRET The OAuth clientsecret (e.g. PID/cpkD8aZzbGaa6+muYYOOMWPDeM1ug/sQ5ZF... ) - DIRIGIBLE_OAUTH_APPLICATION_HOST The application host (e.g. https://my-application-host ) - DIRIGIBLE_OAUTH_ISSUER The OAuth issuer (e.g. http://xxx.localhost:8080/uaa/oauth/token ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY The OAuth verificationkey (e.g. -----BEGIN PUBLIC KEY-----MIIBIjANBgkqhki... ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY_EXPONENT The OAuth verificationkey exponent (e.g. AQAB ) - DIRIGIBLE_OAUTH_CHECK_ISSUER_ENABLED Sets whether the JWT verifier should check the token issuer true DIRIGIBLE_OAUTH_CHECK_AUDIENCE_ENABLED Sets whether the JWT verifier should check the token aud true DIRIGIBLE_OAUTH_APPLICATION_NAME The application name (e.g. dirigible-xxx ) - Redirect/Callback URL Configure the Redirect/Callback URL in the OAuth client to: /services/v4/oauth/callback","title":"OAuth"},{"location":"setup/setup-environment-variables/#keycloak","text":"Parameter Description Default* DIRIGIBLE_KEYCLOAK_ENABLED Sets whether the Keycloak Authentication is enabled* false DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL The Keycloak Authentication Server URL (e.g. https://keycloak-server/auth/ ) - DIRIGIBLE_KEYCLOAK_REALM The Keycloak realm (e.g. my-realm ) - DIRIGIBLE_KEYCLOAK_SSL_REQUIRED The Keyclaok SSL Required (e.g. none / external ) - DIRIGIBLE_KEYCLOAK_CLIENT_ID The Keycloak Client ID (e.g. my-client ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - SERVER_MAXHTTPHEADERSIZE The HTTP header max size (e.g. 48000 ) Default for the underlying server (e.g. Tomcat) Note In addition to setting the DIRIGIBLE_KEYCLOAK_ENABLED property to true , the DIRIGIBLE_BASIC_ENABLED property should be set to false in order to enable the Keycloak integration. To find more details about the Keycloak configuration go to Keycloak Java Adapter Configuration .","title":"Keycloak"},{"location":"setup/setup-environment-variables/#git","text":"Parameter Description Default* DIRIGIBLE_GIT_ROOT_FOLDER The external folder that will be used for synchronizing git projects -","title":"Git"},{"location":"setup/setup-environment-variables/#registry","text":"Parameter Description Default* DIRIGIBLE_REGISTRY_EXTERNAL_FOLDER The external folder that will be used for synchronizing the public registry - DIRIGIBLE_REGISTRY_IMPORT_WORKSPACE The external folder that will be imported into the public registry -","title":"Registry"},{"location":"setup/setup-environment-variables/#repository","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_PROVIDER The name of the repository provider used in this instance local or database DIRIGIBLE_REPOSITORY_CACHE_ENABLED Enable the usage of the repository cache true","title":"Repository"},{"location":"setup/setup-environment-variables/#local-repository","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER The location of the root folder where the repository artifacts will be stored . DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false","title":"Local Repository"},{"location":"setup/setup-environment-variables/#master-repository","text":"Parameter Description Default* DIRIGIBLE_MASTER_REPOSITORY_PROVIDER The name of the master repository provider used in this instance ( filesystem , zip or jar ) - DIRIGIBLE_MASTER_REPOSITORY_ROOT_FOLDER The location of the root folder where the master repository artifacts will be loaded from . DIRIGIBLE_MASTER_REPOSITORY_ZIP_LOCATION The location of the zip file where the master repository artifacts will be loaded from (e.g. /User/data/my-repo.zip ) - DIRIGIBLE_MASTER_REPOSITORY_JAR_PATH The JAR path location of the zip file where the master repository artifacts will be loaded from (e.g. /org/dirigible/example/my-repo.zip ) - Note The JAR path is absolute inside the class path","title":"Master Repository"},{"location":"setup/setup-environment-variables/#repository-search","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER The location of the root folder to be used by the indexing engine . DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false DIRIGIBLE_REPOSITORY_SEARCH_INDEX_LOCATION The sub-folder under the root folder where the index files will be stored dirigible/repository/index","title":"Repository Search"},{"location":"setup/setup-environment-variables/#repository-versioning","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_VERSIONING_ENABLED The flag whether versioning for repository is enabled false","title":"Repository Versioning"},{"location":"setup/setup-environment-variables/#database","text":"","title":"Database"},{"location":"setup/setup-environment-variables/#common-parameters","text":"Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance ( local , managed or custom ) local DIRIGIBLE_DATABASE_DEFAULT_SET_AUTO_COMMIT The AUTO_COMMIT data source parameter true DIRIGIBLE_DATABASE_DEFAULT_MAX_CONNECTIONS_COUNT The MAX_CONNECTIONS_COUNT data source parameter 8 DIRIGIBLE_DATABASE_DEFAULT_WAIT_TIMEOUT The WAIT_TIMEOUT data source parameter 500 DIRIGIBLE_DATABASE_DEFAULT_WAIT_COUNT The WAIT_COUNT data source parameter 5 DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance DefaultDB DIRIGIBLE_DATABASE_DATASOURCE_NAME_SYSTEM The name of the system data source used in this instance SystemDB DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE The names of the tables, views and columns to be considered as case sensitive false DIRIGIBLE_DATABASE_TRANSFER_BATCH_SIZE The batch size used during the data transfer 1000 DIRIGIBLE_DATABASE_DEFAULT_QUERY_LIMIT The batch size used during quering data from the database 1000 DIRIGIBLE_DATABASE_SYSTEM_DRIVER The driver used for the SystemDB database connection org.h2.Driver DIRIGIBLE_DATABASE_SYSTEM_URL The JDBC url used for the SystemDB database connection jdbc:h2:file:./target/dirigible/h2/SystemDB DIRIGIBLE_DATABASE_SYSTEM_USERNAME The username used for the SystemDB database connection sa DIRIGIBLE_DATABASE_SYSTEM_PASSWORD The password used for the SystemDB database connection (empty)","title":"Common Parameters"},{"location":"setup/setup-environment-variables/#custom-database","text":"Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance to be set to custom local DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES The list of the custom data sources names used in this instance e.g. DS1,DS2 `` DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance e.g. DS1 DefaultDB DS1_DRIVER The JDBC driver used for the examplary DS1 database connection `` DS1_URL The JDBC url used for the examplary DS1 database connection `` DS1_SCHEMA The default schema used for the examplary DS1 database connection `` DS1_USERNAME The username used for the examplary DS1 database connection `` DS1_PASSWORD The password used for the examplary DS1 database connection ``","title":"Custom Database"},{"location":"setup/setup-environment-variables/#database-h2","text":"Parameter Description Default* DIRIGIBLE_DATABASE_H2_ROOT_FOLDER_DEFAULT The location used by H2 database ./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_DRIVER The Driver used by H2 database org.h2.Driver DIRIGIBLE_DATABASE_H2_URL The URL used by H2 database jdbc:h2:./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_USERNAME The Username used by H2 database sa DIRIGIBLE_DATABASE_H2_PASSWORD The Password used by H2 database -","title":"Database H2"},{"location":"setup/setup-environment-variables/#database-snowflake","text":"Parameter Description Default* SNOWFLAKE_DATABASE The database used by Snowflake - SNOWFLAKE_SCHEMA The schema used by Snowflake - SNOWFLAKE_WAREHOUSE The warehouse used by Snowflake - SNOWFLAKE_DEFAULT_TABLE_TYPE Default table type for create table statements HYBRID","title":"Database Snowflake"},{"location":"setup/setup-environment-variables/#persistence","text":"Parameter Description Default* DIRIGIBLE_PERSISTENCE_CREATE_TABLE_ON_USE Whether the table to be created automatically on use if it does not exist true","title":"Persistence"},{"location":"setup/setup-environment-variables/#mongodb","text":"Parameter Description Default* DIRIGIBLE_MONGODB_CLIENT_URI The location used by MongoDB server mongodb://localhost:27017 DIRIGIBLE_MONGODB_DATABASE_DEFAULT The default database name db","title":"MongoDB"},{"location":"setup/setup-environment-variables/#lifecycle","text":"Parameter Description Default* DIRIGIBLE_PUBLISH_DISABLED Disable publishing process in this instance false","title":"Lifecycle"},{"location":"setup/setup-environment-variables/#scheduler","text":"Parameter Description Default* DIRIGIBLE_SCHEDULER_MEMORY_STORE Whether Quartz to use in-memory job store false DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_TYPE The type of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_NAME The name of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_LOGS_RETANTION_PERIOD The period the logs of the job execution will be kept (the default is one week - 24x7) 168 DIRIGIBLE_SCHEDULER_EMAIL_SENDER The sender for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_RECIPIENTS The recipients list for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_ERROR The error subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_NORMAL The normal subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_ERROR The error template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_NORMAL The normal template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_SCHEME The scheme part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_HOST The host part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_PORT The port part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE The name of the JDBC delegate used by Quartz, if not the default one org.quartz.impl.jdbcjobstore.StdJDBCDelegate Note Quartz JDBC delegates: org.quartz.impl.jdbcjobstore.StdJDBCDelegate (for fully JDBC-compliant drivers) org.quartz.impl.jdbcjobstore.MSSQLDelegate (for Microsoft SQL Server, and Sybase) org.quartz.impl.jdbcjobstore.PostgreSQLDelegate org.quartz.impl.jdbcjobstore.WebLogicDelegate (for WebLogic drivers) org.quartz.impl.jdbcjobstore.oracle.OracleDelegate org.quartz.impl.jdbcjobstore.oracle.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.oracle.weblogic.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.CloudscapeDelegate org.quartz.impl.jdbcjobstore.DB2v6Delegate org.quartz.impl.jdbcjobstore.DB2v7Delegate org.quartz.impl.jdbcjobstore.DB2v8Delegate org.quartz.impl.jdbcjobstore.HSQLDBDelegate org.quartz.impl.jdbcjobstore.PointbaseDelegate org.quartz.impl.jdbcjobstore.SybaseDelegate","title":"Scheduler"},{"location":"setup/setup-environment-variables/#synchronizer","text":"Parameter Description Default* DIRIGIBLE_SYNCHRONIZER_IGNORE_DEPENDENCIES Whether to ignore dependencies for synchronizers, e.g. for tests purposes false DIRIGIBLE_SYNCHRONIZER_EXCLUDE_PATHS Paths to be excluded from processing (comma separated list) `` DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_COUNT Cross-dependencies processing count 10 DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_INTERVAL Cross-dependencies processing interval 10000","title":"Synchronizer"},{"location":"setup/setup-environment-variables/#job-expression","text":"Parameter Description Default* DIRIGIBLE_JOB_EXPRESSION_BPM BPM synchronizer job config 0/50 * * * * ? DIRIGIBLE_JOB_EXPRESSION_DATA_STRUCTURES Data structures job synchronizer config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_EXTENSIONS Extension synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_JOBS Jobs synchronizer job config 0/15 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MESSAGING Messaging synchronizer job config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MIGRATIONS Migration synchronizer job config 0/55 * * * * ? DIRIGIBLE_JOB_EXPRESSION_ODATA OData synchronizer job config 0/45 * * * * ? DIRIGIBLE_JOB_EXPRESSION_PUBLISHER Publisher synchronizer job config 0/5 * * * * ? DIRIGIBLE_JOB_EXPRESSION_SECURITY Security synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_REGISTRY Registry synchronizer job config 0/35 * * * * ? DIRIGIBLE_JOB_DEFAULT_TIMEOUT Default timeout in minutes 3","title":"Job Expression"},{"location":"setup/setup-environment-variables/#runtime-core","text":"Parameter Description Default* DIRIGIBLE_HOME_URL The home URL where the user to be redirected on access /services/v4/web/ide/index.html","title":"Runtime Core"},{"location":"setup/setup-environment-variables/#vertx","text":"Parameter Description Default* DIRIGIBLE_VERTX_PORT The Vert.x server port, if used 8888","title":"Vert.x"},{"location":"setup/setup-environment-variables/#csv","text":"Parameter Description Default* DIRIGIBLE_CSV_DATA_MAX_COMPARE_SIZE The maximum number of CSV records for which will be performed comparison with the existing table data 1000 DIRIGIBLE_CSV_DATA_BATCH_SIZE The number of CSV records to be included in a batch operation 100","title":"CSV"},{"location":"setup/setup-environment-variables/#cms","text":"Parameter Description Default* DIRIGIBLE_CMS_PROVIDER The type of the CMS provider used in this instance (e.g. cms-provider-internal , cms-provider-s3 , managed or database ) internal DIRIGIBLE_CMS_ROLES_ENABLED Whether the RBAC over the CMS content to be enabled true","title":"CMS"},{"location":"setup/setup-environment-variables/#cms-internal","text":"Parameter Description Default* DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER The location of the CMS internal repository target DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER_IS_ABSOLUTE Whether the root folder parameter is absolute or not false DIRIGIBLE_CMS_INTERNAL_VERSIONING_ENABLED Whether the versioning of the files is enabled or not false","title":"CMS - Internal"},{"location":"setup/setup-environment-variables/#cms-s3","text":"Parameter Description Default* AWS_ACCESS_KEY_ID The AWS access key used for authentication target AWS_SECRET_ACCESS_KEY The AWS secret key used for authentication target AWS_DEFAULT_REGION The region where the bucket is stored eu-central-1 DIRIGIBLE_S3_BUCKET The bucket to be used for content management. Will be created if the provided one does not exist target DIRIGIBLE_S3_PROVIDER The provider to be used for S3. For local testing an option with localstack is available aws","title":"CMS - S3"},{"location":"setup/setup-environment-variables/#cms-managed","text":"Parameter Description Default* DIRIGIBLE_CMS_MANAGED_CONFIGURATION_JNDI_NAME The JNDI name of the managed CMS repository java:comp/env/EcmService in case of SAP package DIRIGIBLE_CMS_MANAGED_CONFIGURATION_AUTH_METHOD The authentication method (e.g. key or destination ) key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_NAME The name of the repository cmis:dirigible DIRIGIBLE_CMS_MANAGED_CONFIGURATION_KEY The key of the repository cmis:dirigible:key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_DESTINATION The name of the destination where the name and the key for the repository are stored (e.g. CMIS_DESTINATION ) - DIRIGIBLE_CONNECTIVITY_CONFIGURATION_JNDI_NAME The JNDI name of the connectivity configuration serivce java:comp/env/connectivity/Configuration in case of SAP package","title":"CMS - Managed"},{"location":"setup/setup-environment-variables/#cms-database","text":"Parameter Description Default* DIRIGIBLE_CMS_DATABASE_DATASOURCE_TYPE Type of the database for CMS repository (e.g. local , managed , custom , dynamic ) managed DIRIGIBLE_CMS_DATABASE_DATASOURCE_NAME The datasource name DefaultDB","title":"CMS Database"},{"location":"setup/setup-environment-variables/#bpm","text":"Parameter Description Default* DIRIGIBLE_BPM_PROVIDER The provider of the BPM engine (e.g. internal , managed , remote ) internal","title":"BPM"},{"location":"setup/setup-environment-variables/#bpm-flowable","text":"Parameter Description Default* DIRIGIBLE_FLOWABLE_DATABASE_DRIVER The driver of the Flowable engine (e.g. org.postgresql.Driver ) - DIRIGIBLE_FLOWABLE_DATABASE_URL The URL of the Flowable engine (e.g. jdbc:postgresql://localhost:5432/ ) - DIRIGIBLE_FLOWABLE_DATABASE_USER The user of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_PASSWORD The driver of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_DATASOURCE_NAME The datasource name of the Flowable engine, if any configured - DIRIGIBLE_FLOWABLE_DATABASE_SCHEMA_UPDATE Whether to materialize the database layout or not true DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in H2 (e.g. true (DefaultDB) or false (H2)) true","title":"BPM - Flowable"},{"location":"setup/setup-environment-variables/#mail","text":"Parameter Description Default* DIRIGIBLE_MAIL_USERNAME Mailbox username - DIRIGIBLE_MAIL_PASSWORD Mailbox password - DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL Mail transport protocol smtps DIRIGIBLE_MAIL_SMTPS_HOST Mailbox SMTPS host - DIRIGIBLE_MAIL_SMTPS_PORT Mailbox SMTPS port - DIRIGIBLE_MAIL_SMTPS_AUTH Enable/disable mailbox SMTPS authentication - DIRIGIBLE_MAIL_SMTP_HOST Mailbox SMTP host - DIRIGIBLE_MAIL_SMTP_PORT Mailbox SMTP port - DIRIGIBLE_MAIL_SMTP_AUTH Enable/disable mailbox SMTP authentication -","title":"Mail"},{"location":"setup/setup-environment-variables/#messaging","text":"Parameter Description Default* DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in KahaDB (e.g. true (DefaultDB) or false (KahaDB)) true","title":"Messaging"},{"location":"setup/setup-environment-variables/#kafka","text":"Parameter Description Default* DIRIGIBLE_KAFKA_BOOTSTRAP_SERVER The Kafka server location localhost:9092 DIRIGIBLE_KAFKA_ACKS The number of brokers that must receive the record before considering the write as successful all DIRIGIBLE_KAFKA_KEY_SERIALIZER The Key serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_VALUE_SERIALIZER The Value serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_AUTOCOMMIT_ENABLED Whether Auto Commit is enabled true DIRIGIBLE_KAFKA_AUTOCOMMIT_INTERVAL Auto Commit interval in ms 1000","title":"Kafka"},{"location":"setup/setup-environment-variables/#engines","text":"","title":"Engines"},{"location":"setup/setup-environment-variables/#javascript","text":"Parameter Description Default* DIRIGIBLE_JAVASCRIPT_ENGINE_TYPE_DEFAULT The type of the JavaScript engine provider used in this instance (e.g. graalvm , rhino , nashorn or v8 ) graalvm since 5.0","title":"JavaScript"},{"location":"setup/setup-environment-variables/#graalvm","text":"Parameter Description Default* DIRIGIBLE_GRAALIUM_ENABLE_DEBUG Whether the debug mode is enabled false DIRIGIBLE_JAVASCRIPT_GRAALVM_DEBUGGER_PORT The GraalVM debugger port 8081 and 0.0.0.0:8081 in Docker environment DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_HOST_ACCESS Whether GraalVM can load classes form custom packages true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_THREAD Whether GraalVM can create threads true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_PROCESS Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_IO Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_NASHORN Whether GraalVM has enabled compatibility mode for Nashorn true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_MOZILLA Whether GraalVM has enabled compatibility mode for Mozilla false","title":"GraalVM"},{"location":"setup/setup-environment-variables/#typescript","text":"Parameter Description Default* DIRIGIBLE_PROJECT_TYPESCRIPT Whether the project is TypeScript enabled true","title":"TypeScript"},{"location":"setup/setup-environment-variables/#odata","text":"Parameter Description Default* DIRIGIBLE_ODATA_HANDLER_EXECUTOR_TYPE The type of the JavaScript engine to be used for event handlers in OData DIRIGIBLE_ODATA_HANDLER_EXECUTOR_ON_EVENT The location of the wrapper helper to be used for event handlers in OData","title":"OData"},{"location":"setup/setup-environment-variables/#ftp","text":"Parameter Description Default* DIRIGIBLE_FTP_USERNAME The FTP server username admin DIRIGIBLE_FTP_PASSWORD The FTP server password admin DIRIGIBLE_FTP_PORT The FTP server port 8022","title":"FTP"},{"location":"setup/setup-environment-variables/#sftp","text":"Parameter Description Default* DIRIGIBLE_SFTP_USERNAME The SFTP server username admin DIRIGIBLE_SFTP_PASSWORD The SFTP server password admin DIRIGIBLE_SFTP_PORT The SFTP server port 8022","title":"SFTP"},{"location":"setup/setup-environment-variables/#operations","text":"","title":"Operations"},{"location":"setup/setup-environment-variables/#logs","text":"Parameter Description Default* DIRIGIBLE_OPERATIONS_LOGS_ROOT_FOLDER_DEFAULT The folder where the log files are stored in ../logs DIRIGIBLE_EXEC_COMMAND_LOGGING_ENABLED Whether to log the executed command by the exec API false","title":"Logs"},{"location":"setup/setup-environment-variables/#look-feel","text":"","title":"Look & Feel"},{"location":"setup/setup-environment-variables/#theme","text":"Parameter Description Default* DIRIGIBLE_THEME_DEFAULT The name of the default name Default","title":"Theme"},{"location":"setup/setup-environment-variables/#terminal","text":"Parameter Description Default* DIRIGIBLE_TERMINAL_ENABLED Whether the Terminal view is enabled true","title":"Terminal"},{"location":"setup/kubernetes/","text":"Setup in Kubernetes You can deploy Eclipse Dirigible Docker images, for example dirigiblelabs/dirigible , in a Kubernetes cluster. Prerequisites Install kubectl . Access to Kubernetes Cluster on IaaS provider of your choice. Steps Tip This guide describes the generic steps on how to deploy Eclipse Dirigible in a Kubernetes cluster. For more detailed deployment guides go to: Setup in Google Kubernetes Engine . Setup in Azure Kubernetes Service . Setup in Red Hat OpenShift . Setup in SAP BTP Kyma . For additional deployment guides go to: Keycloak Setup . PostgreSQL Setup . GCP DNS Zone Setup . AKS DNS Zone Setup . Create deployment configuration file: deployment.yaml Pod Deployment Deployment with PVC apiVersion : v1 kind : Pod metadata : name : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always ports : - name : http containerPort : 8080 apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service NodePort LoadBalancer Ingress apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 80 targetPort : 8080 - name : https port : 443 targetPort : 8080 type : LoadBalancer selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible. http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note Replace with your Ingress host. Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: http://dirigible. Note Replace with your Ingress host. Login with user dirigible and password dirigible , which are set by default in the Docker image ( dirigiblelabs/dirigible ) used above. Maintenance Version Update To update the Eclipse Dirigible version either use the kubectl or update the Deployment YAML as follows: with kubectl with Deployment YAML kubectl set image deployment/dirigible dirigible=dirigiblelabs/dirigible: spec : containers : - name : dirigible image : dirigiblelabs/dirigible: imagePullPolicy : Always Eclipse Dirigible versions Update the placeholder with a stable release version: You can find all released versions here . You can find all Eclipse Dirigible Docker images and tags (versions) here . Scaling The Eclipse Dirigible Deployment could be scaled horizontally by adding/removing Pods as follows: Scale to Zero Scale Up kubectl scale deployment/dirigible --replicas=0 kubectl scale deployment/dirigible --replicas= Note To learn more about application scaling in Kubernetes, see Horizontal Pod Autoscaling . Debugging To debug the Eclipse Dirigible engine via Remote Java Debugging execute the following commands: Scale the deployment to zero: kubectl scale deployment/dirigible --replicas=0 Set debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS=0.0.0.0:8000 kubectl set env deployment/dirigible -e JPDA_TRANSPORT=dt_socket Edit the deployment and add command and args : kubectl edit deployment dirigible containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always command : [ \"/bin/sh\" ] args : [ \"/usr/local/tomcat/bin/catalina.sh\" , \"jpda\" , \"run\" ] Scale up the deployment: kubectl scale deployment/dirigible --replicas=1 Forward the debug port: kubectl port-forward deployment/dirigible 8000:8000 Clean-up To clean-up the environment after the debugging is done: Stop the port forwarding. Scale the deployment to zero. Remove the debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS- kubectl set env deployment/dirigible -e JPDA_TRANSPORT- Edit the deployment and remove command and args . Scale up the deployment.","title":"Kubernetes"},{"location":"setup/kubernetes/#setup-in-kubernetes","text":"You can deploy Eclipse Dirigible Docker images, for example dirigiblelabs/dirigible , in a Kubernetes cluster. Prerequisites Install kubectl . Access to Kubernetes Cluster on IaaS provider of your choice.","title":"Setup in Kubernetes"},{"location":"setup/kubernetes/#steps","text":"Tip This guide describes the generic steps on how to deploy Eclipse Dirigible in a Kubernetes cluster. For more detailed deployment guides go to: Setup in Google Kubernetes Engine . Setup in Azure Kubernetes Service . Setup in Red Hat OpenShift . Setup in SAP BTP Kyma . For additional deployment guides go to: Keycloak Setup . PostgreSQL Setup . GCP DNS Zone Setup . AKS DNS Zone Setup . Create deployment configuration file: deployment.yaml Pod Deployment Deployment with PVC apiVersion : v1 kind : Pod metadata : name : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always ports : - name : http containerPort : 8080 apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service NodePort LoadBalancer Ingress apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 80 targetPort : 8080 - name : https port : 443 targetPort : 8080 type : LoadBalancer selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible. http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note Replace with your Ingress host. Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: http://dirigible. Note Replace with your Ingress host. Login with user dirigible and password dirigible , which are set by default in the Docker image ( dirigiblelabs/dirigible ) used above.","title":"Steps"},{"location":"setup/kubernetes/#maintenance","text":"","title":"Maintenance"},{"location":"setup/kubernetes/#version-update","text":"To update the Eclipse Dirigible version either use the kubectl or update the Deployment YAML as follows: with kubectl with Deployment YAML kubectl set image deployment/dirigible dirigible=dirigiblelabs/dirigible: spec : containers : - name : dirigible image : dirigiblelabs/dirigible: imagePullPolicy : Always Eclipse Dirigible versions Update the placeholder with a stable release version: You can find all released versions here . You can find all Eclipse Dirigible Docker images and tags (versions) here .","title":"Version Update"},{"location":"setup/kubernetes/#scaling","text":"The Eclipse Dirigible Deployment could be scaled horizontally by adding/removing Pods as follows: Scale to Zero Scale Up kubectl scale deployment/dirigible --replicas=0 kubectl scale deployment/dirigible --replicas= Note To learn more about application scaling in Kubernetes, see Horizontal Pod Autoscaling .","title":"Scaling"},{"location":"setup/kubernetes/#debugging","text":"To debug the Eclipse Dirigible engine via Remote Java Debugging execute the following commands: Scale the deployment to zero: kubectl scale deployment/dirigible --replicas=0 Set debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS=0.0.0.0:8000 kubectl set env deployment/dirigible -e JPDA_TRANSPORT=dt_socket Edit the deployment and add command and args : kubectl edit deployment dirigible containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always command : [ \"/bin/sh\" ] args : [ \"/usr/local/tomcat/bin/catalina.sh\" , \"jpda\" , \"run\" ] Scale up the deployment: kubectl scale deployment/dirigible --replicas=1 Forward the debug port: kubectl port-forward deployment/dirigible 8000:8000 Clean-up To clean-up the environment after the debugging is done: Stop the port forwarding. Scale the deployment to zero. Remove the debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS- kubectl set env deployment/dirigible -e JPDA_TRANSPORT- Edit the deployment and remove command and args . Scale up the deployment.","title":"Debugging"},{"location":"setup/kubernetes/azure-kubernetes-service/","text":"Setup in Azure Kubernetes Services Deploy Eclipse Dirigible in Azure Kubernetes Services(AKS) environment. Prerequisites Install kubectl . Install Azure cli . Note Configure Azure DNS Zone Setup letsencrypt certificate for your domain. Steps Access the Azure Kubernetes Services (AKS) environment via the Azure cli: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: Azure DNS Zone Setup . apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Azure Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Azure Kubernetes Service"},{"location":"setup/kubernetes/azure-kubernetes-service/#setup-in-azure-kubernetes-services","text":"Deploy Eclipse Dirigible in Azure Kubernetes Services(AKS) environment. Prerequisites Install kubectl . Install Azure cli . Note Configure Azure DNS Zone Setup letsencrypt certificate for your domain.","title":"Setup in Azure Kubernetes Services"},{"location":"setup/kubernetes/azure-kubernetes-service/#steps","text":"Access the Azure Kubernetes Services (AKS) environment via the Azure cli: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: Azure DNS Zone Setup . apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Azure Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Steps"},{"location":"setup/kubernetes/google-kubernetes-engine/","text":"Setup in Google Kubernetes Engine Deploy Eclipse Dirigible in Google Kubernetes Engine (GKE) environment. Prerequisites Install kubectl . Access to Google Kubernetes Engine . Note Create GKE cluster . How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances Steps Access the Google Kubernetes Engine (GKE) environment via the Google Cloud Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: GCP DNS Zone Setup . Prerequisites Install Istio , if not already installed. Install cert-manager , if not already installed. Register your zone in Google Cloud Platform \u2192 Cloud DNS , if not already registered. Register DNS Record Set Get the Istio Ingress Gateway IP: kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Register DNS Record Set: gcloud dns record-sets transaction start --zone= gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= gcloud dns record-sets transaction execute --zone= apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Google Kubernetes Engine Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Google Kubernetes Engine"},{"location":"setup/kubernetes/google-kubernetes-engine/#setup-in-google-kubernetes-engine","text":"Deploy Eclipse Dirigible in Google Kubernetes Engine (GKE) environment. Prerequisites Install kubectl . Access to Google Kubernetes Engine . Note Create GKE cluster . How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances","title":"Setup in Google Kubernetes Engine"},{"location":"setup/kubernetes/google-kubernetes-engine/#steps","text":"Access the Google Kubernetes Engine (GKE) environment via the Google Cloud Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: GCP DNS Zone Setup . Prerequisites Install Istio , if not already installed. Install cert-manager , if not already installed. Register your zone in Google Cloud Platform \u2192 Cloud DNS , if not already registered. Register DNS Record Set Get the Istio Ingress Gateway IP: kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Register DNS Record Set: gcloud dns record-sets transaction start --zone= gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= gcloud dns record-sets transaction execute --zone= apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Google Kubernetes Engine Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Steps"},{"location":"setup/kubernetes/helm/","text":"Setup with Helm You can deploy Dirigible via Helm Chart in a Kubernetes cluster. Prerequisites Helm Kubernetes Cluster on IaaS provider of your choice Steps Add the Eclipse Dirigible Helm repository: helm repo add dirigible https://eclipse.github.io/dirigible helm repo update Verify Eclipse Dirigible Helm chart: helm pull dirigible/dirigible --prov curl -o ~/.gnupg/pubring.gpg https://eclipse.github.io/dirigible/charts/pubring.gpg helm verify dirigible-.tgz You shoul see message: Signed by: Using Key With Fingerprint: Chart Hash Verified: Basic: helm install dirigible dirigible/dirigible Access This will install Eclipse Dirigible Deployment and Service with ClusterIP only. To access the Dirigible instance execute the command that was printed in the console. Example: export POD_NAME=$(kubectl get pods --namespace default -l \"app.kubernetes.io/name=dirigible,app.kubernetes.io/instance=dirigible\" -o jsonpath=\"{.items[0].metadata.name}\") echo \"Visit http://127.0.0.1:8080 to use your application\" kubectl --namespace default port-forward $POD_NAME 8080:8080 Navigate to: http://127.0.0.1:8080 Login with: dirigible / dirigible Kubernetes: Basic Istio PostgreSQL PostgreSQL & Keycloak GCP Cloud SQL Postgre & Keycloak helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= This will expose the Dirigible instance through Ingress host ( http://... ). Prerequisites Install Istio . kubectl label namespace default istio-injection=enabled helm install dirigible dirigible/dirigible \\ --set istio.enabled=true \\ --set ingress.host= This will install Eclipse Dirigible Deployment , Service with ClusterIP only and Istio Gateway and Virtual Service . To access the Dirigible instance execute the command that was printed in the console. kubectl get svc istio-ingressgateway -n istio-system \\ -o jsonpath=\"{.status.loadBalancer.ingress[*].hostname}\" helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set keycloak.database.enabled=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Prerequisites Install the gcloud CLI Install kubectl and configure cluster access Install Helm Info You can check the blog for more details. helm upgrade --install dirigible dirigible -n dirigible-demo \\ --set volume.enabled=true \\ --set serviceAccount.create=false \\ --set keycloak.serviceAccountCreate=false \\ --set ingress.tls=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set istio.enabled=true \\ --set istio.enableHttps=true \\ --set gke.cloudSQL=true \\ --set gke.projectId= \\ --set gke.region= \\ --set ingress.host= Kyma: Basic PostgreSQL PostgreSQL & Keycloak helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= This will install additionally an ApiRule and XSUAA ServiceInstance and ServiceBinding . The appropriate roles should be assigned to the user. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Uninstall: helm uninstall dirigible Configuration The following table lists all the configurable parameters expose by the Dirigible chart and their default values. Generic Name Description Default dirigible.image Custom Dirigible image \"\" image.repository Dirigible image repo dirigiblelabs/dirigible-all image.repositoryKyma Dirigible Kyma image repo dirigiblelabs/dirigible-sap-kyma image.repositoryKeycloak Dirigible Keycloak image repo dirigiblelabs/dirigible-keycloak image.pullPolicy Image pull policy IfNotPresent service.type Service type ClusterIP service.port Service port 8080 replicaCount Number of replicas 1 imagePullSecrets Image pull secrets [] nameOverride Name override \"\" fullnameOverride Fullname override \"\" podSecurityContext Pod security context {} nodeSelector Node selector {} tolerations Tolerations [] affinity Affinity {} resources Resources {} Basic Name Description Default volume.enabled Volume to be mounted true volume.storage Volume storage size 1Gi database.enabled Database to be deployed false database.image Database image postgres:13 database.driver Database JDBC driver org.postgresql.Driver database.storage Database storage size 1Gi database.username Database username dirigible database.password Database password dirigible ingress.enabled Ingress to be created false ingress.annotations Ingress annotations {} ingress.host Ingress host \"\" ingress.tls Ingress tls false Istio Name Description Default istio.enabled Istio to be enable false istio.gatewayName Istio gateway name gateway istio.serversPortNumber Istio servers port number 80 istio.serversPortName Istio servers port name http istio.serversPortProtocol Istio servers port protocol HTTP istio.serversHost Istio servers host * istio.virtualserviceName Istio virtual service name dirigible istio.virtualserviceHosts Istio virtual service hosts * istio.virtualserviceGateways Istio virtual service gateway gateway istio.virtualserviceDestination Istio virtual service destination dirigible istio.virtualservicePort Istio virtual service port 8080 Kyma Name Description Default kyma.enabled Kyma environment to be used false kyma.apirule.enabled Kyma ApiRule to be created true kyma.apirule.host Kyma host to be used in ApiRule \"\" Keycloak Name Description Default keycloak.enabled Keycloak environment to be used false keycloak.install Keycloak to be installed false keycloak.name Keycloak deployment name keycloak keycloak.image Keycloak image jboss/keycloak:12.0.4 keycloak.username Keycloak username admin keycloak.password Keycloak password admin keycloak.replicaCount Keycloak number of replicas 1 keycloak.realm Keycloak realm to be set master keycloak.clientId Keycloak clientId to be used dirigible keycloak.database.enabled Keycloak database to be used false keycloak.database.enabled Keycloak database to be used true keycloak.database.image Keycloak database image postgres:13 keycloak.database.storage Keycloak database storage size 1Gi keycloak.database.username Keycloak database username keycloak keycloak.database.password Keycloak database password keycloak Usage Specify the parameters you which to customize using the --set argument to the helm install command. For instance, helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host=my-ingress-host.com The above command sets the ingress.host to my-ingress-host.com . Alternatively, a YAML file that specifies the values for the above parameters can be provided while installing the chart. For example, helm install dirigible dirigible/dirigible --values values.yaml Tip You can use the default values.yaml .","title":"Helm"},{"location":"setup/kubernetes/helm/#setup-with-helm","text":"You can deploy Dirigible via Helm Chart in a Kubernetes cluster. Prerequisites Helm Kubernetes Cluster on IaaS provider of your choice","title":"Setup with Helm"},{"location":"setup/kubernetes/helm/#steps","text":"Add the Eclipse Dirigible Helm repository: helm repo add dirigible https://eclipse.github.io/dirigible helm repo update Verify Eclipse Dirigible Helm chart: helm pull dirigible/dirigible --prov curl -o ~/.gnupg/pubring.gpg https://eclipse.github.io/dirigible/charts/pubring.gpg helm verify dirigible-.tgz You shoul see message: Signed by: Using Key With Fingerprint: Chart Hash Verified: Basic: helm install dirigible dirigible/dirigible Access This will install Eclipse Dirigible Deployment and Service with ClusterIP only. To access the Dirigible instance execute the command that was printed in the console. Example: export POD_NAME=$(kubectl get pods --namespace default -l \"app.kubernetes.io/name=dirigible,app.kubernetes.io/instance=dirigible\" -o jsonpath=\"{.items[0].metadata.name}\") echo \"Visit http://127.0.0.1:8080 to use your application\" kubectl --namespace default port-forward $POD_NAME 8080:8080 Navigate to: http://127.0.0.1:8080 Login with: dirigible / dirigible Kubernetes: Basic Istio PostgreSQL PostgreSQL & Keycloak GCP Cloud SQL Postgre & Keycloak helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= This will expose the Dirigible instance through Ingress host ( http://... ). Prerequisites Install Istio . kubectl label namespace default istio-injection=enabled helm install dirigible dirigible/dirigible \\ --set istio.enabled=true \\ --set ingress.host= This will install Eclipse Dirigible Deployment , Service with ClusterIP only and Istio Gateway and Virtual Service . To access the Dirigible instance execute the command that was printed in the console. kubectl get svc istio-ingressgateway -n istio-system \\ -o jsonpath=\"{.status.loadBalancer.ingress[*].hostname}\" helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set keycloak.database.enabled=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Prerequisites Install the gcloud CLI Install kubectl and configure cluster access Install Helm Info You can check the blog for more details. helm upgrade --install dirigible dirigible -n dirigible-demo \\ --set volume.enabled=true \\ --set serviceAccount.create=false \\ --set keycloak.serviceAccountCreate=false \\ --set ingress.tls=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set istio.enabled=true \\ --set istio.enableHttps=true \\ --set gke.cloudSQL=true \\ --set gke.projectId= \\ --set gke.region= \\ --set ingress.host= Kyma: Basic PostgreSQL PostgreSQL & Keycloak helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= This will install additionally an ApiRule and XSUAA ServiceInstance and ServiceBinding . The appropriate roles should be assigned to the user. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Uninstall: helm uninstall dirigible","title":"Steps"},{"location":"setup/kubernetes/helm/#configuration","text":"The following table lists all the configurable parameters expose by the Dirigible chart and their default values.","title":"Configuration"},{"location":"setup/kubernetes/helm/#generic","text":"Name Description Default dirigible.image Custom Dirigible image \"\" image.repository Dirigible image repo dirigiblelabs/dirigible-all image.repositoryKyma Dirigible Kyma image repo dirigiblelabs/dirigible-sap-kyma image.repositoryKeycloak Dirigible Keycloak image repo dirigiblelabs/dirigible-keycloak image.pullPolicy Image pull policy IfNotPresent service.type Service type ClusterIP service.port Service port 8080 replicaCount Number of replicas 1 imagePullSecrets Image pull secrets [] nameOverride Name override \"\" fullnameOverride Fullname override \"\" podSecurityContext Pod security context {} nodeSelector Node selector {} tolerations Tolerations [] affinity Affinity {} resources Resources {}","title":"Generic"},{"location":"setup/kubernetes/helm/#basic","text":"Name Description Default volume.enabled Volume to be mounted true volume.storage Volume storage size 1Gi database.enabled Database to be deployed false database.image Database image postgres:13 database.driver Database JDBC driver org.postgresql.Driver database.storage Database storage size 1Gi database.username Database username dirigible database.password Database password dirigible ingress.enabled Ingress to be created false ingress.annotations Ingress annotations {} ingress.host Ingress host \"\" ingress.tls Ingress tls false","title":"Basic"},{"location":"setup/kubernetes/helm/#istio","text":"Name Description Default istio.enabled Istio to be enable false istio.gatewayName Istio gateway name gateway istio.serversPortNumber Istio servers port number 80 istio.serversPortName Istio servers port name http istio.serversPortProtocol Istio servers port protocol HTTP istio.serversHost Istio servers host * istio.virtualserviceName Istio virtual service name dirigible istio.virtualserviceHosts Istio virtual service hosts * istio.virtualserviceGateways Istio virtual service gateway gateway istio.virtualserviceDestination Istio virtual service destination dirigible istio.virtualservicePort Istio virtual service port 8080","title":"Istio"},{"location":"setup/kubernetes/helm/#kyma","text":"Name Description Default kyma.enabled Kyma environment to be used false kyma.apirule.enabled Kyma ApiRule to be created true kyma.apirule.host Kyma host to be used in ApiRule \"\"","title":"Kyma"},{"location":"setup/kubernetes/helm/#keycloak","text":"Name Description Default keycloak.enabled Keycloak environment to be used false keycloak.install Keycloak to be installed false keycloak.name Keycloak deployment name keycloak keycloak.image Keycloak image jboss/keycloak:12.0.4 keycloak.username Keycloak username admin keycloak.password Keycloak password admin keycloak.replicaCount Keycloak number of replicas 1 keycloak.realm Keycloak realm to be set master keycloak.clientId Keycloak clientId to be used dirigible keycloak.database.enabled Keycloak database to be used false keycloak.database.enabled Keycloak database to be used true keycloak.database.image Keycloak database image postgres:13 keycloak.database.storage Keycloak database storage size 1Gi keycloak.database.username Keycloak database username keycloak keycloak.database.password Keycloak database password keycloak","title":"Keycloak"},{"location":"setup/kubernetes/helm/#usage","text":"Specify the parameters you which to customize using the --set argument to the helm install command. For instance, helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host=my-ingress-host.com The above command sets the ingress.host to my-ingress-host.com . Alternatively, a YAML file that specifies the values for the above parameters can be provided while installing the chart. For example, helm install dirigible dirigible/dirigible --values values.yaml Tip You can use the default values.yaml .","title":"Usage"},{"location":"setup/kubernetes/red-hat-openshift/","text":"Setup in Red Hat OpenShift Deploy Eclipse Dirigible in Red Hat OpenShift environment. Prerequisites Install kubectl . Access to Red Hat OpenShift . Steps Access the Red Hat OpenShift environment via the OpenShift Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-temp-data mountPath : /usr/local/tomcat/target volumes : - name : dirigible-temp-data emptyDir : {} apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Route apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : dirigible spec : host : dirigible. to : kind : Service name : dirigible port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the OpenShift Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml OpenShift Console Alternatively the OpenShift Console could be used to deploy the artifacts via the Console UI. Open a web browser and go to: https://dirigible.","title":"Red Hat OpenShift"},{"location":"setup/kubernetes/red-hat-openshift/#setup-in-red-hat-openshift","text":"Deploy Eclipse Dirigible in Red Hat OpenShift environment. Prerequisites Install kubectl . Access to Red Hat OpenShift .","title":"Setup in Red Hat OpenShift"},{"location":"setup/kubernetes/red-hat-openshift/#steps","text":"Access the Red Hat OpenShift environment via the OpenShift Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-temp-data mountPath : /usr/local/tomcat/target volumes : - name : dirigible-temp-data emptyDir : {} apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Route apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : dirigible spec : host : dirigible. to : kind : Service name : dirigible port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the OpenShift Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml OpenShift Console Alternatively the OpenShift Console could be used to deploy the artifacts via the Console UI. Open a web browser and go to: https://dirigible.","title":"Steps"},{"location":"setup/kubernetes/sap-btp-kyma/","text":"Setup in SAP BTP Kyma Deploy Eclipse Dirigible in SAP BTP 1 , Kyma environment. Prerequisites Install kubectl - this step is optional. Access to SAP BTP account (the Trial landscape can be accessed here ). Steps Access the SAP BTP, Kyma environment via the SAP BTP cockpit: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. ports : - containerPort : 8080 name : dirigible protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. volumeMounts : - name : dirigible-volume mountPath : /usr/local/tomcat/target/dirigible/repository ports : - containerPort : 8080 name : dirigible protocol : TCP volumes : - name : dirigible-volume persistentVolumeClaim : claimName : dirigible-claim --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-claim spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service APIRule apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP --- apiVersion : gateway.kyma-project.io/v1alpha1 kind : APIRule metadata : name : dirigible spec : gateway : kyma-gateway.kyma-system.svc.cluster.local rules : - accessStrategies : - config : {} handler : noop methods : - GET - POST - PUT - PATCH - DELETE - HEAD path : /.* service : host : dirigible. name : dirigible port : 8080 Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Click on the Deploy new resource button and select the deployment.yaml and service.yaml files. Note Alternatively the kubectl could be used to deploy the resources: kubectl -f deployment.yaml kubectl -f service.yaml Create XSUAA service instance: From the Kyma dashboard, go to Service Management \u2192 Catalog . Find the Authorization & Trust Management service. Create new service instance. Provide the following additional parameters. { \"xsappname\" : \"dirigible-xsuaa\" , \"oauth2-configuration\" : { \"token-validity\" : 7200 , \"redirect-uris\" : [ \"https://dirigible.\" ] }, \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your Kyma cluster host (e.g. c-xxxxxxx.kyma.xxx.xxx.xxx.ondemand.com ). Bind the servce instance to the dirigible application. Assign the Developer and Operator roles. Log in. SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"SAP BTP Kyma"},{"location":"setup/kubernetes/sap-btp-kyma/#setup-in-sap-btp-kyma","text":"Deploy Eclipse Dirigible in SAP BTP 1 , Kyma environment. Prerequisites Install kubectl - this step is optional. Access to SAP BTP account (the Trial landscape can be accessed here ).","title":"Setup in SAP BTP Kyma"},{"location":"setup/kubernetes/sap-btp-kyma/#steps","text":"Access the SAP BTP, Kyma environment via the SAP BTP cockpit: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. ports : - containerPort : 8080 name : dirigible protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. volumeMounts : - name : dirigible-volume mountPath : /usr/local/tomcat/target/dirigible/repository ports : - containerPort : 8080 name : dirigible protocol : TCP volumes : - name : dirigible-volume persistentVolumeClaim : claimName : dirigible-claim --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-claim spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service APIRule apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP --- apiVersion : gateway.kyma-project.io/v1alpha1 kind : APIRule metadata : name : dirigible spec : gateway : kyma-gateway.kyma-system.svc.cluster.local rules : - accessStrategies : - config : {} handler : noop methods : - GET - POST - PUT - PATCH - DELETE - HEAD path : /.* service : host : dirigible. name : dirigible port : 8080 Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Click on the Deploy new resource button and select the deployment.yaml and service.yaml files. Note Alternatively the kubectl could be used to deploy the resources: kubectl -f deployment.yaml kubectl -f service.yaml Create XSUAA service instance: From the Kyma dashboard, go to Service Management \u2192 Catalog . Find the Authorization & Trust Management service. Create new service instance. Provide the following additional parameters. { \"xsappname\" : \"dirigible-xsuaa\" , \"oauth2-configuration\" : { \"token-validity\" : 7200 , \"redirect-uris\" : [ \"https://dirigible.\" ] }, \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your Kyma cluster host (e.g. c-xxxxxxx.kyma.xxx.xxx.xxx.ondemand.com ). Bind the servce instance to the dirigible application. Assign the Developer and Operator roles. Log in. SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"Steps"},{"location":"setup/kubernetes/addons/azure-dns-zone/","text":"Create Google DNS Zone Setup Prerequisites Install Azure cli . Steps Create a resource group az group create \\ --name DirigibleResourceGroup \\ --location Create static public IP address az aks show \\ --resource-group DirigibleResourceGroup \\ --name dirigible \\ --query nodeResourceGroup \\ -o tsv After you run the previus command you will receive MC_.... and add to next command. az network public-ip create \\ --resource-group MC_DirigibleResourceGroup_dirigible_ \\ --name PublicIP \\ --sku Standard \\ --allocation-method static \\ --query publicIp.ipAddress \\ -o tsv Create DNS zone az network dns zone create \\ -g DirigibleResourceGroup \\ -n dirigible.io Create DNS Record Get ip address kubectl get svc -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Set A dns record az network dns record-set a add-record \\ -g DirigibleResourceGroup \\ -z \\ -n dirigible \\ -a ","title":"Azure DNS Zone"},{"location":"setup/kubernetes/addons/azure-dns-zone/#create-google-dns-zone-setup","text":"Prerequisites Install Azure cli .","title":"Create Google DNS Zone Setup"},{"location":"setup/kubernetes/addons/azure-dns-zone/#steps","text":"Create a resource group az group create \\ --name DirigibleResourceGroup \\ --location Create static public IP address az aks show \\ --resource-group DirigibleResourceGroup \\ --name dirigible \\ --query nodeResourceGroup \\ -o tsv After you run the previus command you will receive MC_.... and add to next command. az network public-ip create \\ --resource-group MC_DirigibleResourceGroup_dirigible_ \\ --name PublicIP \\ --sku Standard \\ --allocation-method static \\ --query publicIp.ipAddress \\ -o tsv Create DNS zone az network dns zone create \\ -g DirigibleResourceGroup \\ -n dirigible.io Create DNS Record Get ip address kubectl get svc -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Set A dns record az network dns record-set a add-record \\ -g DirigibleResourceGroup \\ -z \\ -n dirigible \\ -a ","title":"Steps"},{"location":"setup/kubernetes/addons/gke-cluster/","text":"Create Google Kubernetes cluster Setup Prerequisites First you will need to add your billing information Install the gcloud CLI Install kubectl and configure cluster access Steps Create organization Create project List the organizations gcloud organizations list gcloud projects create dirigible-demo --name=dirigible --organization= You can check for the new project with: gcloud projects list --filter 'parent.id=' Enable Engine Api Go to Kubernetes Engine -> Clusters and click on Enable to allow creating cluster. Create cluster Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . Create an IAM service account with the minimum permissions required to operate GKE SA_NAME: the name of the new service account. DISPLAY_NAME: the display name for the new service account, which makes the account easier to identify. PROJECT_ID: the project ID of the project in which you want to create the new service account. SA_NAME=sa-minimum-pemissions-gke-demo \\ DISPLAY_NAME='SA minimum permissions required to operate GKE' \\ PROJECT_ID= gcloud iam service-accounts create $SA_NAME \\ --display-name=\"$DISPLAY_NAME\" \\ --project $PROJECT_ID gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/logging.logWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.metricWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.viewer gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/stackdriver.resourceMetadata.writer Create the cluster gcloud container clusters create \\ --region europe-west1-b \\ --project=$PROJECT_ID \\ --service-account=$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com Get connection to the cluster gcloud container clusters get-credentials Note How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances","title":"GKE cluster"},{"location":"setup/kubernetes/addons/gke-cluster/#create-google-kubernetes-cluster-setup","text":"Prerequisites First you will need to add your billing information Install the gcloud CLI Install kubectl and configure cluster access","title":"Create Google Kubernetes cluster Setup"},{"location":"setup/kubernetes/addons/gke-cluster/#steps","text":"Create organization Create project List the organizations gcloud organizations list gcloud projects create dirigible-demo --name=dirigible --organization= You can check for the new project with: gcloud projects list --filter 'parent.id=' Enable Engine Api Go to Kubernetes Engine -> Clusters and click on Enable to allow creating cluster. Create cluster Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . Create an IAM service account with the minimum permissions required to operate GKE SA_NAME: the name of the new service account. DISPLAY_NAME: the display name for the new service account, which makes the account easier to identify. PROJECT_ID: the project ID of the project in which you want to create the new service account. SA_NAME=sa-minimum-pemissions-gke-demo \\ DISPLAY_NAME='SA minimum permissions required to operate GKE' \\ PROJECT_ID= gcloud iam service-accounts create $SA_NAME \\ --display-name=\"$DISPLAY_NAME\" \\ --project $PROJECT_ID gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/logging.logWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.metricWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.viewer gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/stackdriver.resourceMetadata.writer Create the cluster gcloud container clusters create \\ --region europe-west1-b \\ --project=$PROJECT_ID \\ --service-account=$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com Get connection to the cluster gcloud container clusters get-credentials Note How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances","title":"Steps"},{"location":"setup/kubernetes/addons/google-dns-zone/","text":"Create Google DNS Zone Setup Prerequisites Enable Cloud DNS API . install gcloud install gcloud component gcloud components install kubectl Access to Kubernetes cluster gcloud auth login . Update the kubectl configuration to use the plugin gcloud container clusters get-credentials --zone Steps Create managed DNS Zone Console gcloud Google Cloud console In the Google Cloud console, go to the Create a DNS zone page. `Go to Create a DNS zone` For the Zone type, select Public. Enter a Zone name such as my-new-zone. Enter a DNS name suffix for the zonegcloud config set project PROJECT_ID using a domain name that you own. All records in the zone share this suffix, for example: example.com. Under DNSSEC, select Off, On, or Transfer. For more information, see Enable DNSSEC for existing managed zones. Click Create. The Zone details page is displayed. Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . gcloud dns managed-zones create NAME \\ --description=DESCRIPTION \\ --dns-name=DNS_SUFFIX \\ --labels=LABELS \\ --visibility=public Replace Placeholders DESCRIPTION with your description. LABELS with your label. DNS_SUFFIX with your main domain or subdomain. Get Ingress IP address Kubernetes Ingress Istio Ingress kubectl get ingress check column ADDRESS kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Change namespace istio-ingress to match your installation. Note You can check Istio setup Create A record in Cloud DNS Set zone for which you will create records gcloud dns record-sets transaction start --zone= Add A record gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= Apply the new record gcloud dns record-sets transaction execute --zone= - Promote ephemeral ip to reserve ``` gcloud compute addresses create --addresses= \\ --region= ``` Get your current DNS records for your zone gcloud dns record-sets list --zone= Replace Placeholders Before run the commands, replace the following placeholders: with your Google cloud dnz zone name. Add name servers Subdomain Main domain Note If you configure subdomain add Google name servers to your main domain control panel for this subdomain example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note At the end you need to update your domain's name servers to use Cloud DNS to publish your new records to the internet. Example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note How to create certificate for your domain .","title":"GCP DNS Zone"},{"location":"setup/kubernetes/addons/google-dns-zone/#create-google-dns-zone-setup","text":"Prerequisites Enable Cloud DNS API . install gcloud install gcloud component gcloud components install kubectl Access to Kubernetes cluster gcloud auth login . Update the kubectl configuration to use the plugin gcloud container clusters get-credentials --zone ","title":"Create Google DNS Zone Setup"},{"location":"setup/kubernetes/addons/google-dns-zone/#steps","text":"Create managed DNS Zone Console gcloud Google Cloud console In the Google Cloud console, go to the Create a DNS zone page. `Go to Create a DNS zone` For the Zone type, select Public. Enter a Zone name such as my-new-zone. Enter a DNS name suffix for the zonegcloud config set project PROJECT_ID using a domain name that you own. All records in the zone share this suffix, for example: example.com. Under DNSSEC, select Off, On, or Transfer. For more information, see Enable DNSSEC for existing managed zones. Click Create. The Zone details page is displayed. Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . gcloud dns managed-zones create NAME \\ --description=DESCRIPTION \\ --dns-name=DNS_SUFFIX \\ --labels=LABELS \\ --visibility=public Replace Placeholders DESCRIPTION with your description. LABELS with your label. DNS_SUFFIX with your main domain or subdomain. Get Ingress IP address Kubernetes Ingress Istio Ingress kubectl get ingress check column ADDRESS kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Change namespace istio-ingress to match your installation. Note You can check Istio setup Create A record in Cloud DNS Set zone for which you will create records gcloud dns record-sets transaction start --zone= Add A record gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= Apply the new record gcloud dns record-sets transaction execute --zone= - Promote ephemeral ip to reserve ``` gcloud compute addresses create --addresses= \\ --region= ``` Get your current DNS records for your zone gcloud dns record-sets list --zone= Replace Placeholders Before run the commands, replace the following placeholders: with your Google cloud dnz zone name. Add name servers Subdomain Main domain Note If you configure subdomain add Google name servers to your main domain control panel for this subdomain example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note At the end you need to update your domain's name servers to use Cloud DNS to publish your new records to the internet. Example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note How to create certificate for your domain .","title":"Steps"},{"location":"setup/kubernetes/addons/istio/","text":"Istio Setup Prerequisites Install istioctl . Install kubectl Access to Kubernetes cluster. Create istio-system namespace kubectl create namespace istio-system Install Istio conrol plane service istiod apiVersion : v1 kind : Service metadata : labels : app : istiod istio : pilot release : istio name : istiod namespace : istio-system spec : type : ClusterIP ports : - name : grpc-xds port : 15010 - name : https-dns port : 15012 - name : https-webhook port : 443 targetPort : 15017 - name : http-monitoring port : 15014 selector : app : istiod Install minimal and reduce gateway config. Create control-plane.yaml file apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : control-plane spec : profile : minimal components : pilot : k8s : env : - name : PILOT_FILTER_GATEWAY_CLUSTER_CONFIG value : \"true\" meshConfig : defaultConfig : proxyMetadata : ISTIO_META_DNS_CAPTURE : \"true\" enablePrometheusMerge : true Check the latest version istioctl install -y -n istio-system -f control-plane.yaml --revision 1-14-3 Add Istio injection kubectl label namespace default istio-injection=enabled --overwrite Enable istio-ingressgateway component Create namespace istio-ingress kubectl create namespace istio-ingress Create istio-ingress-gw-install.yaml apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : istio-ingress-gw-install spec : profile : empty values : gateways : istio-ingressgateway : autoscaleEnabled : false components : ingressGateways : - name : istio-ingressgateway namespace : istio-ingress enabled : true k8s : overlays : - apiVersion : apps/v1 kind : Deployment name : istio-ingressgateway patches : - path : spec.template.spec.containers[name:istio-proxy].lifecycle value : preStop : exec : command : [ \"sh\" , \"-c\" , \"sleep 5\" ] Apply latest revision istioctl install -y -n istio-ingress -f istio-ingress-gw-install.yaml --revision 1-14-3 Apply Strict mTLS apiVersion : security.istio.io/v1beta1 kind : PeerAuthentication metadata : name : default namespace : istio-system spec : mtls : mode : STRICT","title":"Istio"},{"location":"setup/kubernetes/addons/istio/#istio-setup","text":"Prerequisites Install istioctl . Install kubectl Access to Kubernetes cluster. Create istio-system namespace kubectl create namespace istio-system Install Istio conrol plane service istiod apiVersion : v1 kind : Service metadata : labels : app : istiod istio : pilot release : istio name : istiod namespace : istio-system spec : type : ClusterIP ports : - name : grpc-xds port : 15010 - name : https-dns port : 15012 - name : https-webhook port : 443 targetPort : 15017 - name : http-monitoring port : 15014 selector : app : istiod Install minimal and reduce gateway config. Create control-plane.yaml file apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : control-plane spec : profile : minimal components : pilot : k8s : env : - name : PILOT_FILTER_GATEWAY_CLUSTER_CONFIG value : \"true\" meshConfig : defaultConfig : proxyMetadata : ISTIO_META_DNS_CAPTURE : \"true\" enablePrometheusMerge : true Check the latest version istioctl install -y -n istio-system -f control-plane.yaml --revision 1-14-3 Add Istio injection kubectl label namespace default istio-injection=enabled --overwrite Enable istio-ingressgateway component Create namespace istio-ingress kubectl create namespace istio-ingress Create istio-ingress-gw-install.yaml apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : istio-ingress-gw-install spec : profile : empty values : gateways : istio-ingressgateway : autoscaleEnabled : false components : ingressGateways : - name : istio-ingressgateway namespace : istio-ingress enabled : true k8s : overlays : - apiVersion : apps/v1 kind : Deployment name : istio-ingressgateway patches : - path : spec.template.spec.containers[name:istio-proxy].lifecycle value : preStop : exec : command : [ \"sh\" , \"-c\" , \"sleep 5\" ] Apply latest revision istioctl install -y -n istio-ingress -f istio-ingress-gw-install.yaml --revision 1-14-3 Apply Strict mTLS apiVersion : security.istio.io/v1beta1 kind : PeerAuthentication metadata : name : default namespace : istio-system spec : mtls : mode : STRICT","title":"Istio Setup"},{"location":"setup/kubernetes/addons/keycloak/","text":"Keycloak Setup Deploy Keycloak in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster. Steps Create deployment configuration file: deployment.yaml Deployment Deployment with PostgreSQL apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" ports : - name : http containerPort : 8080 protocol : TCP Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : initContainers : - name : wait-db-ready image : busybox:1.28 command : - sh - -c - for i in {1..15}; do echo \"Waiting for database creation.\"; sleep 2; done; containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" - name : DB_VENDOR value : postgres - name : DB_USER value : - name : DB_PASSWORD value : - name : DB_DATABASE value : - name : DB_ADDR value : keycloak-database ports : - name : http containerPort : 8080 protocol : TCP --- apiVersion : apps/v1 kind : Deployment metadata : name : keycloak-database labels : app : keycloak-database spec : replicas : 1 selector : matchLabels : app : keycloak-database template : metadata : labels : app : keycloak-database spec : containers : - name : keycloak-database image : postgres:13 volumeMounts : - name : keycloak-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : keycloak-database-data persistentVolumeClaim : claimName : keycloak-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : keycloak-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . with your Keycloak database username (e.g. dbadmin ) . with your Keycloak database password (e.g. dbadmin ) . Create service configuration file: service.yaml Service Service with PostgreSQL Route (OpenShift) apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : keycloak spec : host : keycloak. to : kind : Service name : keycloak port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://keycloak.","title":"Keycloak"},{"location":"setup/kubernetes/addons/keycloak/#keycloak-setup","text":"Deploy Keycloak in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster.","title":"Keycloak Setup"},{"location":"setup/kubernetes/addons/keycloak/#steps","text":"Create deployment configuration file: deployment.yaml Deployment Deployment with PostgreSQL apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" ports : - name : http containerPort : 8080 protocol : TCP Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : initContainers : - name : wait-db-ready image : busybox:1.28 command : - sh - -c - for i in {1..15}; do echo \"Waiting for database creation.\"; sleep 2; done; containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" - name : DB_VENDOR value : postgres - name : DB_USER value : - name : DB_PASSWORD value : - name : DB_DATABASE value : - name : DB_ADDR value : keycloak-database ports : - name : http containerPort : 8080 protocol : TCP --- apiVersion : apps/v1 kind : Deployment metadata : name : keycloak-database labels : app : keycloak-database spec : replicas : 1 selector : matchLabels : app : keycloak-database template : metadata : labels : app : keycloak-database spec : containers : - name : keycloak-database image : postgres:13 volumeMounts : - name : keycloak-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : keycloak-database-data persistentVolumeClaim : claimName : keycloak-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : keycloak-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . with your Keycloak database username (e.g. dbadmin ) . with your Keycloak database password (e.g. dbadmin ) . Create service configuration file: service.yaml Service Service with PostgreSQL Route (OpenShift) apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : keycloak spec : host : keycloak. to : kind : Service name : keycloak port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://keycloak.","title":"Steps"},{"location":"setup/kubernetes/addons/letsencrypt/","text":"Letsencrypt Setup Deploy Cert Manager in Kubernetes environment. Prerequisites Install kubectl . Install Helm Access to Kubernetes cluster. Steps Install cert-manager: Add Jetstack Helm repository: helm repo add jetstack https://charts.jetstack.io Update your local Helm chart repository cache: helm repo update Intall cert-manager and CustomResourceDefinitions : helm install \\ cert-manager jetstack/cert-manager \\ --namespace cert-manager \\ --create-namespace \\ --version v1.9.1 \\ --set installCRDs=true Note Check the current version of the Installation with Helm . Create Cluster Issuer: apiVersion : cert-manager.io/v1alpha2 kind : ClusterIssuer metadata : name : dirigible spec : acme : server : https://acme-v02.api.letsencrypt.org/directory email : privateKeySecretRef : name : dirigible http01 : {} Note Replace the placeholder with valid email address. Update ClusterIssuer If your ingress is Istio change the ClusterIssuer and add: solvers: - selector: {} http01: ingress: class: istio Create certificate: apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : dirigible kind : ClusterIssuer commonName : \"\" dnsNames : - \"\" Note Replace the placeholder with your domain from previous step. Add Namespace If your Istio Ingress is installed to namespace istio-ingress add namespace: istio-ingress . Create Ingress: Kubernetes Ingress Istio Ingress apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can install istio with default profile istioctl install this will install istio-ingressgateway and istiod and you can install manually : apiVersion : networking.istio.io/v1alpha3 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - port : number : 80 name : http protocol : HTTP hosts : - dirigible. # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true - port : number : 443 name : https-443 protocol : HTTPS hosts : - dirigible. tls : mode : SIMPLE credentialName : dirigible Replace the placeholder with your domain from previous step. Create Virtual Service for Istio: apiVersion : networking.istio.io/v1beta1 kind : VirtualService metadata : name : dirigible spec : hosts : - \"dirigible.\" gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : host : dirigible.default.svc.cluster.local port : number : 8080 Replace the placeholder with your domain from previous step. Check certificate status in cert-manager: kubectl logs -n cert-manager -lapp=cert-manager","title":"Letsencrypt"},{"location":"setup/kubernetes/addons/letsencrypt/#letsencrypt-setup","text":"Deploy Cert Manager in Kubernetes environment. Prerequisites Install kubectl . Install Helm Access to Kubernetes cluster.","title":"Letsencrypt Setup"},{"location":"setup/kubernetes/addons/letsencrypt/#steps","text":"Install cert-manager: Add Jetstack Helm repository: helm repo add jetstack https://charts.jetstack.io Update your local Helm chart repository cache: helm repo update Intall cert-manager and CustomResourceDefinitions : helm install \\ cert-manager jetstack/cert-manager \\ --namespace cert-manager \\ --create-namespace \\ --version v1.9.1 \\ --set installCRDs=true Note Check the current version of the Installation with Helm . Create Cluster Issuer: apiVersion : cert-manager.io/v1alpha2 kind : ClusterIssuer metadata : name : dirigible spec : acme : server : https://acme-v02.api.letsencrypt.org/directory email : privateKeySecretRef : name : dirigible http01 : {} Note Replace the placeholder with valid email address. Update ClusterIssuer If your ingress is Istio change the ClusterIssuer and add: solvers: - selector: {} http01: ingress: class: istio Create certificate: apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : dirigible kind : ClusterIssuer commonName : \"\" dnsNames : - \"\" Note Replace the placeholder with your domain from previous step. Add Namespace If your Istio Ingress is installed to namespace istio-ingress add namespace: istio-ingress . Create Ingress: Kubernetes Ingress Istio Ingress apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can install istio with default profile istioctl install this will install istio-ingressgateway and istiod and you can install manually : apiVersion : networking.istio.io/v1alpha3 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - port : number : 80 name : http protocol : HTTP hosts : - dirigible. # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true - port : number : 443 name : https-443 protocol : HTTPS hosts : - dirigible. tls : mode : SIMPLE credentialName : dirigible Replace the placeholder with your domain from previous step. Create Virtual Service for Istio: apiVersion : networking.istio.io/v1beta1 kind : VirtualService metadata : name : dirigible spec : hosts : - \"dirigible.\" gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : host : dirigible.default.svc.cluster.local port : number : 8080 Replace the placeholder with your domain from previous step. Check certificate status in cert-manager: kubectl logs -n cert-manager -lapp=cert-manager","title":"Steps"},{"location":"setup/kubernetes/addons/postgresql/","text":"PostgreSQL Setup Deploy PostgreSQL in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster. Steps Kubernetes Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 volumeMounts : - name : dirigible-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : dirigible-database-data persistentVolumeClaim : claimName : dirigible-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . Create service configuration file: service.yaml Service apiVersion : v1 kind : Service metadata : name : dirigible-database labels : app : dirigible-database spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : dirigible-database Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml GCP Cloud Dirigible PostgreSQL instances Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create PostgreSQL instance gcloud beta sql instances create YOUR_DIRIGIBLE_SQL_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_DIRIGIBLE_SQL_INSTANCE --require-ssl Create Dirigible database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_DIRIGIBLE_DB_NAME \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE Create Dirigible user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_DIRIGIBLE_DB_USER \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Quickstart Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_GKE_CLUSTER_NAME \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KSA_NAME]\" \\ YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KSA_NAME \\ iam.gke.io/gcp-service-account=YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_SERVICE_ACCOUNT_NAME Configure secrets kubectl create secret generic YOUR_DIRIGIBLE_SECRET_NAMET \\ --from-literal=database=YOUR_DIRIGIBLE_DATABASE \\ --from-literal=username=YOUR_DIRIGIBLE_USERNAME \\ --from-literal=password=DB_PASS Deploye app connects to your Cloud SQL instance env : - name : POSTGRE_URL valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : postgre_url - name : POSTGRE_DRIVER value : org.postgresql.Driver - name : POSTGRE_USERNAME valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : username - name : POSTGRE_PASSWORD valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : password - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances=::=tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true GCP Cloud Keycloak PostgreSQL instances Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create Keycloak PostgreSQL instance gcloud beta sql instances create YOUR_KEYCLOAK_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_KEYCLOAK_INSTANCE --require-ssl Create Keycloak database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_KEYCLOAK_DB \\ --instance=YOUR_KEYCLOAK_INSTANCE Create Keycloak user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_KEYCLOAK_USER \\ --instance=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Keycloak Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog Update node pool if is not updated gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_CLUSTER \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME]\" \\ YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME \\ iam.gke.io/gcp-service-account=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_KUBERNETES_SERVICE_ACCOUNT Configure secrets kubectl create secret generic YOUR_KEYCLOAK_SECRET_NAME \\ --from-literal=database=YOUR_KEYCLOAK_DB_NAME \\ --from-literal=username=YOUR_KEYCLOAK_USER_NAME \\ --from-literal=password=YOUR_KEYCLOAK_DB_PASS \\ --from-literal=postgre_url=jdbc:postgresql://127.0.0.1:5432/YOUR_KEYCLOAK_DB_NAME Set the environments to Keycloak deployment. env : - name : DB_VENDOR value : postgres - name : DB_USER valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : username - name : DB_PASSWORD valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : password - name : DB_DATABASE valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : database - name : DB_ADDR value : 127.0.0.1 - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances==tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true","title":"PostgreSQL"},{"location":"setup/kubernetes/addons/postgresql/#postgresql-setup","text":"Deploy PostgreSQL in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster.","title":"PostgreSQL Setup"},{"location":"setup/kubernetes/addons/postgresql/#steps","text":"","title":"Steps"},{"location":"setup/kubernetes/addons/postgresql/#kubernetes","text":"Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 volumeMounts : - name : dirigible-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : dirigible-database-data persistentVolumeClaim : claimName : dirigible-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . Create service configuration file: service.yaml Service apiVersion : v1 kind : Service metadata : name : dirigible-database labels : app : dirigible-database spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : dirigible-database Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml","title":"Kubernetes"},{"location":"setup/kubernetes/addons/postgresql/#gcp-cloud-dirigible-postgresql-instances","text":"Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create PostgreSQL instance gcloud beta sql instances create YOUR_DIRIGIBLE_SQL_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_DIRIGIBLE_SQL_INSTANCE --require-ssl Create Dirigible database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_DIRIGIBLE_DB_NAME \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE Create Dirigible user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_DIRIGIBLE_DB_USER \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Quickstart Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_GKE_CLUSTER_NAME \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KSA_NAME]\" \\ YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KSA_NAME \\ iam.gke.io/gcp-service-account=YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_SERVICE_ACCOUNT_NAME Configure secrets kubectl create secret generic YOUR_DIRIGIBLE_SECRET_NAMET \\ --from-literal=database=YOUR_DIRIGIBLE_DATABASE \\ --from-literal=username=YOUR_DIRIGIBLE_USERNAME \\ --from-literal=password=DB_PASS Deploye app connects to your Cloud SQL instance env : - name : POSTGRE_URL valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : postgre_url - name : POSTGRE_DRIVER value : org.postgresql.Driver - name : POSTGRE_USERNAME valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : username - name : POSTGRE_PASSWORD valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : password - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances=::=tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true","title":"GCP Cloud Dirigible PostgreSQL instances"},{"location":"setup/kubernetes/addons/postgresql/#gcp-cloud-keycloak-postgresql-instances","text":"Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create Keycloak PostgreSQL instance gcloud beta sql instances create YOUR_KEYCLOAK_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_KEYCLOAK_INSTANCE --require-ssl Create Keycloak database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_KEYCLOAK_DB \\ --instance=YOUR_KEYCLOAK_INSTANCE Create Keycloak user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_KEYCLOAK_USER \\ --instance=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Keycloak Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog Update node pool if is not updated gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_CLUSTER \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME]\" \\ YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME \\ iam.gke.io/gcp-service-account=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_KUBERNETES_SERVICE_ACCOUNT Configure secrets kubectl create secret generic YOUR_KEYCLOAK_SECRET_NAME \\ --from-literal=database=YOUR_KEYCLOAK_DB_NAME \\ --from-literal=username=YOUR_KEYCLOAK_USER_NAME \\ --from-literal=password=YOUR_KEYCLOAK_DB_PASS \\ --from-literal=postgre_url=jdbc:postgresql://127.0.0.1:5432/YOUR_KEYCLOAK_DB_NAME Set the environments to Keycloak deployment. env : - name : DB_VENDOR value : postgres - name : DB_USER valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : username - name : DB_PASSWORD valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : password - name : DB_DATABASE valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : database - name : DB_ADDR value : 127.0.0.1 - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances==tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true","title":"GCP Cloud Keycloak PostgreSQL instances"},{"location":"tutorials/application-development/file-upload/","text":"File Upload Overview This sample shows how to create a simple web application for uploading files. Steps Create a project named file-upload-project . Right click on the file-upload-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the file-upload-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"file-upload-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the file-upload-project project and select New \u2192 TypeScript Service . Enter service.ts for the name of the TypeScript Service. Replace the content with the following code: import { upload , request , response } from \"sdk/http\" ; import { cmis } from \"sdk/cms\" ; import { streams } from \"sdk/io\" ; if ( request . getMethod () === \"POST\" ) { if ( upload . isMultipartContent ()) { const fileItems = upload . parseRequest (); for ( let i = 0 ; i < fileItems . size (); i ++ ) { const fileItem = fileItems . get ( i ); const fileName = fileItem . getName (); const contentType = fileItem . getContentType (); const bytes = fileItem . getBytes (); const inputStream = streams . createByteArrayInputStream ( bytes ); const cmisSession = cmis . getSession (); const contentStream = cmisSession . getObjectFactory (). createContentStream ( fileName , bytes . length , contentType , inputStream ); cmisSession . createDocument ( \"file-upload-project/uploads\" , { [ cmis . OBJECT_TYPE_ID ] : cmis . OBJECT_TYPE_DOCUMENT , [ cmis . NAME ] : fileName }, contentStream , cmis . VERSIONING_STATE_MAJOR ); } response . sendRedirect ( \"/services/web/ide-documents/\" ); } else { response . println ( \"The request's content must be 'multipart'\" ); } } else { response . println ( \"Use POST request.\" ); } response . flush (); response . close (); Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . http/upload Take a look at the http/upload documentation for more details about the API. Right click on the file-upload-project project and select New \u2192 HTLM5 Page . Enter index.html for the name of the file. Replace the content with the following code: < html > < body > < form action = \"/services/ts/file-upload-project/service.ts\" method = \"post\" enctype = \"multipart/form-data\" > < label for = \"file\" > Filename: < input type = \"file\" name = \"file\" id = \"file\" multiple > < br > < input type = \"submit\" name = \"submit\" value = \"Submit\" > < p >< b > Note: After successful upload you'll be redirected to the < a href = \"/services/web/ide-documents/\" > Documents perspective where the file can be found under the < b > file-upload-project/uploads folder. Save & Publish Saving the files will trigger a Publish action, which will build and deploy the TypeScript Service and the HTML5 Page . Select the index.html file and open the Preview view to test the file upload. Summary Tutorial Completed After completing the steps in this tutorial, you would have: HTML page to submit the uploaded file to the TypeScript service. Backend TypeScript service that would render the uploaded file. Note: The complete content of the File Upload tutorial is available at: https://github.com/dirigiblelabs/tutorial-file-upload-project","title":"File Upload"},{"location":"tutorials/application-development/file-upload/#file-upload","text":"","title":"File Upload"},{"location":"tutorials/application-development/file-upload/#overview","text":"This sample shows how to create a simple web application for uploading files.","title":"Overview"},{"location":"tutorials/application-development/file-upload/#steps","text":"Create a project named file-upload-project . Right click on the file-upload-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the file-upload-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"file-upload-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the file-upload-project project and select New \u2192 TypeScript Service . Enter service.ts for the name of the TypeScript Service. Replace the content with the following code: import { upload , request , response } from \"sdk/http\" ; import { cmis } from \"sdk/cms\" ; import { streams } from \"sdk/io\" ; if ( request . getMethod () === \"POST\" ) { if ( upload . isMultipartContent ()) { const fileItems = upload . parseRequest (); for ( let i = 0 ; i < fileItems . size (); i ++ ) { const fileItem = fileItems . get ( i ); const fileName = fileItem . getName (); const contentType = fileItem . getContentType (); const bytes = fileItem . getBytes (); const inputStream = streams . createByteArrayInputStream ( bytes ); const cmisSession = cmis . getSession (); const contentStream = cmisSession . getObjectFactory (). createContentStream ( fileName , bytes . length , contentType , inputStream ); cmisSession . createDocument ( \"file-upload-project/uploads\" , { [ cmis . OBJECT_TYPE_ID ] : cmis . OBJECT_TYPE_DOCUMENT , [ cmis . NAME ] : fileName }, contentStream , cmis . VERSIONING_STATE_MAJOR ); } response . sendRedirect ( \"/services/web/ide-documents/\" ); } else { response . println ( \"The request's content must be 'multipart'\" ); } } else { response . println ( \"Use POST request.\" ); } response . flush (); response . close (); Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . http/upload Take a look at the http/upload documentation for more details about the API. Right click on the file-upload-project project and select New \u2192 HTLM5 Page . Enter index.html for the name of the file. Replace the content with the following code: < html > < body > < form action = \"/services/ts/file-upload-project/service.ts\" method = \"post\" enctype = \"multipart/form-data\" > < label for = \"file\" > Filename: < input type = \"file\" name = \"file\" id = \"file\" multiple > < br > < input type = \"submit\" name = \"submit\" value = \"Submit\" > < p >< b > Note: After successful upload you'll be redirected to the < a href = \"/services/web/ide-documents/\" > Documents perspective where the file can be found under the < b > file-upload-project/uploads folder. Save & Publish Saving the files will trigger a Publish action, which will build and deploy the TypeScript Service and the HTML5 Page . Select the index.html file and open the Preview view to test the file upload.","title":"Steps"},{"location":"tutorials/application-development/file-upload/#summary","text":"Tutorial Completed After completing the steps in this tutorial, you would have: HTML page to submit the uploaded file to the TypeScript service. Backend TypeScript service that would render the uploaded file. Note: The complete content of the File Upload tutorial is available at: https://github.com/dirigiblelabs/tutorial-file-upload-project","title":"Summary"},{"location":"tutorials/application-development/kafka/","text":"Kafka Producer and Counsmer Prerequisites Run a local Kafka server following the steps (1 and 2) from here: https://kafka.apache.org/quickstart Steps Create a project kafka_project Then create a JavaScript service named my_kafka_handler.js Replace the service code with the following content: Handler exports . onMessage = function ( message ) { console . log ( \"Hello from My Kafka Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Kafka Listener! Error: \" + error ); }; Then create a Kafka Consumer named my_kafka_consumer.js Replace the file content with the following code: var consumer = require ( \"kafka/consumer\" ); consumer . topic ( \"topic1\" , \"{}\" ). startListening ( \"kafka_project/my_kafka_handler\" , 1000 ); Then create another back-end service which will play the role of a trigger my_kafka_producer.js Replace the trigger content with the following code: var producer = require ( \"kafka/producer\" ); producer . topic ( \"topic1\" , \"{}\" ). send ( \"key1\" , \"value1\" ); Publish the project Select the my_kafka_producer.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: 2020-11-01 23:33:54.272 [INFO ] [Thread-275] o.e.dirigible.api.v3.core.Console - Hello from My Kafka Listener! Message: {\"topic\":\"topic1\",\"partition\":0,\"offset\":29,\"timestamp\":1604266434251,\"timestampType\":\"CREATE_TIME\",\"serializedKeySize\":4,\"serializedValueSize\":6,\"headers\":{\"headers\":[],\"isReadOnly\":false},\"key\":\"key1\",\"value\":\"value1\",\"leaderEpoch\":{\"value\":0}} Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Kafka Producer and Consumer"},{"location":"tutorials/application-development/kafka/#kafka-producer-and-counsmer","text":"","title":"Kafka Producer and Counsmer"},{"location":"tutorials/application-development/kafka/#prerequisites","text":"Run a local Kafka server following the steps (1 and 2) from here: https://kafka.apache.org/quickstart","title":"Prerequisites"},{"location":"tutorials/application-development/kafka/#steps","text":"Create a project kafka_project Then create a JavaScript service named my_kafka_handler.js Replace the service code with the following content:","title":"Steps"},{"location":"tutorials/application-development/kafka/#handler","text":"exports . onMessage = function ( message ) { console . log ( \"Hello from My Kafka Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Kafka Listener! Error: \" + error ); }; Then create a Kafka Consumer named my_kafka_consumer.js Replace the file content with the following code: var consumer = require ( \"kafka/consumer\" ); consumer . topic ( \"topic1\" , \"{}\" ). startListening ( \"kafka_project/my_kafka_handler\" , 1000 ); Then create another back-end service which will play the role of a trigger my_kafka_producer.js Replace the trigger content with the following code: var producer = require ( \"kafka/producer\" ); producer . topic ( \"topic1\" , \"{}\" ). send ( \"key1\" , \"value1\" ); Publish the project Select the my_kafka_producer.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: 2020-11-01 23:33:54.272 [INFO ] [Thread-275] o.e.dirigible.api.v3.core.Console - Hello from My Kafka Listener! Message: {\"topic\":\"topic1\",\"partition\":0,\"offset\":29,\"timestamp\":1604266434251,\"timestampType\":\"CREATE_TIME\",\"serializedKeySize\":4,\"serializedValueSize\":6,\"headers\":{\"headers\":[],\"isReadOnly\":false},\"key\":\"key1\",\"value\":\"value1\",\"leaderEpoch\":{\"value\":0}} Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Handler"},{"location":"tutorials/application-development/listener-queue/","text":"Listener of a Queue Steps Create a project message_queue_listener_project Then create a JavaScript service named my_listener_handler.js Replace the service code with the following content: Handler exports . onMessage = function ( message ) { console . log ( \"Hello from My Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Listener! Error: \" + error ); }; Then create a Message Listener named my_listener.listener Replace the file content with the following JSON code: { \"name\" : \"message_queue_listener_project/my_queue\" , \"type\" : \"Q\" , \"handler\" : \"message_queue_listener_project/my_listener_handler.js\" , \"description\" : \"My Listener\" } Then create another back-end service which will play the role of a trigger my_trigger.js Replace the trigger content with the following code: var producer = require ( 'messaging/v3/producer' ); var message = \"*** I am a message created at: \" + new Date () + \" ***\" ; producer . queue ( \"message_queue_listener_project/my_queue\" ). send ( message ); console . log ( \"Hello from My Trigger! Message: \" + message ); Publish the project Select the my_trigger.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: [2018-05-14T11:57:13.197Z] [INFO] Hello from My Listener! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) [2018-05-14T11:57:13.174Z] [INFO] Hello from My Trigger! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Listener of a Queue"},{"location":"tutorials/application-development/listener-queue/#listener-of-a-queue","text":"","title":"Listener of a Queue"},{"location":"tutorials/application-development/listener-queue/#steps","text":"Create a project message_queue_listener_project Then create a JavaScript service named my_listener_handler.js Replace the service code with the following content:","title":"Steps"},{"location":"tutorials/application-development/listener-queue/#handler","text":"exports . onMessage = function ( message ) { console . log ( \"Hello from My Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Listener! Error: \" + error ); }; Then create a Message Listener named my_listener.listener Replace the file content with the following JSON code: { \"name\" : \"message_queue_listener_project/my_queue\" , \"type\" : \"Q\" , \"handler\" : \"message_queue_listener_project/my_listener_handler.js\" , \"description\" : \"My Listener\" } Then create another back-end service which will play the role of a trigger my_trigger.js Replace the trigger content with the following code: var producer = require ( 'messaging/v3/producer' ); var message = \"*** I am a message created at: \" + new Date () + \" ***\" ; producer . queue ( \"message_queue_listener_project/my_queue\" ). send ( message ); console . log ( \"Hello from My Trigger! Message: \" + message ); Publish the project Select the my_trigger.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: [2018-05-14T11:57:13.197Z] [INFO] Hello from My Listener! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) [2018-05-14T11:57:13.174Z] [INFO] Hello from My Trigger! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Handler"},{"location":"tutorials/application-development/shell-command/","text":"Shell Command Steps Create a project shell_command_project Then create a file named my_command.sh Replace the code with the following content: uname -an echo variable1=$variable1 Then create a Command named my_command.command Replace the content with the following JSON code: { \"description\" : \"command description\" , \"contentType\" : \"text/plain\" , \"commands\" :[ { \"os\" : \"mac\" , \"command\" : \"sh shell_command_project/my_command.sh\" }, { \"os\" : \"linux\" , \"command\" : \"sh shell_command_project/my_command.sh\" } ], \"set\" :{ \"variable1\" : \"value1\" }, \"unset\" :[ \"variable2\" ] } Publish the project Select the *.command file in the Workspace explorer and inspect the result in the Preview: Darwin XXXXXXXXXXXXX 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64 variable1=value1 Note: The working folder is set to the registry/public space under the file-based Repository. You can execute an arbitrary command e.g. even Node, Python, Julia, etc., by using the dirigible projects' content published and available under the registry space. For this case the given framework has to be setup in advance and the entry point executable to be added to the PATH environment variable. The standard output is redirected to the service response. For more information, see the API documentation.","title":"Shell Command"},{"location":"tutorials/application-development/shell-command/#shell-command","text":"","title":"Shell Command"},{"location":"tutorials/application-development/shell-command/#steps","text":"Create a project shell_command_project Then create a file named my_command.sh Replace the code with the following content: uname -an echo variable1=$variable1 Then create a Command named my_command.command Replace the content with the following JSON code: { \"description\" : \"command description\" , \"contentType\" : \"text/plain\" , \"commands\" :[ { \"os\" : \"mac\" , \"command\" : \"sh shell_command_project/my_command.sh\" }, { \"os\" : \"linux\" , \"command\" : \"sh shell_command_project/my_command.sh\" } ], \"set\" :{ \"variable1\" : \"value1\" }, \"unset\" :[ \"variable2\" ] } Publish the project Select the *.command file in the Workspace explorer and inspect the result in the Preview: Darwin XXXXXXXXXXXXX 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64 variable1=value1 Note: The working folder is set to the registry/public space under the file-based Repository. You can execute an arbitrary command e.g. even Node, Python, Julia, etc., by using the dirigible projects' content published and available under the registry space. For this case the given framework has to be setup in advance and the entry point executable to be added to the PATH environment variable. The standard output is redirected to the service response. For more information, see the API documentation.","title":"Steps"},{"location":"tutorials/application-development/bookstore/","text":"Bookstore Application Overview This sample shows how to create a simple web application for managing a single entity called Books . It contains a database table definition, a RESTful service and a web page for managing the instances via user interface. Sections Database Table and Data Layer REST API User Interface Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Bookstore Application"},{"location":"tutorials/application-development/bookstore/#bookstore-application","text":"","title":"Bookstore Application"},{"location":"tutorials/application-development/bookstore/#overview","text":"This sample shows how to create a simple web application for managing a single entity called Books . It contains a database table definition, a RESTful service and a web page for managing the instances via user interface.","title":"Overview"},{"location":"tutorials/application-development/bookstore/#sections","text":"Database Table and Data Layer REST API User Interface Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Sections"},{"location":"tutorials/application-development/bookstore/api/","text":"Bookstore Application - API Overview This section shows how to create the API layer for the Bookstore application. It contains a Books REST API . Steps REST API Right click on the babylon-project project and select New \u2192 Folder . Enter api for the name of the folder. Right click on the api folder and select New \u2192 TypeScript Service . Enter books.ts for the name of the TypeScript Service. Replace the content the following code: import { rs } from \"sdk/http\" ; import { BookRepository , Book } from '../data/BookRepository' ; const repository = new BookRepository (); rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ) { const entities : Book [] = repository . list (); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entities )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . get ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); response . setContentType ( \"application/json\" ); if ( entity ) { response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); } else { response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"application/json\" ]) . resource ( \"/count\" ) . get ( function ( ctx , request , response ) { const count : number = repository . count (); response . setStatus ( 200 ); response . println ( ` ${ count } ` ); }) . resource ( \"\" ) . post ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = repository . create ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 201 ); response . setHeader ( \"Content-Location\" , `/services/ts/babylon-project/service/Books.ts/ ${ entity . id } ` ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . put ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = ctx . pathParameters . id ; repository . update ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . delete ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); if ( entity ) { repository . deleteById ( id ); response . setStatus ( 204 ); } else { response . setContentType ( \"application/json\" ); response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"*/*\" ]) . execute (); Save & Publish After saving the file right click on the project and select Publish in order to run the compilation and the deployment of the TypeScript Service . The tsconfig.json and project.json files should be present at the project root in order to run the compilation (they can be found in the Bookstore Application - Database tutorial) . REST API Execution A GET to the root path of the REST API request is triggered by selecting the books.ts file and open the Preview view. The TypeScript Service is available at the http://localhost:8080/services/ts/babylon-project/api/books.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . https/rs Take a look at the http/rs documentation for more details about the API. OpenAPI Right click on the babylon-project/api folder and select New \u2192 File . Enter books.openapi for the name of the file. Replace the content with the following definition: openapi : 3.0.3 info : title : Bookstore Application description : Bookstore application based on the following tutorial - [https://www.dirigible.io/help/tutorials/application-development/bookstore/](https://www.dirigible.io/help/tutorials/application-development/bookstore/). contact : name : Eclipse Dirigible url : https://dirigible.io license : name : Eclipse Public License - v 2.0 url : https://github.com/dirigiblelabs/tutorial-babylon-project/blob/master/LICENSE version : 1.0.0 servers : - url : /services/ts tags : - name : Books paths : /babylon-project/api/books.ts : get : tags : - Books responses : 200 : content : application/json : schema : type : array items : $ref : '#/components/schemas/Book' post : tags : - Books requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 201 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' /babylon-project/api/books.ts/{id} : get : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found put : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found delete : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10000 responses : 204 : description : The resource was deleted successfully. 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found components : schemas : Error : type : object properties : code : type : integer example : 400 message : type : string example : Bad Request Book : type : object properties : id : type : integer isbn : type : string maxLength : 17 pattern : ^\\d{3}-\\d{1}-\\d{3}-\\d{5}-\\d{1}$ example : 978-1-599-86977-3 title : type : string maxLength : 120 example : The Art of War publisher : type : string maxLength : 120 example : Filiquarian date : type : string format : date example : \"2006-01-01\" price : type : number format : float minimum : 0 example : 18.99 Save & Publish Saving the file will trigger a Publish action, which will build and deploy the OpenAPI definition. To display the embedded SwaggerUI select the books.openapi file and open the Preview view. The SwaggerUI can be accessed at http://localhost:8080/services/web/ide-swagger/ui/index.html?openapi=/services/web/babylon-project/api/books.openapi Note: All published OpenAPI definitions can be seen at http://localhost:8080/services/web/ide-swagger/ui/ Next Steps Section Completed After completing the steps in this tutorial, you would have: REST API and business logic to perform CRUD operations on the Book entity. Continue to the User Interface section to build a UI for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"API"},{"location":"tutorials/application-development/bookstore/api/#bookstore-application-api","text":"","title":"Bookstore Application - API"},{"location":"tutorials/application-development/bookstore/api/#overview","text":"This section shows how to create the API layer for the Bookstore application. It contains a Books REST API .","title":"Overview"},{"location":"tutorials/application-development/bookstore/api/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/bookstore/api/#rest-api","text":"Right click on the babylon-project project and select New \u2192 Folder . Enter api for the name of the folder. Right click on the api folder and select New \u2192 TypeScript Service . Enter books.ts for the name of the TypeScript Service. Replace the content the following code: import { rs } from \"sdk/http\" ; import { BookRepository , Book } from '../data/BookRepository' ; const repository = new BookRepository (); rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ) { const entities : Book [] = repository . list (); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entities )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . get ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); response . setContentType ( \"application/json\" ); if ( entity ) { response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); } else { response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"application/json\" ]) . resource ( \"/count\" ) . get ( function ( ctx , request , response ) { const count : number = repository . count (); response . setStatus ( 200 ); response . println ( ` ${ count } ` ); }) . resource ( \"\" ) . post ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = repository . create ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 201 ); response . setHeader ( \"Content-Location\" , `/services/ts/babylon-project/service/Books.ts/ ${ entity . id } ` ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . put ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = ctx . pathParameters . id ; repository . update ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . delete ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); if ( entity ) { repository . deleteById ( id ); response . setStatus ( 204 ); } else { response . setContentType ( \"application/json\" ); response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"*/*\" ]) . execute (); Save & Publish After saving the file right click on the project and select Publish in order to run the compilation and the deployment of the TypeScript Service . The tsconfig.json and project.json files should be present at the project root in order to run the compilation (they can be found in the Bookstore Application - Database tutorial) . REST API Execution A GET to the root path of the REST API request is triggered by selecting the books.ts file and open the Preview view. The TypeScript Service is available at the http://localhost:8080/services/ts/babylon-project/api/books.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . https/rs Take a look at the http/rs documentation for more details about the API.","title":"REST API"},{"location":"tutorials/application-development/bookstore/api/#openapi","text":"Right click on the babylon-project/api folder and select New \u2192 File . Enter books.openapi for the name of the file. Replace the content with the following definition: openapi : 3.0.3 info : title : Bookstore Application description : Bookstore application based on the following tutorial - [https://www.dirigible.io/help/tutorials/application-development/bookstore/](https://www.dirigible.io/help/tutorials/application-development/bookstore/). contact : name : Eclipse Dirigible url : https://dirigible.io license : name : Eclipse Public License - v 2.0 url : https://github.com/dirigiblelabs/tutorial-babylon-project/blob/master/LICENSE version : 1.0.0 servers : - url : /services/ts tags : - name : Books paths : /babylon-project/api/books.ts : get : tags : - Books responses : 200 : content : application/json : schema : type : array items : $ref : '#/components/schemas/Book' post : tags : - Books requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 201 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' /babylon-project/api/books.ts/{id} : get : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found put : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found delete : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10000 responses : 204 : description : The resource was deleted successfully. 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found components : schemas : Error : type : object properties : code : type : integer example : 400 message : type : string example : Bad Request Book : type : object properties : id : type : integer isbn : type : string maxLength : 17 pattern : ^\\d{3}-\\d{1}-\\d{3}-\\d{5}-\\d{1}$ example : 978-1-599-86977-3 title : type : string maxLength : 120 example : The Art of War publisher : type : string maxLength : 120 example : Filiquarian date : type : string format : date example : \"2006-01-01\" price : type : number format : float minimum : 0 example : 18.99 Save & Publish Saving the file will trigger a Publish action, which will build and deploy the OpenAPI definition. To display the embedded SwaggerUI select the books.openapi file and open the Preview view. The SwaggerUI can be accessed at http://localhost:8080/services/web/ide-swagger/ui/index.html?openapi=/services/web/babylon-project/api/books.openapi Note: All published OpenAPI definitions can be seen at http://localhost:8080/services/web/ide-swagger/ui/","title":"OpenAPI"},{"location":"tutorials/application-development/bookstore/api/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: REST API and business logic to perform CRUD operations on the Book entity. Continue to the User Interface section to build a UI for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Next Steps"},{"location":"tutorials/application-development/bookstore/database/","text":"Bookstore Application - Database Overview This section shows how to create the database layer for the Bookstore application. It contains a database table definition for the BOOKS table, CSV data, CSVIM import definition and TypeScript Repository class. Steps Table Definition Create a project named babylon-project . Right click on the babylon-project project and select New \u2192 Folder . Enter data for the name of the folder. Right click on the data folder and select New \u2192 Database Table . Enter BABYLON_BOOKS.table for the name of the database table descriptor. Right click on BABYLON_BOOKS.table and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"name\" : \"BABYLON_BOOKS\" , \"type\" : \"TABLE\" , \"columns\" : [ { \"name\" : \"BOOK_ID\" , \"type\" : \"INTEGER\" , \"primaryKey\" : true , \"identity\" : \"true\" , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_ISBN\" , \"type\" : \"CHAR\" , \"length\" : \"17\" , \"unique\" : true , \"primaryKey\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_TITLE\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_PUBLISHER\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_DATE\" , \"type\" : \"DATE\" , \"nullable\" : true , \"unique\" : false }, { \"name\" : \"BOOK_PRICE\" , \"type\" : \"DOUBLE\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false } ], \"dependencies\" : [] } Save the changes and close the Code Editor . Double click on BABYLON_BOOKS.table to view the definition with the Table Editor . Save & Publish Saving the file will trigger a Publish action, which will create the database table in the target database schema. Usually this action should take several seconds to complete, after which the database table would be visible in the Database Perspective . Note: Manual Publish can be performed by right clicking on the artifact and selecting Publish from the context menu. The Publish action can be performed also on project level. CSV Data Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csv for the name of the file. Right click on books.csv and select Open With \u2192 Code Editor . Paste the following CSV data: BOOK_ID,BOOK_ISBN,BOOK_TITLE,BOOK_PUBLISHER,BOOK_DATE,BOOK_PRICE 10001,978-3-598-21500-1,Beartown,Simon & Schuster,2019-05-01,17.0 10002,978-3-598-21501-8,Beneath a Scarlet Sky,Lake Union Publishing,2017-05-01,9.74 10003,978-3-598-21529-2,Dead Certain,Free Press,2007-09-04,7.19 10004,978-3-598-21550-6,Everything We Keep,Lake Union Publishing,2016-08-01,14.65 10005,978-3-598-21550-9,Exit West,Hamish Hamilton,2017-02-27,11.45 Save the changes and close the Code Editor . Double click on books.csv to view the data with the CSV Editor . CSVIM Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csvim for the name of the file. Right click on books.csvim and select Open With \u2192 Code Editor . Paste the following CSVIM definition: { \"files\" : [ { \"table\" : \"BABYLON_BOOKS\" , \"schema\" : \"PUBLIC\" , \"file\" : \"/babylon-project/data/books.csv\" , \"header\" : true , \"useHeaderNames\" : true , \"delimField\" : \",\" , \"delimEnclosing\" : \"\\\"\" , \"distinguishEmptyFromNull\" : true , \"version\" : \"\" } ] } Save the changes and close the Code Editor . Double click on books.csvim to view the definition with the CSVIM Editor . Save & Publish Once the file is saved a Publish action would be triggered, which will result into the data from the CSV file to be imported to the database table. Note: Navigate to the Database Perspective to check that the BABYLON_BOOKS table is created and perform the following SQL query to check that the data from the CSV file is imported. select * from BABYLON_BOOKS ; Repository Right click on the babylon-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the babylon-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"babylon-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the babylon-project/data folder and select New \u2192 TypeScript Service . Enter BookRepository.ts for the name of the TypeScript Service. Replace the content with the following code: import { dao as daoApi } from \"sdk/db\" export interface Book { readonly id? : number ; readonly isbn : string ; readonly title : string ; readonly publisher : string ; readonly date : Date ; readonly price : number ; } export class BookRepository { private repository ; constructor ( dataSourceName? : string , logCtxName? : string ) { this . repository = daoApi . create ({ table : \"BABYLON_BOOKS\" , properties : [ { name : \"id\" , column : \"BOOK_ID\" , type : \"INTEGER\" , id : true , required : true }, { name : \"isbn\" , column : \"BOOK_ISBN\" , type : \"CHAR\" , id : false , required : false }, { name : \"title\" , column : \"BOOK_TITLE\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"publisher\" , column : \"BOOK_PUBLISHER\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"date\" , column : \"BOOK_DATE\" , type : \"DATE\" , id : false , required : true }, { name : \"price\" , column : \"BOOK_PRICE\" , type : \"DOUBLE\" , id : false , required : true }] }, logCtxName , dataSourceName ); } public list = ( settings ? ) : Book [] => { return this . repository . list ( settings ); }; public findById = ( id : number ) : Book | null => { return this . repository . find ( id ); }; public create = ( entity : Book ) : Book => { return this . repository . insert ( entity ); }; public update = ( entity : Book ) : Book => { return this . repository . update ( entity ); }; public deleteById = ( id : number ) : void => { this . repository . remove ( id ); }; public count = () : number => { return this . repository . count (); } } Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . db/dao Take a look at the db/dao documentation for more details about the API. Next Steps Section Completed After completing the steps in this tutorial, you would have: Database table named BABYLON_BOOKS . Initial data imported into the database table. TypeScript repository class to perform basic data operations. Continue to the API section to build a REST API for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Database"},{"location":"tutorials/application-development/bookstore/database/#bookstore-application-database","text":"","title":"Bookstore Application - Database"},{"location":"tutorials/application-development/bookstore/database/#overview","text":"This section shows how to create the database layer for the Bookstore application. It contains a database table definition for the BOOKS table, CSV data, CSVIM import definition and TypeScript Repository class.","title":"Overview"},{"location":"tutorials/application-development/bookstore/database/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/bookstore/database/#table-definition","text":"Create a project named babylon-project . Right click on the babylon-project project and select New \u2192 Folder . Enter data for the name of the folder. Right click on the data folder and select New \u2192 Database Table . Enter BABYLON_BOOKS.table for the name of the database table descriptor. Right click on BABYLON_BOOKS.table and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"name\" : \"BABYLON_BOOKS\" , \"type\" : \"TABLE\" , \"columns\" : [ { \"name\" : \"BOOK_ID\" , \"type\" : \"INTEGER\" , \"primaryKey\" : true , \"identity\" : \"true\" , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_ISBN\" , \"type\" : \"CHAR\" , \"length\" : \"17\" , \"unique\" : true , \"primaryKey\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_TITLE\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_PUBLISHER\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_DATE\" , \"type\" : \"DATE\" , \"nullable\" : true , \"unique\" : false }, { \"name\" : \"BOOK_PRICE\" , \"type\" : \"DOUBLE\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false } ], \"dependencies\" : [] } Save the changes and close the Code Editor . Double click on BABYLON_BOOKS.table to view the definition with the Table Editor . Save & Publish Saving the file will trigger a Publish action, which will create the database table in the target database schema. Usually this action should take several seconds to complete, after which the database table would be visible in the Database Perspective . Note: Manual Publish can be performed by right clicking on the artifact and selecting Publish from the context menu. The Publish action can be performed also on project level.","title":"Table Definition"},{"location":"tutorials/application-development/bookstore/database/#csv-data","text":"Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csv for the name of the file. Right click on books.csv and select Open With \u2192 Code Editor . Paste the following CSV data: BOOK_ID,BOOK_ISBN,BOOK_TITLE,BOOK_PUBLISHER,BOOK_DATE,BOOK_PRICE 10001,978-3-598-21500-1,Beartown,Simon & Schuster,2019-05-01,17.0 10002,978-3-598-21501-8,Beneath a Scarlet Sky,Lake Union Publishing,2017-05-01,9.74 10003,978-3-598-21529-2,Dead Certain,Free Press,2007-09-04,7.19 10004,978-3-598-21550-6,Everything We Keep,Lake Union Publishing,2016-08-01,14.65 10005,978-3-598-21550-9,Exit West,Hamish Hamilton,2017-02-27,11.45 Save the changes and close the Code Editor . Double click on books.csv to view the data with the CSV Editor .","title":"CSV Data"},{"location":"tutorials/application-development/bookstore/database/#csvim","text":"Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csvim for the name of the file. Right click on books.csvim and select Open With \u2192 Code Editor . Paste the following CSVIM definition: { \"files\" : [ { \"table\" : \"BABYLON_BOOKS\" , \"schema\" : \"PUBLIC\" , \"file\" : \"/babylon-project/data/books.csv\" , \"header\" : true , \"useHeaderNames\" : true , \"delimField\" : \",\" , \"delimEnclosing\" : \"\\\"\" , \"distinguishEmptyFromNull\" : true , \"version\" : \"\" } ] } Save the changes and close the Code Editor . Double click on books.csvim to view the definition with the CSVIM Editor . Save & Publish Once the file is saved a Publish action would be triggered, which will result into the data from the CSV file to be imported to the database table. Note: Navigate to the Database Perspective to check that the BABYLON_BOOKS table is created and perform the following SQL query to check that the data from the CSV file is imported. select * from BABYLON_BOOKS ;","title":"CSVIM"},{"location":"tutorials/application-development/bookstore/database/#repository","text":"Right click on the babylon-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the babylon-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"babylon-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the babylon-project/data folder and select New \u2192 TypeScript Service . Enter BookRepository.ts for the name of the TypeScript Service. Replace the content with the following code: import { dao as daoApi } from \"sdk/db\" export interface Book { readonly id? : number ; readonly isbn : string ; readonly title : string ; readonly publisher : string ; readonly date : Date ; readonly price : number ; } export class BookRepository { private repository ; constructor ( dataSourceName? : string , logCtxName? : string ) { this . repository = daoApi . create ({ table : \"BABYLON_BOOKS\" , properties : [ { name : \"id\" , column : \"BOOK_ID\" , type : \"INTEGER\" , id : true , required : true }, { name : \"isbn\" , column : \"BOOK_ISBN\" , type : \"CHAR\" , id : false , required : false }, { name : \"title\" , column : \"BOOK_TITLE\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"publisher\" , column : \"BOOK_PUBLISHER\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"date\" , column : \"BOOK_DATE\" , type : \"DATE\" , id : false , required : true }, { name : \"price\" , column : \"BOOK_PRICE\" , type : \"DOUBLE\" , id : false , required : true }] }, logCtxName , dataSourceName ); } public list = ( settings ? ) : Book [] => { return this . repository . list ( settings ); }; public findById = ( id : number ) : Book | null => { return this . repository . find ( id ); }; public create = ( entity : Book ) : Book => { return this . repository . insert ( entity ); }; public update = ( entity : Book ) : Book => { return this . repository . update ( entity ); }; public deleteById = ( id : number ) : void => { this . repository . remove ( id ); }; public count = () : number => { return this . repository . count (); } } Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . db/dao Take a look at the db/dao documentation for more details about the API.","title":"Repository"},{"location":"tutorials/application-development/bookstore/database/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Database table named BABYLON_BOOKS . Initial data imported into the database table. TypeScript repository class to perform basic data operations. Continue to the API section to build a REST API for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Next Steps"},{"location":"tutorials/application-development/bookstore/ui/","text":"Bookstore Application - UI Overview This section shows how to create the User Interface layer for the Bookstore application. It contains a Books Perspective , View for displaying the data and Dialog for modifing the Books data. Steps Perspective Right click on the babylon-project project and select New \u2192 Folder . Enter ui for the name of the folder. Create index.html , perspective.js and perspective.extension as shown below: index.html perspective.js perspective.extension Right click on the ui folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" ng-app = \"app\" ng-controller = \"ApplicationController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < script type = \"text/javascript\" src = \"perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-perspective-css\" /> < body > < ide-header menu-ext-id = \"books-menu\" > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > < script type = \"text/javascript\" > angular . module ( 'app' , [ 'ngResource' , 'ideLayout' , 'ideUI' ]) . constant ( 'branding' , { name : 'Babylon' , brand : 'Eclipse Dirigible' , brandUrl : 'https://dirigible.io' , icons : { faviconIco : '/services/web/resources/images/favicon.ico' , favicon32 : '/services/web/resources/images/favicon-32x32.png' , favicon16 : '/services/web/resources/images/favicon-16x16.png' , }, logo : '/services/web/resources/images/dirigible.svg' , }) . constant ( 'extensionPoint' , { perspectives : \"books\" , views : \"books-view\" , dialogWindows : \"books-dialog-window\" }) . controller ( 'ApplicationController' , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { const httpRequest = new XMLHttpRequest (); httpRequest . open ( \"GET\" , \"/services/js/resources-core/services/views.js?extensionPoint=books-view\" , false ); httpRequest . send (); $scope . layoutModel = { views : JSON . parse ( httpRequest . responseText ). filter ( e => ! e . isLaunchpad && e . perspectiveName === \"books\" ). map ( e => e . id ) }; }]); Right click on the ui folder and select New \u2192 File . Enter perspective.js for the name of the file. Replace the content with the following code: const perspectiveData = { id : \"books\" , name : \"books\" , link : \"/services/web/babylon-project/ui/index.html\" , order : \"100\" , icon : \"/services/web/resources/unicons/copy.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Right click on the ui folder and select New \u2192 Extension . Enter perspective.extension for the name of the Extension. Right click on perspective.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/perspective.js\" , \"extensionPoint\" : \"books\" , \"description\" : \"Books - Perspective\" } Note The index.html , perspective.js and perspective.extension files should be located at the babylon-project/ui folder. View Right click on the babylon-project/ui folder and select New \u2192 Folder . Enter Books for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the Books folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" > < fd-toolbar has-title = \"true\" > < fd-toolbar-title > Items ({{dataCount}}) < fd-toolbar-spacer > < fd-button compact = \"true\" dg-type = \"transparent\" dg-label = \"Create\" ng-click = \"createEntity()\" > < fd-scrollbar class = \"dg-full-height\" ng-hide = \"data == null\" > < table fd-table display-mode = \"compact\" inner-borders = \"top\" outer-borders = \"none\" > < thead fd-table-header sticky = \"true\" > < tr fd-table-row > < th fd-table-header-cell > Title < th fd-table-header-cell > Publisher < th fd-table-header-cell > Date < th fd-table-header-cell > Price < th fd-table-header-cell > < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-show = \"data.length == 0\" > < td fd-table-cell no-data = \"true\" > No data available. < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" dg-selected = \"next.id === selectedEntity.id\" ng-click = \"selectEntity(next)\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.title}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.publisher}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > < fd-input type = \"date\" ng-model = \"next.date\" ng-readonly = \"true\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.price}} < td fd-table-cell fit-content = \"true\" > < fd-popover > < fd-popover-control > < fd-button compact = \"true\" glyph = \"sap-icon--overflow\" dg-type = \"transparent\" aria-label = \"Table Row Menu Button\" ng-click = \"setTristate()\" > < fd-popover-body dg-align = \"bottom-right\" > < fd-menu aria-label = \"Table Row Menu\" no-backdrop = \"true\" no-shadow = \"true\" > < fd-menu-item title = \"View Details\" ng-click = \"openDetails(next)\" > < fd-menu-item title = \"Edit\" ng-click = \"updateEntity(next)\" > < fd-menu-item title = \"Delete\" ng-click = \"deleteEntity(next)\" > < fd-pagination total-items = \"dataCount\" items-per-page = \"dataLimit\" items-per-page-options = \"[10, 20, 50]\" page-change = \"loadPage(pageNumber)\" items-per-page-change = \"loadPage(pageNumber)\" items-per-page-placement = \"top-start\" compact = \"true\" display-total-items = \"true\" ng-hide = \"dataCount == 0\" > Right click on the Books folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { function resetPagination () { $scope . dataPage = 1 ; $scope . dataCount = 0 ; $scope . dataLimit = 20 ; } resetPagination (); //-----------------Events-------------------// messageHub . onDidReceiveMessage ( \"entityCreated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); messageHub . onDidReceiveMessage ( \"entityUpdated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); //-----------------Events-------------------// $scope . loadPage = function ( pageNumber ) { $scope . dataPage = pageNumber ; entityApi . count (). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to count Books: ' ${ response . message } '` ); return ; } $scope . dataCount = parseInt ( response . data ); let offset = ( pageNumber - 1 ) * $scope . dataLimit ; let limit = $scope . dataLimit ; entityApi . list ( offset , limit ). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to list Books: ' ${ response . message } '` ); return ; } response . data . forEach ( e => { if ( e . date ) { e . date = new Date ( e . date ); } }); $scope . data = response . data ; }); }); }; $scope . loadPage ( $scope . dataPage ); $scope . selectEntity = function ( entity ) { $scope . selectedEntity = entity ; }; $scope . openDetails = function ( entity ) { $scope . selectedEntity = entity ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"select\" , entity : entity , }); }; $scope . createEntity = function () { $scope . selectedEntity = null ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"create\" , entity : {}, }, null , false ); }; $scope . updateEntity = function ( entity ) { messageHub . showDialogWindow ( \"Books-details\" , { action : \"update\" , entity : entity , }, null , false ); }; $scope . deleteEntity = function ( entity ) { let id = entity . id ; messageHub . showDialogAsync ( 'Delete Books?' , `Are you sure you want to delete Books? This action cannot be undone.` , [{ id : \"delete-btn-yes\" , type : \"emphasized\" , label : \"Yes\" , }, { id : \"delete-btn-no\" , type : \"normal\" , label : \"No\" , }], ). then ( function ( msg ) { if ( msg . data === \"delete-btn-yes\" ) { entityApi . delete ( id ). then ( function ( response ) { if ( response . status != 204 ) { messageHub . showAlertError ( \"Books\" , `Unable to delete Books: ' ${ response . message } '` ); return ; } $scope . loadPage ( $scope . dataPage ); messageHub . postMessage ( \"clearDetails\" ); }); } }); }; }]); Right click on the Books folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books\" , label : \"Books\" , factory : \"frame\" , region : \"center\" , link : \"/services/web/babylon-project/ui/Books/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Right click on the Books folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/view.js\" , \"extensionPoint\" : \"books-view\" , \"description\" : \"Books - Application View\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books folder. Dialog Right click on the babylon-project/ui/Books folder and select New \u2192 Folder . Enter dialog-window for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the dialog-window folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < fd-scrollbar class = \"dg-full-height\" > < div class = \"fd-margin--md fd-message-strip fd-message-strip--error fd-message-strip--dismissible\" role = \"alert\" ng-show = \"errorMessage\" > < p class = \"fd-message-strip__text\" > {{ errorMessage }} < fd-button glyph = \"sap-icon--decline\" compact = \"true\" dg-type = \"transparent\" aria-label = \"Close\" in-msg-strip = \"true\" ng-click = \"clearErrorMessage()\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group dg-header = \"{{formHeaders[action]}}\" name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idisbn\" dg-required = \"false\" dg-colon = \"true\" > ISBN < fd-form-input-message-group dg-inactive = \"{{ formErrors.isbn ? false : true }}\" > < fd-input id = \"idisbn\" name = \"isbn\" state = \"{{ formErrors.isbn ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['isbn'].$valid, 'isbn')\" ng-model = \"entity.isbn\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter isbn\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idtitle\" dg-required = \"false\" dg-colon = \"true\" > Title < fd-form-input-message-group dg-inactive = \"{{ formErrors.title ? false : true }}\" > < fd-input id = \"idtitle\" name = \"title\" state = \"{{ formErrors.title ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['title'].$valid, 'title')\" ng-model = \"entity.title\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter title\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idpublisher\" dg-required = \"false\" dg-colon = \"true\" > Publisher < fd-form-input-message-group dg-inactive = \"{{ formErrors.publisher ? false : true }}\" > < fd-input id = \"idpublisher\" name = \"publisher\" state = \"{{ formErrors.publisher ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['publisher'].$valid, 'publisher')\" ng-model = \"entity.publisher\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter publisher\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"iddate\" dg-required = \"false\" dg-colon = \"true\" > Date < fd-form-input-message-group dg-inactive = \"{{ formErrors.date ? false : true }}\" > < fd-input id = \"iddate\" name = \"date\" state = \"{{ formErrors.date ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['date'].$valid, 'date')\" ng-model = \"entity.date\" ng-readonly = \"action === 'select'\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idprice\" dg-required = \"false\" dg-colon = \"true\" > Price < fd-form-input-message-group dg-inactive = \"{{ formErrors.price ? false : true }}\" > < fd-input id = \"idprice\" name = \"price\" state = \"{{ formErrors.price ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['price'].$valid, 'price')\" ng-model = \"entity.price\" ng-readonly = \"action === 'select'\" type = \"number\" placeholder = \"Enter price\" > < fd-form-message dg-type = \"error\" > Incorrect Input < footer class = \"fd-dialog__footer fd-bar fd-bar--footer\" ng-show = \"action !== 'select'\" > < div class = \"fd-bar__right\" > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"{{action === 'create' ? 'Create' : 'Update'}}\" ng-click = \"action === 'create' ? create() : update()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"cancel()\" > Right click on the dialog-window folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { $scope . entity = {}; $scope . formHeaders = { select : \"Books Details\" , create : \"Create Books\" , update : \"Update Books\" }; $scope . formErrors = {}; $scope . action = 'select' ; if ( window != null && window . frameElement != null && window . frameElement . hasAttribute ( \"data-parameters\" )) { let dataParameters = window . frameElement . getAttribute ( \"data-parameters\" ); if ( dataParameters ) { let params = JSON . parse ( dataParameters ); $scope . action = params . action ; if ( $scope . action == \"create\" ) { $scope . formErrors = { }; } if ( params . entity . date ) { params . entity . date = new Date ( params . entity . date ); } $scope . entity = params . entity ; $scope . selectedMainEntityKey = params . selectedMainEntityKey ; $scope . selectedMainEntityId = params . selectedMainEntityId ; } } $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . create = function () { let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . create ( entity ). then ( function ( response ) { if ( response . status != 201 ) { $scope . errorMessage = `Unable to create Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityCreated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully created\" ); }); }; $scope . update = function () { let id = $scope . entity . id ; let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . update ( id , entity ). then ( function ( response ) { if ( response . status != 200 ) { $scope . errorMessage = `Unable to update Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityUpdated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully updated\" ); }); }; $scope . cancel = function () { $scope . entity = {}; $scope . action = 'select' ; messageHub . closeDialogWindow ( \"Books-details\" ); }; $scope . clearErrorMessage = function () { $scope . errorMessage = null ; }; }]); Right click on the dialog-window folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books-details\" , label : \"Books\" , link : \"/services/web/babylon-project/ui/Books/dialog-window/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getDialogWindow = function () { return viewData ; } } Right click on the dialog-window folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/dialog-window/view.js\" , \"extensionPoint\" : \"books-dialog-window\" , \"description\" : \"Books - Application Dialog Window\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books/dialog-window folder. Publish and Preview (optional) Right click on the babylon-project project and select Publish . Select the babylon-project/ui/index.html in the Projects view In the Preview window you should see the web page for management of Books. Try to enter a few books to test how it works. Application URL The Bookstore Application is available at: http://localhost:8080/services/web/babylon-project/ui/ Summary Tutorial Completed After completing all steps in this tutorial, you would have: Extendable UI Perspective for the book related views. Books View to display the books data. Books Dialog for modifing the books data. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"User Interface"},{"location":"tutorials/application-development/bookstore/ui/#bookstore-application-ui","text":"","title":"Bookstore Application - UI"},{"location":"tutorials/application-development/bookstore/ui/#overview","text":"This section shows how to create the User Interface layer for the Bookstore application. It contains a Books Perspective , View for displaying the data and Dialog for modifing the Books data.","title":"Overview"},{"location":"tutorials/application-development/bookstore/ui/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/bookstore/ui/#perspective","text":"Right click on the babylon-project project and select New \u2192 Folder . Enter ui for the name of the folder. Create index.html , perspective.js and perspective.extension as shown below: index.html perspective.js perspective.extension Right click on the ui folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" ng-app = \"app\" ng-controller = \"ApplicationController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < script type = \"text/javascript\" src = \"perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-perspective-css\" /> < body > < ide-header menu-ext-id = \"books-menu\" > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > < script type = \"text/javascript\" > angular . module ( 'app' , [ 'ngResource' , 'ideLayout' , 'ideUI' ]) . constant ( 'branding' , { name : 'Babylon' , brand : 'Eclipse Dirigible' , brandUrl : 'https://dirigible.io' , icons : { faviconIco : '/services/web/resources/images/favicon.ico' , favicon32 : '/services/web/resources/images/favicon-32x32.png' , favicon16 : '/services/web/resources/images/favicon-16x16.png' , }, logo : '/services/web/resources/images/dirigible.svg' , }) . constant ( 'extensionPoint' , { perspectives : \"books\" , views : \"books-view\" , dialogWindows : \"books-dialog-window\" }) . controller ( 'ApplicationController' , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { const httpRequest = new XMLHttpRequest (); httpRequest . open ( \"GET\" , \"/services/js/resources-core/services/views.js?extensionPoint=books-view\" , false ); httpRequest . send (); $scope . layoutModel = { views : JSON . parse ( httpRequest . responseText ). filter ( e => ! e . isLaunchpad && e . perspectiveName === \"books\" ). map ( e => e . id ) }; }]); Right click on the ui folder and select New \u2192 File . Enter perspective.js for the name of the file. Replace the content with the following code: const perspectiveData = { id : \"books\" , name : \"books\" , link : \"/services/web/babylon-project/ui/index.html\" , order : \"100\" , icon : \"/services/web/resources/unicons/copy.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Right click on the ui folder and select New \u2192 Extension . Enter perspective.extension for the name of the Extension. Right click on perspective.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/perspective.js\" , \"extensionPoint\" : \"books\" , \"description\" : \"Books - Perspective\" } Note The index.html , perspective.js and perspective.extension files should be located at the babylon-project/ui folder.","title":"Perspective"},{"location":"tutorials/application-development/bookstore/ui/#view","text":"Right click on the babylon-project/ui folder and select New \u2192 Folder . Enter Books for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the Books folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" > < fd-toolbar has-title = \"true\" > < fd-toolbar-title > Items ({{dataCount}}) < fd-toolbar-spacer > < fd-button compact = \"true\" dg-type = \"transparent\" dg-label = \"Create\" ng-click = \"createEntity()\" > < fd-scrollbar class = \"dg-full-height\" ng-hide = \"data == null\" > < table fd-table display-mode = \"compact\" inner-borders = \"top\" outer-borders = \"none\" > < thead fd-table-header sticky = \"true\" > < tr fd-table-row > < th fd-table-header-cell > Title < th fd-table-header-cell > Publisher < th fd-table-header-cell > Date < th fd-table-header-cell > Price < th fd-table-header-cell > < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-show = \"data.length == 0\" > < td fd-table-cell no-data = \"true\" > No data available. < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" dg-selected = \"next.id === selectedEntity.id\" ng-click = \"selectEntity(next)\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.title}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.publisher}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > < fd-input type = \"date\" ng-model = \"next.date\" ng-readonly = \"true\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.price}} < td fd-table-cell fit-content = \"true\" > < fd-popover > < fd-popover-control > < fd-button compact = \"true\" glyph = \"sap-icon--overflow\" dg-type = \"transparent\" aria-label = \"Table Row Menu Button\" ng-click = \"setTristate()\" > < fd-popover-body dg-align = \"bottom-right\" > < fd-menu aria-label = \"Table Row Menu\" no-backdrop = \"true\" no-shadow = \"true\" > < fd-menu-item title = \"View Details\" ng-click = \"openDetails(next)\" > < fd-menu-item title = \"Edit\" ng-click = \"updateEntity(next)\" > < fd-menu-item title = \"Delete\" ng-click = \"deleteEntity(next)\" > < fd-pagination total-items = \"dataCount\" items-per-page = \"dataLimit\" items-per-page-options = \"[10, 20, 50]\" page-change = \"loadPage(pageNumber)\" items-per-page-change = \"loadPage(pageNumber)\" items-per-page-placement = \"top-start\" compact = \"true\" display-total-items = \"true\" ng-hide = \"dataCount == 0\" > Right click on the Books folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { function resetPagination () { $scope . dataPage = 1 ; $scope . dataCount = 0 ; $scope . dataLimit = 20 ; } resetPagination (); //-----------------Events-------------------// messageHub . onDidReceiveMessage ( \"entityCreated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); messageHub . onDidReceiveMessage ( \"entityUpdated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); //-----------------Events-------------------// $scope . loadPage = function ( pageNumber ) { $scope . dataPage = pageNumber ; entityApi . count (). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to count Books: ' ${ response . message } '` ); return ; } $scope . dataCount = parseInt ( response . data ); let offset = ( pageNumber - 1 ) * $scope . dataLimit ; let limit = $scope . dataLimit ; entityApi . list ( offset , limit ). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to list Books: ' ${ response . message } '` ); return ; } response . data . forEach ( e => { if ( e . date ) { e . date = new Date ( e . date ); } }); $scope . data = response . data ; }); }); }; $scope . loadPage ( $scope . dataPage ); $scope . selectEntity = function ( entity ) { $scope . selectedEntity = entity ; }; $scope . openDetails = function ( entity ) { $scope . selectedEntity = entity ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"select\" , entity : entity , }); }; $scope . createEntity = function () { $scope . selectedEntity = null ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"create\" , entity : {}, }, null , false ); }; $scope . updateEntity = function ( entity ) { messageHub . showDialogWindow ( \"Books-details\" , { action : \"update\" , entity : entity , }, null , false ); }; $scope . deleteEntity = function ( entity ) { let id = entity . id ; messageHub . showDialogAsync ( 'Delete Books?' , `Are you sure you want to delete Books? This action cannot be undone.` , [{ id : \"delete-btn-yes\" , type : \"emphasized\" , label : \"Yes\" , }, { id : \"delete-btn-no\" , type : \"normal\" , label : \"No\" , }], ). then ( function ( msg ) { if ( msg . data === \"delete-btn-yes\" ) { entityApi . delete ( id ). then ( function ( response ) { if ( response . status != 204 ) { messageHub . showAlertError ( \"Books\" , `Unable to delete Books: ' ${ response . message } '` ); return ; } $scope . loadPage ( $scope . dataPage ); messageHub . postMessage ( \"clearDetails\" ); }); } }); }; }]); Right click on the Books folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books\" , label : \"Books\" , factory : \"frame\" , region : \"center\" , link : \"/services/web/babylon-project/ui/Books/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Right click on the Books folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/view.js\" , \"extensionPoint\" : \"books-view\" , \"description\" : \"Books - Application View\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books folder.","title":"View"},{"location":"tutorials/application-development/bookstore/ui/#dialog","text":"Right click on the babylon-project/ui/Books folder and select New \u2192 Folder . Enter dialog-window for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the dialog-window folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < fd-scrollbar class = \"dg-full-height\" > < div class = \"fd-margin--md fd-message-strip fd-message-strip--error fd-message-strip--dismissible\" role = \"alert\" ng-show = \"errorMessage\" > < p class = \"fd-message-strip__text\" > {{ errorMessage }} < fd-button glyph = \"sap-icon--decline\" compact = \"true\" dg-type = \"transparent\" aria-label = \"Close\" in-msg-strip = \"true\" ng-click = \"clearErrorMessage()\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group dg-header = \"{{formHeaders[action]}}\" name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idisbn\" dg-required = \"false\" dg-colon = \"true\" > ISBN < fd-form-input-message-group dg-inactive = \"{{ formErrors.isbn ? false : true }}\" > < fd-input id = \"idisbn\" name = \"isbn\" state = \"{{ formErrors.isbn ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['isbn'].$valid, 'isbn')\" ng-model = \"entity.isbn\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter isbn\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idtitle\" dg-required = \"false\" dg-colon = \"true\" > Title < fd-form-input-message-group dg-inactive = \"{{ formErrors.title ? false : true }}\" > < fd-input id = \"idtitle\" name = \"title\" state = \"{{ formErrors.title ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['title'].$valid, 'title')\" ng-model = \"entity.title\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter title\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idpublisher\" dg-required = \"false\" dg-colon = \"true\" > Publisher < fd-form-input-message-group dg-inactive = \"{{ formErrors.publisher ? false : true }}\" > < fd-input id = \"idpublisher\" name = \"publisher\" state = \"{{ formErrors.publisher ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['publisher'].$valid, 'publisher')\" ng-model = \"entity.publisher\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter publisher\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"iddate\" dg-required = \"false\" dg-colon = \"true\" > Date < fd-form-input-message-group dg-inactive = \"{{ formErrors.date ? false : true }}\" > < fd-input id = \"iddate\" name = \"date\" state = \"{{ formErrors.date ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['date'].$valid, 'date')\" ng-model = \"entity.date\" ng-readonly = \"action === 'select'\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idprice\" dg-required = \"false\" dg-colon = \"true\" > Price < fd-form-input-message-group dg-inactive = \"{{ formErrors.price ? false : true }}\" > < fd-input id = \"idprice\" name = \"price\" state = \"{{ formErrors.price ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['price'].$valid, 'price')\" ng-model = \"entity.price\" ng-readonly = \"action === 'select'\" type = \"number\" placeholder = \"Enter price\" > < fd-form-message dg-type = \"error\" > Incorrect Input < footer class = \"fd-dialog__footer fd-bar fd-bar--footer\" ng-show = \"action !== 'select'\" > < div class = \"fd-bar__right\" > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"{{action === 'create' ? 'Create' : 'Update'}}\" ng-click = \"action === 'create' ? create() : update()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"cancel()\" > Right click on the dialog-window folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { $scope . entity = {}; $scope . formHeaders = { select : \"Books Details\" , create : \"Create Books\" , update : \"Update Books\" }; $scope . formErrors = {}; $scope . action = 'select' ; if ( window != null && window . frameElement != null && window . frameElement . hasAttribute ( \"data-parameters\" )) { let dataParameters = window . frameElement . getAttribute ( \"data-parameters\" ); if ( dataParameters ) { let params = JSON . parse ( dataParameters ); $scope . action = params . action ; if ( $scope . action == \"create\" ) { $scope . formErrors = { }; } if ( params . entity . date ) { params . entity . date = new Date ( params . entity . date ); } $scope . entity = params . entity ; $scope . selectedMainEntityKey = params . selectedMainEntityKey ; $scope . selectedMainEntityId = params . selectedMainEntityId ; } } $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . create = function () { let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . create ( entity ). then ( function ( response ) { if ( response . status != 201 ) { $scope . errorMessage = `Unable to create Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityCreated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully created\" ); }); }; $scope . update = function () { let id = $scope . entity . id ; let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . update ( id , entity ). then ( function ( response ) { if ( response . status != 200 ) { $scope . errorMessage = `Unable to update Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityUpdated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully updated\" ); }); }; $scope . cancel = function () { $scope . entity = {}; $scope . action = 'select' ; messageHub . closeDialogWindow ( \"Books-details\" ); }; $scope . clearErrorMessage = function () { $scope . errorMessage = null ; }; }]); Right click on the dialog-window folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books-details\" , label : \"Books\" , link : \"/services/web/babylon-project/ui/Books/dialog-window/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getDialogWindow = function () { return viewData ; } } Right click on the dialog-window folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/dialog-window/view.js\" , \"extensionPoint\" : \"books-dialog-window\" , \"description\" : \"Books - Application Dialog Window\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books/dialog-window folder.","title":"Dialog"},{"location":"tutorials/application-development/bookstore/ui/#publish-and-preview","text":"(optional) Right click on the babylon-project project and select Publish . Select the babylon-project/ui/index.html in the Projects view In the Preview window you should see the web page for management of Books. Try to enter a few books to test how it works. Application URL The Bookstore Application is available at: http://localhost:8080/services/web/babylon-project/ui/","title":"Publish and Preview"},{"location":"tutorials/application-development/bookstore/ui/#summary","text":"Tutorial Completed After completing all steps in this tutorial, you would have: Extendable UI Perspective for the book related views. Books View to display the books data. Books Dialog for modifing the books data. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Summary"},{"location":"tutorials/application-development/scheduled-job/","text":"Scheduled Job Overview This sample shows how to create a simple application with scheduled job for log events. It contains a Database Table to store log events, helper Logger class to create log events, Job Definition and Job Handler . Sections Database Table Job Handler Job Definition Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Scheduled Job"},{"location":"tutorials/application-development/scheduled-job/#scheduled-job","text":"","title":"Scheduled Job"},{"location":"tutorials/application-development/scheduled-job/#overview","text":"This sample shows how to create a simple application with scheduled job for log events. It contains a Database Table to store log events, helper Logger class to create log events, Job Definition and Job Handler .","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/#sections","text":"Database Table Job Handler Job Definition Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Sections"},{"location":"tutorials/application-development/scheduled-job/database/","text":"Scheduled Job - Database Overview This section shows how to create the database table for the Scheduled Job application. Steps Database Table Navigate to the Database Perspective . In the SQL View enter the following script: create table LOG_EVENTS ( LOG_ID integer primary key auto_increment , LOG_SEVERITY varchar ( 16 ), LOG_MESSAGE varchar ( 120 ), LOG_TIMESTAMP timestamp ); Press the Run icon to execute the SQL script. Keyboard Shortcut Press Ctrl + X for Windows, Cmd + X for macOS to execute the SQL script. Note: You can execute all or part of the SQL scripts in the SQL View by making a selection and pressing the Run icon or the keyboard shortcut. Press the Refresh button to see the LOG_EVENTS table. Table Content Right click on the LOG_EVENTS table and select Show Contents . The table data would be displayed in the Result View . As the table is empty, there should be no data: Next Steps Section Completed After completing the steps in this tutorial, you would: Have database table named LOG_EVENTS . Be familiar with the Database Perspective , the SQL View and the Result View . Continue to the Job Handler section to create a Job Handler , that would be executed by the Scheduled Job . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Database"},{"location":"tutorials/application-development/scheduled-job/database/#scheduled-job-database","text":"","title":"Scheduled Job - Database"},{"location":"tutorials/application-development/scheduled-job/database/#overview","text":"This section shows how to create the database table for the Scheduled Job application.","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/database/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/scheduled-job/database/#database-table","text":"Navigate to the Database Perspective . In the SQL View enter the following script: create table LOG_EVENTS ( LOG_ID integer primary key auto_increment , LOG_SEVERITY varchar ( 16 ), LOG_MESSAGE varchar ( 120 ), LOG_TIMESTAMP timestamp ); Press the Run icon to execute the SQL script. Keyboard Shortcut Press Ctrl + X for Windows, Cmd + X for macOS to execute the SQL script. Note: You can execute all or part of the SQL scripts in the SQL View by making a selection and pressing the Run icon or the keyboard shortcut. Press the Refresh button to see the LOG_EVENTS table.","title":"Database Table"},{"location":"tutorials/application-development/scheduled-job/database/#table-content","text":"Right click on the LOG_EVENTS table and select Show Contents . The table data would be displayed in the Result View . As the table is empty, there should be no data:","title":"Table Content"},{"location":"tutorials/application-development/scheduled-job/database/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would: Have database table named LOG_EVENTS . Be familiar with the Database Perspective , the SQL View and the Result View . Continue to the Job Handler section to create a Job Handler , that would be executed by the Scheduled Job . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Next Steps"},{"location":"tutorials/application-development/scheduled-job/handler/","text":"Scheduled Job - Job Handler Overview This section shows how to create helper Logger class to create log events and Job Handler that would be executed by the Scheduled Job . Steps Logger Right click on the scheduled-job-project project and select New \u2192 TypeScript Service . Enter Logger.ts for the name of the TypeScript Service. Replace the content with the following code: import { update } from \"sdk/db\" ; export enum LogDataSeverity { INFO = 'Info' , WARNING = 'Warning' , ERROR = 'Error' } export interface LogData { readonly date : Date ; readonly severity : LogDataSeverity ; readonly message : string ; } export class Logger { public static log ( logData : LogData ) { Logger . saveLogEvent ( logData ); const message = `---> [ ${ logData . severity } ] [ ${ Logger . toDateString ( logData . date ) } ]: ${ logData . message } <---` ; switch ( logData . severity ) { case LogDataSeverity.INFO : console.info ( message ); break ; case LogDataSeverity.WARNING : console.warn ( message ); break ; case LogDataSeverity.ERROR : console.error ( message ); break ; } } private static saveLogEvent ( logData : LogData ) { const sql = `insert into LOG_EVENTS (\"LOG_SEVERITY\", \"LOG_MESSAGE\", \"LOG_TIMESTAMP\") values (?, ?, ?)` ; const queryParameters = [ logData . severity , logData . message , logData . date ]; update . execute ( sql , queryParameters , null ); } private static toDateString ( date : Date ) : string { return ` ${ date . toLocaleDateString () } ; ${ date . toLocaleTimeString () } ` ; } } db/update Take a look at the db/update documentation for more details about the API. Logger Right click on the scheduled-job-project project and select New \u2192 JavaScript ESM Service . Enter handler.mjs for the name of the JavaScript Service. Replace the content with the following code: import { Logger , LogDataSeverity } from './Logger' ; const logData = [{ date : new Date (), severity : LogDataSeverity . INFO , message : 'Success feels so good!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'You made it!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Open Sesame!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Password updated!' }, { date : new Date (), severity : LogDataSeverity . ERROR , message : 'Welcome to the dark side!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'So glad you are back!' }]; const randomIndex = Math . floor ( Math . random () * logData . length ); Logger . log ( logData [ randomIndex ]); Navigate to the Database Perspective to check that there is a record in the LOG_EVENTS table. Save & Publish Saving the file will trigger a Publish action, which will build and deploy the JavaScript and TypeScript services. The handler.mjs service would be executed by the Preview view. As it's expected to be executed by a Scheduled Job and not by HTTP Request nothing would be displayed in the Preview view, however the log event data would be insterted into the LOG_EVENTS table. JavaScript ESM Handler At the time of writing the tutorial, it was not possible to create a TypeScript handler for the Scheduled Job . Next Steps Section Completed After completing the steps in this tutorial, you would have: Job Handler and Logger class to create log events. At least one record in the LOG_EVENTS table. Continue to the Job Definition section to create a Scheduled Job , that would trigger the Job Handler . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Job Handler"},{"location":"tutorials/application-development/scheduled-job/handler/#scheduled-job-job-handler","text":"","title":"Scheduled Job - Job Handler"},{"location":"tutorials/application-development/scheduled-job/handler/#overview","text":"This section shows how to create helper Logger class to create log events and Job Handler that would be executed by the Scheduled Job .","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/handler/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/scheduled-job/handler/#logger","text":"Right click on the scheduled-job-project project and select New \u2192 TypeScript Service . Enter Logger.ts for the name of the TypeScript Service. Replace the content with the following code: import { update } from \"sdk/db\" ; export enum LogDataSeverity { INFO = 'Info' , WARNING = 'Warning' , ERROR = 'Error' } export interface LogData { readonly date : Date ; readonly severity : LogDataSeverity ; readonly message : string ; } export class Logger { public static log ( logData : LogData ) { Logger . saveLogEvent ( logData ); const message = `---> [ ${ logData . severity } ] [ ${ Logger . toDateString ( logData . date ) } ]: ${ logData . message } <---` ; switch ( logData . severity ) { case LogDataSeverity.INFO : console.info ( message ); break ; case LogDataSeverity.WARNING : console.warn ( message ); break ; case LogDataSeverity.ERROR : console.error ( message ); break ; } } private static saveLogEvent ( logData : LogData ) { const sql = `insert into LOG_EVENTS (\"LOG_SEVERITY\", \"LOG_MESSAGE\", \"LOG_TIMESTAMP\") values (?, ?, ?)` ; const queryParameters = [ logData . severity , logData . message , logData . date ]; update . execute ( sql , queryParameters , null ); } private static toDateString ( date : Date ) : string { return ` ${ date . toLocaleDateString () } ; ${ date . toLocaleTimeString () } ` ; } } db/update Take a look at the db/update documentation for more details about the API.","title":"Logger"},{"location":"tutorials/application-development/scheduled-job/handler/#logger_1","text":"Right click on the scheduled-job-project project and select New \u2192 JavaScript ESM Service . Enter handler.mjs for the name of the JavaScript Service. Replace the content with the following code: import { Logger , LogDataSeverity } from './Logger' ; const logData = [{ date : new Date (), severity : LogDataSeverity . INFO , message : 'Success feels so good!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'You made it!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Open Sesame!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Password updated!' }, { date : new Date (), severity : LogDataSeverity . ERROR , message : 'Welcome to the dark side!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'So glad you are back!' }]; const randomIndex = Math . floor ( Math . random () * logData . length ); Logger . log ( logData [ randomIndex ]); Navigate to the Database Perspective to check that there is a record in the LOG_EVENTS table. Save & Publish Saving the file will trigger a Publish action, which will build and deploy the JavaScript and TypeScript services. The handler.mjs service would be executed by the Preview view. As it's expected to be executed by a Scheduled Job and not by HTTP Request nothing would be displayed in the Preview view, however the log event data would be insterted into the LOG_EVENTS table. JavaScript ESM Handler At the time of writing the tutorial, it was not possible to create a TypeScript handler for the Scheduled Job .","title":"Logger"},{"location":"tutorials/application-development/scheduled-job/handler/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Job Handler and Logger class to create log events. At least one record in the LOG_EVENTS table. Continue to the Job Definition section to create a Scheduled Job , that would trigger the Job Handler . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Next Steps"},{"location":"tutorials/application-development/scheduled-job/job/","text":"Scheduled Job - Job Definition Overview This section shows how to create and manage Job Definition for the Scheduled Job application. Steps Job Definition Right click on the scheduled-job-project project and select New \u2192 Scheduled Job . Enter log.job for the name of the Scheduled Job. Right click on log.job and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"expression\" : \"0/10 * * * * ?\" , \"group\" : \"dirigible-defined\" , \"handler\" : \"scheduled-job-project/handler.mjs\" , \"description\" : \"Scheduled Log Job\" , \"parameters\" : [ { \"name\" : \"severity\" , \"type\" : \"choice\" , \"defaultValue\" : \"\" , \"choices\" : \"Info,Warning,Error\" , \"description\" : \"The log severity\" }, { \"name\" : \"message\" , \"type\" : \"string\" , \"defaultValue\" : \"\" , \"description\" : \"The log message\" } ] } Double click on log.job to open it with the Job Editor . Save & Publish Saving the file will trigger a Publish action, that would schedule the job. As defined by the expression ( 0/10 * * * * ? ) , the job handler would be executed each 10 seconds and data would be insterted into the LOG_EVENTS table. Log Events Data Navigate to the Database Perspective to check that there are new records in the LOG_EVENTS table. You can notice in the LOG_TIMESTAMP column that the last records are 10 seconds apart each. Manage Jobs Navigate to the Jobs Perspective to see a list of the Scheduled Jobs on the instance. Click on the Enable/Disable icon to stop the log Scheduled Job. Navigate to the Database Perspective to check that there are no new records in the LOG_EVENTS table after the job was disabled. Go back to the Jobs Perspective . Click on the Trigger icon and then on the Trigger button to start new Job Execution . Force Trigger This action would instantly trigger the Job Handler without respecting the Job Schedule Expression or whether the Job Schedule is enabled or disabled. Navigate back to the Database Perspective to check that there was a new record added in the LOG_EVENTS table after the job was disabled. Summary Tutorial Completed After completing all steps in this tutorial, you would: Have Scheduled Job . New records in the LOG_EVENTS table. Experience with the Jobs Perspective . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Scheduled Job"},{"location":"tutorials/application-development/scheduled-job/job/#scheduled-job-job-definition","text":"","title":"Scheduled Job - Job Definition"},{"location":"tutorials/application-development/scheduled-job/job/#overview","text":"This section shows how to create and manage Job Definition for the Scheduled Job application.","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/job/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/scheduled-job/job/#job-definition","text":"Right click on the scheduled-job-project project and select New \u2192 Scheduled Job . Enter log.job for the name of the Scheduled Job. Right click on log.job and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"expression\" : \"0/10 * * * * ?\" , \"group\" : \"dirigible-defined\" , \"handler\" : \"scheduled-job-project/handler.mjs\" , \"description\" : \"Scheduled Log Job\" , \"parameters\" : [ { \"name\" : \"severity\" , \"type\" : \"choice\" , \"defaultValue\" : \"\" , \"choices\" : \"Info,Warning,Error\" , \"description\" : \"The log severity\" }, { \"name\" : \"message\" , \"type\" : \"string\" , \"defaultValue\" : \"\" , \"description\" : \"The log message\" } ] } Double click on log.job to open it with the Job Editor . Save & Publish Saving the file will trigger a Publish action, that would schedule the job. As defined by the expression ( 0/10 * * * * ? ) , the job handler would be executed each 10 seconds and data would be insterted into the LOG_EVENTS table.","title":"Job Definition"},{"location":"tutorials/application-development/scheduled-job/job/#log-events-data","text":"Navigate to the Database Perspective to check that there are new records in the LOG_EVENTS table. You can notice in the LOG_TIMESTAMP column that the last records are 10 seconds apart each.","title":"Log Events Data"},{"location":"tutorials/application-development/scheduled-job/job/#manage-jobs","text":"Navigate to the Jobs Perspective to see a list of the Scheduled Jobs on the instance. Click on the Enable/Disable icon to stop the log Scheduled Job. Navigate to the Database Perspective to check that there are no new records in the LOG_EVENTS table after the job was disabled. Go back to the Jobs Perspective . Click on the Trigger icon and then on the Trigger button to start new Job Execution . Force Trigger This action would instantly trigger the Job Handler without respecting the Job Schedule Expression or whether the Job Schedule is enabled or disabled. Navigate back to the Database Perspective to check that there was a new record added in the LOG_EVENTS table after the job was disabled.","title":"Manage Jobs"},{"location":"tutorials/application-development/scheduled-job/job/#summary","text":"Tutorial Completed After completing all steps in this tutorial, you would: Have Scheduled Job . New records in the LOG_EVENTS table. Experience with the Jobs Perspective . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Summary"},{"location":"tutorials/customizations/custom-stack/","text":"Custom Stack Overview This tutorial will guide you through the creation of a custom Eclipse Dirigible stack. Sections Project Structure Branding Facade Advanced Facade Dependency Note The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Custom Stack"},{"location":"tutorials/customizations/custom-stack/#custom-stack","text":"","title":"Custom Stack"},{"location":"tutorials/customizations/custom-stack/#overview","text":"This tutorial will guide you through the creation of a custom Eclipse Dirigible stack.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/#sections","text":"Project Structure Branding Facade Advanced Facade Dependency Note The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Sections"},{"location":"tutorials/customizations/custom-stack/advanced-facade/","text":"Custom Stack - Advanced Facade Overview This section will guide you through the different ways of creating a TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here The Facade section is completed. Steps Create Java Facade Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create Example.java , SubExample.java , ExampleRequest.java , ExampleResponse.java and ExampleService.java files. Example.java SubExample.java ExampleRequest.java ExampleResponse.java ExampleService.java Create new apis/src/main/java/io/dirigible/samples/api/domain/Example.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/Example.java package io.dirigible.samples.api.domain ; import java.util.ArrayList ; import java.util.List ; public class Example { private String id ; private String name ; private List < SubExample > subexamples = new ArrayList <> (); public String getId () { return id ; } public String getName () { return name ; } public List < SubExample > getSubexamples () { return subexamples ; } public void setId ( String id ) { this . id = id ; } public void setName ( String name ) { this . name = name ; } public void setSubexamples ( List < SubExample > subexamples ) { this . subexamples = subexamples ; } public Example withId ( String id ) { setId ( id ); return this ; } public Example withName ( String name ) { setName ( name ); return this ; } public Example withSubexamples ( List < SubExample > subexamples ) { setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java package io.dirigible.samples.api.domain ; import java.util.Date ; public class SubExample { private Date date ; public Date getDate () { return date ; } public void setDate ( Date date ) { this . date = date ; } public SubExample withDate ( Date date ) { setDate ( date ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java package io.dirigible.samples.api.domain.input ; public class ExampleRequest { private String exampleId ; private String exampleName ; public String getExampleId () { return exampleId ; } public void setExampleId ( String exampleId ) { this . exampleId = exampleId ; } public String getExampleName () { return exampleName ; } public void setExampleName ( String exampleName ) { this . exampleName = exampleName ; } public ExampleRequest withExampleId ( String exampleId ) { setExampleId ( exampleId ); return this ; } public ExampleRequest withExampleName ( String exampleName ) { setExampleName ( exampleName ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java package io.dirigible.samples.api.domain.output ; import java.util.ArrayList ; import java.util.List ; import io.dirigible.samples.api.domain.Example ; public class ExampleResponse { private List < Example > examples = new ArrayList <> (); public List < Example > getExamples () { return examples ; } public void setExamples ( List < Example > examples ) { this . examples = examples ; } public ExampleResponse withExamples ( List < Example > examples ) { setExamples ( examples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java package io.dirigible.samples.api.service ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public interface ExampleService { ExampleResponse doExample ( ExampleRequest request ); } Create TypeScript API Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create Example.ts , SubExample.ts , ExampleRequest.ts and ExampleResponse.ts files. Note The TypeScript files are 1:1 representation of the Java classes. They have the same methods, signature and logic as the Java classes. All TypeScript files are in the custom-api folder and don't follow the Java packages nesting, just for simplicity. Example.ts SubExample.ts ExampleRequest.ts ExampleResponse.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts import { SubExample } from \"./SubExample\" ; export class Example { // @ts-ignore private id : string ; // @ts-ignore private name : string ; // @ts-ignore private subexamples : SubExample [] = []; public getId () : string { return this . id ; } public getName () : string { return this . name ; } public getSubexamples () : SubExample [] { return this . subexamples ; } public setId ( id : string ) : void { this . id = id ; } public setName ( name : string ) : void { this . name = name ; } public setSubexamples ( subexamples : SubExample []) : void { this . subexamples = subexamples ; } public withId ( id : string ) : Example { this . setId ( id ); return this ; } public withName ( name : string ) : Example { this . setName ( name ); return this ; } public withSubexamples ( subexamples : SubExample []) : Example { this . setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts export class SubExample { // @ts-ignore private date : Date ; public getDate () : Date { return this . date ; } public setDate ( date : Date ) : void { this . date = date ; } public withDate ( date : Date ) : SubExample { this . setDate ( date ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts export class ExampleRequest { // @ts-ignore private exampleId : string ; // @ts-ignore private exampleName : string ; public getExampleId () : string { return this . exampleId ; } public setExampleId ( exampleId : string ) : void { this . exampleId = exampleId ; } public getExampleName () : string { return this . exampleName ; } public setExampleName ( exampleName : string ) : void { this . exampleName = exampleName ; } public withExampleId ( exampleId : string ) : ExampleRequest { this . setExampleId ( exampleId ); return this ; } public withExampleName ( exampleName : string ) : ExampleRequest { this . setExampleName ( exampleName ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts import { Example } from \"./Example\" ; export class ExampleResponse { private examples : Example [] = []; public getExamples () : Example [] { return this . examples ; } public setExamples ( examples : Example []) : void { this . examples = examples ; } public withExamples ( examples : Example []) : ExampleResponse { this . setExamples ( examples ); return this ; } } Create Java Client Facade Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create ExampleClient.java and ExampleClientV2.java files. ExampleClient.java vs ExampleClientV2.java There is a difference in the method signature of the ExampleClient and the ExampleClientV2 classes. Although they have the same functionallity there is difference in the input parameter type and the return type . In ExampleClient : public ExampleResponse doExample ( ExampleRequest request ) In ExampleClientV2 : public String doExample ( String requestAsString ) The ExampleClientV2 accepts String input parameter instead of ExampleRequest and returns also String instead of ExampleResponse . Inside the implementation Gson is used to parse and to stringify the JSON representation of the ExampleRequest and the ExampleResponse . This technique is used to simplify the integration between the Java facade and the TypeScript API. ExampleClient.java ExampleClientV2.java Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; import io.dirigible.samples.api.service.ExampleService ; public class ExampleClient implements ExampleService { @Override public ExampleResponse doExample ( ExampleRequest request ) { final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return exampleResponse ; } } Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public class ExampleClientV2 { public String doExample ( String requestAsString ) { final var gson = new Gson (); final var request = gson . fromJson ( requestAsString , ExampleRequest . class ); final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return gson . toJson ( exampleResponse ); } } Create TypeScript API Client Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create ExampleClient.ts , ExampleClientV2.ts , ExampleRequestV2.ts and ExampleResponseV2.ts files. ExampleClient.ts vs ExampleClientV2.ts The ExampleClient uses the native Java objects, so it has to follow the Java way of creation of objects and assigning properties. The ExampleClientV2 uses TypeScript interfaces , that represents the Java classes (see ExampleRequestV2.ts and ExampleResponseV2.ts ) to follow the TypeScript way of creation of objects and assigning properties. ExampleClient.ts ExampleClientV2.ts ExampleRequestV2.ts ExampleResponseV2.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts import { ExampleResponse } from \"./ExampleResponse\" ; import { ExampleRequest } from \"./ExampleRequest\" ; import { Example } from \"./Example\" ; import { SubExample } from \"./SubExample\" ; const ExampleClientClass = Java . type ( \"io.dirigible.samples.api.client.ExampleClient\" ); const ExampleRequestClass = Java . type ( \"io.dirigible.samples.api.domain.input.ExampleRequest\" ); export class ExampleClient { public doExample ( request : ExampleRequest ) : ExampleResponse { const requestObj = new ExampleRequestClass (); requestObj . setExampleId ( request . getExampleId ()); requestObj . setExampleName ( request . getExampleName ()); const responseObj = new ExampleClientClass (). doExample ( requestObj ); const examples : Example [] = []; for ( const exampleObj of responseObj . getExamples ()) { const example = new Example (); const subExamples : SubExample [] = []; example . setId ( exampleObj . getId ()); example . setName ( exampleObj . getName ()); for ( const subexampleObj of exampleObj . getSubexamples ()) { const subexample = new SubExample (); subexample . setDate ( subexampleObj . getDate ()); subExamples . push ( subexample ); } example . setSubexamples ( subExamples ) examples . push ( example ); } const response = new ExampleResponse (); response . setExamples ( examples ); return response ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts import { ExampleResponseV2 } from \"./ExampleResponseV2\" ; import { ExampleRequestV2 } from \"./ExampleRequestV2\" ; const ExampleClientV2Class = Java . type ( \"io.dirigible.samples.api.client.ExampleClientV2\" ); export class ExampleClientV2 { public doExample ( request : ExampleRequestV2 ) : ExampleResponseV2 { const response = new ExampleClientV2Class (). doExample ( JSON . stringify ( request )); return JSON . parse ( response ); } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts export interface ExampleRequestV2 { readonly exampleId : string ; readonly exampleName : string ; } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts export interface SubExampleV2 { readonly date : Date ; } export interface ExampleV2 { readonly id : string ; readonly name : string ; readonly subexamples : SubExampleV2 []; } export interface ExampleResponseV2 { readonly examples : ExampleV2 []; } Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Test the Advanced TypeScript API Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter demo-client.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClient } from \"custom-api/ExampleClient\" ; import { ExampleRequest } from \"custom-api/ExampleRequest\" ; const exampleRequest = new ExampleRequest (); exampleRequest . setExampleId ( 'example-id-1234' ); exampleRequest . setExampleName ( 'Custom Stack Example' ); const exampleClient = new ExampleClient (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Enter demo-client-v2.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClientV2 } from \"custom-api/ExampleClientV2\" ; import { ExampleRequestV2 } from \"custom-api/ExampleRequestV2\" ; const exampleRequest : ExampleRequestV2 = { exampleId : 'example-id-1234' , exampleName : 'Custom Stack Example' }; const exampleClient = new ExampleClientV2 (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo-client.ts from the Projects explorer and open the Preview view to see the result. Select the demo-client-v2.ts from the Projects explorer and open the Preview view to see the result. Tip As in the TypeScript API Client section, there is a difference between the usage of the ExampleClient and the ExampleClientV2 in the application code. The demo-client.ts uses the ExampleClient and the native Java objects, so it has to follow the Java way of creation of objects and assigning properties, while the demo-client-v2.ts follows the TypeScript way of creation of objects and assigning properties. Next Steps Section Completed After completing the steps in this tutorial, you would have: Two different versions of the ExampleClient Java Facades. Two different versions of the ExampleClient TypeScript APIs. Learned the difference between the native Java way and native TypeScript way of implementing the Java Facades and the TypeScript APIs Continue to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Advanced Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#custom-stack-advanced-facade","text":"","title":"Custom Stack - Advanced Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#overview","text":"This section will guide you through the different ways of creating a TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here The Facade section is completed.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-java-facade","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create Example.java , SubExample.java , ExampleRequest.java , ExampleResponse.java and ExampleService.java files. Example.java SubExample.java ExampleRequest.java ExampleResponse.java ExampleService.java Create new apis/src/main/java/io/dirigible/samples/api/domain/Example.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/Example.java package io.dirigible.samples.api.domain ; import java.util.ArrayList ; import java.util.List ; public class Example { private String id ; private String name ; private List < SubExample > subexamples = new ArrayList <> (); public String getId () { return id ; } public String getName () { return name ; } public List < SubExample > getSubexamples () { return subexamples ; } public void setId ( String id ) { this . id = id ; } public void setName ( String name ) { this . name = name ; } public void setSubexamples ( List < SubExample > subexamples ) { this . subexamples = subexamples ; } public Example withId ( String id ) { setId ( id ); return this ; } public Example withName ( String name ) { setName ( name ); return this ; } public Example withSubexamples ( List < SubExample > subexamples ) { setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java package io.dirigible.samples.api.domain ; import java.util.Date ; public class SubExample { private Date date ; public Date getDate () { return date ; } public void setDate ( Date date ) { this . date = date ; } public SubExample withDate ( Date date ) { setDate ( date ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java package io.dirigible.samples.api.domain.input ; public class ExampleRequest { private String exampleId ; private String exampleName ; public String getExampleId () { return exampleId ; } public void setExampleId ( String exampleId ) { this . exampleId = exampleId ; } public String getExampleName () { return exampleName ; } public void setExampleName ( String exampleName ) { this . exampleName = exampleName ; } public ExampleRequest withExampleId ( String exampleId ) { setExampleId ( exampleId ); return this ; } public ExampleRequest withExampleName ( String exampleName ) { setExampleName ( exampleName ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java package io.dirigible.samples.api.domain.output ; import java.util.ArrayList ; import java.util.List ; import io.dirigible.samples.api.domain.Example ; public class ExampleResponse { private List < Example > examples = new ArrayList <> (); public List < Example > getExamples () { return examples ; } public void setExamples ( List < Example > examples ) { this . examples = examples ; } public ExampleResponse withExamples ( List < Example > examples ) { setExamples ( examples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java package io.dirigible.samples.api.service ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public interface ExampleService { ExampleResponse doExample ( ExampleRequest request ); }","title":"Create Java Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-typescript-api","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create Example.ts , SubExample.ts , ExampleRequest.ts and ExampleResponse.ts files. Note The TypeScript files are 1:1 representation of the Java classes. They have the same methods, signature and logic as the Java classes. All TypeScript files are in the custom-api folder and don't follow the Java packages nesting, just for simplicity. Example.ts SubExample.ts ExampleRequest.ts ExampleResponse.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts import { SubExample } from \"./SubExample\" ; export class Example { // @ts-ignore private id : string ; // @ts-ignore private name : string ; // @ts-ignore private subexamples : SubExample [] = []; public getId () : string { return this . id ; } public getName () : string { return this . name ; } public getSubexamples () : SubExample [] { return this . subexamples ; } public setId ( id : string ) : void { this . id = id ; } public setName ( name : string ) : void { this . name = name ; } public setSubexamples ( subexamples : SubExample []) : void { this . subexamples = subexamples ; } public withId ( id : string ) : Example { this . setId ( id ); return this ; } public withName ( name : string ) : Example { this . setName ( name ); return this ; } public withSubexamples ( subexamples : SubExample []) : Example { this . setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts export class SubExample { // @ts-ignore private date : Date ; public getDate () : Date { return this . date ; } public setDate ( date : Date ) : void { this . date = date ; } public withDate ( date : Date ) : SubExample { this . setDate ( date ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts export class ExampleRequest { // @ts-ignore private exampleId : string ; // @ts-ignore private exampleName : string ; public getExampleId () : string { return this . exampleId ; } public setExampleId ( exampleId : string ) : void { this . exampleId = exampleId ; } public getExampleName () : string { return this . exampleName ; } public setExampleName ( exampleName : string ) : void { this . exampleName = exampleName ; } public withExampleId ( exampleId : string ) : ExampleRequest { this . setExampleId ( exampleId ); return this ; } public withExampleName ( exampleName : string ) : ExampleRequest { this . setExampleName ( exampleName ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts import { Example } from \"./Example\" ; export class ExampleResponse { private examples : Example [] = []; public getExamples () : Example [] { return this . examples ; } public setExamples ( examples : Example []) : void { this . examples = examples ; } public withExamples ( examples : Example []) : ExampleResponse { this . setExamples ( examples ); return this ; } }","title":"Create TypeScript API"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-java-client-facade","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create ExampleClient.java and ExampleClientV2.java files. ExampleClient.java vs ExampleClientV2.java There is a difference in the method signature of the ExampleClient and the ExampleClientV2 classes. Although they have the same functionallity there is difference in the input parameter type and the return type . In ExampleClient : public ExampleResponse doExample ( ExampleRequest request ) In ExampleClientV2 : public String doExample ( String requestAsString ) The ExampleClientV2 accepts String input parameter instead of ExampleRequest and returns also String instead of ExampleResponse . Inside the implementation Gson is used to parse and to stringify the JSON representation of the ExampleRequest and the ExampleResponse . This technique is used to simplify the integration between the Java facade and the TypeScript API. ExampleClient.java ExampleClientV2.java Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; import io.dirigible.samples.api.service.ExampleService ; public class ExampleClient implements ExampleService { @Override public ExampleResponse doExample ( ExampleRequest request ) { final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return exampleResponse ; } } Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public class ExampleClientV2 { public String doExample ( String requestAsString ) { final var gson = new Gson (); final var request = gson . fromJson ( requestAsString , ExampleRequest . class ); final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return gson . toJson ( exampleResponse ); } }","title":"Create Java Client Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-typescript-api-client","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create ExampleClient.ts , ExampleClientV2.ts , ExampleRequestV2.ts and ExampleResponseV2.ts files. ExampleClient.ts vs ExampleClientV2.ts The ExampleClient uses the native Java objects, so it has to follow the Java way of creation of objects and assigning properties. The ExampleClientV2 uses TypeScript interfaces , that represents the Java classes (see ExampleRequestV2.ts and ExampleResponseV2.ts ) to follow the TypeScript way of creation of objects and assigning properties. ExampleClient.ts ExampleClientV2.ts ExampleRequestV2.ts ExampleResponseV2.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts import { ExampleResponse } from \"./ExampleResponse\" ; import { ExampleRequest } from \"./ExampleRequest\" ; import { Example } from \"./Example\" ; import { SubExample } from \"./SubExample\" ; const ExampleClientClass = Java . type ( \"io.dirigible.samples.api.client.ExampleClient\" ); const ExampleRequestClass = Java . type ( \"io.dirigible.samples.api.domain.input.ExampleRequest\" ); export class ExampleClient { public doExample ( request : ExampleRequest ) : ExampleResponse { const requestObj = new ExampleRequestClass (); requestObj . setExampleId ( request . getExampleId ()); requestObj . setExampleName ( request . getExampleName ()); const responseObj = new ExampleClientClass (). doExample ( requestObj ); const examples : Example [] = []; for ( const exampleObj of responseObj . getExamples ()) { const example = new Example (); const subExamples : SubExample [] = []; example . setId ( exampleObj . getId ()); example . setName ( exampleObj . getName ()); for ( const subexampleObj of exampleObj . getSubexamples ()) { const subexample = new SubExample (); subexample . setDate ( subexampleObj . getDate ()); subExamples . push ( subexample ); } example . setSubexamples ( subExamples ) examples . push ( example ); } const response = new ExampleResponse (); response . setExamples ( examples ); return response ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts import { ExampleResponseV2 } from \"./ExampleResponseV2\" ; import { ExampleRequestV2 } from \"./ExampleRequestV2\" ; const ExampleClientV2Class = Java . type ( \"io.dirigible.samples.api.client.ExampleClientV2\" ); export class ExampleClientV2 { public doExample ( request : ExampleRequestV2 ) : ExampleResponseV2 { const response = new ExampleClientV2Class (). doExample ( JSON . stringify ( request )); return JSON . parse ( response ); } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts export interface ExampleRequestV2 { readonly exampleId : string ; readonly exampleName : string ; } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts export interface SubExampleV2 { readonly date : Date ; } export interface ExampleV2 { readonly id : string ; readonly name : string ; readonly subexamples : SubExampleV2 []; } export interface ExampleResponseV2 { readonly examples : ExampleV2 []; }","title":"Create TypeScript API Client"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#test-the-advanced-typescript-api","text":"Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter demo-client.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClient } from \"custom-api/ExampleClient\" ; import { ExampleRequest } from \"custom-api/ExampleRequest\" ; const exampleRequest = new ExampleRequest (); exampleRequest . setExampleId ( 'example-id-1234' ); exampleRequest . setExampleName ( 'Custom Stack Example' ); const exampleClient = new ExampleClient (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Enter demo-client-v2.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClientV2 } from \"custom-api/ExampleClientV2\" ; import { ExampleRequestV2 } from \"custom-api/ExampleRequestV2\" ; const exampleRequest : ExampleRequestV2 = { exampleId : 'example-id-1234' , exampleName : 'Custom Stack Example' }; const exampleClient = new ExampleClientV2 (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo-client.ts from the Projects explorer and open the Preview view to see the result. Select the demo-client-v2.ts from the Projects explorer and open the Preview view to see the result. Tip As in the TypeScript API Client section, there is a difference between the usage of the ExampleClient and the ExampleClientV2 in the application code. The demo-client.ts uses the ExampleClient and the native Java objects, so it has to follow the Java way of creation of objects and assigning properties, while the demo-client-v2.ts follows the TypeScript way of creation of objects and assigning properties.","title":"Test the Advanced TypeScript API"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Two different versions of the ExampleClient Java Facades. Two different versions of the ExampleClient TypeScript APIs. Learned the difference between the native Java way and native TypeScript way of implementing the Java Facades and the TypeScript APIs Continue to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/custom-stack/branding/","text":"Custom Stack - Branding Overview This section will guide you through the process of rebranding of Eclipse Dirigible Custom Stack. Steps Create Maven Module Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create branding folder and navigate to it. Create pom.xml file. pom.xml Create new branding/pom.xml file. Paste the following content: branding/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - branding custom-stack-branding 1.0.0-SNAPSHOT jar Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Open the pom.xml file. Navigate to the section. Add the following module: pom.xml Final pom.xml application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Create Branding Resources Navigate to the branding folder. Create src/main/resources/META-INF/dirigible/ide-branding/ folder structure and navigate to it. Create branding.js and custom-stack.svg files. branding.js custom-stack.svg Create new branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js const brandingInfo = { name : 'Custom Stack' , brand : 'Custom Stack' , brandUrl : 'https://github.com/dirigiblelabs/tutorial-custom-stack' , icons : { faviconIco : '/services/web/ide-branding/favicon.ico' , favicon32 : '/services/web/ide-branding/favicon-32x32.png' , favicon16 : '/services/web/ide-branding/favicon-16x16.png' , }, logo : '/services/web/ide-branding/custom-stack.svg' }; Favicons For the sake of simplicity, the favicon files were omitted. Create new branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg Add Branding Dependency Navigate to the application folder. Open the pom.xml file. Make the following changes: Add Branding Dependency Exclude Default Branding Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT Navigate to the section. Edit the dirigible-components-group-ide dependency: org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Reset Theme If the branding changes aren't visible, clear the browser cache and reset the theme by selecting Theme \u2192 Reset in the top right corner. Next Steps Section Completed After completing the steps in this tutorial, you would have: Branding Maven Module. Eclipse Dirigible Stack wih custom branding running at http://localhost:8080 . Continue to the Facade section to create Java facade and TypeScript API for the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Branding"},{"location":"tutorials/customizations/custom-stack/branding/#custom-stack-branding","text":"","title":"Custom Stack - Branding"},{"location":"tutorials/customizations/custom-stack/branding/#overview","text":"This section will guide you through the process of rebranding of Eclipse Dirigible Custom Stack.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/branding/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/branding/#create-maven-module","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create branding folder and navigate to it. Create pom.xml file. pom.xml Create new branding/pom.xml file. Paste the following content: branding/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - branding custom-stack-branding 1.0.0-SNAPSHOT jar Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Open the pom.xml file. Navigate to the section. Add the following module: pom.xml Final pom.xml application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none ","title":"Create Maven Module"},{"location":"tutorials/customizations/custom-stack/branding/#create-branding-resources","text":"Navigate to the branding folder. Create src/main/resources/META-INF/dirigible/ide-branding/ folder structure and navigate to it. Create branding.js and custom-stack.svg files. branding.js custom-stack.svg Create new branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js const brandingInfo = { name : 'Custom Stack' , brand : 'Custom Stack' , brandUrl : 'https://github.com/dirigiblelabs/tutorial-custom-stack' , icons : { faviconIco : '/services/web/ide-branding/favicon.ico' , favicon32 : '/services/web/ide-branding/favicon-32x32.png' , favicon16 : '/services/web/ide-branding/favicon-16x16.png' , }, logo : '/services/web/ide-branding/custom-stack.svg' }; Favicons For the sake of simplicity, the favicon files were omitted. Create new branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg ","title":"Create Branding Resources"},{"location":"tutorials/customizations/custom-stack/branding/#add-branding-dependency","text":"Navigate to the application folder. Open the pom.xml file. Make the following changes: Add Branding Dependency Exclude Default Branding Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT Navigate to the section. Edit the dirigible-components-group-ide dependency: org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Add Branding Dependency"},{"location":"tutorials/customizations/custom-stack/branding/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/branding/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Reset Theme If the branding changes aren't visible, clear the browser cache and reset the theme by selecting Theme \u2192 Reset in the top right corner.","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/branding/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Branding Maven Module. Eclipse Dirigible Stack wih custom branding running at http://localhost:8080 . Continue to the Facade section to create Java facade and TypeScript API for the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/custom-stack/dependency/","text":"Custom Stack - Dependency Overview This section will guide you through the process of adding external Maven dependency for generating barcodes and using it in the Eclipse Dirigible Custom Stack without creating separate Java Facade and/or TypeScript API. Note Creating TypeScript APIs is always recommended, as there is no out of the box code completion for native Java objects. Steps Add External Dependency: Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the application folder. Open the pom.xml file. Make the following changes: Add External Dependency Final pom.xml Navigate to the section. Add the following dependency: uk.org.okapibarcode okapibarcode 0.3.3 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT uk.org.okapibarcode okapibarcode 0.3.3 org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Test the Changes Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter barcode.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; const Code128 = Java . type ( \"uk.org.okapibarcode.backend.Code128\" ); const BufferedImage = Java . type ( \"java.awt.image.BufferedImage\" ); const Java2DRenderer = Java . type ( \"uk.org.okapibarcode.output.Java2DRenderer\" ); const Color = Java . type ( \"java.awt.Color\" ); const File = Java . type ( \"java.io.File\" ); const ImageIO = Java . type ( \"javax.imageio.ImageIO\" ); const FileUtils = Java . type ( \"org.apache.commons.io.FileUtils\" ); const barcode = new Code128 (); barcode . setFontName ( \"Monospaced\" ); barcode . setFontSize ( 16 ); barcode . setContent ( \"custom-stack-1234\" ); const image = new BufferedImage ( barcode . getWidth (), barcode . getHeight (), BufferedImage . TYPE_BYTE_GRAY ); const g2d = image . createGraphics (); const renderer = new Java2DRenderer ( g2d , 1 , Color . WHITE , Color . BLACK ); renderer . render ( barcode ); const file = new File ( \"code128.png\" ); ImageIO . write ( image , \"png\" , file ); const bytes = FileUtils . readFileToByteArray ( file ); response . setContentType ( \"image/png\" ); response . write ( bytes ); response . flush (); response . close (); Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 ); Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish Select the barcode.ts from the Projects explorer and open the Preview view to see the result. Summary Tutorial Completed After completing all steps in this tutorial, you would have: Custom Eclipse Dirigible Stack. Custom branding of the Eclipse Dirigible Stack. Custom Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Dependency"},{"location":"tutorials/customizations/custom-stack/dependency/#custom-stack-dependency","text":"","title":"Custom Stack - Dependency"},{"location":"tutorials/customizations/custom-stack/dependency/#overview","text":"This section will guide you through the process of adding external Maven dependency for generating barcodes and using it in the Eclipse Dirigible Custom Stack without creating separate Java Facade and/or TypeScript API. Note Creating TypeScript APIs is always recommended, as there is no out of the box code completion for native Java objects.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/dependency/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/dependency/#add-external-dependency","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the application folder. Open the pom.xml file. Make the following changes: Add External Dependency Final pom.xml Navigate to the section. Add the following dependency: uk.org.okapibarcode okapibarcode 0.3.3 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT uk.org.okapibarcode okapibarcode 0.3.3 org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Add External Dependency:"},{"location":"tutorials/customizations/custom-stack/dependency/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/dependency/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/dependency/#test-the-changes","text":"Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter barcode.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; const Code128 = Java . type ( \"uk.org.okapibarcode.backend.Code128\" ); const BufferedImage = Java . type ( \"java.awt.image.BufferedImage\" ); const Java2DRenderer = Java . type ( \"uk.org.okapibarcode.output.Java2DRenderer\" ); const Color = Java . type ( \"java.awt.Color\" ); const File = Java . type ( \"java.io.File\" ); const ImageIO = Java . type ( \"javax.imageio.ImageIO\" ); const FileUtils = Java . type ( \"org.apache.commons.io.FileUtils\" ); const barcode = new Code128 (); barcode . setFontName ( \"Monospaced\" ); barcode . setFontSize ( 16 ); barcode . setContent ( \"custom-stack-1234\" ); const image = new BufferedImage ( barcode . getWidth (), barcode . getHeight (), BufferedImage . TYPE_BYTE_GRAY ); const g2d = image . createGraphics (); const renderer = new Java2DRenderer ( g2d , 1 , Color . WHITE , Color . BLACK ); renderer . render ( barcode ); const file = new File ( \"code128.png\" ); ImageIO . write ( image , \"png\" , file ); const bytes = FileUtils . readFileToByteArray ( file ); response . setContentType ( \"image/png\" ); response . write ( bytes ); response . flush (); response . close (); Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 ); Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish Select the barcode.ts from the Projects explorer and open the Preview view to see the result.","title":"Test the Changes"},{"location":"tutorials/customizations/custom-stack/dependency/#summary","text":"Tutorial Completed After completing all steps in this tutorial, you would have: Custom Eclipse Dirigible Stack. Custom branding of the Eclipse Dirigible Stack. Custom Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Summary"},{"location":"tutorials/customizations/custom-stack/facade/","text":"Custom Stack - Facade Overview This section will guide you through the process of creation of Java Facade and TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here Steps Create APIs Module Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create apis folder and navigate to it. Create pom.xml , MyFacade.java , MyApi.ts , project.json and tsconfig.json files. pom.xml MyFacade.java MyApi.ts Create new apis/pom.xml file. Paste the following content: apis/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - apis custom-stack-apis jar Note The creation of a Java facade is optional, as the same logic can be wrapped/implemented in the TypeScript API only by using the Java.type() function. Create src/main/java/io/dirigible/samples/ folder stucture and navigate to it. Create new apis/src/main/java/io/dirigible/samples/MyFacade.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/MyFacade.java package io.dirigible.samples ; public class MyFacade { public static String greet () { return \"Hello, welcome to my custom Eclipse Dirigible stack!\" ; } public int add ( int a , int b ) { return a + b ; } public int multiply ( int a , int b ) { return a * b ; } public String customMethod ( String input ) { // Your custom logic here return \"Processed input: \" + input ; } } Create src/main/resources/META-INF/dirigible/custom-api/ folder stucture and navigate to it. Create new apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); export class MyApi { private facadeInstance = new MyFacade (); public static greet () : string { return MyFacade . greet (); } public add ( a : number , b : number ) : number { return this . facadeInstance . add ( a , b ); } public multiply ( a : number , b : number ) : number { return this . facadeInstance . multiply ( a , b ); } public customMethod ( input : string ) : string { return this . facadeInstance . customMethod ( input ); } } Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 ); Add Module Dependency Navigate to the root folder of the project (e.g. /custom-stack ) . Open the pom.xml file. Make the following changes: Add APIs Module Final pom.xml Navigate to the section. Add the following module: apis application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack apis application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Navigate to the application folder. Open the pom.xml file. Make the following changes: Add APIs Dependency Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Test the TypeScript API Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript CJS Service . Enter demo.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { MyApi } from \"custom-api/MyApi\" ; const myApiInstance = new MyApi (); const firstNumber = myApiInstance . add ( 5 , 3 ); const secondNumber = myApiInstance . multiply ( 5 , 3 ); const customMethod = myApiInstance . customMethod ( \"tutorial-custom-stack\" ); const greetingMessage = MyApi . greet (); const data = { firstNumber : firstNumber , secondNumber : secondNumber , customMethod : customMethod , greetingMessage : greetingMessage , }; response . println ( JSON . stringify ( data , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo.ts from the Projects explorer and open the Preview view to see the result. Next Steps Section Completed After completing the steps in this tutorial, you would have: APIs Maven Module. Java Facade io.dirigible.samples.MyFacade . TypeScript API custom-api/MyApi exposing the Java Facade. Sample project utilizing the TypeScript API. Continue either to the the Advanced Facade section or to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Facade"},{"location":"tutorials/customizations/custom-stack/facade/#custom-stack-facade","text":"","title":"Custom Stack - Facade"},{"location":"tutorials/customizations/custom-stack/facade/#overview","text":"This section will guide you through the process of creation of Java Facade and TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here","title":"Overview"},{"location":"tutorials/customizations/custom-stack/facade/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/facade/#create-apis-module","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create apis folder and navigate to it. Create pom.xml , MyFacade.java , MyApi.ts , project.json and tsconfig.json files. pom.xml MyFacade.java MyApi.ts Create new apis/pom.xml file. Paste the following content: apis/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - apis custom-stack-apis jar Note The creation of a Java facade is optional, as the same logic can be wrapped/implemented in the TypeScript API only by using the Java.type() function. Create src/main/java/io/dirigible/samples/ folder stucture and navigate to it. Create new apis/src/main/java/io/dirigible/samples/MyFacade.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/MyFacade.java package io.dirigible.samples ; public class MyFacade { public static String greet () { return \"Hello, welcome to my custom Eclipse Dirigible stack!\" ; } public int add ( int a , int b ) { return a + b ; } public int multiply ( int a , int b ) { return a * b ; } public String customMethod ( String input ) { // Your custom logic here return \"Processed input: \" + input ; } } Create src/main/resources/META-INF/dirigible/custom-api/ folder stucture and navigate to it. Create new apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); export class MyApi { private facadeInstance = new MyFacade (); public static greet () : string { return MyFacade . greet (); } public add ( a : number , b : number ) : number { return this . facadeInstance . add ( a , b ); } public multiply ( a : number , b : number ) : number { return this . facadeInstance . multiply ( a , b ); } public customMethod ( input : string ) : string { return this . facadeInstance . customMethod ( input ); } } Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 );","title":"Create APIs Module"},{"location":"tutorials/customizations/custom-stack/facade/#add-module-dependency","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the pom.xml file. Make the following changes: Add APIs Module Final pom.xml Navigate to the section. Add the following module: apis application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack apis application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Navigate to the application folder. Open the pom.xml file. Make the following changes: Add APIs Dependency Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Add Module Dependency"},{"location":"tutorials/customizations/custom-stack/facade/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/facade/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/facade/#test-the-typescript-api","text":"Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript CJS Service . Enter demo.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { MyApi } from \"custom-api/MyApi\" ; const myApiInstance = new MyApi (); const firstNumber = myApiInstance . add ( 5 , 3 ); const secondNumber = myApiInstance . multiply ( 5 , 3 ); const customMethod = myApiInstance . customMethod ( \"tutorial-custom-stack\" ); const greetingMessage = MyApi . greet (); const data = { firstNumber : firstNumber , secondNumber : secondNumber , customMethod : customMethod , greetingMessage : greetingMessage , }; response . println ( JSON . stringify ( data , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo.ts from the Projects explorer and open the Preview view to see the result.","title":"Test the TypeScript API"},{"location":"tutorials/customizations/custom-stack/facade/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: APIs Maven Module. Java Facade io.dirigible.samples.MyFacade . TypeScript API custom-api/MyApi exposing the Java Facade. Sample project utilizing the TypeScript API. Continue either to the the Advanced Facade section or to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/custom-stack/project-structure/","text":"Custom Stack - Project Structure Overview This section shows how to create the project structure of the Custom Stack. It contains the creation of several Maven pom.xml files, static content resources, application.properties configuration files and a Spring Boot Java class. Prerequisites JDK 21+ - OpenJDK versions can be found here . Maven 3.5+ - Maven version 3.5.3 can be found here . Steps Create Maven Project Create new folder on your machine, for the custom stack (e.g. /custom-stack ) . Create pom.xml and application/pom.xml files. pom.xml application/pom.xml Create new pom.xml file. Paste the following content: pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Eclipse Dirigible version The tutorial is using Eclipse Dirigible version 10.2.7 as highlighted on line 229 . To check for a more recent and stable version go to Eclipse Dirigible Releases . Create new folder application and navigate to it. Create new application/pom.xml file. Paste the following content: application/pom.xml Git Repository For git repositories uncomment the following lines, in order to receive the Commit Id information in the About view: 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Create Eclipse Dirigible Resources Navigate to the application folder. Create src/main/resources/ folder structure and navigate to it. Create dirigible.properties , index.html and index-busy.html files. dirigible.properties static/index.html static/index-busy.html Create application/src/main/resources/dirigible.properties file. Paste the following content: application/src/main/resources/dirigible.properties # General DIRIGIBLE_PRODUCT_NAME=${project.title} DIRIGIBLE_PRODUCT_VERSION=${project.version} DIRIGIBLE_PRODUCT_COMMIT_ID=${git.commit.id} DIRIGIBLE_PRODUCT_REPOSITORY=https://github.com/dirigiblelabs/tutorial-custom-stack DIRIGIBLE_PRODUCT_TYPE=all DIRIGIBLE_INSTANCE_NAME=custom-stack DIRIGIBLE_DATABASE_PROVIDER=local DIRIGIBLE_JAVASCRIPT_HANDLER_CLASS_NAME=org.eclipse.dirigible.graalium.handler.GraaliumJavascriptHandler DIRIGIBLE_GRAALIUM_ENABLE_DEBUG=true DIRIGIBLE_HOME_URL=services/web/ide/ DIRIGIBLE_FTP_PORT=22 Environment Variables The properties file will be packaged inside the Custom Stack , and the above environment variables will be set by default. These environment variables could be overridden during Deployment or at Runtime . To learn more about the supported configurations go to Environment Variables . Create static folder and navigate to it. Create application/src/main/resources/static/index.html file. Paste the following content: application/src/main/resources/static/index.html < html lang = \"en-US\" > < meta charset = \"utf-8\" > < title > Redirecting … < link rel = \"canonical\" href = \"/home\" > < script > location = \"/home\" < meta http-equiv = \"refresh\" content = \"0; url=/home\" > < meta name = \"robots\" content = \"noindex\" > < h1 > Redirecting … < a href = \"/home\" > Click here if you are not redirected. Create static folder and navigate to it. Create application/src/main/resources/static/index-busy.html file. Paste the following content: application/src/main/resources/static/index-busy.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"busyPage\" ng-controller = \"BusyController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Loading ... < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"padding-left: 10rem; padding-right: 10rem; margin-top: 3rem;\" > < div class = \"fd-panel fd-panel--fixed\" > < div class = \"fd-panel__header\" > < h4 class = \"fd-panel__title\" > Preparing Custom Stack Instance < fd-list > < fd-list-item ng-repeat = \"job in jobs\" > < span fd-object-status status = \"{{job.status}}\" glyph = \"{{job.statusIcon}}\" text = \"{{job.name}}\" > < fd-busy-indicator style = \"margin-top: 3rem;\" dg-size = \"l\" > < script > let busyPage = angular . module ( 'busyPage' , [ 'ideUI' , 'ideView' ]); busyPage . controller ( 'BusyController' , [ '$scope' , '$http' , 'theming' , function ( $scope , $http , theming ) { setInterval ( function () { $http ({ method : 'GET' , url : '/services/healthcheck' }). then ( function ( healthStatus ){ if ( healthStatus . data . status === \"Ready\" ) { window . location = '/home' ; } let jobs = []; for ( const [ key , value ] of Object . entries ( healthStatus . data . jobs . statuses )) { let job = new Object (); job . name = key ; switch ( value ) { case \"Succeeded\" : job . status = \"positive\" ; job . statusIcon = \"sap-icon--message-success\" break ; case \"Failed\" : job . status = \"negative\" ; job . statusIcon = \"sap-icon--message-error\" ; default : job . status = \"informative\" ; job . statusIcon = \"sap-icon--message-information\" break ; } jobs . push ( job ); } $scope . jobs = jobs . sort (( x , y ) => x . name > y . name ? 1 : - 1 ); }), ( function ( e ){ console . error ( \"Error retreiving the health status\" , e ); }); }, 1000 ); }]); (optional) Create Eclipse Dirigible Error Resources Navigate to the application/src/main/resources folder. Create public folder and navigate to it. Create error.html , 403.html and 404.html files. error.html 403.html 404.html Create application/src/main/resources/public/error/error.html file. Paste the following content: application/src/main/resources/public/error/error.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Unexpected Error Occurred < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--error\" > < fd-message-page-title > Unexpected Error Occurred < fd-message-page-subtitle > < b > There was a problem serving the requested page . < br > Usually this means that an enexpected error happened while processing your request. Here's what you can try next: < br > < br > < i >< b > Reload the page , the problem may be temporary. If the problem persists, < b > contact us and we'll help get you on your way. < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Reload Page\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"reloadPage()\" > < fd-button compact = \"true\" dg-label = \"Contact Support\" ng-click = \"contactSupport()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . reloadPage = function () { location . reload (); }; $scope . contactSupport = function () { window . open ( \"https://bugs.dirigible.io\" , \"_blank\" ); }; }]); Create error folder and navigate to it. Create application/src/main/resources/error/403.html file. Paste the following content: application/src/main/resources/error/403.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Access Denied < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--alert\" > < fd-message-page-title > Access Denied < fd-message-page-subtitle > < b > The page you're trying to access has resctricted access . < br > Pleace contact your system administrator for more details. < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { }]); Create error folder and navigate to it. Create application/src/main/resources/error/404.html file. Paste the following content: application/src/main/resources/error/404.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Page Not Found < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--documents\" > < fd-message-page-title > Page Not Found < fd-message-page-subtitle > < b > It looks like you've reached a URL that doesn't exist . < br > The page you are looking for is no longer here, or never existed in the first place. < br > < br > < i > You can go to the < b > previous page , or start over from the < b > home page . < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Go Back\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"goBack()\" > < fd-button compact = \"true\" dg-label = \"Take Me Home\" ng-click = \"goHome()\" ng-click = \"goHome()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . goBack = function () { history . back (); }; $scope . goHome = function () { window . location = \"/home\" ; }; }]); Create Spring Boot Resources Navigate to the application folder. Create application.properties , quartz.properties and CustomStackApplication.java files. application.properties quartz.properties CustomStackApplication.java Navigate to the src/main/resources/ folder. Create application/src/main/resources/application.properties file. Paste the following content: application/src/main/resources/application.properties server.port=8080 spring.main.allow-bean-definition-overriding=true server.error.include-message=always spring.servlet.multipart.enabled=true spring.servlet.multipart.file-size-threshold=2KB spring.servlet.multipart.max-file-size=1GB spring.servlet.multipart.max-request-size=1GB spring.servlet.multipart.max-file-size=200MB spring.servlet.multipart.max-request-size=215MB spring.servlet.multipart.location=${java.io.tmpdir} spring.datasource.hikari.connectionTimeout=3600000 spring.mvc.async.request-timeout=3600000 basic.enabled=${DIRIGIBLE_BASIC_ENABLED:true} terminal.enabled=${DIRIGIBLE_TERMINAL_ENABLED:false} keycloak.enabled=${DIRIGIBLE_KEYCLOAK_ENABLED:false} keycloak.realm=${DIRIGIBLE_KEYCLOAK_REALM:null} keycloak.auth-server-url=${DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL:null} keycloak.ssl-required=${DIRIGIBLE_KEYCLOAK_SSL_REQUIRED:external} keycloak.resource=${DIRIGIBLE_KEYCLOAK_CLIENT_ID:null} keycloak.public-client=true keycloak.principal-attribute=preferred_username keycloak.confidential-port=${DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT:443} keycloak.use-resource-role-mappings=true management.metrics.mongo.command.enabled=false management.metrics.mongo.connectionpool.enabled=false management.endpoints.jmx.exposure.include=* management.endpoints.jmx.exposure.exclude= management.endpoints.web.exposure.include=* management.endpoints.web.exposure.exclude= management.endpoint.health.show-details=always springdoc.api-docs.path=/api-docs cxf.path=/odata/v2 # the following are used to force the Spring to create QUARTZ tables # quartz properties are manged in quartz.properties don't try to add them here spring.quartz.job-store-type=jdbc spring.quartz.jdbc.initialize-schema=always Navigate to the src/main/resources/ folder. Create application/src/main/resources/quartz.properties file. Paste the following content: application/src/main/resources/quartz.properties # thread-pool org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool org.quartz.threadPool.threadCount=2 org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread=true # job-store # Enable this property for RAMJobStore org.quartz.jobStore.class=org.quartz.simpl.RAMJobStore # Enable these properties for a JDBCJobStore using JobStoreTX #org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX #org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.StdJDBCDelegate #org.quartz.jobStore.dataSource=quartzDataSource # Enable this property for JobStoreCMT #org.quartz.jobStore.nonManagedTXDataSource=quartzDataSource # H2 database # use an in-memory database & initialise Quartz using their standard SQL script #org.quartz.dataSource.quartzDataSource.URL=jdbc:h2:mem:spring-quartz;INIT=RUNSCRIPT FROM 'classpath:/org/quartz/impl/jdbcjobstore/tables_h2.sql' #org.quartz.dataSource.quartzDataSource.driver=org.h2.Driver #org.quartz.dataSource.quartzDataSource.user=sa #org.quartz.dataSource.quartzDataSource.password= #org.quartz.jdbc.initialize-schema=never Navigate to the src/main folder. Create java/io/dirigible/samples/ and navigate to it. Create application/src/main/java/io/dirigible/samples/CustomStackApplication.java file. Paste the following content: application/src/main/java/io/dirigible/samples/CustomStackApplication.java package io.dirigible.samples ; import org.springframework.boot.SpringApplication ; import org.springframework.boot.autoconfigure.SpringBootApplication ; import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration ; import org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration ; import org.springframework.data.jpa.repository.config.EnableJpaAuditing ; import org.springframework.data.jpa.repository.config.EnableJpaRepositories ; import org.springframework.scheduling.annotation.EnableScheduling ; @EnableJpaAuditing @EnableJpaRepositories @SpringBootApplication ( scanBasePackages = { \"io.dirigible.samples\" , \"org.eclipse.dirigible.components\" }, exclude = { DataSourceAutoConfiguration . class , DataSourceTransactionManagerAutoConfiguration . class , HibernateJpaAutoConfiguration . class , JdbcTemplateAutoConfiguration . class }) @EnableScheduling public class CustomStackApplication { public static void main ( String [] args ) { SpringApplication . run ( CustomStackApplication . class , args ); } } Build the Custom Stack Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Stack : mvn clean install Run the Custom Stack Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Debugging To run in debug mode, execute the following command: java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8000 -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Next Steps Section Completed After completing the steps in this tutorial, you would have: Maven project structure. Spring Boot application. Eclipse Dirigible Stack running at http://localhost:8080 . Continue to the Branding section to customize the branding of the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Project Structure"},{"location":"tutorials/customizations/custom-stack/project-structure/#custom-stack-project-structure","text":"","title":"Custom Stack - Project Structure"},{"location":"tutorials/customizations/custom-stack/project-structure/#overview","text":"This section shows how to create the project structure of the Custom Stack. It contains the creation of several Maven pom.xml files, static content resources, application.properties configuration files and a Spring Boot Java class. Prerequisites JDK 21+ - OpenJDK versions can be found here . Maven 3.5+ - Maven version 3.5.3 can be found here .","title":"Overview"},{"location":"tutorials/customizations/custom-stack/project-structure/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/project-structure/#create-maven-project","text":"Create new folder on your machine, for the custom stack (e.g. /custom-stack ) . Create pom.xml and application/pom.xml files. pom.xml application/pom.xml Create new pom.xml file. Paste the following content: pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Eclipse Dirigible version The tutorial is using Eclipse Dirigible version 10.2.7 as highlighted on line 229 . To check for a more recent and stable version go to Eclipse Dirigible Releases . Create new folder application and navigate to it. Create new application/pom.xml file. Paste the following content: application/pom.xml Git Repository For git repositories uncomment the following lines, in order to receive the Commit Id information in the About view: 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Create Maven Project"},{"location":"tutorials/customizations/custom-stack/project-structure/#create-eclipse-dirigible-resources","text":"Navigate to the application folder. Create src/main/resources/ folder structure and navigate to it. Create dirigible.properties , index.html and index-busy.html files. dirigible.properties static/index.html static/index-busy.html Create application/src/main/resources/dirigible.properties file. Paste the following content: application/src/main/resources/dirigible.properties # General DIRIGIBLE_PRODUCT_NAME=${project.title} DIRIGIBLE_PRODUCT_VERSION=${project.version} DIRIGIBLE_PRODUCT_COMMIT_ID=${git.commit.id} DIRIGIBLE_PRODUCT_REPOSITORY=https://github.com/dirigiblelabs/tutorial-custom-stack DIRIGIBLE_PRODUCT_TYPE=all DIRIGIBLE_INSTANCE_NAME=custom-stack DIRIGIBLE_DATABASE_PROVIDER=local DIRIGIBLE_JAVASCRIPT_HANDLER_CLASS_NAME=org.eclipse.dirigible.graalium.handler.GraaliumJavascriptHandler DIRIGIBLE_GRAALIUM_ENABLE_DEBUG=true DIRIGIBLE_HOME_URL=services/web/ide/ DIRIGIBLE_FTP_PORT=22 Environment Variables The properties file will be packaged inside the Custom Stack , and the above environment variables will be set by default. These environment variables could be overridden during Deployment or at Runtime . To learn more about the supported configurations go to Environment Variables . Create static folder and navigate to it. Create application/src/main/resources/static/index.html file. Paste the following content: application/src/main/resources/static/index.html < html lang = \"en-US\" > < meta charset = \"utf-8\" > < title > Redirecting … < link rel = \"canonical\" href = \"/home\" > < script > location = \"/home\" < meta http-equiv = \"refresh\" content = \"0; url=/home\" > < meta name = \"robots\" content = \"noindex\" > < h1 > Redirecting … < a href = \"/home\" > Click here if you are not redirected. Create static folder and navigate to it. Create application/src/main/resources/static/index-busy.html file. Paste the following content: application/src/main/resources/static/index-busy.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"busyPage\" ng-controller = \"BusyController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Loading ... < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"padding-left: 10rem; padding-right: 10rem; margin-top: 3rem;\" > < div class = \"fd-panel fd-panel--fixed\" > < div class = \"fd-panel__header\" > < h4 class = \"fd-panel__title\" > Preparing Custom Stack Instance < fd-list > < fd-list-item ng-repeat = \"job in jobs\" > < span fd-object-status status = \"{{job.status}}\" glyph = \"{{job.statusIcon}}\" text = \"{{job.name}}\" > < fd-busy-indicator style = \"margin-top: 3rem;\" dg-size = \"l\" > < script > let busyPage = angular . module ( 'busyPage' , [ 'ideUI' , 'ideView' ]); busyPage . controller ( 'BusyController' , [ '$scope' , '$http' , 'theming' , function ( $scope , $http , theming ) { setInterval ( function () { $http ({ method : 'GET' , url : '/services/healthcheck' }). then ( function ( healthStatus ){ if ( healthStatus . data . status === \"Ready\" ) { window . location = '/home' ; } let jobs = []; for ( const [ key , value ] of Object . entries ( healthStatus . data . jobs . statuses )) { let job = new Object (); job . name = key ; switch ( value ) { case \"Succeeded\" : job . status = \"positive\" ; job . statusIcon = \"sap-icon--message-success\" break ; case \"Failed\" : job . status = \"negative\" ; job . statusIcon = \"sap-icon--message-error\" ; default : job . status = \"informative\" ; job . statusIcon = \"sap-icon--message-information\" break ; } jobs . push ( job ); } $scope . jobs = jobs . sort (( x , y ) => x . name > y . name ? 1 : - 1 ); }), ( function ( e ){ console . error ( \"Error retreiving the health status\" , e ); }); }, 1000 ); }]); ","title":"Create Eclipse Dirigible Resources"},{"location":"tutorials/customizations/custom-stack/project-structure/#optional-create-eclipse-dirigible-error-resources","text":"Navigate to the application/src/main/resources folder. Create public folder and navigate to it. Create error.html , 403.html and 404.html files. error.html 403.html 404.html Create application/src/main/resources/public/error/error.html file. Paste the following content: application/src/main/resources/public/error/error.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Unexpected Error Occurred < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--error\" > < fd-message-page-title > Unexpected Error Occurred < fd-message-page-subtitle > < b > There was a problem serving the requested page . < br > Usually this means that an enexpected error happened while processing your request. Here's what you can try next: < br > < br > < i >< b > Reload the page , the problem may be temporary. If the problem persists, < b > contact us and we'll help get you on your way. < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Reload Page\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"reloadPage()\" > < fd-button compact = \"true\" dg-label = \"Contact Support\" ng-click = \"contactSupport()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . reloadPage = function () { location . reload (); }; $scope . contactSupport = function () { window . open ( \"https://bugs.dirigible.io\" , \"_blank\" ); }; }]); Create error folder and navigate to it. Create application/src/main/resources/error/403.html file. Paste the following content: application/src/main/resources/error/403.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Access Denied < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--alert\" > < fd-message-page-title > Access Denied < fd-message-page-subtitle > < b > The page you're trying to access has resctricted access . < br > Pleace contact your system administrator for more details. < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { }]); Create error folder and navigate to it. Create application/src/main/resources/error/404.html file. Paste the following content: application/src/main/resources/error/404.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Page Not Found < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--documents\" > < fd-message-page-title > Page Not Found < fd-message-page-subtitle > < b > It looks like you've reached a URL that doesn't exist . < br > The page you are looking for is no longer here, or never existed in the first place. < br > < br > < i > You can go to the < b > previous page , or start over from the < b > home page . < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Go Back\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"goBack()\" > < fd-button compact = \"true\" dg-label = \"Take Me Home\" ng-click = \"goHome()\" ng-click = \"goHome()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . goBack = function () { history . back (); }; $scope . goHome = function () { window . location = \"/home\" ; }; }]); ","title":"(optional) Create Eclipse Dirigible Error Resources"},{"location":"tutorials/customizations/custom-stack/project-structure/#create-spring-boot-resources","text":"Navigate to the application folder. Create application.properties , quartz.properties and CustomStackApplication.java files. application.properties quartz.properties CustomStackApplication.java Navigate to the src/main/resources/ folder. Create application/src/main/resources/application.properties file. Paste the following content: application/src/main/resources/application.properties server.port=8080 spring.main.allow-bean-definition-overriding=true server.error.include-message=always spring.servlet.multipart.enabled=true spring.servlet.multipart.file-size-threshold=2KB spring.servlet.multipart.max-file-size=1GB spring.servlet.multipart.max-request-size=1GB spring.servlet.multipart.max-file-size=200MB spring.servlet.multipart.max-request-size=215MB spring.servlet.multipart.location=${java.io.tmpdir} spring.datasource.hikari.connectionTimeout=3600000 spring.mvc.async.request-timeout=3600000 basic.enabled=${DIRIGIBLE_BASIC_ENABLED:true} terminal.enabled=${DIRIGIBLE_TERMINAL_ENABLED:false} keycloak.enabled=${DIRIGIBLE_KEYCLOAK_ENABLED:false} keycloak.realm=${DIRIGIBLE_KEYCLOAK_REALM:null} keycloak.auth-server-url=${DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL:null} keycloak.ssl-required=${DIRIGIBLE_KEYCLOAK_SSL_REQUIRED:external} keycloak.resource=${DIRIGIBLE_KEYCLOAK_CLIENT_ID:null} keycloak.public-client=true keycloak.principal-attribute=preferred_username keycloak.confidential-port=${DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT:443} keycloak.use-resource-role-mappings=true management.metrics.mongo.command.enabled=false management.metrics.mongo.connectionpool.enabled=false management.endpoints.jmx.exposure.include=* management.endpoints.jmx.exposure.exclude= management.endpoints.web.exposure.include=* management.endpoints.web.exposure.exclude= management.endpoint.health.show-details=always springdoc.api-docs.path=/api-docs cxf.path=/odata/v2 # the following are used to force the Spring to create QUARTZ tables # quartz properties are manged in quartz.properties don't try to add them here spring.quartz.job-store-type=jdbc spring.quartz.jdbc.initialize-schema=always Navigate to the src/main/resources/ folder. Create application/src/main/resources/quartz.properties file. Paste the following content: application/src/main/resources/quartz.properties # thread-pool org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool org.quartz.threadPool.threadCount=2 org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread=true # job-store # Enable this property for RAMJobStore org.quartz.jobStore.class=org.quartz.simpl.RAMJobStore # Enable these properties for a JDBCJobStore using JobStoreTX #org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX #org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.StdJDBCDelegate #org.quartz.jobStore.dataSource=quartzDataSource # Enable this property for JobStoreCMT #org.quartz.jobStore.nonManagedTXDataSource=quartzDataSource # H2 database # use an in-memory database & initialise Quartz using their standard SQL script #org.quartz.dataSource.quartzDataSource.URL=jdbc:h2:mem:spring-quartz;INIT=RUNSCRIPT FROM 'classpath:/org/quartz/impl/jdbcjobstore/tables_h2.sql' #org.quartz.dataSource.quartzDataSource.driver=org.h2.Driver #org.quartz.dataSource.quartzDataSource.user=sa #org.quartz.dataSource.quartzDataSource.password= #org.quartz.jdbc.initialize-schema=never Navigate to the src/main folder. Create java/io/dirigible/samples/ and navigate to it. Create application/src/main/java/io/dirigible/samples/CustomStackApplication.java file. Paste the following content: application/src/main/java/io/dirigible/samples/CustomStackApplication.java package io.dirigible.samples ; import org.springframework.boot.SpringApplication ; import org.springframework.boot.autoconfigure.SpringBootApplication ; import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration ; import org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration ; import org.springframework.data.jpa.repository.config.EnableJpaAuditing ; import org.springframework.data.jpa.repository.config.EnableJpaRepositories ; import org.springframework.scheduling.annotation.EnableScheduling ; @EnableJpaAuditing @EnableJpaRepositories @SpringBootApplication ( scanBasePackages = { \"io.dirigible.samples\" , \"org.eclipse.dirigible.components\" }, exclude = { DataSourceAutoConfiguration . class , DataSourceTransactionManagerAutoConfiguration . class , HibernateJpaAutoConfiguration . class , JdbcTemplateAutoConfiguration . class }) @EnableScheduling public class CustomStackApplication { public static void main ( String [] args ) { SpringApplication . run ( CustomStackApplication . class , args ); } }","title":"Create Spring Boot Resources"},{"location":"tutorials/customizations/custom-stack/project-structure/#build-the-custom-stack","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Stack : mvn clean install","title":"Build the Custom Stack"},{"location":"tutorials/customizations/custom-stack/project-structure/#run-the-custom-stack","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Debugging To run in debug mode, execute the following command: java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8000 -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Stack"},{"location":"tutorials/customizations/custom-stack/project-structure/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Maven project structure. Spring Boot application. Eclipse Dirigible Stack running at http://localhost:8080 . Continue to the Branding section to customize the branding of the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/ide/create-perspective/","text":"Create Perspective All perspectives in Eclipse Dirigible are loaded via the ide-perspective extension point. List with all extension points can be found at the Extensions Overview page. To develop a new perspective, extension , perspective definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library . Steps Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-perspective for the name of the project. The project will appear under the projects list. Create perspective extension: Right click on the my-perspective project and select New \u2192 Folder . Enter perspective for the name of the folder. Right click on the perspective folder and select New \u2192 Folder . Enter extensions for the name of the folder. Create perspective.extension , perspective-menu.extension , perspective-menu-window.extension and perspective-menu-help.extension files. perspective.extension perspective-menu.extension perspective-menu-window.extension perspective-menu-help.extension Right click on the extensions folder and select New \u2192 Extension . Enter perspective.extension for the name of the file. Right click on the perspective.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"My Perspective\" } Save the changes and close the Code Editor . (optional) Double click on the perspective.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu.extension for the name of the file. Right click on the perspective-menu.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective-menu.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"My Perspective Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-window.extension for the name of the file. Right click on the perspective-menu-window.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/window.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Window Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-window.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-help.extension for the name of the file. Right click on the perspective-menu-help.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/help.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Help Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-help.extension file to open the extension with the Extension Editor . Create perspective definition: Create perspective.js and perspective-menu.js files. perspective.js perspective-menu.js Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective.js for the name of the file. Double click on the perspective.js file to open it with the Code Editor . Replace the content with the following code: const perspectiveData = { id : \"my-perspective\" , name : \"My Perspective\" , link : \"../my-perspective/index.html\" , order : \"1000\" , icon : \"../my-perspective/icon.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Save the changes and close the Code Editor . Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective-menu.js for the name of the file. Double click on the perspective-menu.js file to open it with the Code Editor . Replace the content with the following code: exports . getMenu = function () { return { label : \"My Menu\" , order : 1 , items : [ { label : \"Empty item\" , order : 1 }, { label : \"Empty item with divider\" , divider : true , order : 2 }, { label : \"Submenu\" , order : 3 , items : [ { label : \"GitHub page\" , data : \"https://github.com/eclipse/dirigible\" , action : \"open\" , order : 1 } ] }, { label : \"About\" , action : \"openDialogWindow\" , dialogId : \"about\" , order : 4 } ] }; } Save the changes and close the Code Editor . Create perspective frontend resources: Create index.html , controller.js and icon.svg files. index.html controller.js icon.svg Right click on the my-perspective project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" ng-app = \"myPerspective\" ng-controller = \"MyPerspectiveController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < title dg-brand-title > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < script type = \"text/javascript\" src = \"/services/v4/web/my-perspective/perspective/perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body dg-contextmenu = \"contextMenuContent\" > < ide-header menu-ext-id = \"my-perspective-menu\" > < ide-contextmenu > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myPerspective = angular . module ( \"myPerspective\" , [ \"ngResource\" , \"ideLayout\" , \"ideUI\" ]); myPerspective . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'example' ; }]); myPerspective . controller ( \"MyPerspectiveController\" , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { $scope . layoutModel = { // Array of view ids views : [ \"import\" , \"welcome\" , \"console\" ], layoutSettings : { hideEditorsPane : false }, events : { \"example.alert.info\" : function ( msg ) { console . info ( msg . data . message ); } } }; $scope . contextMenuContent = function ( element ) { return { callbackTopic : \"example.contextmenu\" , items : [ { id : \"new\" , label : \"New\" , icon : \"sap-icon--create\" , items : [ { id : \"tab\" , label : \"Tab\" }, ] }, { id : \"other\" , label : \"Other\" , divider : true , icon : \"sap-icon--question-mark\" } ] } }; messageHub . onDidReceiveMessage ( \"contextmenu\" , function ( msg ) { if ( msg . data == \"other\" ) { messageHub . showAlertSuccess ( \"Success\" , \"You have selected the other option!\" ); } else { messageHub . showAlertInfo ( \"Nothing will happen\" , \"This is just a demo after all.\" ); } } ); }]); Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter icon.svg for the name of the file. Right click on the icon.svg file and select Open With \u2192 Code Editor . Replace the content with the following code: Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. The new perspective should be visibile at the bottom of the perspectives list. Info Alternatively go to Window \u2192 Open Perspective \u2192 My Perspective to open the new perspective.","title":"Perspective"},{"location":"tutorials/customizations/ide/create-perspective/#create-perspective","text":"All perspectives in Eclipse Dirigible are loaded via the ide-perspective extension point. List with all extension points can be found at the Extensions Overview page. To develop a new perspective, extension , perspective definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library .","title":"Create Perspective"},{"location":"tutorials/customizations/ide/create-perspective/#steps","text":"Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-perspective for the name of the project. The project will appear under the projects list. Create perspective extension: Right click on the my-perspective project and select New \u2192 Folder . Enter perspective for the name of the folder. Right click on the perspective folder and select New \u2192 Folder . Enter extensions for the name of the folder. Create perspective.extension , perspective-menu.extension , perspective-menu-window.extension and perspective-menu-help.extension files. perspective.extension perspective-menu.extension perspective-menu-window.extension perspective-menu-help.extension Right click on the extensions folder and select New \u2192 Extension . Enter perspective.extension for the name of the file. Right click on the perspective.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"My Perspective\" } Save the changes and close the Code Editor . (optional) Double click on the perspective.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu.extension for the name of the file. Right click on the perspective-menu.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective-menu.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"My Perspective Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-window.extension for the name of the file. Right click on the perspective-menu-window.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/window.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Window Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-window.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-help.extension for the name of the file. Right click on the perspective-menu-help.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/help.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Help Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-help.extension file to open the extension with the Extension Editor . Create perspective definition: Create perspective.js and perspective-menu.js files. perspective.js perspective-menu.js Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective.js for the name of the file. Double click on the perspective.js file to open it with the Code Editor . Replace the content with the following code: const perspectiveData = { id : \"my-perspective\" , name : \"My Perspective\" , link : \"../my-perspective/index.html\" , order : \"1000\" , icon : \"../my-perspective/icon.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Save the changes and close the Code Editor . Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective-menu.js for the name of the file. Double click on the perspective-menu.js file to open it with the Code Editor . Replace the content with the following code: exports . getMenu = function () { return { label : \"My Menu\" , order : 1 , items : [ { label : \"Empty item\" , order : 1 }, { label : \"Empty item with divider\" , divider : true , order : 2 }, { label : \"Submenu\" , order : 3 , items : [ { label : \"GitHub page\" , data : \"https://github.com/eclipse/dirigible\" , action : \"open\" , order : 1 } ] }, { label : \"About\" , action : \"openDialogWindow\" , dialogId : \"about\" , order : 4 } ] }; } Save the changes and close the Code Editor . Create perspective frontend resources: Create index.html , controller.js and icon.svg files. index.html controller.js icon.svg Right click on the my-perspective project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" ng-app = \"myPerspective\" ng-controller = \"MyPerspectiveController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < title dg-brand-title > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < script type = \"text/javascript\" src = \"/services/v4/web/my-perspective/perspective/perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body dg-contextmenu = \"contextMenuContent\" > < ide-header menu-ext-id = \"my-perspective-menu\" > < ide-contextmenu > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myPerspective = angular . module ( \"myPerspective\" , [ \"ngResource\" , \"ideLayout\" , \"ideUI\" ]); myPerspective . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'example' ; }]); myPerspective . controller ( \"MyPerspectiveController\" , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { $scope . layoutModel = { // Array of view ids views : [ \"import\" , \"welcome\" , \"console\" ], layoutSettings : { hideEditorsPane : false }, events : { \"example.alert.info\" : function ( msg ) { console . info ( msg . data . message ); } } }; $scope . contextMenuContent = function ( element ) { return { callbackTopic : \"example.contextmenu\" , items : [ { id : \"new\" , label : \"New\" , icon : \"sap-icon--create\" , items : [ { id : \"tab\" , label : \"Tab\" }, ] }, { id : \"other\" , label : \"Other\" , divider : true , icon : \"sap-icon--question-mark\" } ] } }; messageHub . onDidReceiveMessage ( \"contextmenu\" , function ( msg ) { if ( msg . data == \"other\" ) { messageHub . showAlertSuccess ( \"Success\" , \"You have selected the other option!\" ); } else { messageHub . showAlertInfo ( \"Nothing will happen\" , \"This is just a demo after all.\" ); } } ); }]); Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter icon.svg for the name of the file. Right click on the icon.svg file and select Open With \u2192 Code Editor . Replace the content with the following code: Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. The new perspective should be visibile at the bottom of the perspectives list. Info Alternatively go to Window \u2192 Open Perspective \u2192 My Perspective to open the new perspective.","title":"Steps"},{"location":"tutorials/customizations/ide/create-view/","text":"Create View All views in Eclipse Dirigible are loaded via the ide-view extension point. List with all extension points can be found at the Extensions Overview page. To develop a new view, extension , view definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library . Steps Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-view for the name of the project. The project will appear under the projects list. Create view extension: Right click on the my-view project and select New \u2192 Folder . Enter view for the name of the folder. Create view.extension and view.js files. view.extension view.js Right click on the view folder and select New \u2192 Extension . Enter view.extension for the name of the file. Right click on the view.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-view/view/view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"My View\" } Save the changes and close the Code Editor . (optional) Double click on the view.extension file to open the extension with the Extension Editor . Right click on the view folder and select New \u2192 JavaScript CJS Service . Enter view.js for the name of the file. Double click on the view.js file to open it with the Code Editor . Replace the content with the following code: const viewData = { id : \"my-view\" , label : \"My View\" , factory : \"frame\" , region : \"bottom\" , link : \"../my-view/index.html\" , }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Save the changes and close the Code Editor . Create view frontend resources: Create index.html and controller.js files. index.html controller.js Right click on the my-view project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"myView\" ng-controller = \"MyViewController as mvc\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" sizes = \"any\" href = \"data:;base64,iVBORw0KGgo=\" > < title dg-view-title > < script type = \"text/javascript\" src = \"/webjars/jquery/3.6.0/jquery.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular-resource.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angular-aria/1.8.2/angular-aria.min.js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/webjars/fundamental-styles/0.24.0/dist/fundamental-styles.css\" > < theme > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/core.css\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/widgets.css\" /> < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/ide-message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/theming.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/widgets.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/view.js\" > < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < fd-fieldset > < fd-form-group dg-header = \"My Form\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idName\" dg-required = \"true\" dg-colon = \"true\" > Name < fd-input id = \"idName\" type = \"text\" placeholder = \"Enter name here\" ng-model = \"inputData.name\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idEmail\" dg-required = \"true\" dg-colon = \"true\" > Email < fd-input id = \"idEmail\" type = \"text\" placeholder = \"Enter email here\" ng-model = \"inputData.email\" > < button class = \"fd-button fd-button--emphasized\" ng-click = \"saveForm()\" style = \"margin: 6px;\" > Save < table fd-table display-mode = \"compact\" style = \"margin-top: 20px\" > < thead fd-table-header > < tr fd-table-row > < th fd-table-header-cell > Name < th fd-table-header-cell > Email < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" > < td fd-table-cell > {{next.name}} < td fd-table-cell activable = \"true\" >< a class = \"fd-link\" > {{next.email}} Save the changes and close the Code Editor . Right click on the my-view project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myView = angular . module ( \"myView\" , [ \"ideUI\" , \"ideView\" ]); myView . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = \"myView\" ; }]); myView . controller ( \"MyViewController\" , [ \"$scope\" , \"$http\" , \"messageHub\" , function ( $scope , $http , messageHub ) { $scope . inputData = {}; $scope . data = [{ name : \"John Doe\" , email : \"john.doe@email.com\" }, { name : \"Jane Doe\" , email : \"jane.doe@email.com\" }]; $scope . saveForm = function () { messageHub . showAlertInfo ( \"Form Successfully Save\" , `Name: ${ $scope . inputData . name } , Email: ${ $scope . inputData . email } ` ); }; }]); Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. Go to Window \u2192 Show View \u2192 My View to open the new view.","title":"View"},{"location":"tutorials/customizations/ide/create-view/#create-view","text":"All views in Eclipse Dirigible are loaded via the ide-view extension point. List with all extension points can be found at the Extensions Overview page. To develop a new view, extension , view definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library .","title":"Create View"},{"location":"tutorials/customizations/ide/create-view/#steps","text":"Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-view for the name of the project. The project will appear under the projects list. Create view extension: Right click on the my-view project and select New \u2192 Folder . Enter view for the name of the folder. Create view.extension and view.js files. view.extension view.js Right click on the view folder and select New \u2192 Extension . Enter view.extension for the name of the file. Right click on the view.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-view/view/view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"My View\" } Save the changes and close the Code Editor . (optional) Double click on the view.extension file to open the extension with the Extension Editor . Right click on the view folder and select New \u2192 JavaScript CJS Service . Enter view.js for the name of the file. Double click on the view.js file to open it with the Code Editor . Replace the content with the following code: const viewData = { id : \"my-view\" , label : \"My View\" , factory : \"frame\" , region : \"bottom\" , link : \"../my-view/index.html\" , }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Save the changes and close the Code Editor . Create view frontend resources: Create index.html and controller.js files. index.html controller.js Right click on the my-view project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"myView\" ng-controller = \"MyViewController as mvc\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" sizes = \"any\" href = \"data:;base64,iVBORw0KGgo=\" > < title dg-view-title > < script type = \"text/javascript\" src = \"/webjars/jquery/3.6.0/jquery.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular-resource.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angular-aria/1.8.2/angular-aria.min.js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/webjars/fundamental-styles/0.24.0/dist/fundamental-styles.css\" > < theme > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/core.css\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/widgets.css\" /> < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/ide-message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/theming.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/widgets.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/view.js\" > < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < fd-fieldset > < fd-form-group dg-header = \"My Form\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idName\" dg-required = \"true\" dg-colon = \"true\" > Name < fd-input id = \"idName\" type = \"text\" placeholder = \"Enter name here\" ng-model = \"inputData.name\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idEmail\" dg-required = \"true\" dg-colon = \"true\" > Email < fd-input id = \"idEmail\" type = \"text\" placeholder = \"Enter email here\" ng-model = \"inputData.email\" > < button class = \"fd-button fd-button--emphasized\" ng-click = \"saveForm()\" style = \"margin: 6px;\" > Save < table fd-table display-mode = \"compact\" style = \"margin-top: 20px\" > < thead fd-table-header > < tr fd-table-row > < th fd-table-header-cell > Name < th fd-table-header-cell > Email < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" > < td fd-table-cell > {{next.name}} < td fd-table-cell activable = \"true\" >< a class = \"fd-link\" > {{next.email}} Save the changes and close the Code Editor . Right click on the my-view project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myView = angular . module ( \"myView\" , [ \"ideUI\" , \"ideView\" ]); myView . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = \"myView\" ; }]); myView . controller ( \"MyViewController\" , [ \"$scope\" , \"$http\" , \"messageHub\" , function ( $scope , $http , messageHub ) { $scope . inputData = {}; $scope . data = [{ name : \"John Doe\" , email : \"john.doe@email.com\" }, { name : \"Jane Doe\" , email : \"jane.doe@email.com\" }]; $scope . saveForm = function () { messageHub . showAlertInfo ( \"Form Successfully Save\" , `Name: ${ $scope . inputData . name } , Email: ${ $scope . inputData . email } ` ); }; }]); Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. Go to Window \u2192 Show View \u2192 My View to open the new view.","title":"Steps"},{"location":"tutorials/modeling/bpmn-process/","text":"BPMN Process This tutorial will guide you through the steps of creating a Business Process with Service Task , User Task and Choice Gateway elements. The result of the business process modeling would be a Time Entry Request process, that once started would trigger an approval process (with mail notifications, if configured) with the following steps: Steps Start Eclipse Dirigible Info You can find more information on how to do that by following: Getting Started section. Setup section. Create Project Go to the Projects perspective and create New Project . Enter sample-bpm for the name of the project. The project will appear under the projects list. Create JavaScript Process Task Handlers JavaScript handlers should be provided for the Service Task steps in the Business Process . The following handlers will be executed during the Approve Time Entry Request , Deny Time Entry Request and Send Notification tasks. Right click on the sample-bpm project and select New \u2192 Folder . Enter tasks for the name of the folder. Create approve-request.js , reject-request.js and send-notification.js files. approve-request.js reject-request.js send-notification.js Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter approve-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . log ( `Time Entry Request Approved for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Approved\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Approved` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter reject-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . error ( `Time Entry Request Rejected for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Rejected\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Rejected

` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter send-notification.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const base64 = require ( \"utils/v4/base64\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let data = { executionId : executionId , User : process . getVariable ( executionId , \"User\" ), Project : process . getVariable ( executionId , \"Project\" ), Start : process . getVariable ( executionId , \"Start\" ), End : process . getVariable ( executionId , \"End\" ), Hours : process . getVariable ( executionId , \"Hours\" ) }; let urlEncodedData = base64 . encode ( JSON . stringify ( data )); let url = `http://localhost:8080/services/v4/web/sample-bpm/process/?data= ${ urlEncodedData } ` ; console . log ( `Approve Request URL: ${ url } ` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Pending\" ; let content = `

Status:

Time Entry Request for [ ${ data . User } ] - Pending

Click here to process request.` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Create Business Process Model Right click on the sample-bpm project and select New \u2192 Business Process Model . Enter time-entry-request.bpmn for the name of the business process. Manual Steps XML Content Double-click the time-entry-request.bpmn file to open it with the Flowable Editor . Click on the Process identifier field and change the value to time-entry-request . Click on the Name field and change the value to Time Entry Request . Click on the MyServiceTasks to select the first step of the business process. Click on the Name field and change the value to Send Notification . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/send-notification.js . JavaScript Task Handler The value of the handler field (e.g. sample-bpm/tasks/send-notification.js ) points to the location of the javascript task handler created in the previous step. Delete the arrow comming out of the Send Notification step. Expand the Activities group and drag and drop new User task to editor area. Connect the Send Notification task and the newly created user task. User Task Once the business process is triggered, it would stop at the Process Time Entry Request user task and it will wait for process continuation after the user task is completed. Select the user task. Click on the Name field and change the value to Process Time Entry Request . Create Choice gateway comming out of the Process Time Entry Request user task. Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Approve Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/approve-request.js . Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Reject Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/reject-request.js . Connect the Choice gateway with the Approve Time Entry Request and Reject Time Entry Request steps. Select the connection between the Choice gateway and the Reject Time Entry Request step. Click on the Default flow checkbox. Select the connection between the Choice gateway and the Approve Time Entry Request step. Click on the Flow condition field and change the value to ${isRequestApproved} . Flow Condition In the flow condition isRequestApproved is a process context variable, that would be set as part of the process continuation after the completion of the Process Time Entry Request user task. Connect the Approve Time Entry Request and Reject Time Entry Request steps with the end event. Save the changes. Right click on the time-entry-request.bpmn file and select Open With \u2192 Code Editor . Replace the content with the following: Save the changes. Business Process Synchronization Usually when the *.bpmn process is saved it would take between one and two minutes to be deployed and active. After that period of time the business process can be executed. The synchronization period by default is set to 50 seconds ( 0/50 * * * * ? ) . Find out more about the Job Expression environment variables. Updating the *.bpmn file would result in new synchronization being triggered and the updated process flow would be available after minute or two. Updating the JavaScript Task Handler won't require new synchronization and the new behaviour of the handlers will be available on the fly. Create Process API To trigger and continue the BPMN Process execution a server-side JavaScript API will be created. Right click on the sample-bpm project and select New \u2192 Folder . Enter api for the name of the folder. Create process.js file. process.js Right click on the api folder and select New \u2192 JavaScript CJS Service . Enter process.js for the name of the file. Double-click to open the file. Replace the content with the following: const rs = require ( \"http/v4/rs\" ); const process = require ( \"bpm/v4/process\" ); const tasks = require ( \"bpm/v4/tasks\" ); const user = require ( \"security/v4/user\" ); rs . service () . post ( \"\" , ( ctx , request , response ) => { let data = request . getJSON (); process . start ( 'time-entry-request' , { \"User\" : \"\" + user . getName (), \"Project\" : \"\" + data . Project , \"Start\" : \"\" + data . Start , \"End\" : \"\" + data . End , \"Hours\" : \"\" + data . Hours }); response . setStatus ( response . ACCEPTED ); }) . resource ( \"continue/:executionId\" ) . post (( ctx , request , response ) => { let executionId = request . params . executionId ; let tasksList = tasks . list (); let data = request . getJSON (); for ( const task of tasksList ) { if ( task . executionId . toString () === executionId . toString ()) { tasks . completeTask ( task . id , { isRequestApproved : data . approved , user : data . user }); break ; } } response . setStatus ( response . ACCEPTED ); }) . execute () Create Submit Form The submit form would call the server-side javascript api that was created in the previous step and will trigger the business process. Right click on the sample-bpm project and select New \u2192 Folder . Enter submit for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the submit folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--time-entry-request\" > < fd-message-page-title > Submit Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idProject\" dg-required = \"true\" dg-colon = \"true\" > Project < fd-combobox-input id = \"idProject\" name = \"Project\" state = \"{{ formErrors.Project ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Project'].$valid, 'Project')\" ng-model = \"entity.Project\" dropdown-items = \"optionsProject\" dg-placeholder = \"Search Project ...\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-required = \"true\" dg-colon = \"true\" > Start < fd-form-input-message-group dg-inactive = \"{{ formErrors.Start ? false : true }}\" > < fd-input id = \"idStart\" name = \"Start\" state = \"{{ formErrors.Start ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Start'].$valid, 'Start')\" ng-model = \"entity.Start\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-required = \"true\" dg-colon = \"true\" > End < fd-form-input-message-group dg-inactive = \"{{ formErrors.End ? false : true }}\" > < fd-input id = \"idEnd\" name = \"End\" state = \"{{ formErrors.End ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['End'].$valid, 'End')\" ng-model = \"entity.End\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-required = \"true\" dg-colon = \"true\" > Hours < fd-form-input-message-group dg-inactive = \"{{ formErrors.Hours ? false : true }}\" > < fd-input id = \"idHours\" name = \"Hours\" state = \"{{ formErrors.Hours ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Hours'].$valid, 'Hours')\" ng-model = \"entity.Hours\" min = \"0\" max = \"40\" dg-input-rules = \"{ patterns: [''] }\" type = \"number\" placeholder = \"Enter Hours\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Submit\" ng-click = \"submit()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"resetForm()\" > Right click on the api folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , function ( $scope , $http ) { $scope . entity = {}; $scope . optionsProject = [{ text : \"Project Alpha\" , value : \"Project Alpha\" }, { text : \"Project Beta\" , value : \"Project Beta\" }, { text : \"Project Evolution\" , value : \"Project Evolution\" }, { text : \"Project Next\" , value : \"Project Next\" }]; $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . submit = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js\" , JSON . stringify ( $scope . entity )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to submit Time Entry Request: ' ${ response . message } '` ); $scope . resetForm (); return ; } alert ( \"Time Entry Request successfully submitted\" ); $scope . resetForm (); }); }; $scope . resetForm = function () { $scope . entity = {}; $scope . formErrors = { Project : true , Start : true , End : true , Hours : true , }; }; $scope . resetForm (); }]); Create Process Form The process form would call the server-side javascript api that was created before and will resume the business process execution. Right click on the sample-bpm project and select New \u2192 Folder . Enter process for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the process folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--approvals\" > < fd-message-page-title > Approve Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idProject\" name = \"Project\" ng-model = \"entity.Project\" type = \"input\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-colon = \"true\" > Start < fd-form-input-message-group > < fd-input id = \"idStart\" name = \"Start\" ng-model = \"entity.Start\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-colon = \"true\" > End < fd-form-input-message-group > < fd-input id = \"idEnd\" name = \"End\" ng-model = \"entity.End\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idHours\" name = \"Hours\" ng-model = \"entity.Hours\" type = \"number\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Approve\" ng-click = \"approve()\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"negative\" dg-label = \"Reject\" ng-click = \"reject()\" > Right click on the process folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , '$location' , function ( $scope , $http , $location ) { let data = JSON . parse ( atob ( window . location . search . split ( \"=\" )[ 1 ])); $scope . executionId = data . executionId ; $scope . user = data . User ; $scope . entity = { Project : data . Project , Start : new Date ( data . Start ), End : new Date ( data . End ), Hours : parseInt ( data . Hours ) }; $scope . approve = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : true } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to approve Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Approved\" ); }); }; $scope . reject = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : false } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to reject Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Rejected\" ); }); }; }]); (Optional) Email Configuration In order to recieve email notifications about the process steps a mail configuration should be provided. The following environment variables are needed: DIRIGIBLE_MAIL_USERNAME= DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST= DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL= Connecting Eclipse Dirigible with SendGrid SMTP Relay To use a gmail account for the mail configuration follow the steps in the Connecting Eclipse Dirigible with SendGrid SMTP Relay blog. DIRIGIBLE_MAIL_USERNAME=apikey DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST=smtp.sendgrid.net DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL= Demo Navigate to http://localhost:8080/services/v4/web/sample-bpm/submit/ to open the Submit form . Enter the required data and press the Submit button. If email configuration was provided an email notification will be send to the email address set by the APP_SAMPLE_BPM_TO_EMAIL= environment variable. If email configuration wasn't provided then in the Console view the following message can be found: Approve Request URL: http://localhost:8080/services/v4/web/sample-bpm/process/?data=eyJleGVjdXRpb25JZCI6IjE4Ni... Open the URL from the Console view or open it from the email notification. The Process form would be prefilled with the data that was entered in the Submit form . Press the Approve or Reject button to resume the process execution. One more email notification would be send and message in the Console would be logged as part of the last step of the Business Process . BPM Sample GitHub Repository Go to https://github.com/dirigiblelabs/sample-bpm to find the complete sample. The repository can be cloned in the Git perspective and after few minutes the BPM Sample would be active.","title":"BPMN Process"},{"location":"tutorials/modeling/bpmn-process/#bpmn-process","text":"This tutorial will guide you through the steps of creating a Business Process with Service Task , User Task and Choice Gateway elements. The result of the business process modeling would be a Time Entry Request process, that once started would trigger an approval process (with mail notifications, if configured) with the following steps:","title":"BPMN Process"},{"location":"tutorials/modeling/bpmn-process/#steps","text":"","title":"Steps"},{"location":"tutorials/modeling/bpmn-process/#start-eclipse-dirigible","text":"Info You can find more information on how to do that by following: Getting Started section. Setup section.","title":"Start Eclipse Dirigible"},{"location":"tutorials/modeling/bpmn-process/#create-project","text":"Go to the Projects perspective and create New Project . Enter sample-bpm for the name of the project. The project will appear under the projects list.","title":"Create Project"},{"location":"tutorials/modeling/bpmn-process/#create-javascript-process-task-handlers","text":"JavaScript handlers should be provided for the Service Task steps in the Business Process . The following handlers will be executed during the Approve Time Entry Request , Deny Time Entry Request and Send Notification tasks. Right click on the sample-bpm project and select New \u2192 Folder . Enter tasks for the name of the folder. Create approve-request.js , reject-request.js and send-notification.js files. approve-request.js reject-request.js send-notification.js Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter approve-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . log ( `Time Entry Request Approved for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Approved\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Approved` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter reject-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . error ( `Time Entry Request Rejected for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Rejected\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Rejected

` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter send-notification.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const base64 = require ( \"utils/v4/base64\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let data = { executionId : executionId , User : process . getVariable ( executionId , \"User\" ), Project : process . getVariable ( executionId , \"Project\" ), Start : process . getVariable ( executionId , \"Start\" ), End : process . getVariable ( executionId , \"End\" ), Hours : process . getVariable ( executionId , \"Hours\" ) }; let urlEncodedData = base64 . encode ( JSON . stringify ( data )); let url = `http://localhost:8080/services/v4/web/sample-bpm/process/?data= ${ urlEncodedData } ` ; console . log ( `Approve Request URL: ${ url } ` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Pending\" ; let content = `

Status:

Time Entry Request for [ ${ data . User } ] - Pending

Click here to process request.` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes.","title":"Create JavaScript Process Task Handlers"},{"location":"tutorials/modeling/bpmn-process/#create-business-process-model","text":"Right click on the sample-bpm project and select New \u2192 Business Process Model . Enter time-entry-request.bpmn for the name of the business process. Manual Steps XML Content Double-click the time-entry-request.bpmn file to open it with the Flowable Editor . Click on the Process identifier field and change the value to time-entry-request . Click on the Name field and change the value to Time Entry Request . Click on the MyServiceTasks to select the first step of the business process. Click on the Name field and change the value to Send Notification . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/send-notification.js . JavaScript Task Handler The value of the handler field (e.g. sample-bpm/tasks/send-notification.js ) points to the location of the javascript task handler created in the previous step. Delete the arrow comming out of the Send Notification step. Expand the Activities group and drag and drop new User task to editor area. Connect the Send Notification task and the newly created user task. User Task Once the business process is triggered, it would stop at the Process Time Entry Request user task and it will wait for process continuation after the user task is completed. Select the user task. Click on the Name field and change the value to Process Time Entry Request . Create Choice gateway comming out of the Process Time Entry Request user task. Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Approve Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/approve-request.js . Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Reject Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/reject-request.js . Connect the Choice gateway with the Approve Time Entry Request and Reject Time Entry Request steps. Select the connection between the Choice gateway and the Reject Time Entry Request step. Click on the Default flow checkbox. Select the connection between the Choice gateway and the Approve Time Entry Request step. Click on the Flow condition field and change the value to ${isRequestApproved} . Flow Condition In the flow condition isRequestApproved is a process context variable, that would be set as part of the process continuation after the completion of the Process Time Entry Request user task. Connect the Approve Time Entry Request and Reject Time Entry Request steps with the end event. Save the changes. Right click on the time-entry-request.bpmn file and select Open With \u2192 Code Editor . Replace the content with the following: Save the changes. Business Process Synchronization Usually when the *.bpmn process is saved it would take between one and two minutes to be deployed and active. After that period of time the business process can be executed. The synchronization period by default is set to 50 seconds ( 0/50 * * * * ? ) . Find out more about the Job Expression environment variables. Updating the *.bpmn file would result in new synchronization being triggered and the updated process flow would be available after minute or two. Updating the JavaScript Task Handler won't require new synchronization and the new behaviour of the handlers will be available on the fly.","title":"Create Business Process Model"},{"location":"tutorials/modeling/bpmn-process/#create-process-api","text":"To trigger and continue the BPMN Process execution a server-side JavaScript API will be created. Right click on the sample-bpm project and select New \u2192 Folder . Enter api for the name of the folder. Create process.js file. process.js Right click on the api folder and select New \u2192 JavaScript CJS Service . Enter process.js for the name of the file. Double-click to open the file. Replace the content with the following: const rs = require ( \"http/v4/rs\" ); const process = require ( \"bpm/v4/process\" ); const tasks = require ( \"bpm/v4/tasks\" ); const user = require ( \"security/v4/user\" ); rs . service () . post ( \"\" , ( ctx , request , response ) => { let data = request . getJSON (); process . start ( 'time-entry-request' , { \"User\" : \"\" + user . getName (), \"Project\" : \"\" + data . Project , \"Start\" : \"\" + data . Start , \"End\" : \"\" + data . End , \"Hours\" : \"\" + data . Hours }); response . setStatus ( response . ACCEPTED ); }) . resource ( \"continue/:executionId\" ) . post (( ctx , request , response ) => { let executionId = request . params . executionId ; let tasksList = tasks . list (); let data = request . getJSON (); for ( const task of tasksList ) { if ( task . executionId . toString () === executionId . toString ()) { tasks . completeTask ( task . id , { isRequestApproved : data . approved , user : data . user }); break ; } } response . setStatus ( response . ACCEPTED ); }) . execute ()","title":"Create Process API"},{"location":"tutorials/modeling/bpmn-process/#create-submit-form","text":"The submit form would call the server-side javascript api that was created in the previous step and will trigger the business process. Right click on the sample-bpm project and select New \u2192 Folder . Enter submit for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the submit folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--time-entry-request\" > < fd-message-page-title > Submit Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idProject\" dg-required = \"true\" dg-colon = \"true\" > Project < fd-combobox-input id = \"idProject\" name = \"Project\" state = \"{{ formErrors.Project ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Project'].$valid, 'Project')\" ng-model = \"entity.Project\" dropdown-items = \"optionsProject\" dg-placeholder = \"Search Project ...\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-required = \"true\" dg-colon = \"true\" > Start < fd-form-input-message-group dg-inactive = \"{{ formErrors.Start ? false : true }}\" > < fd-input id = \"idStart\" name = \"Start\" state = \"{{ formErrors.Start ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Start'].$valid, 'Start')\" ng-model = \"entity.Start\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-required = \"true\" dg-colon = \"true\" > End < fd-form-input-message-group dg-inactive = \"{{ formErrors.End ? false : true }}\" > < fd-input id = \"idEnd\" name = \"End\" state = \"{{ formErrors.End ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['End'].$valid, 'End')\" ng-model = \"entity.End\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-required = \"true\" dg-colon = \"true\" > Hours < fd-form-input-message-group dg-inactive = \"{{ formErrors.Hours ? false : true }}\" > < fd-input id = \"idHours\" name = \"Hours\" state = \"{{ formErrors.Hours ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Hours'].$valid, 'Hours')\" ng-model = \"entity.Hours\" min = \"0\" max = \"40\" dg-input-rules = \"{ patterns: [''] }\" type = \"number\" placeholder = \"Enter Hours\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Submit\" ng-click = \"submit()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"resetForm()\" > Right click on the api folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , function ( $scope , $http ) { $scope . entity = {}; $scope . optionsProject = [{ text : \"Project Alpha\" , value : \"Project Alpha\" }, { text : \"Project Beta\" , value : \"Project Beta\" }, { text : \"Project Evolution\" , value : \"Project Evolution\" }, { text : \"Project Next\" , value : \"Project Next\" }]; $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . submit = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js\" , JSON . stringify ( $scope . entity )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to submit Time Entry Request: ' ${ response . message } '` ); $scope . resetForm (); return ; } alert ( \"Time Entry Request successfully submitted\" ); $scope . resetForm (); }); }; $scope . resetForm = function () { $scope . entity = {}; $scope . formErrors = { Project : true , Start : true , End : true , Hours : true , }; }; $scope . resetForm (); }]);","title":"Create Submit Form"},{"location":"tutorials/modeling/bpmn-process/#create-process-form","text":"The process form would call the server-side javascript api that was created before and will resume the business process execution. Right click on the sample-bpm project and select New \u2192 Folder . Enter process for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the process folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--approvals\" > < fd-message-page-title > Approve Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idProject\" name = \"Project\" ng-model = \"entity.Project\" type = \"input\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-colon = \"true\" > Start < fd-form-input-message-group > < fd-input id = \"idStart\" name = \"Start\" ng-model = \"entity.Start\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-colon = \"true\" > End < fd-form-input-message-group > < fd-input id = \"idEnd\" name = \"End\" ng-model = \"entity.End\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idHours\" name = \"Hours\" ng-model = \"entity.Hours\" type = \"number\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Approve\" ng-click = \"approve()\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"negative\" dg-label = \"Reject\" ng-click = \"reject()\" > Right click on the process folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , '$location' , function ( $scope , $http , $location ) { let data = JSON . parse ( atob ( window . location . search . split ( \"=\" )[ 1 ])); $scope . executionId = data . executionId ; $scope . user = data . User ; $scope . entity = { Project : data . Project , Start : new Date ( data . Start ), End : new Date ( data . End ), Hours : parseInt ( data . Hours ) }; $scope . approve = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : true } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to approve Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Approved\" ); }); }; $scope . reject = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : false } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to reject Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Rejected\" ); }); }; }]);","title":"Create Process Form"},{"location":"tutorials/modeling/bpmn-process/#optional-email-configuration","text":"In order to recieve email notifications about the process steps a mail configuration should be provided. The following environment variables are needed: DIRIGIBLE_MAIL_USERNAME= DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST= DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL= Connecting Eclipse Dirigible with SendGrid SMTP Relay To use a gmail account for the mail configuration follow the steps in the Connecting Eclipse Dirigible with SendGrid SMTP Relay blog. DIRIGIBLE_MAIL_USERNAME=apikey DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST=smtp.sendgrid.net DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL=","title":"(Optional) Email Configuration"},{"location":"tutorials/modeling/bpmn-process/#demo","text":"Navigate to http://localhost:8080/services/v4/web/sample-bpm/submit/ to open the Submit form . Enter the required data and press the Submit button. If email configuration was provided an email notification will be send to the email address set by the APP_SAMPLE_BPM_TO_EMAIL= environment variable. If email configuration wasn't provided then in the Console view the following message can be found: Approve Request URL: http://localhost:8080/services/v4/web/sample-bpm/process/?data=eyJleGVjdXRpb25JZCI6IjE4Ni... Open the URL from the Console view or open it from the email notification. The Process form would be prefilled with the data that was entered in the Submit form . Press the Approve or Reject button to resume the process execution. One more email notification would be send and message in the Console would be logged as part of the last step of the Business Process . BPM Sample GitHub Repository Go to https://github.com/dirigiblelabs/sample-bpm to find the complete sample. The repository can be cloned in the Git perspective and after few minutes the BPM Sample would be active.","title":"Demo"},{"location":"tutorials/modeling/generate-application-from-datasource/","text":"Generate Application from Datasource This tutorial will guide you through the creation of an Entity Data Model (EDM) and the generation of a full-stack Dirigible application from datasource. We will be using MySQL for that purpose but Eclipse Dirigible supports other databases as well. Prerequisites Access to the latest version of Eclipse Dirigible (10.2.7+) Docker Image setup (follow the steps below) Steps Pull the Docker Image Pull the official Eclipse Dirigible Docker Image to your local environment. docker pull dirigiblelabs/dirigible:latest Run the Docker Image Run with Environment Configurations Launch the Docker Image using the following command: docker run --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest Optional If you want to use environment variables to automatically import your datasource prepare the following file: my_env.list DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=TUTORIAL TUTORIAL_DRIVER=com.mysql.cj.jdbc.Driver TUTORIAL_URL=jdbc:mysql://host.docker.internal/my_db TUTORIAL_USERNAME=*my_username* TUTORIAL_PASSWORD=*my_pass* Launch the Docker Image using the following command: docker run --env-file ./my_env.list --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest Add the Data Source There are several ways how to to add a datasource ( via the Web IDE , via Environment Variables , via *.datsource file ) : via the Web IDE via Environment Variables via *.datsource file Note Note that this method may result in the loss of the datasource upon restart. Navigate to the Database perspective In the bottom right corner select the + sign and input the information for you datasource Test your connection with a simple query Set the following environment variables: DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES= _DRIVER=com.mysql.cj.jdbc.Driver _URL=jdbc:mysql://host.docker.internal/my_db _USERNAME=*my_username* _PASSWORD=*my_pass* Note In the previous section the steps were explained in more details. Create a *.datasource file in your application with the following content: { \"location\" : \"///.datasource\" , \"name\" : \"\" , \"driver\" : \"com.mysql.cj.jdbc.Driver\" , \"url\" : \"jdbc:mysql://${MY_DB_HOST}:${MY_DB_PORT}/${MY_DB_NAME}\" , \"username\" : \"${MY_DB_USER}\" , \"password\" : \"${MY_DB_PASS}\" } Note Replace the following placeholders: ///.datasource with the location of the datasource file in the application. with the name of the datasource. Set the following environment variables: - MY_DB_HOST - the database host. - MY_DB_PORT - the database port. - MY_DB_NAME - the database name. - MY_DB_USER - the database user. - MY_DB_PASS - the database password Application Generation Steps Once the datasource is added, proceed with the following steps to generate the application: Right-click the database you want and select Export Schema as Model . Navigate to the Workbench perspective and you should see a project and a *.model file created from your database. Right click the *.model file: Click on the Generate option. From the Generate from template pop-up select Application - Full Stack . Input additional information for you application Note The Data Source field is where your records are going to be saved. For this tutorial we want to use our imported datasource TUTORIAL but if you want you can use the Eclipse Dirigible H2 datasource (by default named DefaultDB ) . In the TUTORIAL project a couple of files will be generated - this is our application. Right click the project and publish your application using the Publish button. Navigate in the gen folder in the TUTORIAL project, select the index.html and in the Preview section below you can fetch your link and start using your application: Conclusion By following the steps outlined above, you can seamlessly generate an application in Eclipse Dirigible using a datasource. Ensure to set up the datasource correctly and choose the appropriate method based on your requirements.","title":"Generate Application from Datasource"},{"location":"tutorials/modeling/generate-application-from-datasource/#generate-application-from-datasource","text":"This tutorial will guide you through the creation of an Entity Data Model (EDM) and the generation of a full-stack Dirigible application from datasource. We will be using MySQL for that purpose but Eclipse Dirigible supports other databases as well. Prerequisites Access to the latest version of Eclipse Dirigible (10.2.7+) Docker Image setup (follow the steps below)","title":"Generate Application from Datasource"},{"location":"tutorials/modeling/generate-application-from-datasource/#steps","text":"","title":"Steps"},{"location":"tutorials/modeling/generate-application-from-datasource/#pull-the-docker-image","text":"Pull the official Eclipse Dirigible Docker Image to your local environment. docker pull dirigiblelabs/dirigible:latest","title":"Pull the Docker Image"},{"location":"tutorials/modeling/generate-application-from-datasource/#run-the-docker-image","text":"Run with Environment Configurations Launch the Docker Image using the following command: docker run --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest Optional If you want to use environment variables to automatically import your datasource prepare the following file: my_env.list DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=TUTORIAL TUTORIAL_DRIVER=com.mysql.cj.jdbc.Driver TUTORIAL_URL=jdbc:mysql://host.docker.internal/my_db TUTORIAL_USERNAME=*my_username* TUTORIAL_PASSWORD=*my_pass* Launch the Docker Image using the following command: docker run --env-file ./my_env.list --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest","title":"Run the Docker Image"},{"location":"tutorials/modeling/generate-application-from-datasource/#add-the-data-source","text":"There are several ways how to to add a datasource ( via the Web IDE , via Environment Variables , via *.datsource file ) : via the Web IDE via Environment Variables via *.datsource file Note Note that this method may result in the loss of the datasource upon restart. Navigate to the Database perspective In the bottom right corner select the + sign and input the information for you datasource Test your connection with a simple query Set the following environment variables: DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES= _DRIVER=com.mysql.cj.jdbc.Driver _URL=jdbc:mysql://host.docker.internal/my_db _USERNAME=*my_username* _PASSWORD=*my_pass* Note In the previous section the steps were explained in more details. Create a *.datasource file in your application with the following content: { \"location\" : \"///.datasource\" , \"name\" : \"\" , \"driver\" : \"com.mysql.cj.jdbc.Driver\" , \"url\" : \"jdbc:mysql://${MY_DB_HOST}:${MY_DB_PORT}/${MY_DB_NAME}\" , \"username\" : \"${MY_DB_USER}\" , \"password\" : \"${MY_DB_PASS}\" } Note Replace the following placeholders: ///.datasource with the location of the datasource file in the application. with the name of the datasource. Set the following environment variables: - MY_DB_HOST - the database host. - MY_DB_PORT - the database port. - MY_DB_NAME - the database name. - MY_DB_USER - the database user. - MY_DB_PASS - the database password","title":"Add the Data Source"},{"location":"tutorials/modeling/generate-application-from-datasource/#application-generation-steps","text":"Once the datasource is added, proceed with the following steps to generate the application: Right-click the database you want and select Export Schema as Model . Navigate to the Workbench perspective and you should see a project and a *.model file created from your database. Right click the *.model file: Click on the Generate option. From the Generate from template pop-up select Application - Full Stack . Input additional information for you application Note The Data Source field is where your records are going to be saved. For this tutorial we want to use our imported datasource TUTORIAL but if you want you can use the Eclipse Dirigible H2 datasource (by default named DefaultDB ) . In the TUTORIAL project a couple of files will be generated - this is our application. Right click the project and publish your application using the Publish button. Navigate in the gen folder in the TUTORIAL project, select the index.html and in the Preview section below you can fetch your link and start using your application:","title":"Application Generation Steps"},{"location":"tutorials/modeling/generate-application-from-datasource/#conclusion","text":"By following the steps outlined above, you can seamlessly generate an application in Eclipse Dirigible using a datasource. Ensure to set up the datasource correctly and choose the appropriate method based on your requirements.","title":"Conclusion"},{"location":"tutorials/modeling/generate-application-from-model/","text":"Generate Application from Model This tutorial will guide you through the creation of an entity data model and generation of a full-stack Dirigible application, from this model. Prerequisites Access to the latest version of Eclipse Dirigible (3.2.2+) Overview In this tutorial we will create an entity model of a car service bookings and generate full-stack Dirigible application from it. The complete sample can be found here . Setup Car Service Bookings Create new project car-service-bookings Right click -> New -> Entity Data Model Rename file.edm to car-service-bookings.edm Open car-service-bookings.edm Brands Drag and drop new entity Name it Brands Rename entityId to Id Drag and drop new property Rename property2 to Name Open the properties of the Brands entity Open the General tab Set the Type to Primary Entity Switch to the User Interface tab Set the Layout Type to Manage Master Entites Models Drag and drop new entity Name it Models Rename entityId to Id Drag and drop new property Rename property2 to Name Add new relation between Models and Brands Rename the relation property in the Models entity to BrandId Open the relation properties Set Name to Brand Set Relationship Type to Composition Set Relationship Cardinality to one-to-many Open the properties of the BrandId property Switch to the User Interface tab Set Is Major to Show in form only Open the properties of the Models entity Open the General tab Set the Type to Dependent Entity Swith to the User Interface tab Set the Layout Tab to Manage Details Entities Cars Drag and drop new entity Name it Cars Rename entityId to Id Drag and drop new property Rename property2 to PlateNumber Add new relation between Cars and Models Rename the relation property in the Cars entity to ModelId Open the properties of the ModelId property Open the Data tab Set the Data Type to INTEGER Switch to the User Interface Set Widget Type to Dropdown Set Label to Model Set Dropdown Key to Id Set Dropdown Value to Name > Note : the dropdown key and value refers respectively to the Models:Id and Models:Name values Generation Save the model Right click on car-service-bookings.model and select Generate Set Template to Full-stack Application (AngularJS) Set Extension to car-service Check Embedded mode Set Title to Car Service Set Brand to Car Service Click Generate Publish the project Extensibility Sample view based extension can be found here Wrap up The whole application can be found here Resources Sample Car Service Bookings: sample-v3-car-service-bookings Sample Data: sample-v3-car-service-bookings-data Sample Extension: sample-v3-car-service-bookings-extension","title":"Generate Application from Model"},{"location":"tutorials/modeling/generate-application-from-model/#generate-application-from-model","text":"This tutorial will guide you through the creation of an entity data model and generation of a full-stack Dirigible application, from this model.","title":"Generate Application from Model"},{"location":"tutorials/modeling/generate-application-from-model/#prerequisites","text":"Access to the latest version of Eclipse Dirigible (3.2.2+)","title":"Prerequisites"},{"location":"tutorials/modeling/generate-application-from-model/#overview","text":"In this tutorial we will create an entity model of a car service bookings and generate full-stack Dirigible application from it. The complete sample can be found here .","title":"Overview"},{"location":"tutorials/modeling/generate-application-from-model/#setup-car-service-bookings","text":"Create new project car-service-bookings Right click -> New -> Entity Data Model Rename file.edm to car-service-bookings.edm Open car-service-bookings.edm","title":"Setup Car Service Bookings"},{"location":"tutorials/modeling/generate-application-from-model/#brands","text":"Drag and drop new entity Name it Brands Rename entityId to Id Drag and drop new property Rename property2 to Name Open the properties of the Brands entity Open the General tab Set the Type to Primary Entity Switch to the User Interface tab Set the Layout Type to Manage Master Entites","title":"Brands"},{"location":"tutorials/modeling/generate-application-from-model/#models","text":"Drag and drop new entity Name it Models Rename entityId to Id Drag and drop new property Rename property2 to Name Add new relation between Models and Brands Rename the relation property in the Models entity to BrandId Open the relation properties Set Name to Brand Set Relationship Type to Composition Set Relationship Cardinality to one-to-many Open the properties of the BrandId property Switch to the User Interface tab Set Is Major to Show in form only Open the properties of the Models entity Open the General tab Set the Type to Dependent Entity Swith to the User Interface tab Set the Layout Tab to Manage Details Entities","title":"Models"},{"location":"tutorials/modeling/generate-application-from-model/#cars","text":"Drag and drop new entity Name it Cars Rename entityId to Id Drag and drop new property Rename property2 to PlateNumber Add new relation between Cars and Models Rename the relation property in the Cars entity to ModelId Open the properties of the ModelId property Open the Data tab Set the Data Type to INTEGER Switch to the User Interface Set Widget Type to Dropdown Set Label to Model Set Dropdown Key to Id Set Dropdown Value to Name > Note : the dropdown key and value refers respectively to the Models:Id and Models:Name values","title":"Cars"},{"location":"tutorials/modeling/generate-application-from-model/#generation","text":"Save the model Right click on car-service-bookings.model and select Generate Set Template to Full-stack Application (AngularJS) Set Extension to car-service Check Embedded mode Set Title to Car Service Set Brand to Car Service Click Generate Publish the project","title":"Generation"},{"location":"tutorials/modeling/generate-application-from-model/#extensibility","text":"Sample view based extension can be found here","title":"Extensibility"},{"location":"tutorials/modeling/generate-application-from-model/#wrap-up","text":"The whole application can be found here","title":"Wrap up"},{"location":"tutorials/modeling/generate-application-from-model/#resources","text":"Sample Car Service Bookings: sample-v3-car-service-bookings Sample Data: sample-v3-car-service-bookings-data Sample Extension: sample-v3-car-service-bookings-extension","title":"Resources"}]} \ No newline at end of file +{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Welcome! How can we help? Build your first service with Eclipse Dirigible. Get Started Explore different setup options. Setup Understand the nuts and bolts. Architecture More helpful information Read about our vision and opinions on cloud development and Eclipse Dirigible. Blogs Read essential definitions. Concepts Review major features. Features Learn more about Enterprise JavaScript API availability, versions, and status. API Find out what, why, and how. FAQ Learn how to contribute. Community","title":"Welcome"},{"location":"#welcome-how-can-we-help","text":"Build your first service with Eclipse Dirigible. Get Started Explore different setup options. Setup Understand the nuts and bolts. Architecture","title":"Welcome! How can we help?"},{"location":"#more-helpful-information","text":"Read about our vision and opinions on cloud development and Eclipse Dirigible. Blogs Read essential definitions. Concepts Review major features. Features Learn more about Enterprise JavaScript API availability, versions, and status. API Find out what, why, and how. FAQ Learn how to contribute. Community","title":"More helpful information"},{"location":"community/","text":"Community Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible. Contributor Guide Eclipse Dirigible is an open source project, which means that you can propose contributions by sending pull requests through GitHub . Before you get started, here are some prerequisites that you need to complete: Legal considerations Please read the Eclipse Foundation policy on accepting contributions via Git . Please read the Code of Conduct . Your contribution cannot be accepted unless you have an Eclipse Contributor Agreement in place. Here is the checklist for contributions to be acceptable: Create an account at Eclipse Add your GitHub user name in your account settings Log into the projects portal , look for \"Eclipse Contributor Agreement\" , and agree to the terms. Ensure that you sign-off your Git commits with the same email address as your Eclipse Foundation profile. For more information see the Contributor Guide . Style Guide In this section we have outlined text stylizing options and for what elements they should be used. If everyone follows it, we will have visually consistent documentation . Bold How it looks as text: Bold Text How it looks in markdown: **Bold Text** Use it for: UI elements Navigation paths Monospace How it looks as text: Monospace Text How it looks in markdown: `Monospace Text` Use it for: File names and extensions Terms File paths Monospace/Bold How it looks as text: Monospace/Bold Text How it looks in markdown: **`Monospace/Bold Text`** Use it for: User input Headings How it looks: Use Heading 1 for the titles Heading 2 is for main topics Continue with Heading 3 and 4 where needed Structure your topic with no more than 3 heading levels(heading 2, 3 and 4) Blogs We'd welcome any contribution to our Blogs site as long as it conforms with out Legal considerations outlined above. Below we've provided more details about the organization of the Blogs site and the frontmatter that needs to be added so new blogs have the same look and feel as all the others. Add Your Blog to the Right Folder All blogs are organized in folders by year, month, and day of publishing. Hence, a blog written on November 19, 2020 is placed in the directory docs/2020/11/19/ : This also helps arranging the blogs by year of publishing. When publishing, add your blog to the right folder depending on the date. You can also create folders if needed. Include Markdown Frontmatter A big part of any blog's layout is controlled by its .md file frontmatter. This is metadata about the .md file and is denoted by the triple dashes at the start and end of the block. Here's an example with the title of this Community page: --- title : Community --- Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible. ## Contributor Guide Set the title of the blog in the frontmatter: --- title : --- When the title is set in the frontmatter, use Heading 2 level ( ## This is a heading 2 ) as the highest heading level in your blog. Otherwise, the first Heading 1 you use will overwrite the title from the frontmatter and cause formatting issues. Set the author: --- title : author : --- Set your GitHub user: --- title : author : author_gh_user : --- 4. Set reading time and publishing date: --- title : author : author_gh_user : read_time : publish_date : --- Providing all the metadata in the frontmatter as described will include: the title in the beginning of the page your GitHub avatar, your name, and a link to your GitHub profile in the author section reading time and publishing date in the details section Here's an example from one of our recent blogs: Happy Blogging! Join the Discussion Reach out to other contributors and join in the discussion around Dirigible here .","title":"Community"},{"location":"community/#community","text":"Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible.","title":"Community"},{"location":"community/#contributor-guide","text":"Eclipse Dirigible is an open source project, which means that you can propose contributions by sending pull requests through GitHub . Before you get started, here are some prerequisites that you need to complete:","title":"Contributor Guide"},{"location":"community/#legal-considerations","text":"Please read the Eclipse Foundation policy on accepting contributions via Git . Please read the Code of Conduct . Your contribution cannot be accepted unless you have an Eclipse Contributor Agreement in place. Here is the checklist for contributions to be acceptable: Create an account at Eclipse Add your GitHub user name in your account settings Log into the projects portal , look for \"Eclipse Contributor Agreement\" , and agree to the terms. Ensure that you sign-off your Git commits with the same email address as your Eclipse Foundation profile. For more information see the Contributor Guide .","title":"Legal considerations"},{"location":"community/#style-guide","text":"In this section we have outlined text stylizing options and for what elements they should be used. If everyone follows it, we will have visually consistent documentation . Bold How it looks as text: Bold Text How it looks in markdown: **Bold Text** Use it for: UI elements Navigation paths Monospace How it looks as text: Monospace Text How it looks in markdown: `Monospace Text` Use it for: File names and extensions Terms File paths Monospace/Bold How it looks as text: Monospace/Bold Text How it looks in markdown: **`Monospace/Bold Text`** Use it for: User input Headings How it looks: Use Heading 1 for the titles Heading 2 is for main topics Continue with Heading 3 and 4 where needed Structure your topic with no more than 3 heading levels(heading 2, 3 and 4)","title":"Style Guide"},{"location":"community/#blogs","text":"We'd welcome any contribution to our Blogs site as long as it conforms with out Legal considerations outlined above. Below we've provided more details about the organization of the Blogs site and the frontmatter that needs to be added so new blogs have the same look and feel as all the others.","title":"Blogs"},{"location":"community/#add-your-blog-to-the-right-folder","text":"All blogs are organized in folders by year, month, and day of publishing. Hence, a blog written on November 19, 2020 is placed in the directory docs/2020/11/19/ : This also helps arranging the blogs by year of publishing. When publishing, add your blog to the right folder depending on the date. You can also create folders if needed.","title":"Add Your Blog to the Right Folder"},{"location":"community/#include-markdown-frontmatter","text":"A big part of any blog's layout is controlled by its .md file frontmatter. This is metadata about the .md file and is denoted by the triple dashes at the start and end of the block. Here's an example with the title of this Community page: --- title : Community --- Welcome to the community page for contributors! Here you will find resources to help you create even better documentation for Dirigible. ## Contributor Guide Set the title of the blog in the frontmatter: --- title : --- When the title is set in the frontmatter, use Heading 2 level ( ## This is a heading 2 ) as the highest heading level in your blog. Otherwise, the first Heading 1 you use will overwrite the title from the frontmatter and cause formatting issues. Set the author: --- title : author : --- Set your GitHub user: --- title : author : author_gh_user : --- 4. Set reading time and publishing date: --- title : author : author_gh_user : read_time : publish_date : --- Providing all the metadata in the frontmatter as described will include: the title in the beginning of the page your GitHub avatar, your name, and a link to your GitHub profile in the author section reading time and publishing date in the details section Here's an example from one of our recent blogs: Happy Blogging!","title":"Include Markdown Frontmatter"},{"location":"community/#join-the-discussion","text":"Reach out to other contributors and join in the discussion around Dirigible here .","title":"Join the Discussion"},{"location":"developer-resources/cheat-sheet/","text":"Cheat Sheet Clean Up Database Go to the Database perspective. Switch to the local datasource type and select the SystemDB . Execute the following queries: Delete Data Drop Tables DELETE FROM DIRIGIBLE_BPM ; DELETE FROM DIRIGIBLE_DATA_STRUCTURES ; DELETE FROM DIRIGIBLE_EXTENSIONS ; DELETE FROM DIRIGIBLE_EXTENSION_POINTS ; DELETE FROM DIRIGIBLE_IDENTITY ; DELETE FROM DIRIGIBLE_JOBS ; DELETE FROM DIRIGIBLE_LISTENERS ; DELETE FROM DIRIGIBLE_MIGRATIONS ; DELETE FROM DIRIGIBLE_ODATA ; DELETE FROM DIRIGIBLE_ODATA_CONTAINER ; DELETE FROM DIRIGIBLE_ODATA_MAPPING ; DELETE FROM DIRIGIBLE_ODATA_SCHEMA ; DELETE FROM DIRIGIBLE_ODATA_HANDLER ; DELETE FROM DIRIGIBLE_PUBLISH_LOGS ; DELETE FROM DIRIGIBLE_PUBLISH_REQUESTS ; DELETE FROM DIRIGIBLE_SECURITY_ACCESS ; DELETE FROM DIRIGIBLE_SECURITY_ROLES ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DELETE FROM DIRIGIBLE_WEBSOCKETS ; DELETE FROM QUARTZ_BLOB_TRIGGERS ; DELETE FROM QUARTZ_CALENDARS ; DELETE FROM QUARTZ_CRON_TRIGGERS ; DELETE FROM QUARTZ_FIRED_TRIGGERS ; DELETE FROM QUARTZ_LOCKS ; DELETE FROM QUARTZ_PAUSED_TRIGGER_GRPS ; DELETE FROM QUARTZ_SCHEDULER_STATE ; DELETE FROM QUARTZ_SIMPLE_TRIGGERS ; DELETE FROM QUARTZ_SIMPROP_TRIGGERS ; DELETE FROM QUARTZ_TRIGGERS ; DELETE FROM QUARTZ_JOB_DETAILS ; DROP TABLE DIRIGIBLE_BPM ; DROP TABLE DIRIGIBLE_DATA_STRUCTURES ; DROP TABLE DIRIGIBLE_EXTENSIONS ; DROP TABLE DIRIGIBLE_EXTENSION_POINTS ; DROP TABLE DIRIGIBLE_IDENTITY ; DROP TABLE DIRIGIBLE_JOBS ; DROP TABLE DIRIGIBLE_LISTENERS ; DROP TABLE DIRIGIBLE_MIGRATIONS ; DROP TABLE DIRIGIBLE_ODATA ; DROP TABLE DIRIGIBLE_ODATA_CONTAINER ; DROP TABLE DIRIGIBLE_ODATA_MAPPING ; DROP TABLE DIRIGIBLE_ODATA_SCHEMA ; DROP TABLE DIRIGIBLE_ODATA_HANDLER ; DROP TABLE DIRIGIBLE_PUBLISH_LOGS ; DROP TABLE DIRIGIBLE_PUBLISH_REQUESTS ; DROP TABLE DIRIGIBLE_SECURITY_ACCESS ; DROP TABLE DIRIGIBLE_SECURITY_ROLES ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DROP TABLE DIRIGIBLE_WEBSOCKETS ; DROP TABLE QUARTZ_BLOB_TRIGGERS ; DROP TABLE QUARTZ_CALENDARS ; DROP TABLE QUARTZ_CRON_TRIGGERS ; DROP TABLE QUARTZ_FIRED_TRIGGERS ; DROP TABLE QUARTZ_LOCKS ; DROP TABLE QUARTZ_PAUSED_TRIGGER_GRPS ; DROP TABLE QUARTZ_SCHEDULER_STATE ; DROP TABLE QUARTZ_SIMPLE_TRIGGERS ; DROP TABLE QUARTZ_SIMPROP_TRIGGERS ; DROP TABLE QUARTZ_TRIGGERS ; DROP TABLE QUARTZ_JOB_DETAILS ;","title":"Cheat Sheet"},{"location":"developer-resources/cheat-sheet/#cheat-sheet","text":"","title":"Cheat Sheet"},{"location":"developer-resources/cheat-sheet/#clean-up-database","text":"Go to the Database perspective. Switch to the local datasource type and select the SystemDB . Execute the following queries: Delete Data Drop Tables DELETE FROM DIRIGIBLE_BPM ; DELETE FROM DIRIGIBLE_DATA_STRUCTURES ; DELETE FROM DIRIGIBLE_EXTENSIONS ; DELETE FROM DIRIGIBLE_EXTENSION_POINTS ; DELETE FROM DIRIGIBLE_IDENTITY ; DELETE FROM DIRIGIBLE_JOBS ; DELETE FROM DIRIGIBLE_LISTENERS ; DELETE FROM DIRIGIBLE_MIGRATIONS ; DELETE FROM DIRIGIBLE_ODATA ; DELETE FROM DIRIGIBLE_ODATA_CONTAINER ; DELETE FROM DIRIGIBLE_ODATA_MAPPING ; DELETE FROM DIRIGIBLE_ODATA_SCHEMA ; DELETE FROM DIRIGIBLE_ODATA_HANDLER ; DELETE FROM DIRIGIBLE_PUBLISH_LOGS ; DELETE FROM DIRIGIBLE_PUBLISH_REQUESTS ; DELETE FROM DIRIGIBLE_SECURITY_ACCESS ; DELETE FROM DIRIGIBLE_SECURITY_ROLES ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE ; DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS DELETE FROM DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DELETE FROM DIRIGIBLE_WEBSOCKETS ; DELETE FROM QUARTZ_BLOB_TRIGGERS ; DELETE FROM QUARTZ_CALENDARS ; DELETE FROM QUARTZ_CRON_TRIGGERS ; DELETE FROM QUARTZ_FIRED_TRIGGERS ; DELETE FROM QUARTZ_LOCKS ; DELETE FROM QUARTZ_PAUSED_TRIGGER_GRPS ; DELETE FROM QUARTZ_SCHEDULER_STATE ; DELETE FROM QUARTZ_SIMPLE_TRIGGERS ; DELETE FROM QUARTZ_SIMPROP_TRIGGERS ; DELETE FROM QUARTZ_TRIGGERS ; DELETE FROM QUARTZ_JOB_DETAILS ; DROP TABLE DIRIGIBLE_BPM ; DROP TABLE DIRIGIBLE_DATA_STRUCTURES ; DROP TABLE DIRIGIBLE_EXTENSIONS ; DROP TABLE DIRIGIBLE_EXTENSION_POINTS ; DROP TABLE DIRIGIBLE_IDENTITY ; DROP TABLE DIRIGIBLE_JOBS ; DROP TABLE DIRIGIBLE_LISTENERS ; DROP TABLE DIRIGIBLE_MIGRATIONS ; DROP TABLE DIRIGIBLE_ODATA ; DROP TABLE DIRIGIBLE_ODATA_CONTAINER ; DROP TABLE DIRIGIBLE_ODATA_MAPPING ; DROP TABLE DIRIGIBLE_ODATA_SCHEMA ; DROP TABLE DIRIGIBLE_ODATA_HANDLER ; DROP TABLE DIRIGIBLE_PUBLISH_LOGS ; DROP TABLE DIRIGIBLE_PUBLISH_REQUESTS ; DROP TABLE DIRIGIBLE_SECURITY_ACCESS ; DROP TABLE DIRIGIBLE_SECURITY_ROLES ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_ARTEFACTS ; DROP TABLE DIRIGIBLE_SYNCHRONIZER_STATE_LOG ; DROP TABLE DIRIGIBLE_WEBSOCKETS ; DROP TABLE QUARTZ_BLOB_TRIGGERS ; DROP TABLE QUARTZ_CALENDARS ; DROP TABLE QUARTZ_CRON_TRIGGERS ; DROP TABLE QUARTZ_FIRED_TRIGGERS ; DROP TABLE QUARTZ_LOCKS ; DROP TABLE QUARTZ_PAUSED_TRIGGER_GRPS ; DROP TABLE QUARTZ_SCHEDULER_STATE ; DROP TABLE QUARTZ_SIMPLE_TRIGGERS ; DROP TABLE QUARTZ_SIMPROP_TRIGGERS ; DROP TABLE QUARTZ_TRIGGERS ; DROP TABLE QUARTZ_JOB_DETAILS ;","title":"Clean Up Database"},{"location":"developer-resources/java-remote-debugging/","text":"Java Remote Debugging Debugging To connect for remote java debugging of Eclipse Dirigible, follow the next steps: Start the Tomcat server in JPDA (debug) mode: Run Tomcat in JPDA mode on macOS on Linux on Windows Docker Image ./catalina.sh jpda run ./catalina.sh jpda run catalina.bat jpda run Run the docker image with Java Debugging Options as described here . Eclipse IDE IntelliJ IDEA Create new Debug Configuration : New Remote Java Application configuration: Note Double click on the Remote Java Application to create new configuration. Update the host and port properties, if needed. Press the Debug button to start new remote debug session. Create new Debug Configuration from the Edit Configurations.. option: Add new Remote JVM Debug configuration using the + button and double click on Remote JVM Debug : Use the configuration provided on the screenshot below, update the host and port properties if needed: Press the Debug button to start new remote debug session.","title":"Java Remote Debugging"},{"location":"developer-resources/java-remote-debugging/#java-remote-debugging","text":"","title":"Java Remote Debugging"},{"location":"developer-resources/java-remote-debugging/#debugging","text":"To connect for remote java debugging of Eclipse Dirigible, follow the next steps: Start the Tomcat server in JPDA (debug) mode: Run Tomcat in JPDA mode on macOS on Linux on Windows Docker Image ./catalina.sh jpda run ./catalina.sh jpda run catalina.bat jpda run Run the docker image with Java Debugging Options as described here . Eclipse IDE IntelliJ IDEA Create new Debug Configuration : New Remote Java Application configuration: Note Double click on the Remote Java Application to create new configuration. Update the host and port properties, if needed. Press the Debug button to start new remote debug session. Create new Debug Configuration from the Edit Configurations.. option: Add new Remote JVM Debug configuration using the + button and double click on Remote JVM Debug : Use the configuration provided on the screenshot below, update the host and port properties if needed: Press the Debug button to start new remote debug session.","title":"Debugging"},{"location":"developer-resources/keyboard-shortcuts/","text":"Keyboard Shortcuts Keyboard shortcuts represent combinations of two or more keyboard buttons that, when pressed at the same time, yield actions that can also be achieved by clicking a button on the UI. Keyboard Combination Action Ctrl + S / Cmd + S Save Alt + W / Option + W Close active editor Alt + Shift + W / Option + Shift + W Close all opened editors Ctrl + Shift + F / Cmd + Shift + F Open Search view Command Pallette The command pallette gives you access to the most common operations in Eclipse Dirigible along with their keyboard shortcuts. You can access the command pallette by pressing F1 on your keyboard.","title":"Keyboard Shortcuts"},{"location":"developer-resources/keyboard-shortcuts/#keyboard-shortcuts","text":"Keyboard shortcuts represent combinations of two or more keyboard buttons that, when pressed at the same time, yield actions that can also be achieved by clicking a button on the UI. Keyboard Combination Action Ctrl + S / Cmd + S Save Alt + W / Option + W Close active editor Alt + Shift + W / Option + Shift + W Close all opened editors Ctrl + Shift + F / Cmd + Shift + F Open Search view","title":"Keyboard Shortcuts"},{"location":"developer-resources/keyboard-shortcuts/#command-pallette","text":"The command pallette gives you access to the most common operations in Eclipse Dirigible along with their keyboard shortcuts. You can access the command pallette by pressing F1 on your keyboard.","title":"Command Pallette"},{"location":"development/","text":"Getting Started Overview This guide explains how to setup an Eclipse Dirigible instance and how to use it to build your very first Hello World service. The references section below points to the documentation with more technical details for the different aspects of the platform and its components and capabilities. Setup Trial Environment In case you are using the shared https://trial.dirigible.io environment, you can skip this section. Get the binary In case you want to use a prebuild package, you can get the one built for your environment from the downloads section. To build Eclipse Dirigible from sources by yourself, just follow the instructions in the README . Choose the environment You can choose one of the setup options available to get an Eclipse Dirigible instance depending on your target environment. A shared trial instance is also available and can be accessed from here: https://trial.dirigible.io Environment Variables There are many configuration options , so you can connect to different databases, use different platforms, choose a specific set of plugins, and many more. Access the instance In case of a local setup on your machine, you can access Eclipse Dirigible at the following location: http://localhost:8080 Default Credentials The default username is admin and the default password is admin . The credentials can be updated, as described in the configuration options . Hello World Application Create a Hello World service Once you have a running Eclipse Dirigible instance, you can start with your project: Right-click inside the Projects view. From the menu select the New Project option. Enter hello-world for the name of the project and click the Create button. Right-click on the hello-world project in the Projects view and choose TypeScript or JavaScript ECMA6 service from the New dropdown: TypeScript JavaScript ECMA6 Select the New \u2192 TypeScript Service option: Enter service.ts for the name of the TypeScript Service : Double-click on the service.ts to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.ts selected in the Projects view, check the result of the execution of the server-side TypeScript Service in the Preview view: Note The TypeScript Service is published and available at the http://localhost:8080/services/ts/hello-world/service.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . Select the New \u2192 JavaScript ESM Service option: Enter service.mjs for the name of the JavaScript ESM Service : Double-click on the service.mjs to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.mjs selected in the Projects view, check the result of the execution of the server-side JavaScript ESM Service in the Preview view: Note The JavaScript ESM Service is published and available at the http://localhost:8080/services/js/hello-world/service.mjs URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . Update the Hello World service Go to line 3 in the editor and change the Hello World! message to Hello Eclipse Dirigible! . TypeScript JavaScript ECMA6 import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); Save the file: Ctrl + S for Windows, Cmd + S for macOS The output in the Preview view changes immediately. Note This is due to the default configuration of auto-publish on save . You can find more about this dynamic behavior in Dynamic Applications . References So far we saw how easy it is to create and modify a Hello World RESTful service, but Eclipse Dirigible capabilities goes way beyond that. References You can explore the Tutorials section for more scenarios. If you would like to build complex services, you can go to the API section to find more JavaScript APIs that Eclipse Dirigible provides out-of-the-box. If you are curious what you can do with Eclipse Dirigible apart from writing server-side JavaScript services, you can have a look at the features section. In case you are interested in Modeling and Generation with the Low-Code/No-Code tooling of Eclipse Dirigible, you can read about Entity Data Models and Generation .","title":"Getting Started"},{"location":"development/#getting-started","text":"","title":"Getting Started"},{"location":"development/#overview","text":"This guide explains how to setup an Eclipse Dirigible instance and how to use it to build your very first Hello World service. The references section below points to the documentation with more technical details for the different aspects of the platform and its components and capabilities.","title":"Overview"},{"location":"development/#setup","text":"Trial Environment In case you are using the shared https://trial.dirigible.io environment, you can skip this section.","title":"Setup"},{"location":"development/#get-the-binary","text":"In case you want to use a prebuild package, you can get the one built for your environment from the downloads section. To build Eclipse Dirigible from sources by yourself, just follow the instructions in the README .","title":"Get the binary"},{"location":"development/#choose-the-environment","text":"You can choose one of the setup options available to get an Eclipse Dirigible instance depending on your target environment. A shared trial instance is also available and can be accessed from here: https://trial.dirigible.io Environment Variables There are many configuration options , so you can connect to different databases, use different platforms, choose a specific set of plugins, and many more.","title":"Choose the environment"},{"location":"development/#access-the-instance","text":"In case of a local setup on your machine, you can access Eclipse Dirigible at the following location: http://localhost:8080 Default Credentials The default username is admin and the default password is admin . The credentials can be updated, as described in the configuration options .","title":"Access the instance"},{"location":"development/#hello-world-application","text":"","title":"Hello World Application"},{"location":"development/#create-a-hello-world-service","text":"Once you have a running Eclipse Dirigible instance, you can start with your project: Right-click inside the Projects view. From the menu select the New Project option. Enter hello-world for the name of the project and click the Create button. Right-click on the hello-world project in the Projects view and choose TypeScript or JavaScript ECMA6 service from the New dropdown: TypeScript JavaScript ECMA6 Select the New \u2192 TypeScript Service option: Enter service.ts for the name of the TypeScript Service : Double-click on the service.ts to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.ts selected in the Projects view, check the result of the execution of the server-side TypeScript Service in the Preview view: Note The TypeScript Service is published and available at the http://localhost:8080/services/ts/hello-world/service.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . Select the New \u2192 JavaScript ESM Service option: Enter service.mjs for the name of the JavaScript ESM Service : Double-click on the service.mjs to open the file in the editor on the right. Info The file already contains a Hello World service implementation. As it's not specified otherwise, the service can be executed by performing any of the following HTTP methods: GET , POST , PUT , DELETE and PATCH . Right-click on the hello-world project and choose Publish option from the menu: With the service.mjs selected in the Projects view, check the result of the execution of the server-side JavaScript ESM Service in the Preview view: Note The JavaScript ESM Service is published and available at the http://localhost:8080/services/js/hello-world/service.mjs URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL .","title":"Create a Hello World service"},{"location":"development/#update-the-hello-world-service","text":"Go to line 3 in the editor and change the Hello World! message to Hello Eclipse Dirigible! . TypeScript JavaScript ECMA6 import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); import { response } from \"sdk/http\" ; response . println ( \"Hello Eclipse Dirigible!\" ); Save the file: Ctrl + S for Windows, Cmd + S for macOS The output in the Preview view changes immediately. Note This is due to the default configuration of auto-publish on save . You can find more about this dynamic behavior in Dynamic Applications .","title":"Update the Hello World service"},{"location":"development/#references","text":"So far we saw how easy it is to create and modify a Hello World RESTful service, but Eclipse Dirigible capabilities goes way beyond that. References You can explore the Tutorials section for more scenarios. If you would like to build complex services, you can go to the API section to find more JavaScript APIs that Eclipse Dirigible provides out-of-the-box. If you are curious what you can do with Eclipse Dirigible apart from writing server-side JavaScript services, you can have a look at the features section. In case you are interested in Modeling and Generation with the Low-Code/No-Code tooling of Eclipse Dirigible, you can read about Entity Data Models and Generation .","title":"References"},{"location":"development/devops/","text":"Development & Operations Providing modern business applications in the Cloud nowadays requires a tight relation between the development and operations activities. Dirigible by promoting the in-system development for a full-stack applications needs to cover the both phases with the necessary tools and backend frameworks. Development The front-facing Web IDE component is a collection of plugins for project management, source code editing, modeling, SCM integration, database management, and many more. Workbench Git Database Debugger Documents Search Import Preview Editor - Monaco BPMN Modeler Database Schema Modeler Entity Data Modeler Operations The functionality for import and export of projects or workspaces as well as cloning of a whole Dirigible instance, monitoring, document management, etc. are also integrated in the Web IDE component. Operations Database Repository Terminal Snapshot Logs Console","title":"Development & Operations"},{"location":"development/devops/#development-operations","text":"Providing modern business applications in the Cloud nowadays requires a tight relation between the development and operations activities. Dirigible by promoting the in-system development for a full-stack applications needs to cover the both phases with the necessary tools and backend frameworks.","title":"Development & Operations"},{"location":"development/devops/#development","text":"The front-facing Web IDE component is a collection of plugins for project management, source code editing, modeling, SCM integration, database management, and many more. Workbench Git Database Debugger Documents Search Import Preview Editor - Monaco BPMN Modeler Database Schema Modeler Entity Data Modeler","title":"Development"},{"location":"development/devops/#operations","text":"The functionality for import and export of projects or workspaces as well as cloning of a whole Dirigible instance, monitoring, document management, etc. are also integrated in the Web IDE component. Operations Database Repository Terminal Snapshot Logs Console","title":"Operations"},{"location":"development/artifacts/","text":"Artifacts Overview File extensions Database *.table - a JSON based database table descriptor file. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artefacts. *.view - a JSON based database table descriptor file. The synchroniser reads and creates the database views as defined in the model. *.csvim - a JSON based descriptor file containing, pointing to a CSV file to be imported to the configured database table. Security *.access - security constraints file. It defines the access permissions for the given endpoints. *.role - roles definition file. Flows *.listener - listener definition describing the link between the message queue or topic and the corresponding handler. *.job - job definition describing the period in which the scheduled handler will be executed. Scripting *.js - a JavaScript file supposed to be executed either server-side by the supported engine (GraalJS) or at the client-side by the browser's built-in engine. *.command - a Shell Command service *.md - a Markdown Wiki file. ES6 and TypeScript Starting from version 8.x of Eclipse Dirigible, it's possible to use also *.mjs (ES6 modules) and *.ts (TypeScript) for the development of server-side services. Modeling *.dsm - an internal XML based format file containing a database schema model diagram. *.schema - a JSON descriptor for a database schema layout produced by the Database Schema Modeler *.edm - an internal XML based format file containing an entity data model diagram. *.model - a JSON descriptor for an entity data model produced by the Entity Data Modeler *.bpmn - a BPMN 2.0 XML file containing a definition of a business process.","title":"Artifacts Overview"},{"location":"development/artifacts/#artifacts-overview","text":"","title":"Artifacts Overview"},{"location":"development/artifacts/#file-extensions","text":"","title":"File extensions"},{"location":"development/artifacts/#database","text":"*.table - a JSON based database table descriptor file. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artefacts. *.view - a JSON based database table descriptor file. The synchroniser reads and creates the database views as defined in the model. *.csvim - a JSON based descriptor file containing, pointing to a CSV file to be imported to the configured database table.","title":"Database"},{"location":"development/artifacts/#security","text":"*.access - security constraints file. It defines the access permissions for the given endpoints. *.role - roles definition file.","title":"Security"},{"location":"development/artifacts/#flows","text":"*.listener - listener definition describing the link between the message queue or topic and the corresponding handler. *.job - job definition describing the period in which the scheduled handler will be executed.","title":"Flows"},{"location":"development/artifacts/#scripting","text":"*.js - a JavaScript file supposed to be executed either server-side by the supported engine (GraalJS) or at the client-side by the browser's built-in engine. *.command - a Shell Command service *.md - a Markdown Wiki file. ES6 and TypeScript Starting from version 8.x of Eclipse Dirigible, it's possible to use also *.mjs (ES6 modules) and *.ts (TypeScript) for the development of server-side services.","title":"Scripting"},{"location":"development/artifacts/#modeling","text":"*.dsm - an internal XML based format file containing a database schema model diagram. *.schema - a JSON descriptor for a database schema layout produced by the Database Schema Modeler *.edm - an internal XML based format file containing an entity data model diagram. *.model - a JSON descriptor for an entity data model produced by the Entity Data Modeler *.bpmn - a BPMN 2.0 XML file containing a definition of a business process.","title":"Modeling"},{"location":"development/artifacts/data-files/","text":"Data Files Delimiter Separated Values *.dsv data files are used for importing test data during development or for defining static content for e.g. nomenclatures. The data file name has to be the same as the target table name. The delimiter uses the | char, and the order of the data fields should be the same as the natural order in the target table. Be careful when using static data in tables. Entity Services (generated by the templates) use sequence algorithm to identity columns starting from 1. The automatic re-initialization of static content from the data file can be achieved when you create a *.dsv file in your project. To make it more flexible it is introduced semantic files as follows: REPLACE (*.replace) - the rows in the database table always correspond to the lines in the data file. Processing of this type of files means - first delete all the records in the database table and insert the rows from the file. This is the behavior of the initial format - DSV (*.dsv). The processing is triggered on restart of the App/Web Server or on publishing of the project containing these files. APPEND (*.append) - the rows from these files are imported only once into the corresponding database tables. If the tables already contain some records the inserting is skipped. After the initial import the corresponding sequence is set to the max ID of the table, so that this table can be used afterwards as persistence storage for the e.g. standard CRUD JavaScript Services. DELETE (*.delete) - if the file contains * as the only line, the whole table is cleaned up. Otherwise only the listed records got deleted by the ID (first column = ID = primary key) . UPDATE (*.update) - the records in the database table got updated with the corresponding lines in the data files. The first column is the ID = primary key used as selection parameter for the update clause. The existing records in the table are not deleted in advance as at the REPLACE case. If no record exist by the given ID , it got inserted. Samples Data Structures and Data Files samples could be found here: Database Table (*.table) . Database View (*.view) . Data Replace (*.replace) . Data Append (*.append) . Data Delete (*.delete) . Data Update (*.update) .","title":"Data Files"},{"location":"development/artifacts/data-files/#data-files","text":"Delimiter Separated Values *.dsv data files are used for importing test data during development or for defining static content for e.g. nomenclatures. The data file name has to be the same as the target table name. The delimiter uses the | char, and the order of the data fields should be the same as the natural order in the target table. Be careful when using static data in tables. Entity Services (generated by the templates) use sequence algorithm to identity columns starting from 1. The automatic re-initialization of static content from the data file can be achieved when you create a *.dsv file in your project. To make it more flexible it is introduced semantic files as follows: REPLACE (*.replace) - the rows in the database table always correspond to the lines in the data file. Processing of this type of files means - first delete all the records in the database table and insert the rows from the file. This is the behavior of the initial format - DSV (*.dsv). The processing is triggered on restart of the App/Web Server or on publishing of the project containing these files. APPEND (*.append) - the rows from these files are imported only once into the corresponding database tables. If the tables already contain some records the inserting is skipped. After the initial import the corresponding sequence is set to the max ID of the table, so that this table can be used afterwards as persistence storage for the e.g. standard CRUD JavaScript Services. DELETE (*.delete) - if the file contains * as the only line, the whole table is cleaned up. Otherwise only the listed records got deleted by the ID (first column = ID = primary key) . UPDATE (*.update) - the records in the database table got updated with the corresponding lines in the data files. The first column is the ID = primary key used as selection parameter for the update clause. The existing records in the table are not deleted in advance as at the REPLACE case. If no record exist by the given ID , it got inserted. Samples Data Structures and Data Files samples could be found here: Database Table (*.table) . Database View (*.view) . Data Replace (*.replace) . Data Append (*.append) . Data Delete (*.delete) . Data Update (*.update) .","title":"Data Files"},{"location":"development/artifacts/database-table/","text":"Table Model Table Model is a JSON formatted *.table descriptor. It represents the layout of the database table, which will be created during the activation process. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artifacts. Example descriptor: { \"tableName\" : \"TEST001\" , \"columns\" : [ { \"name\" : \"ID\" , \"type\" : \"INTEGER\" , \"length\" : \"0\" , \"notNull\" : \"true\" , \"primaryKey\" : \"true\" , \"defaultValue\" : \"\" }, { \"name\" : \"NAME\" , \"type\" : \"VARCHAR\" , \"length\" : \"20\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"DATEOFBIRTH\" , \"type\" : \"DATE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"SALARY\" , \"type\" : \"DOUBLE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" } ] } The supported database types are: VARCHAR - for text-based fields long up to 2K characters CHAR - for text-based fields with fixed length of up to 255 characters INTEGER - 32 bit BIGINT - 64 bit SMALLINT - 16 bit REAL - 7 digits of mantissa DOUBLE - 15 digits of mantissa DATE - represents a date consisting of day, month, and year TIME - represents a time consisting of hours, minutes, and seconds TIMESTAMP - represents DATE, TIME, a nanosecond field, and a time zone BLOB - a binary object, such as an image, audio, etc. The activation of the table descriptor is the process of creating a database table in the target database. The activator constructs a CREATE TABLE SQL statement considering the dialect of the target database system. If a particular table name already exists, the activator checks whether there is a compatible change, such as adding new columns, and constructs an ALTER TABLE SQL statement. If the change is incompatible, the activator returns an error that has to be solved manually through the SQL console. Data Structures Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing. Scripting Services Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers. Web Content Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc. Wiki Content Support of Markdown format for Wiki pages. Integration Services Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ). Mobile Applications Support of native mobile application development via Tabris.js . Extension Definitions Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ). Tooling Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS Modeling Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer Security Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly) Registry Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Database Table"},{"location":"development/artifacts/database-table/#table-model","text":"Table Model is a JSON formatted *.table descriptor. It represents the layout of the database table, which will be created during the activation process. Data structures synchroniser automatically reads all the available *.table files in the repository (including the classpath resources) and creates the underlying database tables into the default database. The definition supports also dependencies which gives the ability to the synchroniser to make a topological sorting before starting the creation of the database artifacts. Example descriptor: { \"tableName\" : \"TEST001\" , \"columns\" : [ { \"name\" : \"ID\" , \"type\" : \"INTEGER\" , \"length\" : \"0\" , \"notNull\" : \"true\" , \"primaryKey\" : \"true\" , \"defaultValue\" : \"\" }, { \"name\" : \"NAME\" , \"type\" : \"VARCHAR\" , \"length\" : \"20\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"DATEOFBIRTH\" , \"type\" : \"DATE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" }, { \"name\" : \"SALARY\" , \"type\" : \"DOUBLE\" , \"length\" : \"0\" , \"notNull\" : \"false\" , \"primaryKey\" : \"false\" , \"defaultValue\" : \"\" } ] } The supported database types are: VARCHAR - for text-based fields long up to 2K characters CHAR - for text-based fields with fixed length of up to 255 characters INTEGER - 32 bit BIGINT - 64 bit SMALLINT - 16 bit REAL - 7 digits of mantissa DOUBLE - 15 digits of mantissa DATE - represents a date consisting of day, month, and year TIME - represents a time consisting of hours, minutes, and seconds TIMESTAMP - represents DATE, TIME, a nanosecond field, and a time zone BLOB - a binary object, such as an image, audio, etc. The activation of the table descriptor is the process of creating a database table in the target database. The activator constructs a CREATE TABLE SQL statement considering the dialect of the target database system. If a particular table name already exists, the activator checks whether there is a compatible change, such as adding new columns, and constructs an ALTER TABLE SQL statement. If the change is incompatible, the activator returns an error that has to be solved manually through the SQL console.","title":"Table Model"},{"location":"development/artifacts/database-table/#data-structures","text":"Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing.","title":"Data Structures"},{"location":"development/artifacts/database-table/#scripting-services","text":"Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers.","title":"Scripting Services"},{"location":"development/artifacts/database-table/#web-content","text":"Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc.","title":"Web Content"},{"location":"development/artifacts/database-table/#wiki-content","text":"Support of Markdown format for Wiki pages.","title":"Wiki Content"},{"location":"development/artifacts/database-table/#integration-services","text":"Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ).","title":"Integration Services"},{"location":"development/artifacts/database-table/#mobile-applications","text":"Support of native mobile application development via Tabris.js .","title":"Mobile Applications"},{"location":"development/artifacts/database-table/#extension-definitions","text":"Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ).","title":"Extension Definitions"},{"location":"development/artifacts/database-table/#tooling","text":"Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS","title":"Tooling"},{"location":"development/artifacts/database-table/#modeling","text":"Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer","title":"Modeling"},{"location":"development/artifacts/database-table/#security","text":"Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly)","title":"Security"},{"location":"development/artifacts/database-table/#registry","text":"Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Registry"},{"location":"development/concepts/","text":"Concepts Overview Dynamic Applications There are several must-know concepts that are implied in the cloud toolkit and have to be understood before getting started. Some of them are closely related to the dynamic applications nature and behavior, others just follow the best practices from the service architecture reflected also in the cloud applications. Repository First comes the concept of a repository . It is the place where the application's content is stored - such as a database for the Eclipse Dirigible's instance. Workspace Next is the concept of a workspace that is very similar to the well known workspace from desktop IDEs (e.g., Eclipse). The workspace can hold one or more projects. One user can have multiple workspaces, but can work in only one at a given moment of time. Registry Registry and the related publishing processes are taken from the SOA (UDDI) and recent API management trends to bring some of their strengths, such as discoverability, reusability, loose coupling, relevance, etc. Generation To boost the development productivity at the very initial phase, we introduced template-based generation of application artifacts via wizards. Entity Services The new Web 2.0 paradigm and the leveraged REST architectural style changed the way services should behave and be described. Although there is push for bilateral contracts only and free description of the services, we decided to introduce a more sophisticated kind of services for special purposes - entity services . Modeling This is the visual definition of database schema models, entity data models, and BPMN processes. In Eclipse Dirigible, modeling is enabled by several editors and modelers . REST framework Along with the low level HTTP request, response, and session handling, Eclipse Dirigible provides a higher level framework for building REST services. More information on how to use this framework can be found here . Web Content This is the client-side application code transported via the container web channel. More information can be found here . Mobile Apps Mobile application support in Eclipse Dirigible is achieved via Tabris.js . Extensions Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions .","title":"Concepts Overview"},{"location":"development/concepts/#concepts-overview","text":"","title":"Concepts Overview"},{"location":"development/concepts/#dynamic-applications","text":"There are several must-know concepts that are implied in the cloud toolkit and have to be understood before getting started. Some of them are closely related to the dynamic applications nature and behavior, others just follow the best practices from the service architecture reflected also in the cloud applications.","title":"Dynamic Applications"},{"location":"development/concepts/#repository","text":"First comes the concept of a repository . It is the place where the application's content is stored - such as a database for the Eclipse Dirigible's instance.","title":"Repository"},{"location":"development/concepts/#workspace","text":"Next is the concept of a workspace that is very similar to the well known workspace from desktop IDEs (e.g., Eclipse). The workspace can hold one or more projects. One user can have multiple workspaces, but can work in only one at a given moment of time.","title":"Workspace"},{"location":"development/concepts/#registry","text":"Registry and the related publishing processes are taken from the SOA (UDDI) and recent API management trends to bring some of their strengths, such as discoverability, reusability, loose coupling, relevance, etc.","title":"Registry"},{"location":"development/concepts/#generation","text":"To boost the development productivity at the very initial phase, we introduced template-based generation of application artifacts via wizards.","title":"Generation"},{"location":"development/concepts/#entity-services","text":"The new Web 2.0 paradigm and the leveraged REST architectural style changed the way services should behave and be described. Although there is push for bilateral contracts only and free description of the services, we decided to introduce a more sophisticated kind of services for special purposes - entity services .","title":"Entity Services"},{"location":"development/concepts/#modeling","text":"This is the visual definition of database schema models, entity data models, and BPMN processes. In Eclipse Dirigible, modeling is enabled by several editors and modelers .","title":"Modeling"},{"location":"development/concepts/#rest-framework","text":"Along with the low level HTTP request, response, and session handling, Eclipse Dirigible provides a higher level framework for building REST services. More information on how to use this framework can be found here .","title":"REST framework"},{"location":"development/concepts/#web-content","text":"This is the client-side application code transported via the container web channel. More information can be found here .","title":"Web Content"},{"location":"development/concepts/#mobile-apps","text":"Mobile application support in Eclipse Dirigible is achieved via Tabris.js .","title":"Mobile Apps"},{"location":"development/concepts/#extensions","text":"Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions .","title":"Extensions"},{"location":"development/concepts/dynamic-applications/","text":"Dynamic Applications We introduced the term \"dynamic applications\" as one that narrows the scope of the target applications that can be created using Eclipse Dirigible. The overall process of building dynamic applications lies on well-known and proved principles: In-system development - known from microcontrollers to business software systems. A major benefit is working on a live system where all changes you make take effect immediately, hence the impact and side effects can be realized in the early stages of the development process. Content-centric - known from networking to development processes in the context of dynamic applications it comprises. All the artifacts are text-based models or executable scripts stored in a generic repository (along with the related binaries, such as images). This makes the life-cycle management of the application itself and the transport between the landscapes (Dev/Test/Prod) straight forward. In result, you can set up the whole system only by pulling the content from a remote source code repository such as git . Scripting languages - programming languages written for a special runtime environment that can interpret (rather than compile) the execution of tasks. Dynamic languages existing nowadays, as well as the existing smooth integration in the Web servers, make the rise of the in-system development in the cloud possible. Shortest turn-around time - the driving principle for our tooling because instant access and instant value are some of the most important requirements for the developers. In general, components of a dynamic application can be separated into the following categories: Data structures - The artifacts representing the domain model of the application. In our case, we have chosen the well-accepted JSON format for describing a normalized entity model. There is no intermediate adaptation layer, hence all entities represent directly the database artifacts - tables and views. Entity services - Once we have the domain model entities, next step is to expose them as Web services. Following the modern Web patterns, we provide the scripting capabilities so you can create your RESTful services in JavaScript, Ruby, and Groovy. Scripting services - During the development, you can use a rich set of APIs that give you access to the database and HTTP layer, utilities, and to the direct Java APIs underneath. Support for creating unit tests is important and is, therefore, integrated as an atomic part of the scripting support itself - you can use the same language for the tests as the one for the services themselves. User interface - Web 2.0 paradigm, as well as HTML5 specification, bring the Web UI to another level. There are already many cool client-side AJAX frameworks that you can use depending on the nature of your application. Integration services - Following the principle of atomicity, one dynamic application should be as self contained as possible. Unfortunately, in the real world there are always some external services that have to be integrated in your application - for data transfer, triggering external processes, lookup in external sources, etc. For this purpose, we provide capabilities for creating simple routing services and dynamic EIPs. Documentation - The documentation is integral part of your application. The target format for describing services and for overall development documentation is already well accepted - wiki .","title":"Dynamic Applications"},{"location":"development/concepts/dynamic-applications/#dynamic-applications","text":"We introduced the term \"dynamic applications\" as one that narrows the scope of the target applications that can be created using Eclipse Dirigible. The overall process of building dynamic applications lies on well-known and proved principles: In-system development - known from microcontrollers to business software systems. A major benefit is working on a live system where all changes you make take effect immediately, hence the impact and side effects can be realized in the early stages of the development process. Content-centric - known from networking to development processes in the context of dynamic applications it comprises. All the artifacts are text-based models or executable scripts stored in a generic repository (along with the related binaries, such as images). This makes the life-cycle management of the application itself and the transport between the landscapes (Dev/Test/Prod) straight forward. In result, you can set up the whole system only by pulling the content from a remote source code repository such as git . Scripting languages - programming languages written for a special runtime environment that can interpret (rather than compile) the execution of tasks. Dynamic languages existing nowadays, as well as the existing smooth integration in the Web servers, make the rise of the in-system development in the cloud possible. Shortest turn-around time - the driving principle for our tooling because instant access and instant value are some of the most important requirements for the developers. In general, components of a dynamic application can be separated into the following categories: Data structures - The artifacts representing the domain model of the application. In our case, we have chosen the well-accepted JSON format for describing a normalized entity model. There is no intermediate adaptation layer, hence all entities represent directly the database artifacts - tables and views. Entity services - Once we have the domain model entities, next step is to expose them as Web services. Following the modern Web patterns, we provide the scripting capabilities so you can create your RESTful services in JavaScript, Ruby, and Groovy. Scripting services - During the development, you can use a rich set of APIs that give you access to the database and HTTP layer, utilities, and to the direct Java APIs underneath. Support for creating unit tests is important and is, therefore, integrated as an atomic part of the scripting support itself - you can use the same language for the tests as the one for the services themselves. User interface - Web 2.0 paradigm, as well as HTML5 specification, bring the Web UI to another level. There are already many cool client-side AJAX frameworks that you can use depending on the nature of your application. Integration services - Following the principle of atomicity, one dynamic application should be as self contained as possible. Unfortunately, in the real world there are always some external services that have to be integrated in your application - for data transfer, triggering external processes, lookup in external sources, etc. For this purpose, we provide capabilities for creating simple routing services and dynamic EIPs. Documentation - The documentation is integral part of your application. The target format for describing services and for overall development documentation is already well accepted - wiki .","title":"Dynamic Applications"},{"location":"development/concepts/entity-service/","text":"Entity Service In general, the entity service is a fully capable RESTful service as it is defined by REST architectural style for performance, scalability, simplicity, and so on. It exposes the CRUD operations of a given domain model object. Underneath it, the database store is connected as a data transfer layer. The domain object management is the service pattern that is used most often when following the RESTful paradigm on business software components. In Eclipse Dirigible, the standard functionality of Web services is enhanced but without breaking the REST principles. This is useful for generic utilities and user interface generation. Standard functionality: GET method If the requested path points directly to the service endpoint (no additional parameters), it lists all the entities of this type (in this collection). If the request contains an id parameter, the service returns only the requested entity. POST method - creates an entity, getting the fields from the request body (JSON formatted) and auto-generated ID. PUT method - updates the entity, getting the ID from the request body (JSON formatted). DELETE method - deletes the entity by the provided ID parameter, which is mandatory. Enhancements to the standard functionality of GET with the following parameters: count - returns the number of the entities collection size. metadata - returns the simplified descriptor of the entity in JSON (see below). sort - indicates the order of the entities. desc - indicates the reverse order used with the above parameter. limit - used for paging, returns limited result set. offset - used for paging, result set starts from the offset value. Example metadata for an entity: { \"name\" : \"books\" , \"type\" : \"object\" , \"properties\" : [ { \"name\" : \"book_id\" , \"type\" : \"integer\" , \"key\" : \"true\" , \"required\" : \"true\" }, { \"name\" : \"book_isbn\" , \"type\" : \"string\" }, { \"name\" : \"book_title\" , \"type\" : \"string\" }, { \"name\" : \"book_author\" , \"type\" : \"string\" }, { \"name\" : \"book_editor\" , \"type\" : \"string\" }, { \"name\" : \"book_publisher\" , \"type\" : \"string\" }, { \"name\" : \"book_format\" , \"type\" : \"string\" }, { \"name\" : \"book_publication_date\" , \"type\" : \"date\" }, { \"name\" : \"book_price\" , \"type\" : \"double\" } ] } All these features of entity services are implied during the generation process. As an input, the template uses a database table and an entity service name that are entered in the Entity Data Modeler . Just select the *.entity artifact in the Workspace view. Choose Generate \u2192 User Interface for Entity Service . Limitations for the table to be entity-service compliant: There should be only one column as a primary key that will be used for its identity . There should be only one set of database column types that are supported by default for generation (simple types only as clob and blob are not supported). Generic query methods are not generated because: It will cover only very simple cases with reasonable performance. For the complex queries, the introduction of an additional layer results in worse performance in comparison to the SQL script. Entity services are generated in JavaScript, hence they can be accessed right after generation and publishing on: ://:/services/v4/js// Here's an example: https://example.com/services/v4/js/bookstore/books.js Or just select them in the Workspace view and check the result in the Preview view.","title":"Entity Service"},{"location":"development/concepts/entity-service/#entity-service","text":"In general, the entity service is a fully capable RESTful service as it is defined by REST architectural style for performance, scalability, simplicity, and so on. It exposes the CRUD operations of a given domain model object. Underneath it, the database store is connected as a data transfer layer. The domain object management is the service pattern that is used most often when following the RESTful paradigm on business software components. In Eclipse Dirigible, the standard functionality of Web services is enhanced but without breaking the REST principles. This is useful for generic utilities and user interface generation. Standard functionality: GET method If the requested path points directly to the service endpoint (no additional parameters), it lists all the entities of this type (in this collection). If the request contains an id parameter, the service returns only the requested entity. POST method - creates an entity, getting the fields from the request body (JSON formatted) and auto-generated ID. PUT method - updates the entity, getting the ID from the request body (JSON formatted). DELETE method - deletes the entity by the provided ID parameter, which is mandatory. Enhancements to the standard functionality of GET with the following parameters: count - returns the number of the entities collection size. metadata - returns the simplified descriptor of the entity in JSON (see below). sort - indicates the order of the entities. desc - indicates the reverse order used with the above parameter. limit - used for paging, returns limited result set. offset - used for paging, result set starts from the offset value. Example metadata for an entity: { \"name\" : \"books\" , \"type\" : \"object\" , \"properties\" : [ { \"name\" : \"book_id\" , \"type\" : \"integer\" , \"key\" : \"true\" , \"required\" : \"true\" }, { \"name\" : \"book_isbn\" , \"type\" : \"string\" }, { \"name\" : \"book_title\" , \"type\" : \"string\" }, { \"name\" : \"book_author\" , \"type\" : \"string\" }, { \"name\" : \"book_editor\" , \"type\" : \"string\" }, { \"name\" : \"book_publisher\" , \"type\" : \"string\" }, { \"name\" : \"book_format\" , \"type\" : \"string\" }, { \"name\" : \"book_publication_date\" , \"type\" : \"date\" }, { \"name\" : \"book_price\" , \"type\" : \"double\" } ] } All these features of entity services are implied during the generation process. As an input, the template uses a database table and an entity service name that are entered in the Entity Data Modeler . Just select the *.entity artifact in the Workspace view. Choose Generate \u2192 User Interface for Entity Service . Limitations for the table to be entity-service compliant: There should be only one column as a primary key that will be used for its identity . There should be only one set of database column types that are supported by default for generation (simple types only as clob and blob are not supported). Generic query methods are not generated because: It will cover only very simple cases with reasonable performance. For the complex queries, the introduction of an additional layer results in worse performance in comparison to the SQL script. Entity services are generated in JavaScript, hence they can be accessed right after generation and publishing on: ://:/services/v4/js// Here's an example: https://example.com/services/v4/js/bookstore/books.js Or just select them in the Workspace view and check the result in the Preview view.","title":"Entity Service"},{"location":"development/concepts/extensions/","text":"Extension Definitions Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions. Extension Points An extension point is the place in the core module, which is expected to be enhanced by particular custom created modules. It is a simple JSON formatted *.extensionpoint file and can be placed anywhere in your project. { \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension Point 1\" } Extensions An extension is the plug-in in the custom module, which extends the core functionality. It is a simple JSON formatted *.extension file and can be placed anywhere in your project. { \"extension\" : \"/project1/extension1\" , \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension 1\" } Note The 'extension' parameter above should point to a valid JavaScript module. For a full example you can look at sample-ide-perspective . Calling Extensions Within the core module, you can iterate over the defined extensions and call theirs functions: let extensions = extensionManager . getExtensions ( \"/project1/extensionPoint1\" ); for ( let i = 0 ; i < extensions . length ; i ++ ) { let extension = require ( extensions [ i ]); response . println ( extension . enhanceProcess ()); } In the code above, the extension is a JavaScript module ( extension1.js ) within the same project, and it has exposed an enhanceProcess() function.","title":"Extension Definitions"},{"location":"development/concepts/extensions/#extension-definitions","text":"Extensibility is an important requirement for business applications built to follow custom processes in Line of Business (LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions.","title":"Extension Definitions"},{"location":"development/concepts/extensions/#extension-points","text":"An extension point is the place in the core module, which is expected to be enhanced by particular custom created modules. It is a simple JSON formatted *.extensionpoint file and can be placed anywhere in your project. { \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension Point 1\" }","title":"Extension Points"},{"location":"development/concepts/extensions/#extensions","text":"An extension is the plug-in in the custom module, which extends the core functionality. It is a simple JSON formatted *.extension file and can be placed anywhere in your project. { \"extension\" : \"/project1/extension1\" , \"extension-point\" : \"/project1/extensionPoint1\" , \"description\" : \"Description for Extension 1\" } Note The 'extension' parameter above should point to a valid JavaScript module. For a full example you can look at sample-ide-perspective .","title":"Extensions"},{"location":"development/concepts/extensions/#calling-extensions","text":"Within the core module, you can iterate over the defined extensions and call theirs functions: let extensions = extensionManager . getExtensions ( \"/project1/extensionPoint1\" ); for ( let i = 0 ; i < extensions . length ; i ++ ) { let extension = require ( extensions [ i ]); response . println ( extension . enhanceProcess ()); } In the code above, the extension is a JavaScript module ( extension1.js ) within the same project, and it has exposed an enhanceProcess() function.","title":"Calling Extensions"},{"location":"development/concepts/generation/","text":"Generation Template-based generation of artifacts helps for developer productivity in the initial phase of building the application. There are several application components that have similar behavior and often very similar implementation. A prominent example is entity service . It has several predefined methods based on REST concepts and HTTP - GET , POST , PUT , DELETE on an entity level as well as a list of all entities. Additionally, the most notable storage for the entity data is the RDBMS provided by the platform. Another example are user interface templates based on patterns - list , master-detail , input form , etc. Templates can also be provided based on different frameworks for client-side interaction. Note The generation here is a one-time process. Once you have the generated artifact, you can modify it based on your own requirements._ In contrast to the approach above, in case of MDA , you can expect to regenerate the PSMs every time you make changes on PIMs . For this approach, we introduced the entity data modeler where you can define declaratively all the needed components and their attributes. Afterwards, you can use them to generate a complete full-stack data-driven application. Note The enhancements in this case must go via extensions only.","title":"Generation"},{"location":"development/concepts/generation/#generation","text":"Template-based generation of artifacts helps for developer productivity in the initial phase of building the application. There are several application components that have similar behavior and often very similar implementation. A prominent example is entity service . It has several predefined methods based on REST concepts and HTTP - GET , POST , PUT , DELETE on an entity level as well as a list of all entities. Additionally, the most notable storage for the entity data is the RDBMS provided by the platform. Another example are user interface templates based on patterns - list , master-detail , input form , etc. Templates can also be provided based on different frameworks for client-side interaction. Note The generation here is a one-time process. Once you have the generated artifact, you can modify it based on your own requirements._ In contrast to the approach above, in case of MDA , you can expect to regenerate the PSMs every time you make changes on PIMs . For this approach, we introduced the entity data modeler where you can define declaratively all the needed components and their attributes. Afterwards, you can use them to generate a complete full-stack data-driven application. Note The enhancements in this case must go via extensions only.","title":"Generation"},{"location":"development/concepts/mobile-apps/","text":"Mobile Applications Overview Mobile application support in Eclipse Dirigible is achieved via Tabris.js . It is a mobile framework that allows you to develop native iOS and Android mobile applications, written entirely in JavaScript. This framework provides native performance, native look and feel, and single code-base (JavaScript). You can use existing JavaScript libraries and native extensions to extend the core functionality. Unlike other frameworks, which use webviews or cross-platform intermediate runtimes, Tabris.js executes the JavaScript directly on the device and renders everything using native widgets. Thanks to the framework capabilities, the developers can focus more on the mobile application development and less on the platform specifics (iOS and Android).","title":"Mobile Applications"},{"location":"development/concepts/mobile-apps/#mobile-applications","text":"","title":"Mobile Applications"},{"location":"development/concepts/mobile-apps/#overview","text":"Mobile application support in Eclipse Dirigible is achieved via Tabris.js . It is a mobile framework that allows you to develop native iOS and Android mobile applications, written entirely in JavaScript. This framework provides native performance, native look and feel, and single code-base (JavaScript). You can use existing JavaScript libraries and native extensions to extend the core functionality. Unlike other frameworks, which use webviews or cross-platform intermediate runtimes, Tabris.js executes the JavaScript directly on the device and renders everything using native widgets. Thanks to the framework capabilities, the developers can focus more on the mobile application development and less on the platform specifics (iOS and Android).","title":"Overview"},{"location":"development/concepts/publishing/","text":"Publishing There is a conceptual separation between design-time and runtime phases of the development life cycle. During the design-time phase, the source artifacts are created and managed within the isolated developer's area - workspace . When you are ready with a given feature, you have to publish the project so that the application artifacts become available for the other users. The meaning of \"available\" depends on the type of artifact. For example, for JavaScript services this is the registration of a public endpoint, while for web and wiki content, it is just the access to the artifacts them self, etc. Publishing action is accessible from the context menu in the Workspace view. The space within the repository, where all the public artifact are placed, is called \"registry\".","title":"Publishing"},{"location":"development/concepts/publishing/#publishing","text":"There is a conceptual separation between design-time and runtime phases of the development life cycle. During the design-time phase, the source artifacts are created and managed within the isolated developer's area - workspace . When you are ready with a given feature, you have to publish the project so that the application artifacts become available for the other users. The meaning of \"available\" depends on the type of artifact. For example, for JavaScript services this is the registration of a public endpoint, while for web and wiki content, it is just the access to the artifacts them self, etc. Publishing action is accessible from the context menu in the Workspace view. The space within the repository, where all the public artifact are placed, is called \"registry\".","title":"Publishing"},{"location":"development/concepts/registry/","text":"Registry The registry is the entry point for searching and browsing for service endpoints, as well as for monitoring and administration at runtime. Technically, it is a space within the repository where all the published artifacts are placed.","title":"Registry"},{"location":"development/concepts/registry/#registry","text":"The registry is the entry point for searching and browsing for service endpoints, as well as for monitoring and administration at runtime. Technically, it is a space within the repository where all the published artifacts are placed.","title":"Registry"},{"location":"development/concepts/repository/","text":"Repository The repository component is the main place where all the project's artifacts are stored. It provides an abstract \"file-system-like\" structure with folder and files that can be backed by different underlying persistence storages - file system, relational database, noSQL database, etc. In a single repository instance there are several spaces holding different types of content - users' workspaces, public registry , search indices, git metadata, versions, etc.","title":"Repository"},{"location":"development/concepts/repository/#repository","text":"The repository component is the main place where all the project's artifacts are stored. It provides an abstract \"file-system-like\" structure with folder and files that can be backed by different underlying persistence storages - file system, relational database, noSQL database, etc. In a single repository instance there are several spaces holding different types of content - users' workspaces, public registry , search indices, git metadata, versions, etc.","title":"Repository"},{"location":"development/concepts/rest/","text":"REST The http-rs module is designed to define and run a broad range of HTTP REST services. A very simple example hello-api.js : var rs = require ( \"http/v4/rs\" ); // serve GET HTTP requests sent to resource path \"\" (i.e. directly to hello-api.js) rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) . execute (); Sending a GET request to /services/v4/js/test/hello-api.js to the server and hosting the hello-api.js piece of code above in test/hello-api.js will return response body: Hello there! Overview Let\u2019s have a closer look at the methods shown in the example above. First, we requested a new REST service instance from the framework: rs.service() Next, we configured the instance to serve HTTP GET requests sent to root path (\"\") using the supplied function: . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) Technically, configuration is not required to execute a service, but obviously it will do nothing, if you don't instruct it what to do. Finally, we run the service and it processes the HTTP request: .execute(); Now, this is a fairly simplistic example aiming to give you a hint of how you can bring up a REST API to life with http-rs. There is a whole lot more that we shall explore in the next sections. Creating REST services rs.service() Creating new service instances is as simple as invoking rs.service() . That returns a configurable and/or executable instance of the HttpController class. The controller API allows to: - start configuring REST service (method resource() ) - serve requests (method execute() ) - perform a couple of more advanced activities, which will be reviewed in the Advanced section below Additionally, the controller API features also shortcut factory methods that are useful for simplistic configurations (like the one in our initial example) such as get(sPath, fServe, arrConsume, arrProduces) . Read below for more examples how to use the methods. Serving requests execute() The mechanism for serving requests is implemented in the execute() method of the HttpController. It tries to match the request to the service API configuration. If the mechanism matches the request successfully, it triggers the execution flow of the callback functions. The execution flow processes the request and response. If the mechanism doesn't match the request successfully, it sends a Bad Request error to the client. The request and response objects are implicitly those that were used to request the script where the execute() method invocation occurred. But they can be exchanged for others as shown in the Advanced section. The execute() method is defined in the service instance (class HttpController) obtained with rs.service() . The execute() method can be triggered with rs.service().execute() . The rs API configuration also provides numerous references to the method so you can invoke it on any stage. For example, rs . service (). get ( \"\" ). execute () rs . service (). resource ( \"\" ). get (). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). execute () are all valid ways to serve requests. What you need to consider is that execute() must be the final method invocation. Even if you retain a reference to a configuration object and change it after that, it will be irrelevant since the response will be flushed and closed by then. Configuring services There are three options as far as configuration is concerned. You can start from scratch and build the configuration using the rs API. You can use configuration objects. They are holding the configuration that the rs API produces. You can start with a configuration object and then enhance or override the configuration using the rs API. Configuration objects A configuration object is a JS object with canonical structure that the http-rs can interpret. We will discuss its schema later on in this guide. For now, let's just consider that it's the same thing that the rs-fluent API will actually produce behind the scenes so it's a completely valid alternative and complement to the rs-fluent API configuration approach. Refer to the Advanced section for more details on using configuration objects. Defining service resources resource(sPath, oConfiguration?) Resources are the top-level configuration objects that represent an HTTP (server) resource , for which we will be defining a protocol. Each resource is identified by a URL on the server. You can have multiple resources per service configuration, provided that their URLs do not overlap. Resource vs Path vs Resource Path As per the REST terms, a resource is an abstraction or a server-side resource that can be a file, a dynamically generated content, or a procedure (although the last is considered heresy by purists). It's virtually anything hosted on a server that has an address and can be accessed with a standard HTTP method. It is often referred to as \"path\" or \"resource path\" due to its singular most notable identifying characteristic. But to be precise, \"path\" is only a property of the resource. As far as configuration is concerned, the resource defines the configuration scope for which we define method handlers and constraints, and is identifiable by its \"path\" property. Resource paths and path templates The sPath string parameter (mandatory) of the resource() method will serve as the resource URL. It is relative to the location where the JavaScript service is running (e.g. /services/v4/my-application/api/my-service.js ). No path ( \"\" ), request directly to the JavaScript service root ( \"\" ) path. The path can also be a URL template, i.e. parameterized. For example consider the path template: {id}/assets/{assetType}/{name} This will resolve request paths such as: /services/js/test.js/1/assets/longterm/building to service path: 1/assets/longterm/building If a request is matched to such path, the service mechanism will provide the resolved parameters as an object map to the function that handles the request. Using the sample path above the path parameters object will look like this: { \"id\" : 1 , \"assetType\" : \"longterm\" , \"name\" : \"building\" } Defining HTTP methods allowed for a resource resource . get () resource . post () resource . put () resource [ \"delete\" ]() and resource . remove () resource . method () By default, only the HTTP request methods that you have configured for a resource are allowed. The fluent API of Resource instances, obtained with the resource(sPath) method that we discussed above, exposes the most popular REST API methods ( get , post , put and delete ). They are simply aliases for the generic method method . Whichever we consider, we will receive a ResourceMethod instance from the invocation and its API will allow us to specify processing functions and further specify constraints on the request/response for which they are applicable: rs.resource('').get().produces([\"application/json\"]).serve(function(){}) Alternatively, as we have already seen, we can supply the serve callback function directly as first argument to the method, which comes in handy if we have nothing more to setup: rs.resource('').get(function(){}) We can also use configuration object as a third option and this will be discussed in the Advanced section. The samples here are all for configuring HTTP GET Method but the usage pattern is still the same for all: rs.resource('').post().consumes([\"application/json\"]).serve(function(){}) Shortcuts You already noticed that instead of explicitly using serve to configure callback for serving the requests we could directly provide the function as argument to the method configuring the HTTP method (e.g. get ). rs.resource('').get(function(){}) rs.resource('').get().serve(function(){}) So why bother provisioning an explicit serve() function in the first place then? The answer is that serve() configures only one of the callback functions that are triggered during the request processing flow. And this shortcut is handy if it is only serve() that you are interested into configuring. Of course, nothing prevents you also from using the shortcut and still configure the other callback functions, unless you find it confusing. These are all valid options. Find out more about configuring request processing callback functions in the section dedicated to this. When the controller API was discussed, it was mentioned that there are shortcut factory methods that combine a couple operations to produce directly a method handler for a resource path. Example rs . service () . get ( \"\" , function ( ctx , request , response ) { response . print ( 'ok' ); }) . execute (); That would be equivalent to the following: rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . print ( 'ok' ); }) . execute (); These shortcut methods share the same names with those in Resource that are used for defining HTTP method handlers: get , post , put , delete and its alias remove , but differ in signature (first argument is a string for the resource path) and the return type (the HttpController instance, instead of ResourceMethod). They are useful as a compact mechanism if you intend to build something simple and simplistic, such as a single resource and one or few handler functions for it. You will not be able to go much further with this API so if you consider anything even slightly more sophisticated you should look into the fluent API of resource instead: rs.service().resource(\"\") . Note Note that the scope of these shortcut methods is the controller, not the resource. That has effect on the method chaining. For clean code, do not confuse despite the similar names and avoid mixing them. Defining content types that an API consumes and produces rs . resource ( \"\" ). get (). produces ( \"[application/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[application/json]\" ) rs . resource ( \"\" ). put (). consumes ( \"[application/json]\" ). produces ( \"[application/json]\" ) Optionally, but also quite likely, we will add some constraints on the data type consumed and produced by the resource method handler that we configure. At request processing runtime, these constraints will be matched for compatibility against the HTTP request headers before delegating to the handler processing function. You can use wildcards (*) in the MIME types arguments both for type and sub-type and it will be considered as anything during the execution: rs . resource ( \"\" ). post (). consumes ( \"[*/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[*/*]\" ) Request processing flow and events Before we continue, let us take a look at the request processing flow. The request is matched against the resource method handling definitions in the configuration and if there is a compatible one it is elicited for execution. Otherwise, a Bad request error is returned back to the client. The before callback function is invoked if any was configured. The serve callback function is invoked if any was configured. If an Error was thrown from the serve function, a catch callback function is invoked. The callback function is either configured or the default one. A finally (always executed) function is invoked if one was configured. Or in pseudocode: try { before ( ctx , request , response , resourceMethod , controller ); serve ( ctx , request , response , resourceMethod , controller ); } catch ( err ){ catch ( ctx , err , request , response , resourceMethod , controller ); } finally { finally (); } As evident form the flow, it is only the serve event callback handler function that is required to be setup. But if you require fine grained reaction to the other events, you can configure handlers for each of those you are interested in. Currently, the API supports a single handler function per event so in multiple invocation of a setup method on the same resource method only the last will matter. Defining event handling functions resource . get (). before ( function ( ctx , request , response , resourceMethod , controller ){ //Implements pre-processing logic }) resource . get (). serve ( function ( ctx , request , response , resourceMethod , controller ){ //Implements request-processing logic }) resource . get (). catch ( function ( ctx , error , request , response , resourceMethod , controller ){ //Implements error-processing logic overriding the default }) resource . get (). finally ( function (){ //Implements post-processing logic regardless of error or success of the serve function }) A valid, executable resource method configuration requires at least the serve callback function to be setup: resource . get (). serve ( function ( ctx , request , response ){ response . println ( 'OK' ); }); The rest are optional and/or have default implementations. Errors thrown from the before and serve callbacks are delegated to the catch callback. There is a default catch callback that sends formatted error back in the response and it can be overridden using the catch method to setup another error processing logic. The finally callback is invoked after the response has been flushed and closed (regardless if in error or success) and can be used to cleanup resources. Example: rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response ){ request . setHeader ( 'X-arestme-version' , '1.0' ); }) . serve ( function ( ctx , request , response ){ response . println ( 'Serving GET request' ); }) . catch ( function ( ctx , err , request , response ){ console . error ( err . message ); }) . finally ( function (){ console . info ( 'GET request processing finished' ); }) Advanced Using configuration objects Configuration objects are particularly useful when you are enhancing or overriding an existing protocol so you don't start configuring from scratch but rather amend or change pieces of the configuration. It is also useful when you are dealing with dynamically generated HTTP-based protocol configurations. For example, consider the simple sample that we started with. It is completely identical with this one, which uses a configuration object and provides it to the service function: rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }). execute (); It is also completely identical with this one: rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); or this one: rs . service () . resource ( \"\" ) . get ([{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }]). execute (); In fact, here is a sample how to define a whole API providing configuration directly to the service method and then enhance it. rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }) . resource ( \"\" ) . post () . serve ( function ( ctx , request , response ){ console . info ( request . readText ()); }) . execute (); In this way we essentially are exploiting the fluent API to configure a service but we will not start from scratch. Many of the API methods accept as a second argument configuration object and this doesn't prevent you to continue the API design with fluent API to enhance or override it. The sendError method in HttpController The HttpController class instances that we receive when rs.service() is invoked, features a sendError method. It implements the logic for formatting errors and returning them back to the client taking into account its type and content type preferences. Should you require to change this behavior globally you can redefine the function. If you require different behavior for particular resources or resource method handlers, then using the catch callback is the better approach. Sometimes it's useful to reuse the method and send error in your handler functions. The standard request processing mechanism in HttpController does not account for logical errors. It doesn't know for example that a parameter form a client input is out of valid range. For such cases you would normally implement validation either in before event handler or in serve. And if you need tighter control on what is sent back, e.g. the HTTP code you wouldn't simply throw an Error but invoke the sendError function with the right parameters yourself. For these purposes the last argument of each event handler function is conveniently the controller instance. rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response , methodHandler , controller ){ //check if requested file exists if ( ! file . exists ()){ controller . sendError (); } }) . serve ( function (){ //return file content }) Defining readonly APIs mappings.readonly() An obvious way of defining readonly APIs is to use only GET resource methods definitions. In some cases though APIs can be created from external configuration that also contains other resource method handlers, or we can receive an API instance from another module factory, or we want to support two instances of the same API, one readonly and one with edit capabilities, with minimal code. In such cases, we already have non-GET resource methods that we have to get rid of somehow. Here the readonly method steps in and does exactly this - removes all but the GET resource handlers if any. Example: rs . service () . resource ( \"\" ) . post () . serve ( function (){}); . get () . serve ( function (){}); . readonly () . execute (); If you inspect the configuration after .readonly() is invoked (use resource(\"\").configuration() ) you will notice that the post verb definition is gone. Consecutively, POST requests to this resource will end up in Bad Request (400). Note that for this to work, this must be the last configuration action for a resource. Otherwise, you are resetting the resource configuration to readonly, only to define write methods again. The readonly method is available both for ResourceMapping and Resource objects returned by either invocations of service mappings() method or retained references from configuration API invocations. Disabling a ResourceMethod Handler api.disable(sPath, sVerb, arrConsumesTypes, arrProducesTypes) Similar to the use cases explored for the readonly method above yo might not be in full control of the definition of the API, but rather takeover at some point. Similar to the readonly method, this one will remove the handler definition identified by the four parameters - resource path, resource verb, consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order), but it will do it for any verb, not only GET . In that sense readonly is a specialization of this one only for GET verbs. Example: var mappings = rs . service ({ \"\" : { \"post\" : [{ serve : function (){} }], \"get\" : [{ serve : function (){} }] } }). mappings (); mappings . disable ( \"\" , \"post\" ); With this API definition, invoking mappings.find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var mappings = rs . service (). get ( function (){}). mappings (); //later in code var handler = mappings . find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); } Executing service with explicit request/response arguments The request and response parameters of the execute method are optional. If you don't supply them, the service will take the request/response objects that were used to request the script. Most of the time this is what you want. However, supplying your own request and response arguments can be very handy for testing as you can easily mock and inspect them. Fluency for execute method The execute method is defined by the service instance (HttpController) obtained with rs.service() and can be executed with: rs.service().execute() . The fluent configuration API also provides references to the method, so you can actually invoke it on any stage. Examples: rs . service (). resource ( \"\" ). get ( function (){}). execute () rs . service (). resource ( \"\" ). get (). serve ( function (){}). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). serve ( function (){}). execute () rs . service () . resource ( \"\" ) . produces ([ \"application/json\" ]) . get ( function (){}) . resource ( \"\" ) . consumes ([ \"*/json\" ]) . post ( function (){}) . execute () Mappings vs Configurations The API supplies two methods mappings() and configuration() that provide configuration in two forms. The mappings method supplies typed API objects such as Resource aggregating ResourceMethod instances. To get a reference to a service mappings, invoke mappings on the service instance: rs.service().mappings() With a reference to mappings you have their fluent API at disposal. This is useful when extending and enhancing the core rs functionality to build dedicated services. For example the HttpController constructor function is designed to accept mappings and if you extend or initialize it internally in another API you will likely need this form of configuration. An invocation of the configuration method on the other hand provides the underlying JS configuration object. It can be used to supply generic configurations that are used to initialize new types of services as the public fluent API is designed to accept this form of configuration. Both are represent configuration but while the mappings are sort of internal, parsed version, the configuration object is the version that the public api accepts and is also therefore kind of advanced public form of the internal configuration. It is also possible to convert between the two: rs . service ( jsConfig ). mappings () rs . service (). resource (). configuration () Finding a ResourceMethod rs.service().mappings().find(sPath, sMethod, arrConsumesTypes, arrProducesTypes) Suppose you want to redefine a handler definition to e.g. change the serve callback, add a before handler, change or add to the consumes media types constraint etc. To do that you need a reference to the handler, which is identified by the four parameters - resource path , resource method , consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order). On a successful search hit you get a reference to the handler definition and can perform changes on it. Example: rs . service () . resource ( \"\" ) . get ( function (){}); With this API definition, invoking rs.service().mappings().find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var handler = svc . mappings (). find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); } With consumes and produces constraints on a resource method handler, getting a reference will require them specified too. Example: var svc = rs . service (); svc . mapings () . resource ( \"\" ) . post () . consumes ([ 'application/json' , 'text/json' ]) . produces ([ 'application/json' ]) . serve ( function (){}); var handler = svc . mapings (). find ( \"\" , \"post\" , [ 'text/json' , 'application/json' ], [ 'application/json' ]); Note, that the order of the MIME type string values in the consumes/produces array parameters is not significant. They will be sorted before matching the sorted corresponding arrays in the definition. Configuring resource with JS object Having defined a resource with path we have two options for configuring it. We can proceed using its fluent API or we can provision a configuration JS object as second argument to the resource method and have it done in one step. Considering the latter, we will be provisioning configuration for this resource only, so it should be an object with method definitions as root members. rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); Refer to the next sections for comparison how to achieve the same, using fluent API and/or configuration objects on the lower levels. JS Configuration object schema In progress. Check back later. Schema: { pathString : { methodString : [{ \"consumes\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"produces\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"before\" : Function \"serve\" : Function \"catch\" : Function \"finally\" : Function }] } } pathString is a string that represents the resource path. There could be 0 or more such non-overlapping members. methodString is a string for the HTTP resource method. There could be 0 or more such non-overlapping members. The value of methodString is an array of 0 or more objects, each defining a request method processing that will be executed under unique conditions (constraints) that match the request. A component in the methodString array, can consist of constraints (consumes, produces) and request processing flow event handlers (before, serve, catch, finally) consumes value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. produces value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. before , serve , catch and finally values are functions. Except for the serve function, the rest can be undefined . Building a CRUD rest service The code snippet below shows a sample design for a REST API for simple CRUD file operations. It has illustrative purposes. The service design is to work with files in the HOME directory of the user that runs the dirigible instance currently. Users can create, read, update and delete files by sending corresponding POST, GET, PUT and DELETE requests using the file name as path segment (e.g. /services/js/file-serivce.js/test.json ) and they can also upload files if they don't specify file name but send multipart-form-data POST request directly to the service (e.g. /services/js/file-serivce.js ). Note how the before handler is used to validate user has permissions on resources and how it makes use of controller's sendError method. var LOGGER = require ( \"log/v4/logging\" ). getLogger ( 'http.filesvc' ); var rs = require ( \"http/v4/rs\" ); var upload = require ( 'http/v4/upload' ); var files = require ( 'io/v4/files' ); var user = require ( 'security/v4/user' ); var env = require ( 'core/v4/env' ); var validateRequest = function ( permissions , ctx , request , response , methodHandler , controller ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; if ( ! files . exists ( filePath )){ LOGGER . info ( \"Requested file \" + filePath + \" does not exist.\" ); controller . sendError ( response . NOT_FOUND , undefined , response . HttpCodesReasons . getReason ( String ( response . NOT_FOUND )), ctx . pathParameters . fileName + \" does not exist.\" ); return ; } if ( permissions ){ var resourcePermissions = files . getPermissions ( filePath ); if ( resourcePermissions !== null && resourcePermissions . indexOf ( permissions ) >- 1 ){ var loggedUser = user . getName (); LOGGER . error ( \"User {} does not have sufficient permissions[{}] for {}\" , loggedUser , files . getPermissions ( filePath ), filePath ); controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), \"User \" + loggedUser + \" does not have sufficient permissions for \" + ctx . pathParameters . fileName ); return ; } } LOGGER . error ( 'validation successfull' ); }; var postProcess = function ( operationName ){ LOGGER . info ( \"{} operation finished\" , operationName ); }; rs . service () . resource ( \"\" ) . post ( function ( ctx , request , response ){ var fileItems = upload . parseRequest (); for ( var i = 0 ; i < fileItems . size (); i ++ ) { var filePath = env . get ( 'HOME' ) + '/' ; var content ; var fileItem = fileItems . get ( i ); if ( ! fileItem . isFormField ()) { filePath += fileItem . getName (); content = String . fromCharCode . apply ( null , fileItem . getBytes ()); } else { filePath += fileItem . getFieldName (); content = fileItem . getText (); } LOGGER . debug ( \"Creating file\" + filePath ); files . writeText ( filePath , content ); } response . setStatus ( response . CREATED ); }) . before ( function ( ctx , request , response , methodHandler , controller ){ var loggedUser = user . getName (); if ( files . getOwner ( ctx . pathParameters . fileName ) !== loggedUser ) controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), loggedUser + \" is not owner of \" + ctx . pathParameters . fileName ); }) . finally ( postProcess . bind ( this , \"Upload\" )) . consumes ([ \"multipart/form-data\" ]) . resource ( \"{fileName}\" ) . post ( function ( ctx , request , response ){ var content = request . getText (); var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Creating file \" + filePath ); files . writeText ( filePath , content ); files . setPermissions ( filePath , 'rw' ); response . setStatus ( response . CREATED ); }) . finally ( postProcess . bind ( this , \"Create\" )) . consumes ([ \"application/json\" ]) . get ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . error ( \"Reading file \" + filePath ); var content = files . readText ( filePath ); response . setStatus ( response . OK ); response . print ( content ); }) . before ( validateRequest . bind ( this , 'r' )) . finally ( postProcess . bind ( this , \"Read\" )) . produces ([ \"application/json\" ]) . put ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Updating file \" + filePath ); var content = request . getJSON (); files . deleteFile ( filePath ); files . writeText ( filePath , content ); response . setStatus ( response . ACCEPTED ); }) . finally ( postProcess . bind ( this , \"Update\" )) . before ( validateRequest . bind ( this , 'rw' )) . consumes ([ \"application/json\" ]) . remove ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Removing file \" + filePath ); files . deleteFile ( filePath ); response . setStatus ( response . NO_CONTENT ); }) . before ( validateRequest . bind ( this , 'w' )) . finally ( postProcess . bind ( this , \"Delete\" )) . execute (); You can find the complete documentation for http/rs and http/rs-data under the API page .","title":"REST"},{"location":"development/concepts/rest/#rest","text":"The http-rs module is designed to define and run a broad range of HTTP REST services. A very simple example hello-api.js : var rs = require ( \"http/v4/rs\" ); // serve GET HTTP requests sent to resource path \"\" (i.e. directly to hello-api.js) rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) . execute (); Sending a GET request to /services/v4/js/test/hello-api.js to the server and hosting the hello-api.js piece of code above in test/hello-api.js will return response body: Hello there!","title":"REST"},{"location":"development/concepts/rest/#overview","text":"Let\u2019s have a closer look at the methods shown in the example above. First, we requested a new REST service instance from the framework: rs.service() Next, we configured the instance to serve HTTP GET requests sent to root path (\"\") using the supplied function: . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . println ( \"Hello there!\" ); }) Technically, configuration is not required to execute a service, but obviously it will do nothing, if you don't instruct it what to do. Finally, we run the service and it processes the HTTP request: .execute(); Now, this is a fairly simplistic example aiming to give you a hint of how you can bring up a REST API to life with http-rs. There is a whole lot more that we shall explore in the next sections.","title":"Overview"},{"location":"development/concepts/rest/#creating-rest-services","text":"rs.service() Creating new service instances is as simple as invoking rs.service() . That returns a configurable and/or executable instance of the HttpController class. The controller API allows to: - start configuring REST service (method resource() ) - serve requests (method execute() ) - perform a couple of more advanced activities, which will be reviewed in the Advanced section below Additionally, the controller API features also shortcut factory methods that are useful for simplistic configurations (like the one in our initial example) such as get(sPath, fServe, arrConsume, arrProduces) . Read below for more examples how to use the methods.","title":"Creating REST services"},{"location":"development/concepts/rest/#serving-requests","text":"execute() The mechanism for serving requests is implemented in the execute() method of the HttpController. It tries to match the request to the service API configuration. If the mechanism matches the request successfully, it triggers the execution flow of the callback functions. The execution flow processes the request and response. If the mechanism doesn't match the request successfully, it sends a Bad Request error to the client. The request and response objects are implicitly those that were used to request the script where the execute() method invocation occurred. But they can be exchanged for others as shown in the Advanced section. The execute() method is defined in the service instance (class HttpController) obtained with rs.service() . The execute() method can be triggered with rs.service().execute() . The rs API configuration also provides numerous references to the method so you can invoke it on any stage. For example, rs . service (). get ( \"\" ). execute () rs . service (). resource ( \"\" ). get (). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). execute () are all valid ways to serve requests. What you need to consider is that execute() must be the final method invocation. Even if you retain a reference to a configuration object and change it after that, it will be irrelevant since the response will be flushed and closed by then.","title":"Serving requests"},{"location":"development/concepts/rest/#configuring-services","text":"There are three options as far as configuration is concerned. You can start from scratch and build the configuration using the rs API. You can use configuration objects. They are holding the configuration that the rs API produces. You can start with a configuration object and then enhance or override the configuration using the rs API. Configuration objects A configuration object is a JS object with canonical structure that the http-rs can interpret. We will discuss its schema later on in this guide. For now, let's just consider that it's the same thing that the rs-fluent API will actually produce behind the scenes so it's a completely valid alternative and complement to the rs-fluent API configuration approach. Refer to the Advanced section for more details on using configuration objects.","title":"Configuring services"},{"location":"development/concepts/rest/#defining-service-resources","text":"resource(sPath, oConfiguration?) Resources are the top-level configuration objects that represent an HTTP (server) resource , for which we will be defining a protocol. Each resource is identified by a URL on the server. You can have multiple resources per service configuration, provided that their URLs do not overlap. Resource vs Path vs Resource Path As per the REST terms, a resource is an abstraction or a server-side resource that can be a file, a dynamically generated content, or a procedure (although the last is considered heresy by purists). It's virtually anything hosted on a server that has an address and can be accessed with a standard HTTP method. It is often referred to as \"path\" or \"resource path\" due to its singular most notable identifying characteristic. But to be precise, \"path\" is only a property of the resource. As far as configuration is concerned, the resource defines the configuration scope for which we define method handlers and constraints, and is identifiable by its \"path\" property.","title":"Defining service resources"},{"location":"development/concepts/rest/#resource-paths-and-path-templates","text":"The sPath string parameter (mandatory) of the resource() method will serve as the resource URL. It is relative to the location where the JavaScript service is running (e.g. /services/v4/my-application/api/my-service.js ). No path ( \"\" ), request directly to the JavaScript service root ( \"\" ) path. The path can also be a URL template, i.e. parameterized. For example consider the path template: {id}/assets/{assetType}/{name} This will resolve request paths such as: /services/js/test.js/1/assets/longterm/building to service path: 1/assets/longterm/building If a request is matched to such path, the service mechanism will provide the resolved parameters as an object map to the function that handles the request. Using the sample path above the path parameters object will look like this: { \"id\" : 1 , \"assetType\" : \"longterm\" , \"name\" : \"building\" }","title":"Resource paths and path templates"},{"location":"development/concepts/rest/#defining-http-methods-allowed-for-a-resource","text":"resource . get () resource . post () resource . put () resource [ \"delete\" ]() and resource . remove () resource . method () By default, only the HTTP request methods that you have configured for a resource are allowed. The fluent API of Resource instances, obtained with the resource(sPath) method that we discussed above, exposes the most popular REST API methods ( get , post , put and delete ). They are simply aliases for the generic method method . Whichever we consider, we will receive a ResourceMethod instance from the invocation and its API will allow us to specify processing functions and further specify constraints on the request/response for which they are applicable: rs.resource('').get().produces([\"application/json\"]).serve(function(){}) Alternatively, as we have already seen, we can supply the serve callback function directly as first argument to the method, which comes in handy if we have nothing more to setup: rs.resource('').get(function(){}) We can also use configuration object as a third option and this will be discussed in the Advanced section. The samples here are all for configuring HTTP GET Method but the usage pattern is still the same for all: rs.resource('').post().consumes([\"application/json\"]).serve(function(){})","title":"Defining HTTP methods allowed for a resource"},{"location":"development/concepts/rest/#shortcuts","text":"You already noticed that instead of explicitly using serve to configure callback for serving the requests we could directly provide the function as argument to the method configuring the HTTP method (e.g. get ). rs.resource('').get(function(){}) rs.resource('').get().serve(function(){}) So why bother provisioning an explicit serve() function in the first place then? The answer is that serve() configures only one of the callback functions that are triggered during the request processing flow. And this shortcut is handy if it is only serve() that you are interested into configuring. Of course, nothing prevents you also from using the shortcut and still configure the other callback functions, unless you find it confusing. These are all valid options. Find out more about configuring request processing callback functions in the section dedicated to this. When the controller API was discussed, it was mentioned that there are shortcut factory methods that combine a couple operations to produce directly a method handler for a resource path. Example rs . service () . get ( \"\" , function ( ctx , request , response ) { response . print ( 'ok' ); }) . execute (); That would be equivalent to the following: rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ){ response . print ( 'ok' ); }) . execute (); These shortcut methods share the same names with those in Resource that are used for defining HTTP method handlers: get , post , put , delete and its alias remove , but differ in signature (first argument is a string for the resource path) and the return type (the HttpController instance, instead of ResourceMethod). They are useful as a compact mechanism if you intend to build something simple and simplistic, such as a single resource and one or few handler functions for it. You will not be able to go much further with this API so if you consider anything even slightly more sophisticated you should look into the fluent API of resource instead: rs.service().resource(\"\") . Note Note that the scope of these shortcut methods is the controller, not the resource. That has effect on the method chaining. For clean code, do not confuse despite the similar names and avoid mixing them.","title":"Shortcuts"},{"location":"development/concepts/rest/#defining-content-types-that-an-api-consumes-and-produces","text":"rs . resource ( \"\" ). get (). produces ( \"[application/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[application/json]\" ) rs . resource ( \"\" ). put (). consumes ( \"[application/json]\" ). produces ( \"[application/json]\" ) Optionally, but also quite likely, we will add some constraints on the data type consumed and produced by the resource method handler that we configure. At request processing runtime, these constraints will be matched for compatibility against the HTTP request headers before delegating to the handler processing function. You can use wildcards (*) in the MIME types arguments both for type and sub-type and it will be considered as anything during the execution: rs . resource ( \"\" ). post (). consumes ( \"[*/json]\" ) rs . resource ( \"\" ). post (). consumes ( \"[*/*]\" )","title":"Defining content types that an API consumes and produces"},{"location":"development/concepts/rest/#request-processing-flow-and-events","text":"Before we continue, let us take a look at the request processing flow. The request is matched against the resource method handling definitions in the configuration and if there is a compatible one it is elicited for execution. Otherwise, a Bad request error is returned back to the client. The before callback function is invoked if any was configured. The serve callback function is invoked if any was configured. If an Error was thrown from the serve function, a catch callback function is invoked. The callback function is either configured or the default one. A finally (always executed) function is invoked if one was configured. Or in pseudocode: try { before ( ctx , request , response , resourceMethod , controller ); serve ( ctx , request , response , resourceMethod , controller ); } catch ( err ){ catch ( ctx , err , request , response , resourceMethod , controller ); } finally { finally (); } As evident form the flow, it is only the serve event callback handler function that is required to be setup. But if you require fine grained reaction to the other events, you can configure handlers for each of those you are interested in. Currently, the API supports a single handler function per event so in multiple invocation of a setup method on the same resource method only the last will matter.","title":"Request processing flow and events"},{"location":"development/concepts/rest/#defining-event-handling-functions","text":"resource . get (). before ( function ( ctx , request , response , resourceMethod , controller ){ //Implements pre-processing logic }) resource . get (). serve ( function ( ctx , request , response , resourceMethod , controller ){ //Implements request-processing logic }) resource . get (). catch ( function ( ctx , error , request , response , resourceMethod , controller ){ //Implements error-processing logic overriding the default }) resource . get (). finally ( function (){ //Implements post-processing logic regardless of error or success of the serve function }) A valid, executable resource method configuration requires at least the serve callback function to be setup: resource . get (). serve ( function ( ctx , request , response ){ response . println ( 'OK' ); }); The rest are optional and/or have default implementations. Errors thrown from the before and serve callbacks are delegated to the catch callback. There is a default catch callback that sends formatted error back in the response and it can be overridden using the catch method to setup another error processing logic. The finally callback is invoked after the response has been flushed and closed (regardless if in error or success) and can be used to cleanup resources. Example: rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response ){ request . setHeader ( 'X-arestme-version' , '1.0' ); }) . serve ( function ( ctx , request , response ){ response . println ( 'Serving GET request' ); }) . catch ( function ( ctx , err , request , response ){ console . error ( err . message ); }) . finally ( function (){ console . info ( 'GET request processing finished' ); })","title":"Defining event handling functions"},{"location":"development/concepts/rest/#advanced","text":"","title":"Advanced"},{"location":"development/concepts/rest/#using-configuration-objects","text":"Configuration objects are particularly useful when you are enhancing or overriding an existing protocol so you don't start configuring from scratch but rather amend or change pieces of the configuration. It is also useful when you are dealing with dynamically generated HTTP-based protocol configurations. For example, consider the simple sample that we started with. It is completely identical with this one, which uses a configuration object and provides it to the service function: rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }). execute (); It is also completely identical with this one: rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); or this one: rs . service () . resource ( \"\" ) . get ([{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }]). execute (); In fact, here is a sample how to define a whole API providing configuration directly to the service method and then enhance it. rs . service ({ \"\" : { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] } }) . resource ( \"\" ) . post () . serve ( function ( ctx , request , response ){ console . info ( request . readText ()); }) . execute (); In this way we essentially are exploiting the fluent API to configure a service but we will not start from scratch. Many of the API methods accept as a second argument configuration object and this doesn't prevent you to continue the API design with fluent API to enhance or override it.","title":"Using configuration objects"},{"location":"development/concepts/rest/#the-senderror-method-in-httpcontroller","text":"The HttpController class instances that we receive when rs.service() is invoked, features a sendError method. It implements the logic for formatting errors and returning them back to the client taking into account its type and content type preferences. Should you require to change this behavior globally you can redefine the function. If you require different behavior for particular resources or resource method handlers, then using the catch callback is the better approach. Sometimes it's useful to reuse the method and send error in your handler functions. The standard request processing mechanism in HttpController does not account for logical errors. It doesn't know for example that a parameter form a client input is out of valid range. For such cases you would normally implement validation either in before event handler or in serve. And if you need tighter control on what is sent back, e.g. the HTTP code you wouldn't simply throw an Error but invoke the sendError function with the right parameters yourself. For these purposes the last argument of each event handler function is conveniently the controller instance. rs . service (). resource ( \"\" ) . get () . before ( function ( ctx , request , response , methodHandler , controller ){ //check if requested file exists if ( ! file . exists ()){ controller . sendError (); } }) . serve ( function (){ //return file content })","title":"The sendError method in HttpController"},{"location":"development/concepts/rest/#defining-readonly-apis","text":"mappings.readonly() An obvious way of defining readonly APIs is to use only GET resource methods definitions. In some cases though APIs can be created from external configuration that also contains other resource method handlers, or we can receive an API instance from another module factory, or we want to support two instances of the same API, one readonly and one with edit capabilities, with minimal code. In such cases, we already have non-GET resource methods that we have to get rid of somehow. Here the readonly method steps in and does exactly this - removes all but the GET resource handlers if any. Example: rs . service () . resource ( \"\" ) . post () . serve ( function (){}); . get () . serve ( function (){}); . readonly () . execute (); If you inspect the configuration after .readonly() is invoked (use resource(\"\").configuration() ) you will notice that the post verb definition is gone. Consecutively, POST requests to this resource will end up in Bad Request (400). Note that for this to work, this must be the last configuration action for a resource. Otherwise, you are resetting the resource configuration to readonly, only to define write methods again. The readonly method is available both for ResourceMapping and Resource objects returned by either invocations of service mappings() method or retained references from configuration API invocations.","title":"Defining readonly APIs"},{"location":"development/concepts/rest/#disabling-a-resourcemethod-handler","text":"api.disable(sPath, sVerb, arrConsumesTypes, arrProducesTypes) Similar to the use cases explored for the readonly method above yo might not be in full control of the definition of the API, but rather takeover at some point. Similar to the readonly method, this one will remove the handler definition identified by the four parameters - resource path, resource verb, consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order), but it will do it for any verb, not only GET . In that sense readonly is a specialization of this one only for GET verbs. Example: var mappings = rs . service ({ \"\" : { \"post\" : [{ serve : function (){} }], \"get\" : [{ serve : function (){} }] } }). mappings (); mappings . disable ( \"\" , \"post\" ); With this API definition, invoking mappings.find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var mappings = rs . service (). get ( function (){}). mappings (); //later in code var handler = mappings . find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); }","title":"Disabling a ResourceMethod Handler"},{"location":"development/concepts/rest/#executing-service-with-explicit-requestresponse-arguments","text":"The request and response parameters of the execute method are optional. If you don't supply them, the service will take the request/response objects that were used to request the script. Most of the time this is what you want. However, supplying your own request and response arguments can be very handy for testing as you can easily mock and inspect them.","title":"Executing service with explicit request/response arguments"},{"location":"development/concepts/rest/#fluency-for-execute-method","text":"The execute method is defined by the service instance (HttpController) obtained with rs.service() and can be executed with: rs.service().execute() . The fluent configuration API also provides references to the method, so you can actually invoke it on any stage. Examples: rs . service (). resource ( \"\" ). get ( function (){}). execute () rs . service (). resource ( \"\" ). get (). serve ( function (){}). execute () rs . service (). resource ( \"\" ). get (). produces ([ \"text/json\" ]). serve ( function (){}). execute () rs . service () . resource ( \"\" ) . produces ([ \"application/json\" ]) . get ( function (){}) . resource ( \"\" ) . consumes ([ \"*/json\" ]) . post ( function (){}) . execute ()","title":"Fluency for execute method"},{"location":"development/concepts/rest/#mappings-vs-configurations","text":"The API supplies two methods mappings() and configuration() that provide configuration in two forms. The mappings method supplies typed API objects such as Resource aggregating ResourceMethod instances. To get a reference to a service mappings, invoke mappings on the service instance: rs.service().mappings() With a reference to mappings you have their fluent API at disposal. This is useful when extending and enhancing the core rs functionality to build dedicated services. For example the HttpController constructor function is designed to accept mappings and if you extend or initialize it internally in another API you will likely need this form of configuration. An invocation of the configuration method on the other hand provides the underlying JS configuration object. It can be used to supply generic configurations that are used to initialize new types of services as the public fluent API is designed to accept this form of configuration. Both are represent configuration but while the mappings are sort of internal, parsed version, the configuration object is the version that the public api accepts and is also therefore kind of advanced public form of the internal configuration. It is also possible to convert between the two: rs . service ( jsConfig ). mappings () rs . service (). resource (). configuration ()","title":"Mappings vs Configurations"},{"location":"development/concepts/rest/#finding-a-resourcemethod","text":"rs.service().mappings().find(sPath, sMethod, arrConsumesTypes, arrProducesTypes) Suppose you want to redefine a handler definition to e.g. change the serve callback, add a before handler, change or add to the consumes media types constraint etc. To do that you need a reference to the handler, which is identified by the four parameters - resource path , resource method , consumes constraint array (not necessarily in same order), produces constraint array (not necessarily in same order). On a successful search hit you get a reference to the handler definition and can perform changes on it. Example: rs . service () . resource ( \"\" ) . get ( function (){}); With this API definition, invoking rs.service().mappings().find(\"\",\"get\") will return a reference to the only get handler defined there and you can manage it. Note that you get a reference to the configuration and not an API. Example: // add produces constraint and redefine the serve callback var handler = svc . mappings (). find ( \"\" , \"get\" ); handler . produces = [ \"application/json\" ] handler . serve = function (){ console . info ( \"I was redefined\" ); } With consumes and produces constraints on a resource method handler, getting a reference will require them specified too. Example: var svc = rs . service (); svc . mapings () . resource ( \"\" ) . post () . consumes ([ 'application/json' , 'text/json' ]) . produces ([ 'application/json' ]) . serve ( function (){}); var handler = svc . mapings (). find ( \"\" , \"post\" , [ 'text/json' , 'application/json' ], [ 'application/json' ]); Note, that the order of the MIME type string values in the consumes/produces array parameters is not significant. They will be sorted before matching the sorted corresponding arrays in the definition.","title":"Finding a ResourceMethod"},{"location":"development/concepts/rest/#configuring-resource-with-js-object","text":"Having defined a resource with path we have two options for configuring it. We can proceed using its fluent API or we can provision a configuration JS object as second argument to the resource method and have it done in one step. Considering the latter, we will be provisioning configuration for this resource only, so it should be an object with method definitions as root members. rs . service () . resource ( \"\" , { \"get\" : [{ \"serve\" : function ( ctx , request , response ){ response . println ( \"Hello there!\" ); } }] }). execute (); Refer to the next sections for comparison how to achieve the same, using fluent API and/or configuration objects on the lower levels.","title":"Configuring resource with JS object"},{"location":"development/concepts/rest/#js-configuration-object-schema","text":"In progress. Check back later. Schema: { pathString : { methodString : [{ \"consumes\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"produces\" : [ \"types/subtype|*/subtype|type/*|*/*\" ] \"before\" : Function \"serve\" : Function \"catch\" : Function \"finally\" : Function }] } } pathString is a string that represents the resource path. There could be 0 or more such non-overlapping members. methodString is a string for the HTTP resource method. There could be 0 or more such non-overlapping members. The value of methodString is an array of 0 or more objects, each defining a request method processing that will be executed under unique conditions (constraints) that match the request. A component in the methodString array, can consist of constraints (consumes, produces) and request processing flow event handlers (before, serve, catch, finally) consumes value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. produces value is an array of 0 or more strings, each a valid MIME type string formatted as types/subtype. Can be undefined. before , serve , catch and finally values are functions. Except for the serve function, the rest can be undefined .","title":"JS Configuration object schema"},{"location":"development/concepts/rest/#building-a-crud-rest-service","text":"The code snippet below shows a sample design for a REST API for simple CRUD file operations. It has illustrative purposes. The service design is to work with files in the HOME directory of the user that runs the dirigible instance currently. Users can create, read, update and delete files by sending corresponding POST, GET, PUT and DELETE requests using the file name as path segment (e.g. /services/js/file-serivce.js/test.json ) and they can also upload files if they don't specify file name but send multipart-form-data POST request directly to the service (e.g. /services/js/file-serivce.js ). Note how the before handler is used to validate user has permissions on resources and how it makes use of controller's sendError method. var LOGGER = require ( \"log/v4/logging\" ). getLogger ( 'http.filesvc' ); var rs = require ( \"http/v4/rs\" ); var upload = require ( 'http/v4/upload' ); var files = require ( 'io/v4/files' ); var user = require ( 'security/v4/user' ); var env = require ( 'core/v4/env' ); var validateRequest = function ( permissions , ctx , request , response , methodHandler , controller ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; if ( ! files . exists ( filePath )){ LOGGER . info ( \"Requested file \" + filePath + \" does not exist.\" ); controller . sendError ( response . NOT_FOUND , undefined , response . HttpCodesReasons . getReason ( String ( response . NOT_FOUND )), ctx . pathParameters . fileName + \" does not exist.\" ); return ; } if ( permissions ){ var resourcePermissions = files . getPermissions ( filePath ); if ( resourcePermissions !== null && resourcePermissions . indexOf ( permissions ) >- 1 ){ var loggedUser = user . getName (); LOGGER . error ( \"User {} does not have sufficient permissions[{}] for {}\" , loggedUser , files . getPermissions ( filePath ), filePath ); controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), \"User \" + loggedUser + \" does not have sufficient permissions for \" + ctx . pathParameters . fileName ); return ; } } LOGGER . error ( 'validation successfull' ); }; var postProcess = function ( operationName ){ LOGGER . info ( \"{} operation finished\" , operationName ); }; rs . service () . resource ( \"\" ) . post ( function ( ctx , request , response ){ var fileItems = upload . parseRequest (); for ( var i = 0 ; i < fileItems . size (); i ++ ) { var filePath = env . get ( 'HOME' ) + '/' ; var content ; var fileItem = fileItems . get ( i ); if ( ! fileItem . isFormField ()) { filePath += fileItem . getName (); content = String . fromCharCode . apply ( null , fileItem . getBytes ()); } else { filePath += fileItem . getFieldName (); content = fileItem . getText (); } LOGGER . debug ( \"Creating file\" + filePath ); files . writeText ( filePath , content ); } response . setStatus ( response . CREATED ); }) . before ( function ( ctx , request , response , methodHandler , controller ){ var loggedUser = user . getName (); if ( files . getOwner ( ctx . pathParameters . fileName ) !== loggedUser ) controller . sendError ( response . UNAUTHORIZED , undefined , response . HttpCodesReasons . getReason ( String ( response . UNAUTHORIZED )), loggedUser + \" is not owner of \" + ctx . pathParameters . fileName ); }) . finally ( postProcess . bind ( this , \"Upload\" )) . consumes ([ \"multipart/form-data\" ]) . resource ( \"{fileName}\" ) . post ( function ( ctx , request , response ){ var content = request . getText (); var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Creating file \" + filePath ); files . writeText ( filePath , content ); files . setPermissions ( filePath , 'rw' ); response . setStatus ( response . CREATED ); }) . finally ( postProcess . bind ( this , \"Create\" )) . consumes ([ \"application/json\" ]) . get ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . error ( \"Reading file \" + filePath ); var content = files . readText ( filePath ); response . setStatus ( response . OK ); response . print ( content ); }) . before ( validateRequest . bind ( this , 'r' )) . finally ( postProcess . bind ( this , \"Read\" )) . produces ([ \"application/json\" ]) . put ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Updating file \" + filePath ); var content = request . getJSON (); files . deleteFile ( filePath ); files . writeText ( filePath , content ); response . setStatus ( response . ACCEPTED ); }) . finally ( postProcess . bind ( this , \"Update\" )) . before ( validateRequest . bind ( this , 'rw' )) . consumes ([ \"application/json\" ]) . remove ( function ( ctx , request , response ){ var filePath = env . get ( 'HOME' ) + '/' + ctx . pathParameters . fileName ; LOGGER . debug ( \"Removing file \" + filePath ); files . deleteFile ( filePath ); response . setStatus ( response . NO_CONTENT ); }) . before ( validateRequest . bind ( this , 'w' )) . finally ( postProcess . bind ( this , \"Delete\" )) . execute (); You can find the complete documentation for http/rs and http/rs-data under the API page .","title":"Building a CRUD rest service"},{"location":"development/concepts/web-content/","text":"Web Content Overview The Web content includes all the static client-side resources, such as HTML files, CSS, and related theme ingredients, as well as the dynamic scripts and the images. In general, a Web content adapter plays the role of a tunnel that takes the desired resource location from the request path, loads the corresponding content from the repository, and sends it back without any modification. By default, the Web content adapter accepts requests to particular resources and responds with an error code to requests to whole collections. This way, the Web content adapter indicates that folder listing is forbidden. Note If the specific application/json Accept header is supplied with the request itself, then a JSON formatted array with sub-folders and resources will be returned. To boost developer productivity in the most common cases, we provide a set of templates that can help during UI creation. There is a set of templates that can be used with the entity services , a list of entities, master-detail, input form, and so on. The other templates can be used as utilities for the creation an application shell in index.html with main menu or as samples that show the most common controls on different AJAX UI frameworks, such as jQuery , Bootstrap , AngularJS , and OpenUI5 .","title":"Web Content"},{"location":"development/concepts/web-content/#web-content","text":"","title":"Web Content"},{"location":"development/concepts/web-content/#overview","text":"The Web content includes all the static client-side resources, such as HTML files, CSS, and related theme ingredients, as well as the dynamic scripts and the images. In general, a Web content adapter plays the role of a tunnel that takes the desired resource location from the request path, loads the corresponding content from the repository, and sends it back without any modification. By default, the Web content adapter accepts requests to particular resources and responds with an error code to requests to whole collections. This way, the Web content adapter indicates that folder listing is forbidden. Note If the specific application/json Accept header is supplied with the request itself, then a JSON formatted array with sub-folders and resources will be returned. To boost developer productivity in the most common cases, we provide a set of templates that can help during UI creation. There is a set of templates that can be used with the entity services , a list of entities, master-detail, input form, and so on. The other templates can be used as utilities for the creation an application shell in index.html with main menu or as samples that show the most common controls on different AJAX UI frameworks, such as jQuery , Bootstrap , AngularJS , and OpenUI5 .","title":"Overview"},{"location":"development/concepts/workspace/","text":"Workspace The workspace is the developer's place where you create and manage the application artifacts. The first-level citizens of the workspace are the projects. Each project can contain multiple folders and files (artifacts). A single user can have multiple workspaces that contain different sets of projects. The artifacts, i.e. the project management, can be done via the views and editors in the Workbench perspective .","title":"Workspace"},{"location":"development/concepts/workspace/#workspace","text":"The workspace is the developer's place where you create and manage the application artifacts. The first-level citizens of the workspace are the projects. Each project can contain multiple folders and files (artifacts). A single user can have multiple workspaces that contain different sets of projects. The artifacts, i.e. the project management, can be done via the views and editors in the Workbench perspective .","title":"Workspace"},{"location":"development/extensions/","text":"Extensions Overview Extensibility Extensibility is an important requirement for business applications built to follow custom processes in Line of Business(LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions. To learn more about the Extensions concept, click here Extension Points IDE ide-perspective ide-view ide-editor ide-template ide-menu ide-themes ide-workspace-menu-new-template api-modules ide-operations-menu ide-documents-content-type ide-documents-menu ide-git-menu ide-terminal-menu ide-discussions-menu ide-database-menu ide-repository-menu Server ide-workspace-on-save ide-workspace-before-publish ide-workspace-after-publish ide-workspace-before-unpublish ide-workspace-after-unpublish Events IDE editor.file.saved editor.file.dirty status.message status.caret status.error database.database.selection.changed database.datasource.selection.changed database.sql.execute database.sql.run git.repository.run workspace.file.selected workspace.file.created workspace.file.open workspace.file.pull workspace.file.deleted workspace.file.renamed workspace.file.moved workspace.file.copied workspace.file.properties workspace.file.published workspace.project.exported repository.resource.selected repository.resource.created repository.resource.open repository.resource.deleted","title":"Extensions Overview"},{"location":"development/extensions/#extensions-overview","text":"","title":"Extensions Overview"},{"location":"development/extensions/#extensibility","text":"Extensibility is an important requirement for business applications built to follow custom processes in Line of Business(LoB) areas. In the cloud toolkit, a generic description of the extension points and extensions is provided without explicitly defining the contract. This a simple but powerful way to define extensions. To learn more about the Extensions concept, click here","title":"Extensibility"},{"location":"development/extensions/#extension-points","text":"","title":"Extension Points"},{"location":"development/extensions/#ide","text":"ide-perspective ide-view ide-editor ide-template ide-menu ide-themes ide-workspace-menu-new-template api-modules ide-operations-menu ide-documents-content-type ide-documents-menu ide-git-menu ide-terminal-menu ide-discussions-menu ide-database-menu ide-repository-menu","title":"IDE"},{"location":"development/extensions/#server","text":"ide-workspace-on-save ide-workspace-before-publish ide-workspace-after-publish ide-workspace-before-unpublish ide-workspace-after-unpublish","title":"Server"},{"location":"development/extensions/#events","text":"","title":"Events"},{"location":"development/extensions/#ide_1","text":"editor.file.saved editor.file.dirty status.message status.caret status.error database.database.selection.changed database.datasource.selection.changed database.sql.execute database.sql.run git.repository.run workspace.file.selected workspace.file.created workspace.file.open workspace.file.pull workspace.file.deleted workspace.file.renamed workspace.file.moved workspace.file.copied workspace.file.properties workspace.file.published workspace.project.exported repository.resource.selected repository.resource.created repository.resource.open repository.resource.deleted","title":"IDE"},{"location":"development/extensions/editor/","text":"Editor Descriptors To contribute a new Editor (text-based or form-based) to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-editor.extension { \"module\" : \"my-project/services/my-editor.js\" , \"extensionPoint\" : \"ide-editor\" , \"description\" : \"The description of my editor\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute. my-editor.js exports . getView = function () { var view = { name : \"My Editor\" , factory : \"frame\" , region : \"center-top\" , link : \"../my-project/index.html\" , contentTypes : [ \"application/json\" ] }; return view ; }; name - The exact name of the view, which will be shown in the e.g. menu. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. contentTypes - The content types array of supported files. The project structure in this case should look like this: | my-project |---- extensions |----> my-editor.extension |---- services |----> my-editor.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project Implementation < html lang = \"en\" ng-app = \"editor\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1.0\" > < meta name = \"description\" content = \"\" > < meta name = \"author\" content = \"\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/bootstrap.min.css\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/web/resources/font-awesome-4.7.0/css/font-awesome.min.css\" > < link type = \"image/png\" rel = \"shortcut icon\" href = \"../../../../../services/v4/web/resources/images/favicon.png\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/ide.css\" /> < body ng-controller = \"EditorController\" > < div class = \"container\" > < div class = \"page-header\" > < h1 > My Editor Description: {{file}} < form > < div class = \"form-group\" > < label > Group < input type = \"text\" class = \"form-control\" ng-model = \"myModel.group\" value = \"\" > ... < button type = \"button\" class = \"btn btn-primary\" ng-click = \"save()\" > Save < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/jquery/2.0.3/jquery.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/bootstrap/3.3.7/bootstrap.min.js\" async > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular-resource.min.js\" > < script src = \"../../../../../services/v4/web/ide-core/ui/message-hub.js\" > < script type = \"text/javascript\" src = \"editor.js\" > For \u0430 real world example you can look at Jobs Plugin or Monaco Editor .","title":"Editor"},{"location":"development/extensions/editor/#editor","text":"","title":"Editor"},{"location":"development/extensions/editor/#descriptors","text":"To contribute a new Editor (text-based or form-based) to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/editor/#my-editorextension","text":"{ \"module\" : \"my-project/services/my-editor.js\" , \"extensionPoint\" : \"ide-editor\" , \"description\" : \"The description of my editor\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute.","title":"my-editor.extension"},{"location":"development/extensions/editor/#my-editorjs","text":"exports . getView = function () { var view = { name : \"My Editor\" , factory : \"frame\" , region : \"center-top\" , link : \"../my-project/index.html\" , contentTypes : [ \"application/json\" ] }; return view ; }; name - The exact name of the view, which will be shown in the e.g. menu. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. contentTypes - The content types array of supported files. The project structure in this case should look like this: | my-project |---- extensions |----> my-editor.extension |---- services |----> my-editor.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project","title":"my-editor.js"},{"location":"development/extensions/editor/#implementation","text":" < html lang = \"en\" ng-app = \"editor\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1.0\" > < meta name = \"description\" content = \"\" > < meta name = \"author\" content = \"\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/bootstrap.min.css\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/web/resources/font-awesome-4.7.0/css/font-awesome.min.css\" > < link type = \"image/png\" rel = \"shortcut icon\" href = \"../../../../../services/v4/web/resources/images/favicon.png\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"../../../../../services/v4/js/theme/resources.js/ide.css\" /> < body ng-controller = \"EditorController\" > < div class = \"container\" > < div class = \"page-header\" > < h1 > My Editor Description: {{file}} < form > < div class = \"form-group\" > < label > Group < input type = \"text\" class = \"form-control\" ng-model = \"myModel.group\" value = \"\" > ... < button type = \"button\" class = \"btn btn-primary\" ng-click = \"save()\" > Save < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/jquery/2.0.3/jquery.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/bootstrap/3.3.7/bootstrap.min.js\" async > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular.min.js\" > < script type = \"text/javascript\" src = \"../../../../../services/v4/web/resources/angular/1.4.7/angular-resource.min.js\" > < script src = \"../../../../../services/v4/web/ide-core/ui/message-hub.js\" > < script type = \"text/javascript\" src = \"editor.js\" > For \u0430 real world example you can look at Jobs Plugin or Monaco Editor .","title":"Implementation"},{"location":"development/extensions/perspective/","text":"Perspective Descriptors To contribute a new Perspective to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-perspective.extension { \"module\" : \"my-project/services/my-perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"The description of my perspective\" } module - Points to the corresponding perspective descriptor (see below). extensionPoint - Where and how this perspective will be shown. Some of the possible values are: ide-perspective ide-view ide-editor ide-database-menu ide-documents-content-type ide-workspace-menu-new-template my-perspective.js exports . getPerspective = function () { var perspective = { name : \"My Perspective\" , link : \"../my-project/index.html\" , order : \"901\" , image : \"files-o\" }; return perspective ; }; name - The exact name of the perspective. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a perspective order - Used to sort the perspective tabs in the sidebar. image - The name of the image which will be used for this perspective. This is a Font awesome icon name. The project structure in this case should look like this: | my-project |---- extensions |----> my-perspective.extension |---- services |----> my-perspective.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project. Implementation In general you can embed any valid HTML in the index.html file and it will be rendered in the place where the perspective should be embedded. For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Database Perspective Project .","title":"Perspective"},{"location":"development/extensions/perspective/#perspective","text":"","title":"Perspective"},{"location":"development/extensions/perspective/#descriptors","text":"To contribute a new Perspective to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/perspective/#my-perspectiveextension","text":"{ \"module\" : \"my-project/services/my-perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"The description of my perspective\" } module - Points to the corresponding perspective descriptor (see below). extensionPoint - Where and how this perspective will be shown. Some of the possible values are: ide-perspective ide-view ide-editor ide-database-menu ide-documents-content-type ide-workspace-menu-new-template","title":"my-perspective.extension"},{"location":"development/extensions/perspective/#my-perspectivejs","text":"exports . getPerspective = function () { var perspective = { name : \"My Perspective\" , link : \"../my-project/index.html\" , order : \"901\" , image : \"files-o\" }; return perspective ; }; name - The exact name of the perspective. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a perspective order - Used to sort the perspective tabs in the sidebar. image - The name of the image which will be used for this perspective. This is a Font awesome icon name. The project structure in this case should look like this: | my-project |---- extensions |----> my-perspective.extension |---- services |----> my-perspective.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project.","title":"my-perspective.js"},{"location":"development/extensions/perspective/#implementation","text":"In general you can embed any valid HTML in the index.html file and it will be rendered in the place where the perspective should be embedded. For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Database Perspective Project .","title":"Implementation"},{"location":"development/extensions/template/","text":"Template Descriptors To contribute a new Template to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-template.extension { \"module\" : \"my-project/services/my-template.js\" , \"extensionPoint\" : \"ide-template\" , \"description\" : \"The description of my template\" } module - Points to the corresponding template descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute. my-template.js exports . getTemplate = function () { var template = { name : \"My Template\" , description : \"My cool template\" , extension : \"myfile\" , sources : [ { location : \"/my-project/my-source.template\" , action : \"generate\" , rename : \"{{fileName}}.\" , engine : \"velocity\" , start : \"[[\" , end : \"]]\" } ], parameters : [] }; return template ; }; name - The exact name of the template, which will be shown in drop-down boxes. description - Text associated with the template. extension - Optional, if present the template will be shown only if a given file with the specified extension is selected. sources - The list of the templates which will be used during the generation phase. location - The relative path to the template. action - The type of the processing which will be used for this templates. rename - If renaming of the target artifact will be needed. engine - The template engine which will be used for this template - \"mustache\" (default), \"velocity\" and \"javascript\". start and end - Tags if the default \"{{\" and \"}}\" are not applicable. handler - The javascript transformation service, in case of javascript engine. parameters - The list of parameters if any which will be passed to the generator. The project structure in this case should look like this: | my-project |---- extensions |----> my-template.extension |---- services |----> my-template.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project Implementation < html xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < title > ${fileName} < body ng-app = \"my-view\" ng-controller = \"MyController as controller\" class = \"view\" > < form class = \"input-group\" name = \"myForm\" > < span class = \"input-group-btn\" > < button class = \"btn btn-default\" type = \"button\" ng-click = \"myClick()\" >< i class = \"fa fa-bolt\" > For \u0430 real world example you can look at Bookstore Template","title":"Template"},{"location":"development/extensions/template/#template","text":"","title":"Template"},{"location":"development/extensions/template/#descriptors","text":"To contribute a new Template to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/template/#my-templateextension","text":"{ \"module\" : \"my-project/services/my-template.js\" , \"extensionPoint\" : \"ide-template\" , \"description\" : \"The description of my template\" } module - Points to the corresponding template descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will contribute.","title":"my-template.extension"},{"location":"development/extensions/template/#my-templatejs","text":"exports . getTemplate = function () { var template = { name : \"My Template\" , description : \"My cool template\" , extension : \"myfile\" , sources : [ { location : \"/my-project/my-source.template\" , action : \"generate\" , rename : \"{{fileName}}.\" , engine : \"velocity\" , start : \"[[\" , end : \"]]\" } ], parameters : [] }; return template ; }; name - The exact name of the template, which will be shown in drop-down boxes. description - Text associated with the template. extension - Optional, if present the template will be shown only if a given file with the specified extension is selected. sources - The list of the templates which will be used during the generation phase. location - The relative path to the template. action - The type of the processing which will be used for this templates. rename - If renaming of the target artifact will be needed. engine - The template engine which will be used for this template - \"mustache\" (default), \"velocity\" and \"javascript\". start and end - Tags if the default \"{{\" and \"}}\" are not applicable. handler - The javascript transformation service, in case of javascript engine. parameters - The list of parameters if any which will be passed to the generator. The project structure in this case should look like this: | my-project |---- extensions |----> my-template.extension |---- services |----> my-template.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project","title":"my-template.js"},{"location":"development/extensions/template/#implementation","text":" < html xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < title > ${fileName} < body ng-app = \"my-view\" ng-controller = \"MyController as controller\" class = \"view\" > < form class = \"input-group\" name = \"myForm\" > < span class = \"input-group-btn\" > < button class = \"btn btn-default\" type = \"button\" ng-click = \"myClick()\" >< i class = \"fa fa-bolt\" > For \u0430 real world example you can look at Bookstore Template","title":"Implementation"},{"location":"development/extensions/view/","text":"View Descriptors To contribute a new View to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project: my-view.extension { \"module\" : \"my-project/services/my-view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"The description of my view\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will be shown initially. my-view.js exports . getView = function () { var view = { id : \"my-view\" , name : \"My View\" , factory : \"frame\" , region : \"center-bottom\" , label : \"My View\" , link : \"../my-project/index.html\" }; return view ; }; id - The unique id of the view. name - The exact name of the view. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. label - The name which will be used in the heading bar. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. The project structure in this case should look like this: | my-project |---- extensions |----> my-view.extension |---- services |----> my-view.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project. Implementation For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Preview View .","title":"View"},{"location":"development/extensions/view/#view","text":"","title":"View"},{"location":"development/extensions/view/#descriptors","text":"To contribute a new View to the Web IDE you need to create one model ( *.extension ) and one descriptor (in *.js ) files in your project:","title":"Descriptors"},{"location":"development/extensions/view/#my-viewextension","text":"{ \"module\" : \"my-project/services/my-view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"The description of my view\" } module - Points to the corresponding view descriptor (see below). extensionPoint - The name of the built-in extension point to which the current plugin will be shown initially.","title":"my-view.extension"},{"location":"development/extensions/view/#my-viewjs","text":"exports . getView = function () { var view = { id : \"my-view\" , name : \"My View\" , factory : \"frame\" , region : \"center-bottom\" , label : \"My View\" , link : \"../my-project/index.html\" }; return view ; }; id - The unique id of the view. name - The exact name of the view. factory - The type of the factory used during instantiating the view. region - The region where the view will be placed initially. label - The name which will be used in the heading bar. link - The location within the same or external project pointing to the entry HTML file which will be rendered as a view. The project structure in this case should look like this: | my-project |---- extensions |----> my-view.extension |---- services |----> my-view.js |---- index.html |---- js |---- css |---- ... The names of the extensions and services can be different following the layout of your project.","title":"my-view.js"},{"location":"development/extensions/view/#implementation","text":"For a full example you can look at sample-ide-perspective . For \u0430 real world example you can look at Preview View .","title":"Implementation"},{"location":"development/ide/","text":"IDE Overview Web IDE The Web-based integrated development environment (Web IDE) runs directly in a browser and, therefore, does not require additional downloads and installations. It has a rich set of editors, viewers, wizards, DevOps productivity tools, and a new Web IDE for in-system application development. The Web IDE is a composition of perspectives, each consisting of the necessary tools to accomplish a certain goal. Three of the UI elements retain their positions in all perpectives: top-area toolbar for the menus, theme selection, and user control sidebar on the left with shortcuts to the perspectives status bar at the bottom, for notifications and other use by the tools The tools that constitute the perspectives are laid out in predefined regions of the work plot, but you can change their position using drag and drop. The perspectives are simply predefined configurations, hence you can open, move, or close different tools on the work plot of a perspective for your convenience. You can also be maximize, minimize, or even pop out any of the tools in a separate window. The tools are the smallest atomic parts in the Web IDE. They are referred to as views or editors, and each type is handled differently. Perspectives By default, the different views and editors are separated into a few perspectives: Workbench Git Database Repository Terminal Operations Documents Debugger Views Each perspective is comprised of different views. Learn more about them following the list below: Snapshot Debugger Roles Jobs Documents Git Preview Workspace SQL Extensions Terminal Variables Breakpoints Console Logs Data Structures Access Listeners Database Search Import Registry Repository Editors Monaco is the editor integrated into the Eclipse Dirigible Web IDE. Modelers There are some more sophisticated visual editors: BPMN Modeler Database Schema Modeler Entity Data Modeler Form Designer Layouts The Web IDE layout API delegates the layout management to the GoldenLayout framework. Layouts is a convenience bag of functions that significantly simplifies the work with layouts. It takes care of views registry setup, the work plot regions configuration, layout initialization, serialization, control on the layout manager, open view and open editor functions, global notifications, and others. The top-area toolbar is a composite that aggregates the drop-down menus, the theme selection, the user name, and sign-out control. It uses the corresponding UI microservices available in the ideUiCore module as Menu, User, and Theme. By convention, all UI components are built with Bootstrap 3.x CSS and the themes in the Web IDE are actually custom Bootstrap CSS. A UI microservice enables dynamic change of the CSS upon change of the theme automatically. It is available as Angular factory theme. The Angular service User provides the details for the user that are rendered by the Menu directive, such as the user name. The sidebar is Angular directive that takes care of rendering a standard sidebar in the framework template. It works with the perspectives.js service to populate the registered perspectives as shortcuts. The status bar is an Angular directive that renders a standard, fixed-position footer. The component is subscribed to listen to message types configured as value of the status-bar-topic attribute, or by default to status-message messages.","title":"IDE Overview"},{"location":"development/ide/#ide-overview","text":"","title":"IDE Overview"},{"location":"development/ide/#web-ide","text":"The Web-based integrated development environment (Web IDE) runs directly in a browser and, therefore, does not require additional downloads and installations. It has a rich set of editors, viewers, wizards, DevOps productivity tools, and a new Web IDE for in-system application development. The Web IDE is a composition of perspectives, each consisting of the necessary tools to accomplish a certain goal. Three of the UI elements retain their positions in all perpectives: top-area toolbar for the menus, theme selection, and user control sidebar on the left with shortcuts to the perspectives status bar at the bottom, for notifications and other use by the tools The tools that constitute the perspectives are laid out in predefined regions of the work plot, but you can change their position using drag and drop. The perspectives are simply predefined configurations, hence you can open, move, or close different tools on the work plot of a perspective for your convenience. You can also be maximize, minimize, or even pop out any of the tools in a separate window. The tools are the smallest atomic parts in the Web IDE. They are referred to as views or editors, and each type is handled differently.","title":"Web IDE"},{"location":"development/ide/#perspectives","text":"By default, the different views and editors are separated into a few perspectives: Workbench Git Database Repository Terminal Operations Documents Debugger","title":"Perspectives"},{"location":"development/ide/#views","text":"Each perspective is comprised of different views. Learn more about them following the list below: Snapshot Debugger Roles Jobs Documents Git Preview Workspace SQL Extensions Terminal Variables Breakpoints Console Logs Data Structures Access Listeners Database Search Import Registry Repository","title":"Views"},{"location":"development/ide/#editors","text":"Monaco is the editor integrated into the Eclipse Dirigible Web IDE.","title":"Editors"},{"location":"development/ide/#modelers","text":"There are some more sophisticated visual editors: BPMN Modeler Database Schema Modeler Entity Data Modeler Form Designer","title":"Modelers"},{"location":"development/ide/#layouts","text":"The Web IDE layout API delegates the layout management to the GoldenLayout framework. Layouts is a convenience bag of functions that significantly simplifies the work with layouts. It takes care of views registry setup, the work plot regions configuration, layout initialization, serialization, control on the layout manager, open view and open editor functions, global notifications, and others. The top-area toolbar is a composite that aggregates the drop-down menus, the theme selection, the user name, and sign-out control. It uses the corresponding UI microservices available in the ideUiCore module as Menu, User, and Theme. By convention, all UI components are built with Bootstrap 3.x CSS and the themes in the Web IDE are actually custom Bootstrap CSS. A UI microservice enables dynamic change of the CSS upon change of the theme automatically. It is available as Angular factory theme. The Angular service User provides the details for the user that are rendered by the Menu directive, such as the user name. The sidebar is Angular directive that takes care of rendering a standard sidebar in the framework template. It works with the perspectives.js service to populate the registered perspectives as shortcuts. The status bar is an Angular directive that renders a standard, fixed-position footer. The component is subscribed to listen to message types configured as value of the status-bar-topic attribute, or by default to status-message messages.","title":"Layouts"},{"location":"development/ide/about/","text":"About The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About"},{"location":"development/ide/about/#about","text":"The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About"},{"location":"development/ide/editor-access/","text":"Access Editor The Access editor lets you manage access to your project through security constraints files ( *.access ). You can create multiple access constraints within your project as part of one security constraints file. Create a Security Constraints File Right-click on your project in the Workspace view and choose New \u2192 Access Constraints . Enter a name for the security constraints file. Create an Access Constraint Double-click on your security constraints file to open it in the Access editor. Choose New ( + ). In the Create Constraint dialog, fill in the path to the file for which you're creating the access constraint in the Path field. Choose an HTTP or CMIS method for which the access constraint will be valid in the Method field. Select HTTP or CMIS scope from the drop-down list in the Scope field. Fill in a role for which the access constraint is valid in the Roles field. Choose Save . Create a Public Endpoint You can also use the Access editor to make a resource publicly accessible. To do this, fill in the role public in step 6 above. This way, you're effectively creating a new public endpoint for the resource. You can access the public endpoint by replacing web with public in the endpoint's URL. Fill in the public role in the Roles field of the Create Constraint dialog and choose Save . Publish your project. Copy the endpoint's URL from the Preview view. Open a browser and replace web with public in the URL. Check if you can access the public endpoint.","title":"Access Editor"},{"location":"development/ide/editor-access/#access-editor","text":"The Access editor lets you manage access to your project through security constraints files ( *.access ). You can create multiple access constraints within your project as part of one security constraints file.","title":"Access Editor"},{"location":"development/ide/editor-access/#create-a-security-constraints-file","text":"Right-click on your project in the Workspace view and choose New \u2192 Access Constraints . Enter a name for the security constraints file.","title":"Create a Security Constraints File"},{"location":"development/ide/editor-access/#create-an-access-constraint","text":"Double-click on your security constraints file to open it in the Access editor. Choose New ( + ). In the Create Constraint dialog, fill in the path to the file for which you're creating the access constraint in the Path field. Choose an HTTP or CMIS method for which the access constraint will be valid in the Method field. Select HTTP or CMIS scope from the drop-down list in the Scope field. Fill in a role for which the access constraint is valid in the Roles field. Choose Save .","title":"Create an Access Constraint"},{"location":"development/ide/editor-access/#create-a-public-endpoint","text":"You can also use the Access editor to make a resource publicly accessible. To do this, fill in the role public in step 6 above. This way, you're effectively creating a new public endpoint for the resource. You can access the public endpoint by replacing web with public in the endpoint's URL. Fill in the public role in the Roles field of the Create Constraint dialog and choose Save . Publish your project. Copy the endpoint's URL from the Preview view. Open a browser and replace web with public in the URL. Check if you can access the public endpoint.","title":"Create a Public Endpoint"},{"location":"development/ide/editor-csv/","text":"CSV Editor The CSV editor in the Eclipse Dirigible IDE is based on the AG Grid library. The CSV editor allows you to create, edit, and manage CSV files. Create CSV Files To create a new CSV file in the IDE, first create a project. Right-click your project and create a file. Finally, give the newly created file a name followed by the .csv extension, and press Enter . To open the CSV file, double-click it. By default, the CSV file has the headers already enabled. If you want to disable the headers, click the vertical ellipsis icon (\" \u22ee \") to open the kebab menu, and click Disable Header . Edit CSV Files While editing a CSV file in the Eclipse Dirigible IDE, you can perform a list of actions: Add a new column To add a column, right-click on the Column field and click Add Column . You can also edit and delete a column by right-clicking it. Add a new row To add a row, right-click on the field where rows should go and click Add Row . Clicking an already existing row allows you to add a new row before or after it, or delete it. Reorder rows You can use the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") to change the order of existing rows by dragging and dropping them where you want them to be: Select rows You can select: - separate single rows with the \"Cmd (macOS) / Ctrl (Linux & Windows) + left click\" shortcut, or - multiple consequent rows with the \"Shift + left click\" shortcut. Once you've selected some rows, you can either reorder them using the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") or delete them with righ-click and Delete Row(s) . Manage CSV Files Filter a CSV file You can filter the CSV file using these predefined filter options: Contains Not Contains Equals Not Equal Starts With Ends With To select one of these options, click the \u201c \u2261 \u201d hamburger icon: Export a CSV file To export a CSV file, click Export . The CSV file will be downloaded automatically.","title":"CSV Editor"},{"location":"development/ide/editor-csv/#csv-editor","text":"The CSV editor in the Eclipse Dirigible IDE is based on the AG Grid library. The CSV editor allows you to create, edit, and manage CSV files.","title":"CSV Editor"},{"location":"development/ide/editor-csv/#create-csv-files","text":"To create a new CSV file in the IDE, first create a project. Right-click your project and create a file. Finally, give the newly created file a name followed by the .csv extension, and press Enter . To open the CSV file, double-click it. By default, the CSV file has the headers already enabled. If you want to disable the headers, click the vertical ellipsis icon (\" \u22ee \") to open the kebab menu, and click Disable Header .","title":"Create CSV Files"},{"location":"development/ide/editor-csv/#edit-csv-files","text":"While editing a CSV file in the Eclipse Dirigible IDE, you can perform a list of actions: Add a new column To add a column, right-click on the Column field and click Add Column . You can also edit and delete a column by right-clicking it. Add a new row To add a row, right-click on the field where rows should go and click Add Row . Clicking an already existing row allows you to add a new row before or after it, or delete it. Reorder rows You can use the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") to change the order of existing rows by dragging and dropping them where you want them to be: Select rows You can select: - separate single rows with the \"Cmd (macOS) / Ctrl (Linux & Windows) + left click\" shortcut, or - multiple consequent rows with the \"Shift + left click\" shortcut. Once you've selected some rows, you can either reorder them using the drag icon (\" \u22ee\u22ee\u22ee\u22ee \") or delete them with righ-click and Delete Row(s) .","title":"Edit CSV Files"},{"location":"development/ide/editor-csv/#manage-csv-files","text":"Filter a CSV file You can filter the CSV file using these predefined filter options: Contains Not Contains Equals Not Equal Starts With Ends With To select one of these options, click the \u201c \u2261 \u201d hamburger icon: Export a CSV file To export a CSV file, click Export . The CSV file will be downloaded automatically.","title":"Manage CSV Files"},{"location":"development/ide/editor-csvim/","text":"CSVIM Editor The CSVIM editor in the Eclipse Dirigible IDE allows you to open, save, delete, and edit the properties of CSV files. Such properties are: Table Schema File path Delimiter Quote character Header Use header names Distinguish empty from null Version Table The Table input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). Schema The Schema input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). File path The File path input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), forward slashes (\"/\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). Here\u2019s an example for a correct file path: /workspace/csv/subfolder/bigstats.csv . In this example, the file path consists of: workspace The workspace is the place where you create and manage the artifacts of your application. csv This is the name of your project. subfolder This is the subfolder that contains the CSV file. bigstats.csv This is the CSV file. Note: If the file path is formatted properly but doesn't exist, you will be able to save the CSVIM file, but you won't be able to open it with the CSV editor. If the file path isn't formatted properly (for example, by having unsupported characters), you won\u2019t be able to save the CSVIM file or open the CSV file. Delimiter The currently supported delimiters are comma (\",\"), tab (\"/t\"), vertical bar (\"|\"), semicolon (\";\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"+\", \"The delimiter is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file and save the CSVIM file. Quote character The currently supported quote characters are apostrophe (\"\u2018\"), quotation mark (\"\u201c\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"^\", \"The quote character is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file or save the CSVIM file. Header If you select this checkbox, the first line of your CSV file will be treated as a column title or header. Use header names If you select this checkbox, the first line of the specified CSV file will be interpreted when importing the file. This option will work only if you have enabled the \"Header\" checkbox. Distinguish empty from null Select this checkbox if you want to make sure that the table-import process interprets correctly all empty values in the CSV file, which is enclosed with the value selected in the Quote character dropdown, for example, as an empty space. This ensures that an empty space is imported \"as is\" into the target table. If the empty space isn't interpreted correctly, it is imported as null. Version You can specify the version of the CSVIM so you can better manage your CSV and database data.","title":"CSVIM Editor"},{"location":"development/ide/editor-csvim/#csvim-editor","text":"The CSVIM editor in the Eclipse Dirigible IDE allows you to open, save, delete, and edit the properties of CSV files. Such properties are: Table Schema File path Delimiter Quote character Header Use header names Distinguish empty from null Version","title":"CSVIM Editor"},{"location":"development/ide/editor-csvim/#table","text":"The Table input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\").","title":"Table"},{"location":"development/ide/editor-csvim/#schema","text":"The Schema input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\").","title":"Schema"},{"location":"development/ide/editor-csvim/#file-path","text":"The File path input field can contain letters (a-z, A-Z), numbers (0-9), hyphens (\"-\"), forward slashes (\"/\"), dots (\".\"), underscores (\"_\"), and dollar signs (\"$\"). Here\u2019s an example for a correct file path: /workspace/csv/subfolder/bigstats.csv . In this example, the file path consists of: workspace The workspace is the place where you create and manage the artifacts of your application. csv This is the name of your project. subfolder This is the subfolder that contains the CSV file. bigstats.csv This is the CSV file. Note: If the file path is formatted properly but doesn't exist, you will be able to save the CSVIM file, but you won't be able to open it with the CSV editor. If the file path isn't formatted properly (for example, by having unsupported characters), you won\u2019t be able to save the CSVIM file or open the CSV file.","title":"File path"},{"location":"development/ide/editor-csvim/#delimiter","text":"The currently supported delimiters are comma (\",\"), tab (\"/t\"), vertical bar (\"|\"), semicolon (\";\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"+\", \"The delimiter is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file and save the CSVIM file.","title":"Delimiter"},{"location":"development/ide/editor-csvim/#quote-character","text":"The currently supported quote characters are apostrophe (\"\u2018\"), quotation mark (\"\u201c\"), and number sign or hash (\"#\"). Note: If you\u2019re trying to use an unsupported character such as \"^\", \"The quote character is not supported!\" warning message will pop up. Nevertheless, you will be able to open the CSV file or save the CSVIM file.","title":"Quote character"},{"location":"development/ide/editor-csvim/#header","text":"If you select this checkbox, the first line of your CSV file will be treated as a column title or header.","title":"Header"},{"location":"development/ide/editor-csvim/#use-header-names","text":"If you select this checkbox, the first line of the specified CSV file will be interpreted when importing the file. This option will work only if you have enabled the \"Header\" checkbox.","title":"Use header names"},{"location":"development/ide/editor-csvim/#distinguish-empty-from-null","text":"Select this checkbox if you want to make sure that the table-import process interprets correctly all empty values in the CSV file, which is enclosed with the value selected in the Quote character dropdown, for example, as an empty space. This ensures that an empty space is imported \"as is\" into the target table. If the empty space isn't interpreted correctly, it is imported as null.","title":"Distinguish empty from null"},{"location":"development/ide/editor-csvim/#version","text":"You can specify the version of the CSVIM so you can better manage your CSV and database data.","title":"Version"},{"location":"development/ide/editor-monaco/","text":"Monaco Editor Monaco Editor is the code editor that powers VS Code . It is not supported in mobile browsers or mobile web frameworks. The Editor supports syntax highlighting for XML, PHP, C#, C++, Razor, Markdown, Diff, Java, VB, CoffeeScript, Handlebars, Batch, Pug, F#, Lua, Powershell, Python, SASS, R, Objective-C and side by side live comparison for all languages out of the box. Monaco has a rich set of default keyboard shortcuts as well as allowing you to customize them. Monaco supports multiple cursors for fast simultaneous edits. You can also add secondary cursors.","title":"Monaco Editor"},{"location":"development/ide/editor-monaco/#monaco-editor","text":"Monaco Editor is the code editor that powers VS Code . It is not supported in mobile browsers or mobile web frameworks. The Editor supports syntax highlighting for XML, PHP, C#, C++, Razor, Markdown, Diff, Java, VB, CoffeeScript, Handlebars, Batch, Pug, F#, Lua, Powershell, Python, SASS, R, Objective-C and side by side live comparison for all languages out of the box. Monaco has a rich set of default keyboard shortcuts as well as allowing you to customize them. Monaco supports multiple cursors for fast simultaneous edits. You can also add secondary cursors.","title":"Monaco Editor"},{"location":"development/ide/modelers/bpmn/","text":"BPMN Modeler The BPMN Modeler provides capabilities for visual design of a business process. Such business processes can include Dirigible services.","title":"BPMN"},{"location":"development/ide/modelers/bpmn/#bpmn-modeler","text":"The BPMN Modeler provides capabilities for visual design of a business process. Such business processes can include Dirigible services.","title":"BPMN Modeler"},{"location":"development/ide/modelers/database-schema/","text":"Database Schema Modeler The Database Schema Modeler provides capabilities for visual design of a database schema.","title":"Database Schema"},{"location":"development/ide/modelers/database-schema/#database-schema-modeler","text":"The Database Schema Modeler provides capabilities for visual design of a database schema.","title":"Database Schema Modeler"},{"location":"development/ide/modelers/entity-data/","text":"Entity Data Modeler The Entity Data Modeler provides capabilities for visual design of a domain model. After that you can generate a full-stack applications for basic operations over the defined entities.","title":"Entity Data"},{"location":"development/ide/modelers/entity-data/#entity-data-modeler","text":"The Entity Data Modeler provides capabilities for visual design of a domain model. After that you can generate a full-stack applications for basic operations over the defined entities.","title":"Entity Data Modeler"},{"location":"development/ide/modelers/form-designer/","text":"Form Designer The Form Designer provides capabilities for visual design of a Web form. You can drag and drop UI controls from a predefined list and edit their properties.","title":"Form Designer"},{"location":"development/ide/modelers/form-designer/#form-designer","text":"The Form Designer provides capabilities for visual design of a Web form. You can drag and drop UI controls from a predefined list and edit their properties.","title":"Form Designer"},{"location":"development/ide/perspectives/database/","text":"Database Perspective The Database perspective contains tools for inspection and manipulation of the artifacts within the underlying relational database. It is comprised of Database , SQL , Console and Result views. The Database perspective features a database explorer, a console to execute SQL statements and to preview results in table format.","title":"Database"},{"location":"development/ide/perspectives/database/#database-perspective","text":"The Database perspective contains tools for inspection and manipulation of the artifacts within the underlying relational database. It is comprised of Database , SQL , Console and Result views. The Database perspective features a database explorer, a console to execute SQL statements and to preview results in table format.","title":"Database Perspective"},{"location":"development/ide/perspectives/debugger/","text":"Debugger Perspective The Web IDE includes a Debugger perspective which is comprised of the following views: Debugger Variables Breakpoints Console Preview The Debugger perspective enables you to monitor the execution of your code, stop it, restart it or set breakpoints, and change values in memory.","title":"Debugger"},{"location":"development/ide/perspectives/debugger/#debugger-perspective","text":"The Web IDE includes a Debugger perspective which is comprised of the following views: Debugger Variables Breakpoints Console Preview The Debugger perspective enables you to monitor the execution of your code, stop it, restart it or set breakpoints, and change values in memory.","title":"Debugger Perspective"},{"location":"development/ide/perspectives/documents/","text":"Documents Perspective The Documents perspective is the place where the user manages the binary artifacts such as pictures, spreadsheets, PDF files, etc. It enables him/her to upload, overwrite, download, delete and search for artifacts. At the moment the Documents perspective consists of only one view, which is also called Documents.","title":"Documents"},{"location":"development/ide/perspectives/documents/#documents-perspective","text":"The Documents perspective is the place where the user manages the binary artifacts such as pictures, spreadsheets, PDF files, etc. It enables him/her to upload, overwrite, download, delete and search for artifacts. At the moment the Documents perspective consists of only one view, which is also called Documents.","title":"Documents Perspective"},{"location":"development/ide/perspectives/git/","text":"Git Perspective The Git perspective aims at presenting a simplified interface for the most common Git operations. It is built from tools that support Git client operations. The Git perspective is comprised of Git and Console views, and workspace menu. It enables the users to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the workspace menu. Note In case of merge conflict on Push operation, a new branch with your local changes will be created in the remote repository. From this point, you can use your preferred tooling to apply the actual merge between the two branches. Video","title":"Git"},{"location":"development/ide/perspectives/git/#git-perspective","text":"The Git perspective aims at presenting a simplified interface for the most common Git operations. It is built from tools that support Git client operations. The Git perspective is comprised of Git and Console views, and workspace menu. It enables the users to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the workspace menu. Note In case of merge conflict on Push operation, a new branch with your local changes will be created in the remote repository. From this point, you can use your preferred tooling to apply the actual merge between the two branches.","title":"Git Perspective"},{"location":"development/ide/perspectives/git/#video","text":"","title":"Video"},{"location":"development/ide/perspectives/operations/","text":"Operations Perspective The Web IDE includes an Operations perspective , which is comprised of the following views: Registry Repository Extension Jobs Listeners Data Structures Access Roles Console Terminal Logs The Operations perspective enables you to monitor the ongoing processes and operation activities.","title":"Operations"},{"location":"development/ide/perspectives/operations/#operations-perspective","text":"The Web IDE includes an Operations perspective , which is comprised of the following views: Registry Repository Extension Jobs Listeners Data Structures Access Roles Console Terminal Logs The Operations perspective enables you to monitor the ongoing processes and operation activities.","title":"Operations Perspective"},{"location":"development/ide/perspectives/repository/","text":"Repository Perspective The Repository perspective gives access to the raw structure of the Dirigible instance. It is comprised of Repository , Snapshot , Preview and Console views. There the user can inspect at low level the project and folder structure, as well as the artifacts content. The user is able to import/export snapshots via the Snapshot view. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository"},{"location":"development/ide/perspectives/repository/#repository-perspective","text":"The Repository perspective gives access to the raw structure of the Dirigible instance. It is comprised of Repository , Snapshot , Preview and Console views. There the user can inspect at low level the project and folder structure, as well as the artifacts content. The user is able to import/export snapshots via the Snapshot view. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository Perspective"},{"location":"development/ide/perspectives/terminal/","text":"Terminal Perspective The key view in the perspective is a terminal that emulates console client connected to the environment of the Dirigible that can execute commands. The difference here is that the whole communication goes via HTTP(S) only and does not require the SSH port to be opened.","title":"Terminal"},{"location":"development/ide/perspectives/terminal/#terminal-perspective","text":"The key view in the perspective is a terminal that emulates console client connected to the environment of the Dirigible that can execute commands. The difference here is that the whole communication goes via HTTP(S) only and does not require the SSH port to be opened.","title":"Terminal Perspective"},{"location":"development/ide/perspectives/workbench/","text":"Workbench Perspective This is the place where the user develops the dynamic applications. This perspective contains all views and editors that may help in the overall implementation, from domain models via services to the user interface. The Workbench perspective is comprised of Workspace , Import , Properties , Console , and Preview views, plus the editors registered for each file type. In other words, the minimal toolset for file management, preview, and editing operations. The main view opened by default in this perspective is the Workspace view, a standard view with the projects in your workspace .","title":"Workbench"},{"location":"development/ide/perspectives/workbench/#workbench-perspective","text":"This is the place where the user develops the dynamic applications. This perspective contains all views and editors that may help in the overall implementation, from domain models via services to the user interface. The Workbench perspective is comprised of Workspace , Import , Properties , Console , and Preview views, plus the editors registered for each file type. In other words, the minimal toolset for file management, preview, and editing operations. The main view opened by default in this perspective is the Workspace view, a standard view with the projects in your workspace .","title":"Workbench Perspective"},{"location":"development/ide/views/about/","text":"About View The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About"},{"location":"development/ide/views/about/#about-view","text":"The About view contains system information about Dirigible's installation. The different properties and sections are: Version - the version of Dirigible. Commit Id - the commit that Dirigible is from. The commit id link also leads to the GitHub page of the release. Type - the type of the account. Instance - the type of the instance running. Repository - the place where the project repository is housed. Database - the database used. It can be local, custom or managed. Modules - the list of modules in the Dirigible. Engines - the list of engines in the Dirigible. Synchronizers - list of synchronizers in the Dirigible. Their status is also shown.","title":"About View"},{"location":"development/ide/views/access/","text":"Access View The Access view displays the defined security constraints on HTTP servers access or paths to the document repository. These constraints are defined in *.access files. More info about the type of the artifacts you can find in Artifacts . Related content Documents View Constraints View Documents Perspective","title":"Access"},{"location":"development/ide/views/access/#access-view","text":"The Access view displays the defined security constraints on HTTP servers access or paths to the document repository. These constraints are defined in *.access files. More info about the type of the artifacts you can find in Artifacts . Related content Documents View Constraints View Documents Perspective","title":"Access View"},{"location":"development/ide/views/configurations/","text":"Configurations View The Configurations view contains a list of configuration parameters and environment variables. Each of them begins with \"DIRIGIBLE_\" and continues with a unique name. In addition to Name, each of the other four columns in the table holds a distinct parameter. They are Environment, Runtime, Deployment and Module (priority left to right). Changing a variable The values of the configuration parameters are set by the module, but they can be overwritten. This can be done either during the deployment of Dirigible, by creating a dirigible.properties file with different values or by changing the values during runtime. Changing a variable during runtime Follow steps 1-5 outlined in the Create a hello-world.js service tutorial. Insert the following code at line 2: var response = require ( \"http/v4/response\" ); var config = require ( \"core/v4/configurations\" ); config . set ( \"DIRIGIBLE_BRANDING_NAME\" , \"RuntimeDemo\" ) response . println ( \"Hello World!\" ); response . flush (); response . close (); Save the file. Refresh the page. Navigate to Window \u2192 Select View \u2192 Configurations You can learn more about how to setup Environment Variables here .","title":"Configurations"},{"location":"development/ide/views/configurations/#configurations-view","text":"The Configurations view contains a list of configuration parameters and environment variables. Each of them begins with \"DIRIGIBLE_\" and continues with a unique name. In addition to Name, each of the other four columns in the table holds a distinct parameter. They are Environment, Runtime, Deployment and Module (priority left to right).","title":"Configurations View"},{"location":"development/ide/views/configurations/#changing-a-variable","text":"The values of the configuration parameters are set by the module, but they can be overwritten. This can be done either during the deployment of Dirigible, by creating a dirigible.properties file with different values or by changing the values during runtime.","title":"Changing a variable"},{"location":"development/ide/views/configurations/#changing-a-variable-during-runtime","text":"Follow steps 1-5 outlined in the Create a hello-world.js service tutorial. Insert the following code at line 2: var response = require ( \"http/v4/response\" ); var config = require ( \"core/v4/configurations\" ); config . set ( \"DIRIGIBLE_BRANDING_NAME\" , \"RuntimeDemo\" ) response . println ( \"Hello World!\" ); response . flush (); response . close (); Save the file. Refresh the page. Navigate to Window \u2192 Select View \u2192 Configurations You can learn more about how to setup Environment Variables here .","title":"Changing a variable during runtime"},{"location":"development/ide/views/console/","text":"Console View The Console view is a major debugging tool. It displays the output of the code that you are executing.","title":"Console"},{"location":"development/ide/views/console/#console-view","text":"The Console view is a major debugging tool. It displays the output of the code that you are executing.","title":"Console View"},{"location":"development/ide/views/constraints/","text":"Constraints View The Constraints view lets you restrict access through the Documents view to specific folders or files by creating constraints. This way, users will be able to access certain resources based on their roles. To create a constraint, you have to specify: a path to the folder or file. For example, /Folder A a method - READ or WRITE ( WRITE constraint includes READ access) a role - the role that the user needs to have in order to be able to see or edit the folder/file. For example, Admin . As specified in the screenshot below, only users with the role Admin can read Folder C that can be accessed by following the path /Folder A/FolderC . The constraints created in the Constraints view are also visible in the Access view. Related content Access View Documents View Documents Perspective","title":"Constraints"},{"location":"development/ide/views/constraints/#constraints-view","text":"The Constraints view lets you restrict access through the Documents view to specific folders or files by creating constraints. This way, users will be able to access certain resources based on their roles. To create a constraint, you have to specify: a path to the folder or file. For example, /Folder A a method - READ or WRITE ( WRITE constraint includes READ access) a role - the role that the user needs to have in order to be able to see or edit the folder/file. For example, Admin . As specified in the screenshot below, only users with the role Admin can read Folder C that can be accessed by following the path /Folder A/FolderC . The constraints created in the Constraints view are also visible in the Access view. Related content Access View Documents View Documents Perspective","title":"Constraints View"},{"location":"development/ide/views/database/","text":"Database View The Database view gives you direct access to the configured data source(s). It enables you to expand the schema item and see the list of all tables and views created either via the data structures models or directly via SQL script in SQL View . Note All created tables can be discovered under the PUBLIC schema (for local deployment with H2 database) . The PUBLIC schema will appear, after the local data source type and the DefaultDB data source are selected in the upper right corner.","title":"Database"},{"location":"development/ide/views/database/#database-view","text":"The Database view gives you direct access to the configured data source(s). It enables you to expand the schema item and see the list of all tables and views created either via the data structures models or directly via SQL script in SQL View . Note All created tables can be discovered under the PUBLIC schema (for local deployment with H2 database) . The PUBLIC schema will appear, after the local data source type and the DefaultDB data source are selected in the upper right corner.","title":"Database View"},{"location":"development/ide/views/datastructures/","text":"Data Structures View The Data Structures view lists all data structures defined in the following files: *.table - the table layout definition in JSON *.view - the view layout definition in JSON *.schema - the schema layout definition in JSON *.append - append mode data file in DSV *.delete - delete mode data file in DSV *.update - update mode data file in DSV *.replace - replace mode data file in DSV More info about the type of the artifacts you can find in Artifacts .","title":"Data Structures"},{"location":"development/ide/views/datastructures/#data-structures-view","text":"The Data Structures view lists all data structures defined in the following files: *.table - the table layout definition in JSON *.view - the view layout definition in JSON *.schema - the schema layout definition in JSON *.append - append mode data file in DSV *.delete - delete mode data file in DSV *.update - update mode data file in DSV *.replace - replace mode data file in DSV More info about the type of the artifacts you can find in Artifacts .","title":"Data Structures View"},{"location":"development/ide/views/debugger/","text":"Debugger View The Debugger view enables you to navigate the debugging of your code. You can: Start Pause Restart Proceed step by step This view includes a few panes that are helpful during the debugging process. See below for more details. Scope When you're paused on a line of code, the Scope pane shows you what local and global variables are currently defined, along with the value of each variable. It also shows closure variables, when applicable. Double-click a variable value to edit it. When you're not paused on a line of code, the Scope pane is empty. Breakpoints The Breakpoints pane shows any line-of-code breakpoints you've added to your code. As the name suggests, you can use a line-of-code breakpoint when you've got a specific line of code that you want to pause on. As you can see in the Breakpoints pane, currently there are two breakpoints added: \"Unnamed\" at row 5 and \"Unnamed\" at row 8. Debug Preview This pane displays the result of executing the debugged file. The Debug Preview is similar in functionality to the Preview view. Related content Console view Debugger perspective","title":"Debugger"},{"location":"development/ide/views/debugger/#debugger-view","text":"The Debugger view enables you to navigate the debugging of your code. You can: Start Pause Restart Proceed step by step This view includes a few panes that are helpful during the debugging process. See below for more details. Scope When you're paused on a line of code, the Scope pane shows you what local and global variables are currently defined, along with the value of each variable. It also shows closure variables, when applicable. Double-click a variable value to edit it. When you're not paused on a line of code, the Scope pane is empty. Breakpoints The Breakpoints pane shows any line-of-code breakpoints you've added to your code. As the name suggests, you can use a line-of-code breakpoint when you've got a specific line of code that you want to pause on. As you can see in the Breakpoints pane, currently there are two breakpoints added: \"Unnamed\" at row 5 and \"Unnamed\" at row 8. Debug Preview This pane displays the result of executing the debugged file. The Debug Preview is similar in functionality to the Preview view. Related content Console view Debugger perspective","title":"Debugger View"},{"location":"development/ide/views/discussions/","text":"Discussions View The Discussions view adds forum-like capabilities to the Eclipse Dirigible's UI. You can review and rate comments, as well as participate in the discussion by commenting under topics. There's also the possibility to toggle between thread view and timeline view for each discussion.","title":"Discussions"},{"location":"development/ide/views/discussions/#discussions-view","text":"The Discussions view adds forum-like capabilities to the Eclipse Dirigible's UI. You can review and rate comments, as well as participate in the discussion by commenting under topics. There's also the possibility to toggle between thread view and timeline view for each discussion.","title":"Discussions View"},{"location":"development/ide/views/documents/","text":"Documents View The Documents view enables you to manage the binary artifacts such as pictures, spreadsheets, PDF, etc. You can upload, overwrite, download, delete, and search for artifacts. Related content Access View Constraints View Documents Perspective","title":"Documents"},{"location":"development/ide/views/documents/#documents-view","text":"The Documents view enables you to manage the binary artifacts such as pictures, spreadsheets, PDF, etc. You can upload, overwrite, download, delete, and search for artifacts. Related content Access View Constraints View Documents Perspective","title":"Documents View"},{"location":"development/ide/views/extensions/","text":"Extensions View The Extensions view lists all defined extensions and extension points through *.extension and *.extensionpoint descriptor. More info about the type of the artifacts you can find here","title":"Extensions"},{"location":"development/ide/views/extensions/#extensions-view","text":"The Extensions view lists all defined extensions and extension points through *.extension and *.extensionpoint descriptor. More info about the type of the artifacts you can find here","title":"Extensions View"},{"location":"development/ide/views/git/","text":"Git View The Git view enables you to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the Workspace menu. Related content Console view Staging view History view","title":"Git"},{"location":"development/ide/views/git/#git-view","text":"The Git view enables you to perform simple Git operations such as cloning a repository to a workspace, pulling changes, and pushing commits. The user can create, manage, and switch between multiple workspaces through the Workspace menu. Related content Console view Staging view History view","title":"Git View"},{"location":"development/ide/views/history/","text":"History View The History view provides a commit history record that includes ID, message, author, and time of each commit.","title":"History"},{"location":"development/ide/views/history/#history-view","text":"The History view provides a commit history record that includes ID, message, author, and time of each commit.","title":"History View"},{"location":"development/ide/views/import/","text":"Import View The Import view enables the user to upload a *.zip file, containing one or more projects, to the selected Workspace . The view includes a progress bar for navigation of the process. The user can manage and switch between multiple workspaces through the Workspace menu.","title":"Import"},{"location":"development/ide/views/import/#import-view","text":"The Import view enables the user to upload a *.zip file, containing one or more projects, to the selected Workspace . The view includes a progress bar for navigation of the process. The user can manage and switch between multiple workspaces through the Workspace menu.","title":"Import View"},{"location":"development/ide/views/jobs/","text":"Jobs View The Jobs view lists all registered custom jobs scheduled for execution in a *.job file. More info about the type of the artifacts you can find here","title":"Jobs"},{"location":"development/ide/views/jobs/#jobs-view","text":"The Jobs view lists all registered custom jobs scheduled for execution in a *.job file. More info about the type of the artifacts you can find here","title":"Jobs View"},{"location":"development/ide/views/listeners/","text":"Listeners View The Listeners view shows all message listeners registered by the *.listener files. Their type depends on the type of the message hub - topic or queue. More info about the type of the artifacts you can find in Artifacts .","title":"Listeners"},{"location":"development/ide/views/listeners/#listeners-view","text":"The Listeners view shows all message listeners registered by the *.listener files. Their type depends on the type of the message hub - topic or queue. More info about the type of the artifacts you can find in Artifacts .","title":"Listeners View"},{"location":"development/ide/views/logs/","text":"Logs View The Logs view lists all available log files.","title":"Logs"},{"location":"development/ide/views/logs/#logs-view","text":"The Logs view lists all available log files.","title":"Logs View"},{"location":"development/ide/views/plugins/","text":"Plugins Info The Plugins view is currently in an initial stage of development and does not have all features. Overview The Plugins view contains a list of plugins that you can install in Dirigible. Each plugin name is a link that leads to a page containing more information about it. Installing a plugin Once you have a running Eclipse Dirigible instance, you can navigate to the Plugins view: Choose Window \u2192 Show View \u2192 Plugins . Install the plugin.","title":"Plugins"},{"location":"development/ide/views/plugins/#plugins","text":"Info The Plugins view is currently in an initial stage of development and does not have all features.","title":"Plugins"},{"location":"development/ide/views/plugins/#overview","text":"The Plugins view contains a list of plugins that you can install in Dirigible. Each plugin name is a link that leads to a page containing more information about it.","title":"Overview"},{"location":"development/ide/views/plugins/#installing-a-plugin","text":"Once you have a running Eclipse Dirigible instance, you can navigate to the Plugins view: Choose Window \u2192 Show View \u2192 Plugins . Install the plugin.","title":"Installing a plugin"},{"location":"development/ide/views/preview/","text":"Preview View The Preview view displays the result of executing the selected file. It refreshes automatically during Workspace change events e.g. Save.","title":"Previews"},{"location":"development/ide/views/preview/#preview-view","text":"The Preview view displays the result of executing the selected file. It refreshes automatically during Workspace change events e.g. Save.","title":"Preview View"},{"location":"development/ide/views/registry/","text":"Registry View Technically, the Registry is a space within the Repository where all the published artifacts are placed. Caution Editing of the file contents via the Registry perspective is not recommended as it can lead to inconsistencies!","title":"Registry"},{"location":"development/ide/views/registry/#registry-view","text":"Technically, the Registry is a space within the Repository where all the published artifacts are placed. Caution Editing of the file contents via the Registry perspective is not recommended as it can lead to inconsistencies!","title":"Registry View"},{"location":"development/ide/views/repository/","text":"Repository View The Repository view gives access to the raw structure of the underlying Repository content. There you can inspect at low level the project and folder structure, as well as the artifacts content. The view enables the user to create new collections and resources, to delete existing ones, or to export them. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository"},{"location":"development/ide/views/repository/#repository-view","text":"The Repository view gives access to the raw structure of the underlying Repository content. There you can inspect at low level the project and folder structure, as well as the artifacts content. The view enables the user to create new collections and resources, to delete existing ones, or to export them. Caution Editing of the file contents via the Repository perspective is not recommended as it can lead to inconsistencies!","title":"Repository View"},{"location":"development/ide/views/resultview/","text":"Result View The Result view graphically shows you result of executed script via SQL View or if you press show content on some table in Database view .","title":"Result"},{"location":"development/ide/views/resultview/#result-view","text":"The Result view graphically shows you result of executed script via SQL View or if you press show content on some table in Database view .","title":"Result View"},{"location":"development/ide/views/roles/","text":"Roles View The Roles view lists all security roles defined in the roles descriptor *.roles . More info about the type of the artifacts you can find in Artifacts .","title":"Roles"},{"location":"development/ide/views/roles/#roles-view","text":"The Roles view lists all security roles defined in the roles descriptor *.roles . More info about the type of the artifacts you can find in Artifacts .","title":"Roles View"},{"location":"development/ide/views/search/","text":"Search View The Search view enables the user to make a free-text search in the selected workspace. The user can switch between multiple workspaces through the Workspace menu.","title":"Search"},{"location":"development/ide/views/search/#search-view","text":"The Search view enables the user to make a free-text search in the selected workspace. The user can switch between multiple workspaces through the Workspace menu.","title":"Search View"},{"location":"development/ide/views/snapshot/","text":"Snapshot View The Snapshot view enables the user to upload the whole repository (including all users' Workspaces) and all registry public contents. It includes a progress bar for navigation of the process.","title":"Snapshot"},{"location":"development/ide/views/snapshot/#snapshot-view","text":"The Snapshot view enables the user to upload the whole repository (including all users' Workspaces) and all registry public contents. It includes a progress bar for navigation of the process.","title":"Snapshot View"},{"location":"development/ide/views/sql/","text":"SQL View The SQL view is one of the most powerful tools for database management. In the SQL console you can enter and execute SQL scripts, compliant to the underlying database system. Note Scripts are executed by pressing: Windows : Ctrl + X Mac: Cmd + X In Database View you can press the refresh button, and preview the data by selecting Show Content . You get the result of the execution in the Results view below.","title":"SQL"},{"location":"development/ide/views/sql/#sql-view","text":"The SQL view is one of the most powerful tools for database management. In the SQL console you can enter and execute SQL scripts, compliant to the underlying database system. Note Scripts are executed by pressing: Windows : Ctrl + X Mac: Cmd + X In Database View you can press the refresh button, and preview the data by selecting Show Content . You get the result of the execution in the Results view below.","title":"SQL View"},{"location":"development/ide/views/staging/","text":"Staging View The Staging view provides a visual alternative of executing Git commands from a terminal. You can manage your locally changed files and prepare them for pushing to your remote repository. Unstaged Files - all files that you've changed are listed here. However, these changes aren't ready yet to be binded to a commit. For this purpose, you have to stage them. Use the downward arrow to move files from unstaged to staged state. Staged Files - all files that you've changed and staged are listed here. These are ready to be binded to a commit. Use the upward arrow to move files from staged back to unstaged state. Commit Message - provide details about the changes included in your commit. Username , Password , Email - provide your authentication credentials. Commit & Push - commit your changes and directly push them to your remote repository. Commit - commit your changes without pushing them. This way, you can organize your changes in several commits and push them together.","title":"Staging"},{"location":"development/ide/views/staging/#staging-view","text":"The Staging view provides a visual alternative of executing Git commands from a terminal. You can manage your locally changed files and prepare them for pushing to your remote repository. Unstaged Files - all files that you've changed are listed here. However, these changes aren't ready yet to be binded to a commit. For this purpose, you have to stage them. Use the downward arrow to move files from unstaged to staged state. Staged Files - all files that you've changed and staged are listed here. These are ready to be binded to a commit. Use the upward arrow to move files from staged back to unstaged state. Commit Message - provide details about the changes included in your commit. Username , Password , Email - provide your authentication credentials. Commit & Push - commit your changes and directly push them to your remote repository. Commit - commit your changes without pushing them. This way, you can organize your changes in several commits and push them together.","title":"Staging View"},{"location":"development/ide/views/terminal/","text":"Terminal View Via the Terminal view, you can execute OS commands. Examples: Linux OS: ls -al Microsoft Windows OS: dir","title":"Terminal"},{"location":"development/ide/views/terminal/#terminal-view","text":"Via the Terminal view, you can execute OS commands. Examples: Linux OS: ls -al Microsoft Windows OS: dir","title":"Terminal View"},{"location":"development/ide/views/websockets/","text":"Web Sockets View The Web Sockets view lists all the connections that Dirigible has currently established with other ports. The different properties and sections are: Location Endpoint Handler Created Creator","title":"Web Sockets"},{"location":"development/ide/views/websockets/#web-sockets-view","text":"The Web Sockets view lists all the connections that Dirigible has currently established with other ports. The different properties and sections are: Location Endpoint Handler Created Creator","title":"Web Sockets View"},{"location":"development/ide/views/workspace/","text":"Workspace View The Workspace is the developer's place where he/she creates and manages the application artifacts. The first-level citizens of the workspace are the projects. With Eclipse Dirigible the users can create, manage, and switch between multiple workspaces through the Workspace view. Each project can contain multiple folders and files (artifacts). The new template-based project and artifacts scaffolding generators features are worthy of mention. The projects file organization is now non-normative and entirely up-to the preferences of the users. The IDE supports multiple editors registered for different file (MIME) types. More than one editor can be registered for one file type and in this case a \"Open with\u2026\" context menu entry is rendered for the user to select, which one to use. The Workspace explorer displays a standard view on the projects in your workspace . It shows the folder structure along with the files. There is a context menu assigned to the project node: Via this context menu, you can create new artifacts such as: Database Table Database View Database Schema Model Entity Data Model JavaScript Service HTML5 Page Scheduled Job Message Listener Business Process Model Access Constraints Roles Definitions or just regular ones: File Folder More info about the type of the artifacts you can find here . When selecting an artifact, you can use the \"Open\" or \"Open With\" actions to load its content in the corresponding editor, for example, Monaco Editor . A single user can have multiple workspaces, containing different set of projects. The artifacts i.e. the project management, can be done via the views and editors in the Workbench Perspective .","title":"Workspace"},{"location":"development/ide/views/workspace/#workspace-view","text":"The Workspace is the developer's place where he/she creates and manages the application artifacts. The first-level citizens of the workspace are the projects. With Eclipse Dirigible the users can create, manage, and switch between multiple workspaces through the Workspace view. Each project can contain multiple folders and files (artifacts). The new template-based project and artifacts scaffolding generators features are worthy of mention. The projects file organization is now non-normative and entirely up-to the preferences of the users. The IDE supports multiple editors registered for different file (MIME) types. More than one editor can be registered for one file type and in this case a \"Open with\u2026\" context menu entry is rendered for the user to select, which one to use. The Workspace explorer displays a standard view on the projects in your workspace . It shows the folder structure along with the files. There is a context menu assigned to the project node: Via this context menu, you can create new artifacts such as: Database Table Database View Database Schema Model Entity Data Model JavaScript Service HTML5 Page Scheduled Job Message Listener Business Process Model Access Constraints Roles Definitions or just regular ones: File Folder More info about the type of the artifacts you can find here . When selecting an artifact, you can use the \"Open\" or \"Open With\" actions to load its content in the corresponding editor, for example, Monaco Editor . A single user can have multiple workspaces, containing different set of projects. The artifacts i.e. the project management, can be done via the views and editors in the Workbench Perspective .","title":"Workspace View"},{"location":"overview/","text":"The Eclipse Dirigible Project Eclipse Dirigible is an open source project that provides Integrated Development Environment as a Service (IDEaaS), as well as integrated runtime execution engines. The applications created with Eclipse Dirigible comply with the Dynamic Applications concept and structure. The main project goal is to provide all required capabilities needed to develop and run end-to-end vertical applications in the cloud in the shortest time ever. The environment itself runs directly in a browser, therefore does not require additional downloads and installations. It packs all the needed components, which makes it a self-contained and well-integrated software stack that can be deployed on any Java based Web server, such as Tomcat, Jetty, JBoss, etc. Eclipse Dirigible project came out of an internal SAP initiative to address the extension and adaptation use cases related to SOA and Enterprise Services. On one hand, in this project were implied the lessons learned from the standard tools and approaches so far. On the other hand, there were added features aligned with the most recent technologies and architectural patterns related to Web 2.0 and HTML5 . This made it complete enough to be used as the only environment needed for building and running applications in the cloud. From the beginning, the project follows the principles of Simplicity, Openness, Agility, Completeness, and Perfection, which provide a sustainable environment where maximum impact is achieved with minimal effort. Features section describes in detail what is included in the project. Concepts section gives you an overview about the internal and the chosen patterns. Samples section shows you how to start and build your first dynamic Web application in seconds.","title":"The Eclipse Dirigible Project"},{"location":"overview/#the-eclipse-dirigible-project","text":"Eclipse Dirigible is an open source project that provides Integrated Development Environment as a Service (IDEaaS), as well as integrated runtime execution engines. The applications created with Eclipse Dirigible comply with the Dynamic Applications concept and structure. The main project goal is to provide all required capabilities needed to develop and run end-to-end vertical applications in the cloud in the shortest time ever. The environment itself runs directly in a browser, therefore does not require additional downloads and installations. It packs all the needed components, which makes it a self-contained and well-integrated software stack that can be deployed on any Java based Web server, such as Tomcat, Jetty, JBoss, etc. Eclipse Dirigible project came out of an internal SAP initiative to address the extension and adaptation use cases related to SOA and Enterprise Services. On one hand, in this project were implied the lessons learned from the standard tools and approaches so far. On the other hand, there were added features aligned with the most recent technologies and architectural patterns related to Web 2.0 and HTML5 . This made it complete enough to be used as the only environment needed for building and running applications in the cloud. From the beginning, the project follows the principles of Simplicity, Openness, Agility, Completeness, and Perfection, which provide a sustainable environment where maximum impact is achieved with minimal effort. Features section describes in detail what is included in the project. Concepts section gives you an overview about the internal and the chosen patterns. Samples section shows you how to start and build your first dynamic Web application in seconds.","title":"The Eclipse Dirigible Project"},{"location":"overview/architecture/","text":"Architecture The Eclipse Dirigible architecture follows the well-proved principles of simplicity and scalability in the classical service-oriented architecture. The components are separated between the design time (definition work, modeling, scripting) and the runtime (execution of services, content provisioning, and monitoring). The transition between design time and runtime is achieved with a repository component. The only linking part is the content itself. At design time, the programmers and designers use the Web-based integrated development environment Web IDE . This tooling is based on the most popular client side JavaScript framework - AngularJS, as well as Bootstrap for theme-ing and GoldenLayout for windows management. The runtime components provide the cloud application after you create it. The underlying technology platform is a Java-Web-Profile-compliant application server (such as Tomcat). On top are the Eclipse Dirigible containers for service execution. Depending on the scripting language and purpose, they can be: GraalVM JS Mylyn Lucene Quartz ActiveMQ Flowable Mustache Chemistry The runtime can be scaled independently from the design time and can be deployed without the design time at all (for productive landscapes). Depending on the target cloud platform, you can integrate the services provided by the underlying technology platform in Eclipse Dirigible.","title":"Architecture"},{"location":"overview/architecture/#architecture","text":"The Eclipse Dirigible architecture follows the well-proved principles of simplicity and scalability in the classical service-oriented architecture. The components are separated between the design time (definition work, modeling, scripting) and the runtime (execution of services, content provisioning, and monitoring). The transition between design time and runtime is achieved with a repository component. The only linking part is the content itself. At design time, the programmers and designers use the Web-based integrated development environment Web IDE . This tooling is based on the most popular client side JavaScript framework - AngularJS, as well as Bootstrap for theme-ing and GoldenLayout for windows management. The runtime components provide the cloud application after you create it. The underlying technology platform is a Java-Web-Profile-compliant application server (such as Tomcat). On top are the Eclipse Dirigible containers for service execution. Depending on the scripting language and purpose, they can be: GraalVM JS Mylyn Lucene Quartz ActiveMQ Flowable Mustache Chemistry The runtime can be scaled independently from the design time and can be deployed without the design time at all (for productive landscapes). Depending on the target cloud platform, you can integrate the services provided by the underlying technology platform in Eclipse Dirigible.","title":"Architecture"},{"location":"overview/credits/","text":"Credits and Special Thanks We would like to say a big THANK YOU! to all the open source projects that we use as components of our platform: GraalJS Mylyn CXF Derby Commons HttpClient Xerces Xalan WS Log4j Batik Velocity Quartz Spring Framework StaX Gson Antlr Hamcrest wsdl4j Slf4j jsoap ICU Mockito AOP Alliance jQuery Bootstrap AngularJS GoldenLayout Flowable Monaco Xtermjs ttyd acorn MkDocs Material for MkDocs unDraw and those who boosted our productivity in the past versions: Rhino Eclipse Equinox Eclipse OSGi Remote Application Platfrom Eclipse Orion Camel Ant Geronimo Felix JUnit Avalon JAF jRuby ACE Editor ASM Woodstox Jettison Groovy CyberNeko HTML EZMorph JCraft JLine","title":"Credits"},{"location":"overview/credits/#credits-and-special-thanks","text":"We would like to say a big THANK YOU! to all the open source projects that we use as components of our platform: GraalJS Mylyn CXF Derby Commons HttpClient Xerces Xalan WS Log4j Batik Velocity Quartz Spring Framework StaX Gson Antlr Hamcrest wsdl4j Slf4j jsoap ICU Mockito AOP Alliance jQuery Bootstrap AngularJS GoldenLayout Flowable Monaco Xtermjs ttyd acorn MkDocs Material for MkDocs unDraw and those who boosted our productivity in the past versions: Rhino Eclipse Equinox Eclipse OSGi Remote Application Platfrom Eclipse Orion Camel Ant Geronimo Felix JUnit Avalon JAF jRuby ACE Editor ASM Woodstox Jettison Groovy CyberNeko HTML EZMorph JCraft JLine","title":"Credits and Special Thanks"},{"location":"overview/editors-modelers/","text":"Editors & Modelers Editors List Monaco - the editor that powers VS Code . Modelers List Entity Data Modeler - design a domain model. Database Schema Modeler - desing a database schema. BPMN Modeler - design a business process. Form Designer - design a Web form.","title":"Editors & Modelers"},{"location":"overview/editors-modelers/#editors-modelers","text":"","title":"Editors & Modelers"},{"location":"overview/editors-modelers/#editors-list","text":"Monaco - the editor that powers VS Code .","title":"Editors List"},{"location":"overview/editors-modelers/#modelers-list","text":"Entity Data Modeler - design a domain model. Database Schema Modeler - desing a database schema. BPMN Modeler - design a business process. Form Designer - design a Web form.","title":"Modelers List"},{"location":"overview/engines/","text":"Engines Engines List Javascript GraalVM JS - a Javascript module based on the GraalVM JS engine. Web - serving the static content via the underlying web container's capabilities e.g. Apache Tomcat . Wiki Markdown - a Wiki engine supporting Markdown markup language and uses the Mylyn underlying framework. BPM - a BPMN specification supporting engine Flowable . OData - expose OData services from database tables/views. Command - execute shell commands and bash scripts. Deprecated Javascript Rhino - a Javascript module based on the Mozilla Rhino engine. Javascript Nashorn - a Javascript module based on the built-in Java Nashorn engine. Javascript V8 - a Javascript module based on the Chrome V8 engine.","title":"Engines"},{"location":"overview/engines/#engines","text":"","title":"Engines"},{"location":"overview/engines/#engines-list","text":"Javascript GraalVM JS - a Javascript module based on the GraalVM JS engine. Web - serving the static content via the underlying web container's capabilities e.g. Apache Tomcat . Wiki Markdown - a Wiki engine supporting Markdown markup language and uses the Mylyn underlying framework. BPM - a BPMN specification supporting engine Flowable . OData - expose OData services from database tables/views. Command - execute shell commands and bash scripts.","title":"Engines List"},{"location":"overview/engines/#deprecated","text":"Javascript Rhino - a Javascript module based on the Mozilla Rhino engine. Javascript Nashorn - a Javascript module based on the built-in Java Nashorn engine. Javascript V8 - a Javascript module based on the Chrome V8 engine.","title":"Deprecated"},{"location":"overview/faq/","text":"If you have a question that is not covered here, but it should be, please let us know . Concepts In-System Development In-System Development is a programming model used when you work directly on a live system. Avoid side-effects of a simulated (local) environment by working on a live system. Access live data via the same channel which will be used in production. All the dependencies and integrations are on place as they will be in production. Shortest development turn-around time. Short life-cycle management process. Vertical Scenarios & Horizontal Scaling Covering end-to-end scenarios including all the application layers from architecture perspective as well as all the development process phases from project management perspective. All or nothing \u2013 partial doesn't count. Equal runtime instances based on a single content package for simple and reliable management. Content-Centric & Centralized Repository All application artifacts are in a single repository. Operational repository vs SCM repository. During development process is used IO optimized repository. After the code is ready it is committed to SCM - version, inspection and support optimized repository. Simple life-cycle management and transport. Workspace, Public Registry separation based on the development life-cycle phases. Dynamic Languages Perfect match to Dynamic Applications - built for change. Can interpret (rather than compile) the execution of tasks. Existing smooth integration within the web servers. No restart required. Java is used for the core components of the platform, while JavaScript is for the application business logic (the glue code). Injected Services Available out-of-the-box for developers \u2013 request, response, datasource, http, CMIS storage, BPMN engine, wiki, indexer, user, etc. Standardized API for cloud developers. Different language's implementations are possible integrated via the extension point. Different provider's implementations can be exposed to developers on their cloud. Integration Services Why integration services are part of the core? Cloud applications usually are extensions to a packaged software (on-premise or on-demand). Re-use of 3-rd party services is very often in this context. Replication use-case - major scenario for on-premise to on-demand cross-platform applications. Scheduled jobs as asynchronous activities usually needed. Semantic separation of integration and orchestration services from the other general purpose services. Extensibility Why is the extensibility important and for whom? Software vendor's code vs customer's specific extension's code. Update and Upgrade issues. Business agility depends on the process change -ability. Bilateral extension-points and extensions descriptors. Web IDE Why it looks like Eclipse in a web browser? Why not more webby style? Lower barrier for Eclipse developers. Overall experience comfortable for developers proven for years from on-premise tools. Using of Resource like API and concepts. There are some themes you can choose from the menu for more \"webby\" look and feel. Decisions GraalJS Why GraalJS ? What about Rhino, Nashorn and V8? Mature engine with the best performance. Built-in debugger with simple API. Possibility to invoke standard Java objects directly, which is not recommended of course. Angular, Bootstrap & GoldenLayout Why moved from RAP to Angular, Bootstrap, GoldenLayout web frameworks? RAP is an Eclipse framework providing a rendering of the user interface for standard SWT/JFace widgets remotely e.g. in a browser. It brings for us: RAP is a mature framework and depends on a reliable API, but not so attractive for pure web developers (HTML, JavaScript, etc.). RAP is a stable framework with great support, but also it could be said for Angular 1.x and Bootstrap 3.x RAP rely on the standard modularization \u2013 OSGi, plugins, but comes with the complexity of Maven, Tycho, OSGi, Orbit, etc. integration. In RAP developers can write mostly in pure Java with all the benefits it brings by itself, but for web developers it turns out it is not a benefit, but a drawback. In RAP one can have a single sourcing components - reuse of existing functionality written as Eclipse plugins, which has never happen in the reality. RAP has possibility to integrate non-Java modules as well (pure client side HTML and JavaScript) via the browser component, but it is much more complex than pure web coding. JSON Models Why JSON for models? JSON is very simple data exchange format. We have chosen it for the standard format for all the models. Simple enough and human readable/writable. Support by mature frameworks for parsing/serializing. Quite popular and proved in web applications context. Flat Data Models Why flat data models? Proved by many business applications for years. Straight forward implementation on relational-database. Easy to be understood and used by the developers. Tools for it are also simple and easy to use. REST Why REST instead of server-side generation? We leverage the use of REST paradigm for the cloud applications created with the toolkit. There are quite enough reasons for these already well described in blogs related to Web 2.0. Clean separation of the data services from the user interface. Independent development of both including easy mocking. Possibility of reuse and/or composition of services in different user interfaces. Possibility of UI-less integration if needed. Better operations and support. Publish Why Publish? Developers can work safely on multiple workspaces. \"Publish\" transfers the artifacts to the central registry space for public use. One-Time-Generation Why one-time-generation? It is enough to boost productivity in some cases. MDA is also supported via Entity Data Modeler. No OSGi OSGi is the only real modularization framework for Java, but comes with much more complexity than needed for our case. We moved from OSGi to build only simple Maven dependency management with Java Services and Guice for runtime injections for the backend. How to How to build my own Dirigible? It is a standard Maven based project, so: git clone cd dirigible mvn clean install should work. How to add my own templates? It is quite easy - create a project with layout similar to ones from DirigibleLabs How to integrate my Java framework? It is even simpler - add it during the packaging phase as a regular Maven module to be packaged in the WAR or the executable JAR files. How to register my Enterprise JavaScript API? Once you make the your core framework available as a Maven module packaged into your WAR file, you can implement your own Enterprise JavaScript API facade. How to integrate my non-Java framework? It depends on the particular framework. Usually, it is via the Command feature. Please, contact us in case of interest. How to integrate my dynamic language? There is an Engine API which can be implemented, as well as a REST service which can execute the code. Warning Please, contact us if you plan such an integration.","title":"FAQ"},{"location":"overview/faq/#concepts","text":"In-System Development In-System Development is a programming model used when you work directly on a live system. Avoid side-effects of a simulated (local) environment by working on a live system. Access live data via the same channel which will be used in production. All the dependencies and integrations are on place as they will be in production. Shortest development turn-around time. Short life-cycle management process. Vertical Scenarios & Horizontal Scaling Covering end-to-end scenarios including all the application layers from architecture perspective as well as all the development process phases from project management perspective. All or nothing \u2013 partial doesn't count. Equal runtime instances based on a single content package for simple and reliable management. Content-Centric & Centralized Repository All application artifacts are in a single repository. Operational repository vs SCM repository. During development process is used IO optimized repository. After the code is ready it is committed to SCM - version, inspection and support optimized repository. Simple life-cycle management and transport. Workspace, Public Registry separation based on the development life-cycle phases. Dynamic Languages Perfect match to Dynamic Applications - built for change. Can interpret (rather than compile) the execution of tasks. Existing smooth integration within the web servers. No restart required. Java is used for the core components of the platform, while JavaScript is for the application business logic (the glue code). Injected Services Available out-of-the-box for developers \u2013 request, response, datasource, http, CMIS storage, BPMN engine, wiki, indexer, user, etc. Standardized API for cloud developers. Different language's implementations are possible integrated via the extension point. Different provider's implementations can be exposed to developers on their cloud. Integration Services Why integration services are part of the core? Cloud applications usually are extensions to a packaged software (on-premise or on-demand). Re-use of 3-rd party services is very often in this context. Replication use-case - major scenario for on-premise to on-demand cross-platform applications. Scheduled jobs as asynchronous activities usually needed. Semantic separation of integration and orchestration services from the other general purpose services. Extensibility Why is the extensibility important and for whom? Software vendor's code vs customer's specific extension's code. Update and Upgrade issues. Business agility depends on the process change -ability. Bilateral extension-points and extensions descriptors. Web IDE Why it looks like Eclipse in a web browser? Why not more webby style? Lower barrier for Eclipse developers. Overall experience comfortable for developers proven for years from on-premise tools. Using of Resource like API and concepts. There are some themes you can choose from the menu for more \"webby\" look and feel.","title":"Concepts"},{"location":"overview/faq/#decisions","text":"GraalJS Why GraalJS ? What about Rhino, Nashorn and V8? Mature engine with the best performance. Built-in debugger with simple API. Possibility to invoke standard Java objects directly, which is not recommended of course. Angular, Bootstrap & GoldenLayout Why moved from RAP to Angular, Bootstrap, GoldenLayout web frameworks? RAP is an Eclipse framework providing a rendering of the user interface for standard SWT/JFace widgets remotely e.g. in a browser. It brings for us: RAP is a mature framework and depends on a reliable API, but not so attractive for pure web developers (HTML, JavaScript, etc.). RAP is a stable framework with great support, but also it could be said for Angular 1.x and Bootstrap 3.x RAP rely on the standard modularization \u2013 OSGi, plugins, but comes with the complexity of Maven, Tycho, OSGi, Orbit, etc. integration. In RAP developers can write mostly in pure Java with all the benefits it brings by itself, but for web developers it turns out it is not a benefit, but a drawback. In RAP one can have a single sourcing components - reuse of existing functionality written as Eclipse plugins, which has never happen in the reality. RAP has possibility to integrate non-Java modules as well (pure client side HTML and JavaScript) via the browser component, but it is much more complex than pure web coding. JSON Models Why JSON for models? JSON is very simple data exchange format. We have chosen it for the standard format for all the models. Simple enough and human readable/writable. Support by mature frameworks for parsing/serializing. Quite popular and proved in web applications context. Flat Data Models Why flat data models? Proved by many business applications for years. Straight forward implementation on relational-database. Easy to be understood and used by the developers. Tools for it are also simple and easy to use. REST Why REST instead of server-side generation? We leverage the use of REST paradigm for the cloud applications created with the toolkit. There are quite enough reasons for these already well described in blogs related to Web 2.0. Clean separation of the data services from the user interface. Independent development of both including easy mocking. Possibility of reuse and/or composition of services in different user interfaces. Possibility of UI-less integration if needed. Better operations and support. Publish Why Publish? Developers can work safely on multiple workspaces. \"Publish\" transfers the artifacts to the central registry space for public use. One-Time-Generation Why one-time-generation? It is enough to boost productivity in some cases. MDA is also supported via Entity Data Modeler. No OSGi OSGi is the only real modularization framework for Java, but comes with much more complexity than needed for our case. We moved from OSGi to build only simple Maven dependency management with Java Services and Guice for runtime injections for the backend.","title":"Decisions"},{"location":"overview/faq/#how-to","text":"How to build my own Dirigible? It is a standard Maven based project, so: git clone cd dirigible mvn clean install should work. How to add my own templates? It is quite easy - create a project with layout similar to ones from DirigibleLabs How to integrate my Java framework? It is even simpler - add it during the packaging phase as a regular Maven module to be packaged in the WAR or the executable JAR files. How to register my Enterprise JavaScript API? Once you make the your core framework available as a Maven module packaged into your WAR file, you can implement your own Enterprise JavaScript API facade. How to integrate my non-Java framework? It depends on the particular framework. Usually, it is via the Command feature. Please, contact us in case of interest. How to integrate my dynamic language? There is an Engine API which can be implemented, as well as a REST service which can execute the code. Warning Please, contact us if you plan such an integration.","title":"How to"},{"location":"overview/features/","text":"Features Note The feature set listed bellow contains only the major part of what is currently available. For more insights on what can be done with Eclipse Dirigible, we recommend to try it out . Data Structures Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing. Scripting Services Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support for Typescript services ( *.ts ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers. Web Content Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc. Wiki Content Support of Markdown format for Wiki pages. Integration Services Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ). Mobile Applications Support of native mobile application development via Tabris.js . Extension Definitions Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ). Tooling Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS Modeling Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer Security Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly) Registry Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Features"},{"location":"overview/features/#features","text":"Note The feature set listed bellow contains only the major part of what is currently available. For more insights on what can be done with Eclipse Dirigible, we recommend to try it out .","title":"Features"},{"location":"overview/features/#data-structures","text":"Creation of table model (JSON formatted *.table descriptor) and actual creation of the corresponding database table during publishing. Creation of view model (JSON formatted *.view descriptor) and actual creation of the corresponding database view during publishing. Creation of delimiter separated values ( *.append , *.update , *.delete , *.replace ) data files and populating the corresponding database table during publishing. Automatic altering of existing tables from the models on compatible changes (new columns added). Modeling of the database schema ( *.dsm and *.schema ) files and creation of the tables, views, and constraints during publishing.","title":"Data Structures"},{"location":"overview/features/#scripting-services","text":"Support of JavaScript language by using GraalVM JS as runtime execution engine ( *.js ). Support for Typescript services ( *.ts ). Support of strictly defined enterprise API for JavaScript to be used by the business application developers.","title":"Scripting Services"},{"location":"overview/features/#web-content","text":"Support of client-side Web related artifacts, such as HTML, CSS, JS, pictures, etc.","title":"Web Content"},{"location":"overview/features/#wiki-content","text":"Support of Markdown format for Wiki pages.","title":"Wiki Content"},{"location":"overview/features/#integration-services","text":"Support of listeners for messages from the built-in message bus ( *.listener ). Support of scheduled jobs as triggers for backend services invocation ( *.job ). Support of business processes defined in BPMN 2.0 and executed by the underlying BPM process engine ( *.bpmn ). Support of shell commands execution ( *.command ). Support of OData 2.0 ( *.odata ). Support of websockets ( *.websocket ).","title":"Integration Services"},{"location":"overview/features/#mobile-applications","text":"Support of native mobile application development via Tabris.js .","title":"Mobile Applications"},{"location":"overview/features/#extension-definitions","text":"Creation of extension points (JSON formatted descriptor - *.extensionpoint ). Creation of extensions by a given extension point (JSON formatted descriptor - *.extension ).","title":"Extension Definitions"},{"location":"overview/features/#tooling","text":"Workbench perspective for full support of project management (New, Cut, Copy, Paste, Delete, Refresh, Import, Export, etc.) Database perspective for RDBMS management including SQL Console Enhanced code editor with highlight support for JavaScript, HTML, JSON, XML, etc. Preview view for easy testing of changes in Web, Wiki, and Scripting Services Configurable Logs view , which provides server-side logs and traces Lots of template-based wizards for creating new content and services Import and export of project content Documents perspective for import of binary files for external documents and pictures Repository perspective for low-level repository content management Debugger perspective for debugging backend JavaScript services Terminal perspective with the corresponding main view for execution of shell commands on the target instance's OS","title":"Tooling"},{"location":"overview/features/#modeling","text":"Modeling of database schema ( *.dsm and *.schema ) files with Database Schema Modeler Modeling of entity data model ( *.edm and *.model ) files with Entity Data Modeler Modeling of BPMN process ( *.bpmn ) files with BPMN Modeler Modeling of Web form layout ( *.form ) files with Form Designer","title":"Modeling"},{"location":"overview/features/#security","text":"Role-based access management for Web services as well as the document repository Security constraints model (JSON formatted *.access ) support Several predefined roles, which can be used out-of-the-box (Everyone, Administrator, Manager, PowerUser, User, ReadWrite, ReadOnly)","title":"Security"},{"location":"overview/features/#registry","text":"Publishing support - exposing the artifacts from the user's workspace publicly Auto-publishing support for better usability User interface for browsing and searching within the published content Separate lists of endpoints and viewers per type of services - JavaScript, Web, wiki, etc. Separate browse user interface for Web and wiki content","title":"Registry"},{"location":"overview/license/","text":"License The Dirigible project source code base is provided under the Eclipse Public License - v 2.0 Eclipse Public License - v 2.0 THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE PUBLIC LICENSE (\"AGREEMENT\"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT. 1. DEFINITIONS \"Contribution\" means: a) in the case of the initial Contributor, the initial code and documentation distributed under this Agreement, and b) in the case of each subsequent Contributor: i) changes to the Program, and ii) additions to the Program; where such changes and/or additions to the Program originate from and are distributed by that particular Contributor. A Contribution 'originates' from a Contributor if it was added to the Program by such Contributor itself or anyone acting on such Contributor's behalf. Contributions do not include additions to the Program which: (i) are separate modules of software distributed in conjunction with the Program under their own license agreement, and (ii) are not derivative works of the Program. \"Contributor\" means any person or entity that distributes the Program. \"Licensed Patents\" mean patent claims licensable by a Contributor which are necessarily infringed by the use or sale of its Contribution alone or when combined with the Program. \"Program\" means the Contributions distributed in accordance with this Agreement. \"Recipient\" means anyone who receives the Program under this Agreement, including all Contributors. 2. GRANT OF RIGHTS a) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, distribute and sublicense the Contribution of such Contributor, if any, and such derivative works, in source code and object code form. b) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free patent license under Licensed Patents to make, use, sell, offer to sell, import and otherwise transfer the Contribution of such Contributor, if any, in source code and object code form. This patent license shall apply to the combination of the Contribution and the Program if, at the time the Contribution is added by the Contributor, such addition of the Contribution causes such combination to be covered by the Licensed Patents. The patent license shall not apply to any other combinations which include the Contribution. No hardware per se is licensed hereunder. c) Recipient understands that although each Contributor grants the licenses to its Contributions set forth herein, no assurances are provided by any Contributor that the Program does not infringe the patent or other intellectual property rights of any other entity. Each Contributor disclaims any liability to Recipient for claims brought by any other entity based on infringement of intellectual property rights or otherwise. As a condition to exercising the rights and licenses granted hereunder, each Recipient hereby assumes sole responsibility to secure any other intellectual property rights needed, if any. For example, if a third party patent license is required to allow Recipient to distribute the Program, it is Recipient's responsibility to acquire that license before distributing the Program. d) Each Contributor represents that to its knowledge it has sufficient copyright rights in its Contribution, if any, to grant the copyright license set forth in this Agreement. 3. REQUIREMENTS A Contributor may choose to distribute the Program in object code form under its own license agreement, provided that: a) it complies with the terms and conditions of this Agreement; and b) its license agreement: i) effectively disclaims on behalf of all Contributors all warranties and conditions, express and implied, including warranties or conditions of title and non-infringement, and implied warranties or conditions of merchantability and fitness for a particular purpose; ii) effectively excludes on behalf of all Contributors all liability for damages, including direct, indirect, special, incidental and consequential damages, such as lost profits; iii) states that any provisions which differ from this Agreement are offered by that Contributor alone and not by any other party; and iv) states that source code for the Program is available from such Contributor, and informs licensees how to obtain it in a reasonable manner on or through a medium customarily used for software exchange. When the Program is made available in source code form: a) it must be made available under this Agreement; and b) a copy of this Agreement must be included with each copy of the Program. Contributors may not remove or alter any copyright notices contained within the Program. Each Contributor must identify itself as the originator of its Contribution, if any, in a manner that reasonably allows subsequent Recipients to identify the originator of the Contribution. 4. COMMERCIAL DISTRIBUTION Commercial distributors of software may accept certain responsibilities with respect to end users, business partners and the like. While this license is intended to facilitate the commercial use of the Program, the Contributor who includes the Program in a commercial product offering should do so in a manner which does not create potential liability for other Contributors. Therefore, if a Contributor includes the Program in a commercial product offering, such Contributor (\"Commercial Contributor\") hereby agrees to defend and indemnify every other Contributor (\"Indemnified Contributor\") against any losses, damages and costs (collectively \"Losses\") arising from claims, lawsuits and other legal actions brought by a third party against the Indemnified Contributor to the extent caused by the acts or omissions of such Commercial Contributor in connection with its distribution of the Program in a commercial product offering. The obligations in this section do not apply to any claims or Losses relating to any actual or alleged intellectual property infringement. In order to qualify, an Indemnified Contributor must: a) promptly notify the Commercial Contributor in writing of such claim, and b) allow the Commercial Contributor to control, and cooperate with the Commercial Contributor in, the defense and any related settlement negotiations. The Indemnified Contributor may participate in any such claim at its own expense. For example, a Contributor might include the Program in a commercial product offering, Product X. That Contributor is then a Commercial Contributor. If that Commercial Contributor then makes performance claims, or offers warranties related to Product X, those performance claims and warranties are such Commercial Contributor's responsibility alone. Under this section, the Commercial Contributor would have to defend claims against the other Contributors related to those performance claims and warranties, and if a court requires any other Contributor to pay any damages as a result, the Commercial Contributor must pay those damages. 5. NO WARRANTY EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON AN \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the appropriateness of using and distributing the Program and assumes all risks associated with its exercise of rights under this Agreement , including but not limited to the risks and costs of program errors, compliance with applicable laws, damage to or loss of data, programs or equipment, and unavailability or interruption of operations. 6. DISCLAIMER OF LIABILITY EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 7. GENERAL If any provision of this Agreement is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this Agreement, and without further action by the parties hereto, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable. If Recipient institutes patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Program itself (excluding combinations of the Program with other software or hardware) infringes such Recipient's patent(s), then such Recipient's rights granted under Section 2(b) shall terminate as of the date such litigation is filed. All Recipient's rights under this Agreement shall terminate if it fails to comply with any of the material terms or conditions of this Agreement and does not cure such failure in a reasonable period of time after becoming aware of such noncompliance. If all Recipient's rights under this Agreement terminate, Recipient agrees to cease use and distribution of the Program as soon as reasonably practicable. However, Recipient's obligations under this Agreement and any licenses granted by Recipient relating to the Program shall continue and survive. Everyone is permitted to copy and distribute copies of this Agreement, but in order to avoid inconsistency the Agreement is copyrighted and may only be modified in the following manner. The Agreement Steward reserves the right to publish new versions (including revisions) of this Agreement from time to time. No one other than the Agreement Steward has the right to modify this Agreement. The Eclipse Foundation is the initial Agreement Steward. The Eclipse Foundation may assign the responsibility to serve as the Agreement Steward to a suitable separate entity. Each new version of the Agreement will be given a distinguishing version number. The Program (including Contributions) may always be distributed subject to the version of the Agreement under which it was received. In addition, after a new version of the Agreement is published, Contributor may elect to distribute the Program (including its Contributions) under the new version. Except as expressly stated in Sections 2(a) and 2(b) above, Recipient receives no rights or licenses to the intellectual property of any Contributor under this Agreement, whether expressly, by implication, estoppel or otherwise. All rights in the Program not expressly granted under this Agreement are reserved. This Agreement is governed by the laws of the State of New York and the intellectual property laws of the United States of America. No party to this Agreement will bring a legal action under this Agreement more than one year after the cause of action arose. Each party waives its rights to a jury trial in any resulting litigation.","title":"License"},{"location":"overview/license/#license","text":"The Dirigible project source code base is provided under the Eclipse Public License - v 2.0","title":"License"},{"location":"overview/license/#eclipse-public-license-v-20","text":"THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE PUBLIC LICENSE (\"AGREEMENT\"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT.","title":"Eclipse Public License - v 2.0"},{"location":"overview/license/#1-definitions","text":"\"Contribution\" means: a) in the case of the initial Contributor, the initial code and documentation distributed under this Agreement, and b) in the case of each subsequent Contributor: i) changes to the Program, and ii) additions to the Program; where such changes and/or additions to the Program originate from and are distributed by that particular Contributor. A Contribution 'originates' from a Contributor if it was added to the Program by such Contributor itself or anyone acting on such Contributor's behalf. Contributions do not include additions to the Program which: (i) are separate modules of software distributed in conjunction with the Program under their own license agreement, and (ii) are not derivative works of the Program. \"Contributor\" means any person or entity that distributes the Program. \"Licensed Patents\" mean patent claims licensable by a Contributor which are necessarily infringed by the use or sale of its Contribution alone or when combined with the Program. \"Program\" means the Contributions distributed in accordance with this Agreement. \"Recipient\" means anyone who receives the Program under this Agreement, including all Contributors.","title":"1. DEFINITIONS"},{"location":"overview/license/#2-grant-of-rights","text":"a) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, distribute and sublicense the Contribution of such Contributor, if any, and such derivative works, in source code and object code form. b) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free patent license under Licensed Patents to make, use, sell, offer to sell, import and otherwise transfer the Contribution of such Contributor, if any, in source code and object code form. This patent license shall apply to the combination of the Contribution and the Program if, at the time the Contribution is added by the Contributor, such addition of the Contribution causes such combination to be covered by the Licensed Patents. The patent license shall not apply to any other combinations which include the Contribution. No hardware per se is licensed hereunder. c) Recipient understands that although each Contributor grants the licenses to its Contributions set forth herein, no assurances are provided by any Contributor that the Program does not infringe the patent or other intellectual property rights of any other entity. Each Contributor disclaims any liability to Recipient for claims brought by any other entity based on infringement of intellectual property rights or otherwise. As a condition to exercising the rights and licenses granted hereunder, each Recipient hereby assumes sole responsibility to secure any other intellectual property rights needed, if any. For example, if a third party patent license is required to allow Recipient to distribute the Program, it is Recipient's responsibility to acquire that license before distributing the Program. d) Each Contributor represents that to its knowledge it has sufficient copyright rights in its Contribution, if any, to grant the copyright license set forth in this Agreement.","title":"2. GRANT OF RIGHTS"},{"location":"overview/license/#3-requirements","text":"A Contributor may choose to distribute the Program in object code form under its own license agreement, provided that: a) it complies with the terms and conditions of this Agreement; and b) its license agreement: i) effectively disclaims on behalf of all Contributors all warranties and conditions, express and implied, including warranties or conditions of title and non-infringement, and implied warranties or conditions of merchantability and fitness for a particular purpose; ii) effectively excludes on behalf of all Contributors all liability for damages, including direct, indirect, special, incidental and consequential damages, such as lost profits; iii) states that any provisions which differ from this Agreement are offered by that Contributor alone and not by any other party; and iv) states that source code for the Program is available from such Contributor, and informs licensees how to obtain it in a reasonable manner on or through a medium customarily used for software exchange. When the Program is made available in source code form: a) it must be made available under this Agreement; and b) a copy of this Agreement must be included with each copy of the Program. Contributors may not remove or alter any copyright notices contained within the Program. Each Contributor must identify itself as the originator of its Contribution, if any, in a manner that reasonably allows subsequent Recipients to identify the originator of the Contribution.","title":"3. REQUIREMENTS"},{"location":"overview/license/#4-commercial-distribution","text":"Commercial distributors of software may accept certain responsibilities with respect to end users, business partners and the like. While this license is intended to facilitate the commercial use of the Program, the Contributor who includes the Program in a commercial product offering should do so in a manner which does not create potential liability for other Contributors. Therefore, if a Contributor includes the Program in a commercial product offering, such Contributor (\"Commercial Contributor\") hereby agrees to defend and indemnify every other Contributor (\"Indemnified Contributor\") against any losses, damages and costs (collectively \"Losses\") arising from claims, lawsuits and other legal actions brought by a third party against the Indemnified Contributor to the extent caused by the acts or omissions of such Commercial Contributor in connection with its distribution of the Program in a commercial product offering. The obligations in this section do not apply to any claims or Losses relating to any actual or alleged intellectual property infringement. In order to qualify, an Indemnified Contributor must: a) promptly notify the Commercial Contributor in writing of such claim, and b) allow the Commercial Contributor to control, and cooperate with the Commercial Contributor in, the defense and any related settlement negotiations. The Indemnified Contributor may participate in any such claim at its own expense. For example, a Contributor might include the Program in a commercial product offering, Product X. That Contributor is then a Commercial Contributor. If that Commercial Contributor then makes performance claims, or offers warranties related to Product X, those performance claims and warranties are such Commercial Contributor's responsibility alone. Under this section, the Commercial Contributor would have to defend claims against the other Contributors related to those performance claims and warranties, and if a court requires any other Contributor to pay any damages as a result, the Commercial Contributor must pay those damages.","title":"4. COMMERCIAL DISTRIBUTION"},{"location":"overview/license/#5-no-warranty","text":"EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON AN \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the appropriateness of using and distributing the Program and assumes all risks associated with its exercise of rights under this Agreement , including but not limited to the risks and costs of program errors, compliance with applicable laws, damage to or loss of data, programs or equipment, and unavailability or interruption of operations.","title":"5. NO WARRANTY"},{"location":"overview/license/#6-disclaimer-of-liability","text":"EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.","title":"6. DISCLAIMER OF LIABILITY"},{"location":"overview/license/#7-general","text":"If any provision of this Agreement is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this Agreement, and without further action by the parties hereto, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable. If Recipient institutes patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Program itself (excluding combinations of the Program with other software or hardware) infringes such Recipient's patent(s), then such Recipient's rights granted under Section 2(b) shall terminate as of the date such litigation is filed. All Recipient's rights under this Agreement shall terminate if it fails to comply with any of the material terms or conditions of this Agreement and does not cure such failure in a reasonable period of time after becoming aware of such noncompliance. If all Recipient's rights under this Agreement terminate, Recipient agrees to cease use and distribution of the Program as soon as reasonably practicable. However, Recipient's obligations under this Agreement and any licenses granted by Recipient relating to the Program shall continue and survive. Everyone is permitted to copy and distribute copies of this Agreement, but in order to avoid inconsistency the Agreement is copyrighted and may only be modified in the following manner. The Agreement Steward reserves the right to publish new versions (including revisions) of this Agreement from time to time. No one other than the Agreement Steward has the right to modify this Agreement. The Eclipse Foundation is the initial Agreement Steward. The Eclipse Foundation may assign the responsibility to serve as the Agreement Steward to a suitable separate entity. Each new version of the Agreement will be given a distinguishing version number. The Program (including Contributions) may always be distributed subject to the version of the Agreement under which it was received. In addition, after a new version of the Agreement is published, Contributor may elect to distribute the Program (including its Contributions) under the new version. Except as expressly stated in Sections 2(a) and 2(b) above, Recipient receives no rights or licenses to the intellectual property of any Contributor under this Agreement, whether expressly, by implication, estoppel or otherwise. All rights in the Program not expressly granted under this Agreement are reserved. This Agreement is governed by the laws of the State of New York and the intellectual property laws of the United States of America. No party to this Agreement will bring a legal action under this Agreement more than one year after the cause of action arose. Each party waives its rights to a jury trial in any resulting litigation.","title":"7. GENERAL"},{"location":"overview/runtime-services/","text":"Runtime Services There are several REST services available at runtime, which can give you another communication channel with Dirigible containers.","title":"Runtime Services"},{"location":"overview/runtime-services/#runtime-services","text":"There are several REST services available at runtime, which can give you another communication channel with Dirigible containers.","title":"Runtime Services"},{"location":"setup/","text":"Setup in Tomcat Deploy Eclipse Dirigible in Apache Tomcat web container. In this case the built-in H2 database is used. Prerequisites Download the Tomcat binary . More information about how to deploy on Tomcat can be found here . JDK 11 or JDK 13 - OpenJDK versions can be found here . macOS Linux Windows Install ttyd : brew install ttyd Linux support is built-in. More info about ttyd can be found at: ttyd You may experience certain functional limitations, if you decide to run the Web IDE locally on Windows using Tomcat: Limitations related to the Create symbolic links policy . Some tests in local builds of Dirigible may fail on Windows due to the same policy restriction. You may grant your user account access to create symbolic links by editing the policy: Go to (WIN + R) > gpedit.msc Navigate to: Computer Configuration -> Windows Settings -> Security Settings -> Local Policies -> User Rights Assignment -> Create Symbolic links . Add your Windows user account to the policy. Note : Editing this policy may make your machine vulnerable to symbolic link attacks as noted here . Alternative of the Windows setup would be to follow the Setup as a Docker Image . Some parts of Dirigible are sensitive to line endings, and assume Unix-style newlines. Git on Windows may attempt to switch files to use a Windows-style CR/LF endings, which will cause problems when building and running Dirigible on Windows. In order to prevent this, git should be instructed to preserve the line endings of files. From a command prompt, type git config core.autocrlf . If the result is not false , change it with git config core.autocrlf false . Steps Download ROOT.war for Tomcat from: download.dirigible.io Note For local test & development purposes, we recommend the server-all distribution. Configure the Users store under $CATALINA_HOME/conf/tomcat-users.xml : Copy the Dirigible's ROOT.war to $TOMCAT/webapps folder. Configure the target Database setup, if needed: Local (H2) PostgreSQL MySQL HANA Sybase ASE No additional setup is needed. Install postgresql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install postgresql postgresql-contrib Create a default database for Eclipse Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: psql dirigible_database create user dirigible_system with password 'dirigible1234'; grant all on database dirigible_database to dirigible_system; Datasource configuration: Download the postgresql JDBC driver version 4.1 from here . Copy the postgresql-*.jar file to the /lib directory. Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=POSTGRES export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=POSTGRES export POSTGRES_DRIVER=org.postgresql.Driver export POSTGRES_URL=jdbc:postgresql://localhost:5432/dirigible_database export POSTGRES_USERNAME=dirigible_system export POSTGRES_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=org.postgresql.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:postgresql://localhost:5432/dirigible_database export DIRIGIBLE_SCHEDULER_DATABASE_USER=dirigible_system export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=true export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=true export DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE=true Install mysql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install mysql-server sudo mysql\\_install\\_db sudo /usr/bin/mysql\\_secure\\_installation Create the default database for Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: mysql -u root -p CREATE DATABASE dirigible_database; CREATE USER 'dirigible_system'@'localhost' IDENTIFIED BY 'dirigible1234'; GRANT ALL PRIVILEGES ON dirigible_database.* TO 'dirigible_system'@'localhost' WITH GRANT OPTION; Datasource configuration: Download the mysql JDBC driver version 5.1 from here . Copy the mysql-*.jar file to the /lib directory. Open the file /conf/context.xml and add the following within the context: web.xml - make sure the initial parameter jndiDefaultDataSource is uncommented: jndiDefaultDataSource java:comp/env/jdbc/DefaultDB Also, the initial parameter jdbcAutoCommit must be set to false (by default). jdbcAutoCommit false The type of the datasource is jndi instead of local . defaultDataSourceType jndi Lastly, the resource reference for the datasource has to be uncommented. jdbc/DefaultDB javax.sql.DataSource Container Install HANA Express . Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=HANA export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=HANA export HANA_DRIVER=com.sap.db.jdbc.Driver export HANA_URL=jdbc:sap://: export HANA_USERNAME= export HANA_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sap.db.jdbc.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:sap://: export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=false Remember to replace the , , , placeholders. How to setup a test environment on Amazon: Select Image Size: t2.medium Security Group: TCP Custom, 5000 Download Sybase ASE Express from here . Transfer: scp -i dirigible-aws.pem ASE_Suite.linuxamd64.tgz ec2-user@:~ scp -i dirigible-aws.pem apache-tomcat-XXX.zip ec2-user@:~ scp -i dirigible-aws.pem ROOT.war ec2-user@:~ scp -i dirigible-aws.pem jdk-8u144-linux-x64.tar.gz ec2-user@:~ Prepare OS: sudo mkdir -p /opt/sybase sudo mkdir -p /var/sybase sudo groupadd sybase sudo useradd -g sybase -d /opt/sybase sybase sudo passwd sybase sudo chown sybase:sybase /opt/sybase sudo chown sybase:sybase /var/sybase Login: ssh ec2-user@ -i dirigible-aws.pem Setup: su - sybase mkdir install cd install cp /home/ec2-user/ASE_Suite.linuxamd64.tgz . tar -xvf ASE_Suite.linuxamd64.tgz ./setup.bin -i console Parameters: Choose Install Folder -> use: /opt/sybase Choose Install Set -> 1- Typical Software License Type Selection -> 2- Install Express Edition of SAP Adaptive Server Enterprise End-user License Agreement -> 1) All regions Configure New Servers -> [X] 1 - Configure new SAP ASE Configure Servers with Different User Account -> 2- No SAP ASE Name ASE160 System Administrator's Password ****** Enable SAP ASE for SAP ASE Cockpit monitoring false Technical user tech_user Technical user password ******** Host Name ip-.eu-central-1.comp Port Number 5000 Application Type Mixed (OLTP/DSS) Create sample databases false Page Size 4k Error Log /opt/sybase/ASE-16_0/install/ASE1 Default Language Default Character Set Default Sort Order Master Device /opt/sybase/data/master.dat Master Device Size (MB) 500 Master Database Size (MB) 250 System Procedure Device /opt/sybase/data/sysprocs.dat System Procedure Device Size (MB) 500 System Procedure Database Size (MB) 500 System Device /opt/sybase/data/sybsysdb.dat System Device Size (MB) 100 System Database Size (MB) 100 Tempdb Device /opt/sybase/data/tempdbdev.dat Tempdb Device Size (MB) 1000 Tempdb Database Size (MB) 1000 Enable PCI false Optimize SAP ASE Configuration false Show Servers: /opt/sybase/ASE-16_0/install/showserver Prepare Test Environment: cd /opt/sybase/install cp /home/ec2-user/apache-tomcat-XXX.zip . cp /home/ec2-user/jdk-8u144-linux-x64.tar.gz . unzip apache-tomcat-XXX.zip tar -xvf jdk-8u144-linux-x64.tar.gz export JAVA_HOME=/opt/sybase/install/jdk1.8.0_144 Add the provided JDBC driver to the lib folder: cp /opt/sybase/shared/lib/jconn4.jar /home/ec2-user/apache-tomcat-XXX/lib Useful actions in case of issues: Start Server: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/sybase/OCS-16_0/lib3p64 export LANG=C cd /opt/sybase/ASE-16_0/bin ./startserver -f /opt/sybase/ASE-16_0/install/RUN_ASE160 Stop Server: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 shutdown with nowait go Kill Hanging Requests: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 sp_who go kill spid Uninstall: cd /opt/sybase/sybuninstall/ASESuite ./uninstall -i console Set the environment variables export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=SYBASE export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=SYBASE export SYBASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export SYBASE_URL=jdbc:sybase:Tds::?ServiceName= export SYBASE_USERNAME= export SYBASE_PASSWORD= export SYBASE_CONNECTION_PROPERTIES=\"DYNAMIC_PREPARE=true;SSL_TRUST_ALL_CERTS=true;JCONNECT_VERSION=0;ENABLE_SSL=true;\" export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export DIRIGIBLE_SCHEDULER_DATABASE_URL=\"jdbc:sybase:Tds::?ServiceName=&DYNAMIC_PREPARE=true&JCONNECT_VERSION=0&ENABLE_SSL=true&SSL_TRUST_ALL_CERTS=true\" export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.SybaseDelegate Remember to replace the , , , placeholders._ Start the Tomcat server. Open a web browser and go to: http://localhost:8080/ Note The default user name and password are dirigible/dirigible Manager App In case you want to use Apache Tomcat's Manager App to deploy the ROOT.war file, you have to increase the file size limit for upload (e.g. to 200MB): conf\\server.xml webapps\\manager\\WEB-INF\\web.xml ... ... 0 209715200 209715200 ... ... ","title":"Tomcat"},{"location":"setup/#setup-in-tomcat","text":"Deploy Eclipse Dirigible in Apache Tomcat web container. In this case the built-in H2 database is used. Prerequisites Download the Tomcat binary . More information about how to deploy on Tomcat can be found here . JDK 11 or JDK 13 - OpenJDK versions can be found here . macOS Linux Windows Install ttyd : brew install ttyd Linux support is built-in. More info about ttyd can be found at: ttyd You may experience certain functional limitations, if you decide to run the Web IDE locally on Windows using Tomcat: Limitations related to the Create symbolic links policy . Some tests in local builds of Dirigible may fail on Windows due to the same policy restriction. You may grant your user account access to create symbolic links by editing the policy: Go to (WIN + R) > gpedit.msc Navigate to: Computer Configuration -> Windows Settings -> Security Settings -> Local Policies -> User Rights Assignment -> Create Symbolic links . Add your Windows user account to the policy. Note : Editing this policy may make your machine vulnerable to symbolic link attacks as noted here . Alternative of the Windows setup would be to follow the Setup as a Docker Image . Some parts of Dirigible are sensitive to line endings, and assume Unix-style newlines. Git on Windows may attempt to switch files to use a Windows-style CR/LF endings, which will cause problems when building and running Dirigible on Windows. In order to prevent this, git should be instructed to preserve the line endings of files. From a command prompt, type git config core.autocrlf . If the result is not false , change it with git config core.autocrlf false .","title":"Setup in Tomcat"},{"location":"setup/#steps","text":"Download ROOT.war for Tomcat from: download.dirigible.io Note For local test & development purposes, we recommend the server-all distribution. Configure the Users store under $CATALINA_HOME/conf/tomcat-users.xml : Copy the Dirigible's ROOT.war to $TOMCAT/webapps folder. Configure the target Database setup, if needed: Local (H2) PostgreSQL MySQL HANA Sybase ASE No additional setup is needed. Install postgresql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install postgresql postgresql-contrib Create a default database for Eclipse Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: psql dirigible_database create user dirigible_system with password 'dirigible1234'; grant all on database dirigible_database to dirigible_system; Datasource configuration: Download the postgresql JDBC driver version 4.1 from here . Copy the postgresql-*.jar file to the /lib directory. Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=POSTGRES export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=POSTGRES export POSTGRES_DRIVER=org.postgresql.Driver export POSTGRES_URL=jdbc:postgresql://localhost:5432/dirigible_database export POSTGRES_USERNAME=dirigible_system export POSTGRES_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=org.postgresql.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:postgresql://localhost:5432/dirigible_database export DIRIGIBLE_SCHEDULER_DATABASE_USER=dirigible_system export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD=dirigible1234 export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=true export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=true export DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE=true Install mysql on Linux (Debian-based) with: sudo apt-get update sudo apt-get install mysql-server sudo mysql\\_install\\_db sudo /usr/bin/mysql\\_secure\\_installation Create the default database for Dirigible: sudo -i -u postgres createdb dirigible_database Create a system user for the Eclipse Dirigible database: mysql -u root -p CREATE DATABASE dirigible_database; CREATE USER 'dirigible_system'@'localhost' IDENTIFIED BY 'dirigible1234'; GRANT ALL PRIVILEGES ON dirigible_database.* TO 'dirigible_system'@'localhost' WITH GRANT OPTION; Datasource configuration: Download the mysql JDBC driver version 5.1 from here . Copy the mysql-*.jar file to the /lib directory. Open the file /conf/context.xml and add the following within the context: web.xml - make sure the initial parameter jndiDefaultDataSource is uncommented: jndiDefaultDataSource java:comp/env/jdbc/DefaultDB Also, the initial parameter jdbcAutoCommit must be set to false (by default). jdbcAutoCommit false The type of the datasource is jndi instead of local . defaultDataSourceType jndi Lastly, the resource reference for the datasource has to be uncommented. jdbc/DefaultDB javax.sql.DataSource Container Install HANA Express . Set the environment variables: export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=HANA export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=HANA export HANA_DRIVER=com.sap.db.jdbc.Driver export HANA_URL=jdbc:sap://: export HANA_USERNAME= export HANA_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sap.db.jdbc.Driver export DIRIGIBLE_SCHEDULER_DATABASE_URL=jdbc:sap://: export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE=false Remember to replace the , , , placeholders. How to setup a test environment on Amazon: Select Image Size: t2.medium Security Group: TCP Custom, 5000 Download Sybase ASE Express from here . Transfer: scp -i dirigible-aws.pem ASE_Suite.linuxamd64.tgz ec2-user@:~ scp -i dirigible-aws.pem apache-tomcat-XXX.zip ec2-user@:~ scp -i dirigible-aws.pem ROOT.war ec2-user@:~ scp -i dirigible-aws.pem jdk-8u144-linux-x64.tar.gz ec2-user@:~ Prepare OS: sudo mkdir -p /opt/sybase sudo mkdir -p /var/sybase sudo groupadd sybase sudo useradd -g sybase -d /opt/sybase sybase sudo passwd sybase sudo chown sybase:sybase /opt/sybase sudo chown sybase:sybase /var/sybase Login: ssh ec2-user@ -i dirigible-aws.pem Setup: su - sybase mkdir install cd install cp /home/ec2-user/ASE_Suite.linuxamd64.tgz . tar -xvf ASE_Suite.linuxamd64.tgz ./setup.bin -i console Parameters: Choose Install Folder -> use: /opt/sybase Choose Install Set -> 1- Typical Software License Type Selection -> 2- Install Express Edition of SAP Adaptive Server Enterprise End-user License Agreement -> 1) All regions Configure New Servers -> [X] 1 - Configure new SAP ASE Configure Servers with Different User Account -> 2- No SAP ASE Name ASE160 System Administrator's Password ****** Enable SAP ASE for SAP ASE Cockpit monitoring false Technical user tech_user Technical user password ******** Host Name ip-.eu-central-1.comp Port Number 5000 Application Type Mixed (OLTP/DSS) Create sample databases false Page Size 4k Error Log /opt/sybase/ASE-16_0/install/ASE1 Default Language Default Character Set Default Sort Order Master Device /opt/sybase/data/master.dat Master Device Size (MB) 500 Master Database Size (MB) 250 System Procedure Device /opt/sybase/data/sysprocs.dat System Procedure Device Size (MB) 500 System Procedure Database Size (MB) 500 System Device /opt/sybase/data/sybsysdb.dat System Device Size (MB) 100 System Database Size (MB) 100 Tempdb Device /opt/sybase/data/tempdbdev.dat Tempdb Device Size (MB) 1000 Tempdb Database Size (MB) 1000 Enable PCI false Optimize SAP ASE Configuration false Show Servers: /opt/sybase/ASE-16_0/install/showserver Prepare Test Environment: cd /opt/sybase/install cp /home/ec2-user/apache-tomcat-XXX.zip . cp /home/ec2-user/jdk-8u144-linux-x64.tar.gz . unzip apache-tomcat-XXX.zip tar -xvf jdk-8u144-linux-x64.tar.gz export JAVA_HOME=/opt/sybase/install/jdk1.8.0_144 Add the provided JDBC driver to the lib folder: cp /opt/sybase/shared/lib/jconn4.jar /home/ec2-user/apache-tomcat-XXX/lib Useful actions in case of issues: Start Server: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/sybase/OCS-16_0/lib3p64 export LANG=C cd /opt/sybase/ASE-16_0/bin ./startserver -f /opt/sybase/ASE-16_0/install/RUN_ASE160 Stop Server: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 shutdown with nowait go Kill Hanging Requests: cd /opt/sybase/OCS-16_0/bin export LANG=C ./isql -Usa -SASE160 sp_who go kill spid Uninstall: cd /opt/sybase/sybuninstall/ASESuite ./uninstall -i console Set the environment variables export DIRIGIBLE_DATABASE_PROVIDER=custom export DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=SYBASE export DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT=SYBASE export SYBASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export SYBASE_URL=jdbc:sybase:Tds::?ServiceName= export SYBASE_USERNAME= export SYBASE_PASSWORD= export SYBASE_CONNECTION_PROPERTIES=\"DYNAMIC_PREPARE=true;SSL_TRUST_ALL_CERTS=true;JCONNECT_VERSION=0;ENABLE_SSL=true;\" export DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE=false export DIRIGIBLE_SCHEDULER_DATABASE_DRIVER=com.sybase.jdbc4.jdbc.SybDriver export DIRIGIBLE_SCHEDULER_DATABASE_URL=\"jdbc:sybase:Tds::?ServiceName=&DYNAMIC_PREPARE=true&JCONNECT_VERSION=0&ENABLE_SSL=true&SSL_TRUST_ALL_CERTS=true\" export DIRIGIBLE_SCHEDULER_DATABASE_USER= export DIRIGIBLE_SCHEDULER_DATABASE_PASSWORD= export DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE=org.quartz.impl.jdbcjobstore.SybaseDelegate Remember to replace the , , , placeholders._ Start the Tomcat server. Open a web browser and go to: http://localhost:8080/ Note The default user name and password are dirigible/dirigible","title":"Steps"},{"location":"setup/#manager-app","text":"In case you want to use Apache Tomcat's Manager App to deploy the ROOT.war file, you have to increase the file size limit for upload (e.g. to 200MB): conf\\server.xml webapps\\manager\\WEB-INF\\web.xml ... ... 0 209715200 209715200 ... ... ","title":"Manager App"},{"location":"setup/cloud-foundry/","text":"Setup in Cloud Foundry Deploy Eclipse Dirigible in SAP BTP 1 , Cloud Foundry environment. Prerequisites Install Cloud Foundry Command Line Interface . Access to SAP BTP account (the Trial landscape can be accessed here ). Steps Set the SAP BTP Cloud Foundry API host: cf api Log in to the SAP BTP, Cloud Foundry environment with: cf login Create XSUAA service instance: Copy and paste the following content into xs-security.json : { \"xsappname\" : \"-xsuaa\" , \"tenant-mode\" : \"shared\" , \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your application name, e.g. dirigible . Create a XSUAA service instance: cf create-service xsuaa application -xsuaa -c xs-security.json Note Use the same as in the previous step. Deploy Eclipse Dirigible: Docker Buildpack cf push dirigible \\ --docker-image=dirigiblelabs/dirigible-sap-cf:latest \\ -m 2G -k 2G Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Bind the XSUAA service instance to the Eclipse Dirigible deployment: cf bind-service dirigible -xsuaa Note Replace the placeholder with the application name used in the previous steps. Restart the dirigible deployment: cf restart dirigible Download the sap-cf-all binaries from the downloads site: download.dirigible.io Unzip the downloaded archieve to extract the ROOT.war file. Create manifest.yaml file in the same directory where the ROOT.war is located: applications : - name : dirigible host : dirigible- memory : 2G buildpack : sap_java_buildpack path : ROOT.war env : JBP_CONFIG_COMPONENTS : \"jres: ['com.sap.xs.java.buildpack.jdk.SAPMachineJDK']\" JBP_CONFIG_SAP_MACHINE_JRE : 'jre: { version: 11.+ }' services : - -xsuaa Note Replace the placeholder with your subaccount's Subdomain value. Replace the placeholder with the application name used in the previous steps. Deploy with: cf push Assign the Developer and Operator roles. Log in. Additional Materials Step-by-step tutorial can be found here . SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"Cloud Foundry"},{"location":"setup/cloud-foundry/#setup-in-cloud-foundry","text":"Deploy Eclipse Dirigible in SAP BTP 1 , Cloud Foundry environment. Prerequisites Install Cloud Foundry Command Line Interface . Access to SAP BTP account (the Trial landscape can be accessed here ).","title":"Setup in Cloud Foundry"},{"location":"setup/cloud-foundry/#steps","text":"Set the SAP BTP Cloud Foundry API host: cf api Log in to the SAP BTP, Cloud Foundry environment with: cf login Create XSUAA service instance: Copy and paste the following content into xs-security.json : { \"xsappname\" : \"-xsuaa\" , \"tenant-mode\" : \"shared\" , \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your application name, e.g. dirigible . Create a XSUAA service instance: cf create-service xsuaa application -xsuaa -c xs-security.json Note Use the same as in the previous step. Deploy Eclipse Dirigible: Docker Buildpack cf push dirigible \\ --docker-image=dirigiblelabs/dirigible-sap-cf:latest \\ -m 2G -k 2G Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Bind the XSUAA service instance to the Eclipse Dirigible deployment: cf bind-service dirigible -xsuaa Note Replace the placeholder with the application name used in the previous steps. Restart the dirigible deployment: cf restart dirigible Download the sap-cf-all binaries from the downloads site: download.dirigible.io Unzip the downloaded archieve to extract the ROOT.war file. Create manifest.yaml file in the same directory where the ROOT.war is located: applications : - name : dirigible host : dirigible- memory : 2G buildpack : sap_java_buildpack path : ROOT.war env : JBP_CONFIG_COMPONENTS : \"jres: ['com.sap.xs.java.buildpack.jdk.SAPMachineJDK']\" JBP_CONFIG_SAP_MACHINE_JRE : 'jre: { version: 11.+ }' services : - -xsuaa Note Replace the placeholder with your subaccount's Subdomain value. Replace the placeholder with the application name used in the previous steps. Deploy with: cf push Assign the Developer and Operator roles. Log in. Additional Materials Step-by-step tutorial can be found here . SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"Steps"},{"location":"setup/docker/","text":"Setup as a Docker Image Deploy Eclipse Dirigible in Docker. Prerequisites Install Docker . Steps Pull the Dirigible Docker image: docker pull dirigiblelabs/dirigible:latest Start the container: Run with Mounted Volume with Environment Configurations with Java Debugging Options docker run --name dirigible \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -v :/target \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -e DIRIGIBLE_BRANDING_NAME=\"My Web IDE\" \\ -e DIRIGIBLE_BRANDING_BRAND=\"WebIDE\" \\ -e DIRIGIBLE_BRANDING_BRAND_URL=\"https://www.eclipse.org\" \\ -e DIRIGIBLE_THEME_DEFAULT=\"fiori\" \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Note The complete list of Dirigible environment variables could be found here . docker run --name dirigible \\ -e JPDA_ADDRESS=0.0.0.0:8000 \\ -e JPDA_TRANSPORT=dt_socket \\ --rm -p 8000:8000 -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Open a web browser and go to: http://localhost:8080/ Note The default user name and password are admin/admin Stop the container: docker stop dirigible","title":"Docker"},{"location":"setup/docker/#setup-as-a-docker-image","text":"Deploy Eclipse Dirigible in Docker. Prerequisites Install Docker .","title":"Setup as a Docker Image"},{"location":"setup/docker/#steps","text":"Pull the Dirigible Docker image: docker pull dirigiblelabs/dirigible:latest Start the container: Run with Mounted Volume with Environment Configurations with Java Debugging Options docker run --name dirigible \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -v :/target \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest docker run --name dirigible \\ -e DIRIGIBLE_BRANDING_NAME=\"My Web IDE\" \\ -e DIRIGIBLE_BRANDING_BRAND=\"WebIDE\" \\ -e DIRIGIBLE_BRANDING_BRAND_URL=\"https://www.eclipse.org\" \\ -e DIRIGIBLE_THEME_DEFAULT=\"fiori\" \\ --rm -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Note The complete list of Dirigible environment variables could be found here . docker run --name dirigible \\ -e JPDA_ADDRESS=0.0.0.0:8000 \\ -e JPDA_TRANSPORT=dt_socket \\ --rm -p 8000:8000 -p 8080:8080 -p 8081:8081 \\ dirigiblelabs/dirigible:latest Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Open a web browser and go to: http://localhost:8080/ Note The default user name and password are admin/admin Stop the container: docker stop dirigible","title":"Steps"},{"location":"setup/setup-environment-variables/","text":"Environment Variables Configuration Types Based on the layer, they are defined, configuration variables have the following priorities: Runtime Environment Deployment Module Highest precedence: No rebuild or restart of the application is required when configuration is changed. The Configuration API could be used to apply changes in the Runtime configuration. Second precedence: No rebuild is required when configuration is changed, however the application should be restarted, to apply the environment changes. Usually the Environment configurations are provided during the application deployment, as part of application descriptor (e.g. Define environment variable for container in Kubernetes or in Cloud Foundry App Manifest ) . Third precedence: Rebuild and re-deployment is required. \"Default\" deployment ( ROOT.war ) configuration variables are taken from dirigible.properties properties file (sample could be found here ) . Lowest precedence: Rebuild and re-deployment is required. \"Default\" module (e.g. dirigible-database-custom.jar , dirigible-database-h2.jar ) configuration variables are taken from dirigible-xxx.properties properties files (sample could be found here and here ) Note The precedence order means that, if the there is an Environment variable with name DIRIGIBLE_TEST and Runtime variable with the same name, the Runtime variable will have high prority and will be applied. All applied configuration values could be found under the Configurations View . Configuration Parameters Branding Parameter Description Default* DIRIGIBLE_BRANDING_NAME The brand name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND The branding name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND_URL The branding URL https://www.dirigible.io DIRIGIBLE_BRANDING_ICON The branding icon ../../../../services/v4/web/resources/images/favicon.png DIRIGIBLE_BRANDING_WELCOME_PAGE_DEFAULT The branding welcome page ../../../../services/v4/web/ide/welcome.html DIRIGIBLE_BRANDING_HELP_ITEMS The list of the custom help menu items (comma separated) - Branding - Help Items Note Replace CUSTOM_ITEM with the actual name set by DIRIGIBLE_BRANDING_HELP_ITEMS e.g. ITEM1 Parameter Description Default* DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_NAME The name of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_URL The url of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_ORDER (Optional) The order of the custom help item 0 DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_DIVIDER (Optional) Whether to set divider after the custom help item false Server Parameter Description Default* DIRIGIBLE_SERVER_PORT The port that Eclipse Dirigible will start on 8080 Basic Parameter Description Default* DIRIGIBLE_BASIC_ENABLED Whether the Basic authentication is enabled true DIRIGIBLE_BASIC_USERNAME Base64 encoded property, which will be used as user name for basic authentication admin DIRIGIBLE_BASIC_PASSWORD Base64 encoded property, which will be used as password for basic authentication admin OAuth Parameter Description Default* DIRIGIBLE_OAUTH_ENABLED Whether the OAuth authentication is enabled false DIRIGIBLE_OAUTH_AUTHORIZE_URL The OAuth authorization URL (e.g. https://my-oauth-server/oauth/authorize ) - DIRIGIBLE_OAUTH_TOKEN_URL The OAuth token URL (e.g. https://my-oauth-server/oauth/token ) - DIRIGIBLE_OAUTH_TOKEN_REQUEST_METHOD The OAuth token request method ( GET or POST ) GET DIRIGIBLE_OAUTH_CLIENT_ID The OAuth clientid (e.g. sb-xxx-yyy ) - DIRIGIBLE_OAUTH_CLIENT_SECRET The OAuth clientsecret (e.g. PID/cpkD8aZzbGaa6+muYYOOMWPDeM1ug/sQ5ZF... ) - DIRIGIBLE_OAUTH_APPLICATION_HOST The application host (e.g. https://my-application-host ) - DIRIGIBLE_OAUTH_ISSUER The OAuth issuer (e.g. http://xxx.localhost:8080/uaa/oauth/token ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY The OAuth verificationkey (e.g. -----BEGIN PUBLIC KEY-----MIIBIjANBgkqhki... ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY_EXPONENT The OAuth verificationkey exponent (e.g. AQAB ) - DIRIGIBLE_OAUTH_CHECK_ISSUER_ENABLED Sets whether the JWT verifier should check the token issuer true DIRIGIBLE_OAUTH_CHECK_AUDIENCE_ENABLED Sets whether the JWT verifier should check the token aud true DIRIGIBLE_OAUTH_APPLICATION_NAME The application name (e.g. dirigible-xxx ) - Redirect/Callback URL Configure the Redirect/Callback URL in the OAuth client to: /services/v4/oauth/callback Keycloak Parameter Description Default* DIRIGIBLE_KEYCLOAK_ENABLED Sets whether the Keycloak Authentication is enabled* false DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL The Keycloak Authentication Server URL (e.g. https://keycloak-server/auth/ ) - DIRIGIBLE_KEYCLOAK_REALM The Keycloak realm (e.g. my-realm ) - DIRIGIBLE_KEYCLOAK_SSL_REQUIRED The Keyclaok SSL Required (e.g. none / external ) - DIRIGIBLE_KEYCLOAK_CLIENT_ID The Keycloak Client ID (e.g. my-client ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - SERVER_MAXHTTPHEADERSIZE The HTTP header max size (e.g. 48000 ) Default for the underlying server (e.g. Tomcat) Note In addition to setting the DIRIGIBLE_KEYCLOAK_ENABLED property to true , the DIRIGIBLE_BASIC_ENABLED property should be set to false in order to enable the Keycloak integration. To find more details about the Keycloak configuration go to Keycloak Java Adapter Configuration . Git Parameter Description Default* DIRIGIBLE_GIT_ROOT_FOLDER The external folder that will be used for synchronizing git projects - Registry Parameter Description Default* DIRIGIBLE_REGISTRY_EXTERNAL_FOLDER The external folder that will be used for synchronizing the public registry - DIRIGIBLE_REGISTRY_IMPORT_WORKSPACE The external folder that will be imported into the public registry - Repository Parameter Description Default* DIRIGIBLE_REPOSITORY_PROVIDER The name of the repository provider used in this instance local or database DIRIGIBLE_REPOSITORY_CACHE_ENABLED Enable the usage of the repository cache true Local Repository Parameter Description Default* DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER The location of the root folder where the repository artifacts will be stored . DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false Master Repository Parameter Description Default* DIRIGIBLE_MASTER_REPOSITORY_PROVIDER The name of the master repository provider used in this instance ( filesystem , zip or jar ) - DIRIGIBLE_MASTER_REPOSITORY_ROOT_FOLDER The location of the root folder where the master repository artifacts will be loaded from . DIRIGIBLE_MASTER_REPOSITORY_ZIP_LOCATION The location of the zip file where the master repository artifacts will be loaded from (e.g. /User/data/my-repo.zip ) - DIRIGIBLE_MASTER_REPOSITORY_JAR_PATH The JAR path location of the zip file where the master repository artifacts will be loaded from (e.g. /org/dirigible/example/my-repo.zip ) - Note The JAR path is absolute inside the class path Repository Search Parameter Description Default* DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER The location of the root folder to be used by the indexing engine . DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false DIRIGIBLE_REPOSITORY_SEARCH_INDEX_LOCATION The sub-folder under the root folder where the index files will be stored dirigible/repository/index Repository Versioning Parameter Description Default* DIRIGIBLE_REPOSITORY_VERSIONING_ENABLED The flag whether versioning for repository is enabled false Database Common Parameters Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance ( local , managed or custom ) local DIRIGIBLE_DATABASE_DEFAULT_SET_AUTO_COMMIT The AUTO_COMMIT data source parameter true DIRIGIBLE_DATABASE_DEFAULT_MAX_CONNECTIONS_COUNT The MAX_CONNECTIONS_COUNT data source parameter 8 DIRIGIBLE_DATABASE_DEFAULT_WAIT_TIMEOUT The WAIT_TIMEOUT data source parameter 500 DIRIGIBLE_DATABASE_DEFAULT_WAIT_COUNT The WAIT_COUNT data source parameter 5 DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance DefaultDB DIRIGIBLE_DATABASE_DATASOURCE_NAME_SYSTEM The name of the system data source used in this instance SystemDB DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE The names of the tables, views and columns to be considered as case sensitive false DIRIGIBLE_DATABASE_TRANSFER_BATCH_SIZE The batch size used during the data transfer 1000 DIRIGIBLE_DATABASE_DEFAULT_QUERY_LIMIT The batch size used during quering data from the database 1000 DIRIGIBLE_DATABASE_SYSTEM_DRIVER The driver used for the SystemDB database connection org.h2.Driver DIRIGIBLE_DATABASE_SYSTEM_URL The JDBC url used for the SystemDB database connection jdbc:h2:file:./target/dirigible/h2/SystemDB DIRIGIBLE_DATABASE_SYSTEM_USERNAME The username used for the SystemDB database connection sa DIRIGIBLE_DATABASE_SYSTEM_PASSWORD The password used for the SystemDB database connection (empty) Custom Database Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance to be set to custom local DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES The list of the custom data sources names used in this instance e.g. DS1,DS2 `` DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance e.g. DS1 DefaultDB DS1_DRIVER The JDBC driver used for the exemplary DS1 database connection `` DS1_URL The JDBC url used for the exemplary DS1 database connection `` DS1_SCHEMA The default schema used for the exemplary DS1 database connection `` DS1_USERNAME The username used for the exemplary DS1 database connection `` DS1_PASSWORD The password used for the exemplary DS1 database connection `` Database H2 Parameter Description Default* DIRIGIBLE_DATABASE_H2_ROOT_FOLDER_DEFAULT The location used by H2 database ./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_DRIVER The Driver used by H2 database org.h2.Driver DIRIGIBLE_DATABASE_H2_URL The URL used by H2 database jdbc:h2:./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_USERNAME The Username used by H2 database sa DIRIGIBLE_DATABASE_H2_PASSWORD The Password used by H2 database - Database Snowflake Parameter Description Default* SNOWFLAKE_DATABASE The database used by Snowflake - SNOWFLAKE_SCHEMA The schema used by Snowflake - SNOWFLAKE_WAREHOUSE The warehouse used by Snowflake - SNOWFLAKE_DEFAULT_TABLE_TYPE Default table type for create table statements HYBRID Persistence Parameter Description Default* DIRIGIBLE_PERSISTENCE_CREATE_TABLE_ON_USE Whether the table to be created automatically on use if it does not exist true MongoDB Parameter Description Default* DIRIGIBLE_MONGODB_CLIENT_URI The location used by MongoDB server mongodb://localhost:27017 DIRIGIBLE_MONGODB_DATABASE_DEFAULT The default database name db Lifecycle Parameter Description Default* DIRIGIBLE_PUBLISH_DISABLED Disable publishing process in this instance false Scheduler Parameter Description Default* DIRIGIBLE_SCHEDULER_MEMORY_STORE Whether Quartz to use in-memory job store false DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_TYPE The type of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_NAME The name of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_LOGS_RETANTION_PERIOD The period the logs of the job execution will be kept (the default is one week - 24x7) 168 DIRIGIBLE_SCHEDULER_EMAIL_SENDER The sender for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_RECIPIENTS The recipients list for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_ERROR The error subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_NORMAL The normal subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_ERROR The error template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_NORMAL The normal template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_SCHEME The scheme part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_HOST The host part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_PORT The port part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE The name of the JDBC delegate used by Quartz, if not the default one org.quartz.impl.jdbcjobstore.StdJDBCDelegate Note Quartz JDBC delegates: org.quartz.impl.jdbcjobstore.StdJDBCDelegate (for fully JDBC-compliant drivers) org.quartz.impl.jdbcjobstore.MSSQLDelegate (for Microsoft SQL Server, and Sybase) org.quartz.impl.jdbcjobstore.PostgreSQLDelegate org.quartz.impl.jdbcjobstore.WebLogicDelegate (for WebLogic drivers) org.quartz.impl.jdbcjobstore.oracle.OracleDelegate org.quartz.impl.jdbcjobstore.oracle.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.oracle.weblogic.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.CloudscapeDelegate org.quartz.impl.jdbcjobstore.DB2v6Delegate org.quartz.impl.jdbcjobstore.DB2v7Delegate org.quartz.impl.jdbcjobstore.DB2v8Delegate org.quartz.impl.jdbcjobstore.HSQLDBDelegate org.quartz.impl.jdbcjobstore.PointbaseDelegate org.quartz.impl.jdbcjobstore.SybaseDelegate Synchronizer Parameter Description Default* DIRIGIBLE_SYNCHRONIZER_IGNORE_DEPENDENCIES Whether to ignore dependencies for synchronizers, e.g. for tests purposes false DIRIGIBLE_SYNCHRONIZER_EXCLUDE_PATHS Paths to be excluded from processing (comma separated list) `` DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_COUNT Cross-dependencies processing count 10 DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_INTERVAL Cross-dependencies processing interval 10000 Job Expression Parameter Description Default* DIRIGIBLE_JOB_EXPRESSION_BPM BPM synchronizer job config 0/50 * * * * ? DIRIGIBLE_JOB_EXPRESSION_DATA_STRUCTURES Data structures job synchronizer config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_EXTENSIONS Extension synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_JOBS Jobs synchronizer job config 0/15 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MESSAGING Messaging synchronizer job config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MIGRATIONS Migration synchronizer job config 0/55 * * * * ? DIRIGIBLE_JOB_EXPRESSION_ODATA OData synchronizer job config 0/45 * * * * ? DIRIGIBLE_JOB_EXPRESSION_PUBLISHER Publisher synchronizer job config 0/5 * * * * ? DIRIGIBLE_JOB_EXPRESSION_SECURITY Security synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_REGISTRY Registry synchronizer job config 0/35 * * * * ? DIRIGIBLE_JOB_DEFAULT_TIMEOUT Default timeout in minutes 3 Runtime Core Parameter Description Default* DIRIGIBLE_HOME_URL The home URL where the user to be redirected on access /services/v4/web/ide/index.html Vert.x Parameter Description Default* DIRIGIBLE_VERTX_PORT The Vert.x server port, if used 8888 CSV Parameter Description Default* DIRIGIBLE_CSV_DATA_MAX_COMPARE_SIZE The maximum number of CSV records for which will be performed comparison with the existing table data 1000 DIRIGIBLE_CSV_DATA_BATCH_SIZE The number of CSV records to be included in a batch operation 100 CMS Parameter Description Default* DIRIGIBLE_CMS_PROVIDER The type of the CMS provider used in this instance (e.g. cms-provider-internal , cms-provider-s3 , managed or database ) internal DIRIGIBLE_CMS_ROLES_ENABLED Whether the RBAC over the CMS content to be enabled true CMS - Internal Parameter Description Default* DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER The location of the CMS internal repository target DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER_IS_ABSOLUTE Whether the root folder parameter is absolute or not false DIRIGIBLE_CMS_INTERNAL_VERSIONING_ENABLED Whether the versioning of the files is enabled or not false CMS - S3 Parameter Description Default* AWS_ACCESS_KEY_ID The AWS access key used for authentication target AWS_SECRET_ACCESS_KEY The AWS secret key used for authentication target AWS_DEFAULT_REGION The region where the bucket is stored eu-central-1 DIRIGIBLE_S3_BUCKET The bucket to be used for content management. Will be created if the provided one does not exist target DIRIGIBLE_S3_PROVIDER The provider to be used for S3. For local testing an option with localstack is available aws CMS - Managed Parameter Description Default* DIRIGIBLE_CMS_MANAGED_CONFIGURATION_JNDI_NAME The JNDI name of the managed CMS repository java:comp/env/EcmService in case of SAP package DIRIGIBLE_CMS_MANAGED_CONFIGURATION_AUTH_METHOD The authentication method (e.g. key or destination ) key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_NAME The name of the repository cmis:dirigible DIRIGIBLE_CMS_MANAGED_CONFIGURATION_KEY The key of the repository cmis:dirigible:key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_DESTINATION The name of the destination where the name and the key for the repository are stored (e.g. CMIS_DESTINATION ) - DIRIGIBLE_CONNECTIVITY_CONFIGURATION_JNDI_NAME The JNDI name of the connectivity configuration serivce java:comp/env/connectivity/Configuration in case of SAP package CMS Database Parameter Description Default* DIRIGIBLE_CMS_DATABASE_DATASOURCE_TYPE Type of the database for CMS repository (e.g. local , managed , custom , dynamic ) managed DIRIGIBLE_CMS_DATABASE_DATASOURCE_NAME The datasource name DefaultDB BPM Parameter Description Default* DIRIGIBLE_BPM_PROVIDER The provider of the BPM engine (e.g. internal , managed , remote ) internal BPM - Flowable Parameter Description Default* DIRIGIBLE_FLOWABLE_DATABASE_DRIVER The driver of the Flowable engine (e.g. org.postgresql.Driver ) - DIRIGIBLE_FLOWABLE_DATABASE_URL The URL of the Flowable engine (e.g. jdbc:postgresql://localhost:5432/ ) - DIRIGIBLE_FLOWABLE_DATABASE_USER The user of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_PASSWORD The driver of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_DATASOURCE_NAME The datasource name of the Flowable engine, if any configured - DIRIGIBLE_FLOWABLE_DATABASE_SCHEMA_UPDATE Whether to materialize the database layout or not true DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in H2 (e.g. true (DefaultDB) or false (H2)) true Mail Parameter Description Default* DIRIGIBLE_MAIL_USERNAME Mailbox username - DIRIGIBLE_MAIL_PASSWORD Mailbox password - DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL Mail transport protocol smtps DIRIGIBLE_MAIL_SMTPS_HOST Mailbox SMTPS host - DIRIGIBLE_MAIL_SMTPS_PORT Mailbox SMTPS port - DIRIGIBLE_MAIL_SMTPS_AUTH Enable/disable mailbox SMTPS authentication - DIRIGIBLE_MAIL_SMTP_HOST Mailbox SMTP host - DIRIGIBLE_MAIL_SMTP_PORT Mailbox SMTP port - DIRIGIBLE_MAIL_SMTP_AUTH Enable/disable mailbox SMTP authentication - Messaging Parameter Description Default* DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in KahaDB (e.g. true (DefaultDB) or false (KahaDB)) true Kafka Parameter Description Default* DIRIGIBLE_KAFKA_BOOTSTRAP_SERVER The Kafka server location localhost:9092 DIRIGIBLE_KAFKA_ACKS The number of brokers that must receive the record before considering the write as successful all DIRIGIBLE_KAFKA_KEY_SERIALIZER The Key serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_VALUE_SERIALIZER The Value serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_AUTOCOMMIT_ENABLED Whether Auto Commit is enabled true DIRIGIBLE_KAFKA_AUTOCOMMIT_INTERVAL Auto Commit interval in ms 1000 Engines JavaScript Parameter Description Default* DIRIGIBLE_JAVASCRIPT_ENGINE_TYPE_DEFAULT The type of the JavaScript engine provider used in this instance (e.g. graalvm , rhino , nashorn or v8 ) graalvm since 5.0 GraalVM Parameter Description Default* DIRIGIBLE_GRAALIUM_ENABLE_DEBUG Whether the debug mode is enabled false DIRIGIBLE_JAVASCRIPT_GRAALVM_DEBUGGER_PORT The GraalVM debugger port 8081 and 0.0.0.0:8081 in Docker environment DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_HOST_ACCESS Whether GraalVM can load classes form custom packages true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_THREAD Whether GraalVM can create threads true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_PROCESS Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_IO Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_NASHORN Whether GraalVM has enabled compatibility mode for Nashorn true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_MOZILLA Whether GraalVM has enabled compatibility mode for Mozilla false TypeScript Parameter Description Default* DIRIGIBLE_PROJECT_TYPESCRIPT Whether the project is TypeScript enabled true OData Parameter Description Default* DIRIGIBLE_ODATA_HANDLER_EXECUTOR_TYPE The type of the JavaScript engine to be used for event handlers in OData DIRIGIBLE_ODATA_HANDLER_EXECUTOR_ON_EVENT The location of the wrapper helper to be used for event handlers in OData FTP Parameter Description Default* DIRIGIBLE_FTP_USERNAME The FTP server username admin DIRIGIBLE_FTP_PASSWORD The FTP server password admin DIRIGIBLE_FTP_PORT The FTP server port 8022 SFTP Parameter Description Default* DIRIGIBLE_SFTP_USERNAME The SFTP server username admin DIRIGIBLE_SFTP_PASSWORD The SFTP server password admin DIRIGIBLE_SFTP_PORT The SFTP server port 8022 Operations Logs Parameter Description Default* DIRIGIBLE_OPERATIONS_LOGS_ROOT_FOLDER_DEFAULT The folder where the log files are stored in ../logs DIRIGIBLE_EXEC_COMMAND_LOGGING_ENABLED Whether to log the executed command by the exec API false Look & Feel Theme Parameter Description Default* DIRIGIBLE_THEME_DEFAULT The name of the default name Default Terminal Parameter Description Default* DIRIGIBLE_TERMINAL_ENABLED Whether the Terminal view is enabled true","title":"Environment Variables"},{"location":"setup/setup-environment-variables/#environment-variables","text":"","title":"Environment Variables"},{"location":"setup/setup-environment-variables/#configuration-types","text":"Based on the layer, they are defined, configuration variables have the following priorities: Runtime Environment Deployment Module Highest precedence: No rebuild or restart of the application is required when configuration is changed. The Configuration API could be used to apply changes in the Runtime configuration. Second precedence: No rebuild is required when configuration is changed, however the application should be restarted, to apply the environment changes. Usually the Environment configurations are provided during the application deployment, as part of application descriptor (e.g. Define environment variable for container in Kubernetes or in Cloud Foundry App Manifest ) . Third precedence: Rebuild and re-deployment is required. \"Default\" deployment ( ROOT.war ) configuration variables are taken from dirigible.properties properties file (sample could be found here ) . Lowest precedence: Rebuild and re-deployment is required. \"Default\" module (e.g. dirigible-database-custom.jar , dirigible-database-h2.jar ) configuration variables are taken from dirigible-xxx.properties properties files (sample could be found here and here ) Note The precedence order means that, if the there is an Environment variable with name DIRIGIBLE_TEST and Runtime variable with the same name, the Runtime variable will have high prority and will be applied. All applied configuration values could be found under the Configurations View .","title":"Configuration Types"},{"location":"setup/setup-environment-variables/#configuration-parameters","text":"","title":"Configuration Parameters"},{"location":"setup/setup-environment-variables/#branding","text":"Parameter Description Default* DIRIGIBLE_BRANDING_NAME The brand name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND The branding name Eclipse Dirigible DIRIGIBLE_BRANDING_BRAND_URL The branding URL https://www.dirigible.io DIRIGIBLE_BRANDING_ICON The branding icon ../../../../services/v4/web/resources/images/favicon.png DIRIGIBLE_BRANDING_WELCOME_PAGE_DEFAULT The branding welcome page ../../../../services/v4/web/ide/welcome.html DIRIGIBLE_BRANDING_HELP_ITEMS The list of the custom help menu items (comma separated) -","title":"Branding"},{"location":"setup/setup-environment-variables/#branding-help-items","text":"Note Replace CUSTOM_ITEM with the actual name set by DIRIGIBLE_BRANDING_HELP_ITEMS e.g. ITEM1 Parameter Description Default* DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_NAME The name of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_URL The url of the custom help item - DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_ORDER (Optional) The order of the custom help item 0 DIRIGIBLE_BRANDING_HELP_ITEM_CUSTOM_ITEM_DIVIDER (Optional) Whether to set divider after the custom help item false","title":"Branding - Help Items"},{"location":"setup/setup-environment-variables/#server","text":"Parameter Description Default* DIRIGIBLE_SERVER_PORT The port that Eclipse Dirigible will start on 8080","title":"Server"},{"location":"setup/setup-environment-variables/#basic","text":"Parameter Description Default* DIRIGIBLE_BASIC_ENABLED Whether the Basic authentication is enabled true DIRIGIBLE_BASIC_USERNAME Base64 encoded property, which will be used as user name for basic authentication admin DIRIGIBLE_BASIC_PASSWORD Base64 encoded property, which will be used as password for basic authentication admin","title":"Basic"},{"location":"setup/setup-environment-variables/#oauth","text":"Parameter Description Default* DIRIGIBLE_OAUTH_ENABLED Whether the OAuth authentication is enabled false DIRIGIBLE_OAUTH_AUTHORIZE_URL The OAuth authorization URL (e.g. https://my-oauth-server/oauth/authorize ) - DIRIGIBLE_OAUTH_TOKEN_URL The OAuth token URL (e.g. https://my-oauth-server/oauth/token ) - DIRIGIBLE_OAUTH_TOKEN_REQUEST_METHOD The OAuth token request method ( GET or POST ) GET DIRIGIBLE_OAUTH_CLIENT_ID The OAuth clientid (e.g. sb-xxx-yyy ) - DIRIGIBLE_OAUTH_CLIENT_SECRET The OAuth clientsecret (e.g. PID/cpkD8aZzbGaa6+muYYOOMWPDeM1ug/sQ5ZF... ) - DIRIGIBLE_OAUTH_APPLICATION_HOST The application host (e.g. https://my-application-host ) - DIRIGIBLE_OAUTH_ISSUER The OAuth issuer (e.g. http://xxx.localhost:8080/uaa/oauth/token ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY The OAuth verificationkey (e.g. -----BEGIN PUBLIC KEY-----MIIBIjANBgkqhki... ) - DIRIGIBLE_OAUTH_VERIFICATION_KEY_EXPONENT The OAuth verificationkey exponent (e.g. AQAB ) - DIRIGIBLE_OAUTH_CHECK_ISSUER_ENABLED Sets whether the JWT verifier should check the token issuer true DIRIGIBLE_OAUTH_CHECK_AUDIENCE_ENABLED Sets whether the JWT verifier should check the token aud true DIRIGIBLE_OAUTH_APPLICATION_NAME The application name (e.g. dirigible-xxx ) - Redirect/Callback URL Configure the Redirect/Callback URL in the OAuth client to: /services/v4/oauth/callback","title":"OAuth"},{"location":"setup/setup-environment-variables/#keycloak","text":"Parameter Description Default* DIRIGIBLE_KEYCLOAK_ENABLED Sets whether the Keycloak Authentication is enabled* false DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL The Keycloak Authentication Server URL (e.g. https://keycloak-server/auth/ ) - DIRIGIBLE_KEYCLOAK_REALM The Keycloak realm (e.g. my-realm ) - DIRIGIBLE_KEYCLOAK_SSL_REQUIRED The Keyclaok SSL Required (e.g. none / external ) - DIRIGIBLE_KEYCLOAK_CLIENT_ID The Keycloak Client ID (e.g. my-client ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT The Keycloak Confidential Port (e.g. 443 ) - SERVER_MAXHTTPHEADERSIZE The HTTP header max size (e.g. 48000 ) Default for the underlying server (e.g. Tomcat) Note In addition to setting the DIRIGIBLE_KEYCLOAK_ENABLED property to true , the DIRIGIBLE_BASIC_ENABLED property should be set to false in order to enable the Keycloak integration. To find more details about the Keycloak configuration go to Keycloak Java Adapter Configuration .","title":"Keycloak"},{"location":"setup/setup-environment-variables/#git","text":"Parameter Description Default* DIRIGIBLE_GIT_ROOT_FOLDER The external folder that will be used for synchronizing git projects -","title":"Git"},{"location":"setup/setup-environment-variables/#registry","text":"Parameter Description Default* DIRIGIBLE_REGISTRY_EXTERNAL_FOLDER The external folder that will be used for synchronizing the public registry - DIRIGIBLE_REGISTRY_IMPORT_WORKSPACE The external folder that will be imported into the public registry -","title":"Registry"},{"location":"setup/setup-environment-variables/#repository","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_PROVIDER The name of the repository provider used in this instance local or database DIRIGIBLE_REPOSITORY_CACHE_ENABLED Enable the usage of the repository cache true","title":"Repository"},{"location":"setup/setup-environment-variables/#local-repository","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER The location of the root folder where the repository artifacts will be stored . DIRIGIBLE_REPOSITORY_LOCAL_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false","title":"Local Repository"},{"location":"setup/setup-environment-variables/#master-repository","text":"Parameter Description Default* DIRIGIBLE_MASTER_REPOSITORY_PROVIDER The name of the master repository provider used in this instance ( filesystem , zip or jar ) - DIRIGIBLE_MASTER_REPOSITORY_ROOT_FOLDER The location of the root folder where the master repository artifacts will be loaded from . DIRIGIBLE_MASTER_REPOSITORY_ZIP_LOCATION The location of the zip file where the master repository artifacts will be loaded from (e.g. /User/data/my-repo.zip ) - DIRIGIBLE_MASTER_REPOSITORY_JAR_PATH The JAR path location of the zip file where the master repository artifacts will be loaded from (e.g. /org/dirigible/example/my-repo.zip ) - Note The JAR path is absolute inside the class path","title":"Master Repository"},{"location":"setup/setup-environment-variables/#repository-search","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER The location of the root folder to be used by the indexing engine . DIRIGIBLE_REPOSITORY_SEARCH_ROOT_FOLDER_IS_ABSOLUTE Whether the location of the root folder is absolute or context dependent false DIRIGIBLE_REPOSITORY_SEARCH_INDEX_LOCATION The sub-folder under the root folder where the index files will be stored dirigible/repository/index","title":"Repository Search"},{"location":"setup/setup-environment-variables/#repository-versioning","text":"Parameter Description Default* DIRIGIBLE_REPOSITORY_VERSIONING_ENABLED The flag whether versioning for repository is enabled false","title":"Repository Versioning"},{"location":"setup/setup-environment-variables/#database","text":"","title":"Database"},{"location":"setup/setup-environment-variables/#common-parameters","text":"Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance ( local , managed or custom ) local DIRIGIBLE_DATABASE_DEFAULT_SET_AUTO_COMMIT The AUTO_COMMIT data source parameter true DIRIGIBLE_DATABASE_DEFAULT_MAX_CONNECTIONS_COUNT The MAX_CONNECTIONS_COUNT data source parameter 8 DIRIGIBLE_DATABASE_DEFAULT_WAIT_TIMEOUT The WAIT_TIMEOUT data source parameter 500 DIRIGIBLE_DATABASE_DEFAULT_WAIT_COUNT The WAIT_COUNT data source parameter 5 DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance DefaultDB DIRIGIBLE_DATABASE_DATASOURCE_NAME_SYSTEM The name of the system data source used in this instance SystemDB DIRIGIBLE_DATABASE_NAMES_CASE_SENSITIVE The names of the tables, views and columns to be considered as case sensitive false DIRIGIBLE_DATABASE_TRANSFER_BATCH_SIZE The batch size used during the data transfer 1000 DIRIGIBLE_DATABASE_DEFAULT_QUERY_LIMIT The batch size used during quering data from the database 1000 DIRIGIBLE_DATABASE_SYSTEM_DRIVER The driver used for the SystemDB database connection org.h2.Driver DIRIGIBLE_DATABASE_SYSTEM_URL The JDBC url used for the SystemDB database connection jdbc:h2:file:./target/dirigible/h2/SystemDB DIRIGIBLE_DATABASE_SYSTEM_USERNAME The username used for the SystemDB database connection sa DIRIGIBLE_DATABASE_SYSTEM_PASSWORD The password used for the SystemDB database connection (empty)","title":"Common Parameters"},{"location":"setup/setup-environment-variables/#custom-database","text":"Parameter Description Default* DIRIGIBLE_DATABASE_PROVIDER The name of the database provider which will be used in this instance to be set to custom local DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES The list of the custom data sources names used in this instance e.g. DS1,DS2 `` DIRIGIBLE_DATABASE_DATASOURCE_NAME_DEFAULT The name of the primary data source used in this instance e.g. DS1 DefaultDB DS1_DRIVER The JDBC driver used for the exemplary DS1 database connection `` DS1_URL The JDBC url used for the exemplary DS1 database connection `` DS1_SCHEMA The default schema used for the exemplary DS1 database connection `` DS1_USERNAME The username used for the exemplary DS1 database connection `` DS1_PASSWORD The password used for the exemplary DS1 database connection ``","title":"Custom Database"},{"location":"setup/setup-environment-variables/#database-h2","text":"Parameter Description Default* DIRIGIBLE_DATABASE_H2_ROOT_FOLDER_DEFAULT The location used by H2 database ./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_DRIVER The Driver used by H2 database org.h2.Driver DIRIGIBLE_DATABASE_H2_URL The URL used by H2 database jdbc:h2:./target/dirigible/h2 DIRIGIBLE_DATABASE_H2_USERNAME The Username used by H2 database sa DIRIGIBLE_DATABASE_H2_PASSWORD The Password used by H2 database -","title":"Database H2"},{"location":"setup/setup-environment-variables/#database-snowflake","text":"Parameter Description Default* SNOWFLAKE_DATABASE The database used by Snowflake - SNOWFLAKE_SCHEMA The schema used by Snowflake - SNOWFLAKE_WAREHOUSE The warehouse used by Snowflake - SNOWFLAKE_DEFAULT_TABLE_TYPE Default table type for create table statements HYBRID","title":"Database Snowflake"},{"location":"setup/setup-environment-variables/#persistence","text":"Parameter Description Default* DIRIGIBLE_PERSISTENCE_CREATE_TABLE_ON_USE Whether the table to be created automatically on use if it does not exist true","title":"Persistence"},{"location":"setup/setup-environment-variables/#mongodb","text":"Parameter Description Default* DIRIGIBLE_MONGODB_CLIENT_URI The location used by MongoDB server mongodb://localhost:27017 DIRIGIBLE_MONGODB_DATABASE_DEFAULT The default database name db","title":"MongoDB"},{"location":"setup/setup-environment-variables/#lifecycle","text":"Parameter Description Default* DIRIGIBLE_PUBLISH_DISABLED Disable publishing process in this instance false","title":"Lifecycle"},{"location":"setup/setup-environment-variables/#scheduler","text":"Parameter Description Default* DIRIGIBLE_SCHEDULER_MEMORY_STORE Whether Quartz to use in-memory job store false DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_TYPE The type of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_DATABASE_DATASOURCE_NAME The name of the custom data-source used by Quartz, if not the default one - DIRIGIBLE_SCHEDULER_LOGS_RETANTION_PERIOD The period the logs of the job execution will be kept (the default is one week - 24x7) 168 DIRIGIBLE_SCHEDULER_EMAIL_SENDER The sender for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_RECIPIENTS The recipients list for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_ERROR The error subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_SUBJECT_NORMAL The normal subject for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_ERROR The error template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_TEMPLATE_NORMAL The normal template for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_SCHEME The scheme part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_HOST The host part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_EMAIL_URL_PORT The port part of the URL for the e-mail notifications - DIRIGIBLE_SCHEDULER_DATABASE_DELEGATE The name of the JDBC delegate used by Quartz, if not the default one org.quartz.impl.jdbcjobstore.StdJDBCDelegate Note Quartz JDBC delegates: org.quartz.impl.jdbcjobstore.StdJDBCDelegate (for fully JDBC-compliant drivers) org.quartz.impl.jdbcjobstore.MSSQLDelegate (for Microsoft SQL Server, and Sybase) org.quartz.impl.jdbcjobstore.PostgreSQLDelegate org.quartz.impl.jdbcjobstore.WebLogicDelegate (for WebLogic drivers) org.quartz.impl.jdbcjobstore.oracle.OracleDelegate org.quartz.impl.jdbcjobstore.oracle.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.oracle.weblogic.WebLogicOracleDelegate (for Oracle drivers used within Weblogic) org.quartz.impl.jdbcjobstore.CloudscapeDelegate org.quartz.impl.jdbcjobstore.DB2v6Delegate org.quartz.impl.jdbcjobstore.DB2v7Delegate org.quartz.impl.jdbcjobstore.DB2v8Delegate org.quartz.impl.jdbcjobstore.HSQLDBDelegate org.quartz.impl.jdbcjobstore.PointbaseDelegate org.quartz.impl.jdbcjobstore.SybaseDelegate","title":"Scheduler"},{"location":"setup/setup-environment-variables/#synchronizer","text":"Parameter Description Default* DIRIGIBLE_SYNCHRONIZER_IGNORE_DEPENDENCIES Whether to ignore dependencies for synchronizers, e.g. for tests purposes false DIRIGIBLE_SYNCHRONIZER_EXCLUDE_PATHS Paths to be excluded from processing (comma separated list) `` DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_COUNT Cross-dependencies processing count 10 DIRIGIBLE_SYNCHRONIZER_CROSS_RETRY_INTERVAL Cross-dependencies processing interval 10000","title":"Synchronizer"},{"location":"setup/setup-environment-variables/#job-expression","text":"Parameter Description Default* DIRIGIBLE_JOB_EXPRESSION_BPM BPM synchronizer job config 0/50 * * * * ? DIRIGIBLE_JOB_EXPRESSION_DATA_STRUCTURES Data structures job synchronizer config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_EXTENSIONS Extension synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_JOBS Jobs synchronizer job config 0/15 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MESSAGING Messaging synchronizer job config 0/25 * * * * ? DIRIGIBLE_JOB_EXPRESSION_MIGRATIONS Migration synchronizer job config 0/55 * * * * ? DIRIGIBLE_JOB_EXPRESSION_ODATA OData synchronizer job config 0/45 * * * * ? DIRIGIBLE_JOB_EXPRESSION_PUBLISHER Publisher synchronizer job config 0/5 * * * * ? DIRIGIBLE_JOB_EXPRESSION_SECURITY Security synchronizer job config 0/10 * * * * ? DIRIGIBLE_JOB_EXPRESSION_REGISTRY Registry synchronizer job config 0/35 * * * * ? DIRIGIBLE_JOB_DEFAULT_TIMEOUT Default timeout in minutes 3","title":"Job Expression"},{"location":"setup/setup-environment-variables/#runtime-core","text":"Parameter Description Default* DIRIGIBLE_HOME_URL The home URL where the user to be redirected on access /services/v4/web/ide/index.html","title":"Runtime Core"},{"location":"setup/setup-environment-variables/#vertx","text":"Parameter Description Default* DIRIGIBLE_VERTX_PORT The Vert.x server port, if used 8888","title":"Vert.x"},{"location":"setup/setup-environment-variables/#csv","text":"Parameter Description Default* DIRIGIBLE_CSV_DATA_MAX_COMPARE_SIZE The maximum number of CSV records for which will be performed comparison with the existing table data 1000 DIRIGIBLE_CSV_DATA_BATCH_SIZE The number of CSV records to be included in a batch operation 100","title":"CSV"},{"location":"setup/setup-environment-variables/#cms","text":"Parameter Description Default* DIRIGIBLE_CMS_PROVIDER The type of the CMS provider used in this instance (e.g. cms-provider-internal , cms-provider-s3 , managed or database ) internal DIRIGIBLE_CMS_ROLES_ENABLED Whether the RBAC over the CMS content to be enabled true","title":"CMS"},{"location":"setup/setup-environment-variables/#cms-internal","text":"Parameter Description Default* DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER The location of the CMS internal repository target DIRIGIBLE_CMS_INTERNAL_ROOT_FOLDER_IS_ABSOLUTE Whether the root folder parameter is absolute or not false DIRIGIBLE_CMS_INTERNAL_VERSIONING_ENABLED Whether the versioning of the files is enabled or not false","title":"CMS - Internal"},{"location":"setup/setup-environment-variables/#cms-s3","text":"Parameter Description Default* AWS_ACCESS_KEY_ID The AWS access key used for authentication target AWS_SECRET_ACCESS_KEY The AWS secret key used for authentication target AWS_DEFAULT_REGION The region where the bucket is stored eu-central-1 DIRIGIBLE_S3_BUCKET The bucket to be used for content management. Will be created if the provided one does not exist target DIRIGIBLE_S3_PROVIDER The provider to be used for S3. For local testing an option with localstack is available aws","title":"CMS - S3"},{"location":"setup/setup-environment-variables/#cms-managed","text":"Parameter Description Default* DIRIGIBLE_CMS_MANAGED_CONFIGURATION_JNDI_NAME The JNDI name of the managed CMS repository java:comp/env/EcmService in case of SAP package DIRIGIBLE_CMS_MANAGED_CONFIGURATION_AUTH_METHOD The authentication method (e.g. key or destination ) key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_NAME The name of the repository cmis:dirigible DIRIGIBLE_CMS_MANAGED_CONFIGURATION_KEY The key of the repository cmis:dirigible:key DIRIGIBLE_CMS_MANAGED_CONFIGURATION_DESTINATION The name of the destination where the name and the key for the repository are stored (e.g. CMIS_DESTINATION ) - DIRIGIBLE_CONNECTIVITY_CONFIGURATION_JNDI_NAME The JNDI name of the connectivity configuration serivce java:comp/env/connectivity/Configuration in case of SAP package","title":"CMS - Managed"},{"location":"setup/setup-environment-variables/#cms-database","text":"Parameter Description Default* DIRIGIBLE_CMS_DATABASE_DATASOURCE_TYPE Type of the database for CMS repository (e.g. local , managed , custom , dynamic ) managed DIRIGIBLE_CMS_DATABASE_DATASOURCE_NAME The datasource name DefaultDB","title":"CMS Database"},{"location":"setup/setup-environment-variables/#bpm","text":"Parameter Description Default* DIRIGIBLE_BPM_PROVIDER The provider of the BPM engine (e.g. internal , managed , remote ) internal","title":"BPM"},{"location":"setup/setup-environment-variables/#bpm-flowable","text":"Parameter Description Default* DIRIGIBLE_FLOWABLE_DATABASE_DRIVER The driver of the Flowable engine (e.g. org.postgresql.Driver ) - DIRIGIBLE_FLOWABLE_DATABASE_URL The URL of the Flowable engine (e.g. jdbc:postgresql://localhost:5432/ ) - DIRIGIBLE_FLOWABLE_DATABASE_USER The user of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_PASSWORD The driver of the Flowable engine - DIRIGIBLE_FLOWABLE_DATABASE_DATASOURCE_NAME The datasource name of the Flowable engine, if any configured - DIRIGIBLE_FLOWABLE_DATABASE_SCHEMA_UPDATE Whether to materialize the database layout or not true DIRIGIBLE_FLOWABLE_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in H2 (e.g. true (DefaultDB) or false (H2)) true","title":"BPM - Flowable"},{"location":"setup/setup-environment-variables/#mail","text":"Parameter Description Default* DIRIGIBLE_MAIL_USERNAME Mailbox username - DIRIGIBLE_MAIL_PASSWORD Mailbox password - DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL Mail transport protocol smtps DIRIGIBLE_MAIL_SMTPS_HOST Mailbox SMTPS host - DIRIGIBLE_MAIL_SMTPS_PORT Mailbox SMTPS port - DIRIGIBLE_MAIL_SMTPS_AUTH Enable/disable mailbox SMTPS authentication - DIRIGIBLE_MAIL_SMTP_HOST Mailbox SMTP host - DIRIGIBLE_MAIL_SMTP_PORT Mailbox SMTP port - DIRIGIBLE_MAIL_SMTP_AUTH Enable/disable mailbox SMTP authentication -","title":"Mail"},{"location":"setup/setup-environment-variables/#messaging","text":"Parameter Description Default* DIRIGIBLE_MESSAGING_USE_DEFAULT_DATABASE Whether to use the DefaultDB datasource or built-in KahaDB (e.g. true (DefaultDB) or false (KahaDB)) true","title":"Messaging"},{"location":"setup/setup-environment-variables/#kafka","text":"Parameter Description Default* DIRIGIBLE_KAFKA_BOOTSTRAP_SERVER The Kafka server location localhost:9092 DIRIGIBLE_KAFKA_ACKS The number of brokers that must receive the record before considering the write as successful all DIRIGIBLE_KAFKA_KEY_SERIALIZER The Key serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_VALUE_SERIALIZER The Value serializer org.apache.kafka.common.serialization.StringSerializer DIRIGIBLE_KAFKA_AUTOCOMMIT_ENABLED Whether Auto Commit is enabled true DIRIGIBLE_KAFKA_AUTOCOMMIT_INTERVAL Auto Commit interval in ms 1000","title":"Kafka"},{"location":"setup/setup-environment-variables/#engines","text":"","title":"Engines"},{"location":"setup/setup-environment-variables/#javascript","text":"Parameter Description Default* DIRIGIBLE_JAVASCRIPT_ENGINE_TYPE_DEFAULT The type of the JavaScript engine provider used in this instance (e.g. graalvm , rhino , nashorn or v8 ) graalvm since 5.0","title":"JavaScript"},{"location":"setup/setup-environment-variables/#graalvm","text":"Parameter Description Default* DIRIGIBLE_GRAALIUM_ENABLE_DEBUG Whether the debug mode is enabled false DIRIGIBLE_JAVASCRIPT_GRAALVM_DEBUGGER_PORT The GraalVM debugger port 8081 and 0.0.0.0:8081 in Docker environment DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_HOST_ACCESS Whether GraalVM can load classes form custom packages true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_THREAD Whether GraalVM can create threads true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_CREATE_PROCESS Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_ALLOW_IO Whether GraalVM can make IO operations true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_NASHORN Whether GraalVM has enabled compatibility mode for Nashorn true DIRIGIBLE_JAVASCRIPT_GRAALVM_COMPATIBILITY_MODE_MOZILLA Whether GraalVM has enabled compatibility mode for Mozilla false","title":"GraalVM"},{"location":"setup/setup-environment-variables/#typescript","text":"Parameter Description Default* DIRIGIBLE_PROJECT_TYPESCRIPT Whether the project is TypeScript enabled true","title":"TypeScript"},{"location":"setup/setup-environment-variables/#odata","text":"Parameter Description Default* DIRIGIBLE_ODATA_HANDLER_EXECUTOR_TYPE The type of the JavaScript engine to be used for event handlers in OData DIRIGIBLE_ODATA_HANDLER_EXECUTOR_ON_EVENT The location of the wrapper helper to be used for event handlers in OData","title":"OData"},{"location":"setup/setup-environment-variables/#ftp","text":"Parameter Description Default* DIRIGIBLE_FTP_USERNAME The FTP server username admin DIRIGIBLE_FTP_PASSWORD The FTP server password admin DIRIGIBLE_FTP_PORT The FTP server port 8022","title":"FTP"},{"location":"setup/setup-environment-variables/#sftp","text":"Parameter Description Default* DIRIGIBLE_SFTP_USERNAME The SFTP server username admin DIRIGIBLE_SFTP_PASSWORD The SFTP server password admin DIRIGIBLE_SFTP_PORT The SFTP server port 8022","title":"SFTP"},{"location":"setup/setup-environment-variables/#operations","text":"","title":"Operations"},{"location":"setup/setup-environment-variables/#logs","text":"Parameter Description Default* DIRIGIBLE_OPERATIONS_LOGS_ROOT_FOLDER_DEFAULT The folder where the log files are stored in ../logs DIRIGIBLE_EXEC_COMMAND_LOGGING_ENABLED Whether to log the executed command by the exec API false","title":"Logs"},{"location":"setup/setup-environment-variables/#look-feel","text":"","title":"Look & Feel"},{"location":"setup/setup-environment-variables/#theme","text":"Parameter Description Default* DIRIGIBLE_THEME_DEFAULT The name of the default name Default","title":"Theme"},{"location":"setup/setup-environment-variables/#terminal","text":"Parameter Description Default* DIRIGIBLE_TERMINAL_ENABLED Whether the Terminal view is enabled true","title":"Terminal"},{"location":"setup/kubernetes/","text":"Setup in Kubernetes You can deploy Eclipse Dirigible Docker images, for example dirigiblelabs/dirigible , in a Kubernetes cluster. Prerequisites Install kubectl . Access to Kubernetes Cluster on IaaS provider of your choice. Steps Tip This guide describes the generic steps on how to deploy Eclipse Dirigible in a Kubernetes cluster. For more detailed deployment guides go to: Setup in Google Kubernetes Engine . Setup in Azure Kubernetes Service . Setup in Red Hat OpenShift . Setup in SAP BTP Kyma . For additional deployment guides go to: Keycloak Setup . PostgreSQL Setup . GCP DNS Zone Setup . AKS DNS Zone Setup . Create deployment configuration file: deployment.yaml Pod Deployment Deployment with PVC apiVersion : v1 kind : Pod metadata : name : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always ports : - name : http containerPort : 8080 apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service NodePort LoadBalancer Ingress apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 80 targetPort : 8080 - name : https port : 443 targetPort : 8080 type : LoadBalancer selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible. http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note Replace with your Ingress host. Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: http://dirigible. Note Replace with your Ingress host. Login with user dirigible and password dirigible , which are set by default in the Docker image ( dirigiblelabs/dirigible ) used above. Maintenance Version Update To update the Eclipse Dirigible version either use the kubectl or update the Deployment YAML as follows: with kubectl with Deployment YAML kubectl set image deployment/dirigible dirigible=dirigiblelabs/dirigible: spec : containers : - name : dirigible image : dirigiblelabs/dirigible: imagePullPolicy : Always Eclipse Dirigible versions Update the placeholder with a stable release version: You can find all released versions here . You can find all Eclipse Dirigible Docker images and tags (versions) here . Scaling The Eclipse Dirigible Deployment could be scaled horizontally by adding/removing Pods as follows: Scale to Zero Scale Up kubectl scale deployment/dirigible --replicas=0 kubectl scale deployment/dirigible --replicas= Note To learn more about application scaling in Kubernetes, see Horizontal Pod Autoscaling . Debugging To debug the Eclipse Dirigible engine via Remote Java Debugging execute the following commands: Scale the deployment to zero: kubectl scale deployment/dirigible --replicas=0 Set debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS=0.0.0.0:8000 kubectl set env deployment/dirigible -e JPDA_TRANSPORT=dt_socket Edit the deployment and add command and args : kubectl edit deployment dirigible containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always command : [ \"/bin/sh\" ] args : [ \"/usr/local/tomcat/bin/catalina.sh\" , \"jpda\" , \"run\" ] Scale up the deployment: kubectl scale deployment/dirigible --replicas=1 Forward the debug port: kubectl port-forward deployment/dirigible 8000:8000 Clean-up To clean-up the environment after the debugging is done: Stop the port forwarding. Scale the deployment to zero. Remove the debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS- kubectl set env deployment/dirigible -e JPDA_TRANSPORT- Edit the deployment and remove command and args . Scale up the deployment.","title":"Kubernetes"},{"location":"setup/kubernetes/#setup-in-kubernetes","text":"You can deploy Eclipse Dirigible Docker images, for example dirigiblelabs/dirigible , in a Kubernetes cluster. Prerequisites Install kubectl . Access to Kubernetes Cluster on IaaS provider of your choice.","title":"Setup in Kubernetes"},{"location":"setup/kubernetes/#steps","text":"Tip This guide describes the generic steps on how to deploy Eclipse Dirigible in a Kubernetes cluster. For more detailed deployment guides go to: Setup in Google Kubernetes Engine . Setup in Azure Kubernetes Service . Setup in Red Hat OpenShift . Setup in SAP BTP Kyma . For additional deployment guides go to: Keycloak Setup . PostgreSQL Setup . GCP DNS Zone Setup . AKS DNS Zone Setup . Create deployment configuration file: deployment.yaml Pod Deployment Deployment with PVC apiVersion : v1 kind : Pod metadata : name : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always ports : - name : http containerPort : 8080 apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service NodePort LoadBalancer Ingress apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 80 targetPort : 8080 - name : https port : 443 targetPort : 8080 type : LoadBalancer selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible. http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note Replace with your Ingress host. Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: http://dirigible. Note Replace with your Ingress host. Login with user dirigible and password dirigible , which are set by default in the Docker image ( dirigiblelabs/dirigible ) used above.","title":"Steps"},{"location":"setup/kubernetes/#maintenance","text":"","title":"Maintenance"},{"location":"setup/kubernetes/#version-update","text":"To update the Eclipse Dirigible version either use the kubectl or update the Deployment YAML as follows: with kubectl with Deployment YAML kubectl set image deployment/dirigible dirigible=dirigiblelabs/dirigible: spec : containers : - name : dirigible image : dirigiblelabs/dirigible: imagePullPolicy : Always Eclipse Dirigible versions Update the placeholder with a stable release version: You can find all released versions here . You can find all Eclipse Dirigible Docker images and tags (versions) here .","title":"Version Update"},{"location":"setup/kubernetes/#scaling","text":"The Eclipse Dirigible Deployment could be scaled horizontally by adding/removing Pods as follows: Scale to Zero Scale Up kubectl scale deployment/dirigible --replicas=0 kubectl scale deployment/dirigible --replicas= Note To learn more about application scaling in Kubernetes, see Horizontal Pod Autoscaling .","title":"Scaling"},{"location":"setup/kubernetes/#debugging","text":"To debug the Eclipse Dirigible engine via Remote Java Debugging execute the following commands: Scale the deployment to zero: kubectl scale deployment/dirigible --replicas=0 Set debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS=0.0.0.0:8000 kubectl set env deployment/dirigible -e JPDA_TRANSPORT=dt_socket Edit the deployment and add command and args : kubectl edit deployment dirigible containers : - name : dirigible image : dirigiblelabs/dirigible:latest imagePullPolicy : Always command : [ \"/bin/sh\" ] args : [ \"/usr/local/tomcat/bin/catalina.sh\" , \"jpda\" , \"run\" ] Scale up the deployment: kubectl scale deployment/dirigible --replicas=1 Forward the debug port: kubectl port-forward deployment/dirigible 8000:8000 Clean-up To clean-up the environment after the debugging is done: Stop the port forwarding. Scale the deployment to zero. Remove the debug environment variables: kubectl set env deployment/dirigible -e JPDA_ADDRESS- kubectl set env deployment/dirigible -e JPDA_TRANSPORT- Edit the deployment and remove command and args . Scale up the deployment.","title":"Debugging"},{"location":"setup/kubernetes/azure-kubernetes-service/","text":"Setup in Azure Kubernetes Services Deploy Eclipse Dirigible in Azure Kubernetes Services(AKS) environment. Prerequisites Install kubectl . Install Azure cli . Note Configure Azure DNS Zone Setup letsencrypt certificate for your domain. Steps Access the Azure Kubernetes Services (AKS) environment via the Azure cli: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: Azure DNS Zone Setup . apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Azure Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Azure Kubernetes Service"},{"location":"setup/kubernetes/azure-kubernetes-service/#setup-in-azure-kubernetes-services","text":"Deploy Eclipse Dirigible in Azure Kubernetes Services(AKS) environment. Prerequisites Install kubectl . Install Azure cli . Note Configure Azure DNS Zone Setup letsencrypt certificate for your domain.","title":"Setup in Azure Kubernetes Services"},{"location":"setup/kubernetes/azure-kubernetes-service/#steps","text":"Access the Azure Kubernetes Services (AKS) environment via the Azure cli: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: Azure DNS Zone Setup . apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Azure Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Steps"},{"location":"setup/kubernetes/google-kubernetes-engine/","text":"Setup in Google Kubernetes Engine Deploy Eclipse Dirigible in Google Kubernetes Engine (GKE) environment. Prerequisites Install kubectl . Access to Google Kubernetes Engine . Note Create GKE cluster . How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances Steps Access the Google Kubernetes Engine (GKE) environment via the Google Cloud Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: GCP DNS Zone Setup . Prerequisites Install Istio , if not already installed. Install cert-manager , if not already installed. Register your zone in Google Cloud Platform \u2192 Cloud DNS , if not already registered. Register DNS Record Set Get the Istio Ingress Gateway IP: kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Register DNS Record Set: gcloud dns record-sets transaction start --zone= gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= gcloud dns record-sets transaction execute --zone= apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Google Kubernetes Engine Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Google Kubernetes Engine"},{"location":"setup/kubernetes/google-kubernetes-engine/#setup-in-google-kubernetes-engine","text":"Deploy Eclipse Dirigible in Google Kubernetes Engine (GKE) environment. Prerequisites Install kubectl . Access to Google Kubernetes Engine . Note Create GKE cluster . How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances","title":"Setup in Google Kubernetes Engine"},{"location":"setup/kubernetes/google-kubernetes-engine/#steps","text":"Access the Google Kubernetes Engine (GKE) environment via the Google Cloud Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Ingress Custom Domain apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : NodePort selector : app : dirigible --- apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can find more information on this page: GCP DNS Zone Setup . Prerequisites Install Istio , if not already installed. Install cert-manager , if not already installed. Register your zone in Google Cloud Platform \u2192 Cloud DNS , if not already registered. Register DNS Record Set Get the Istio Ingress Gateway IP: kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Register DNS Record Set: gcloud dns record-sets transaction start --zone= gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= gcloud dns record-sets transaction execute --zone= apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : letsencrypt kind : ClusterIssuer commonName : \"dirigible.\" dnsNames : - \"dirigible.\" --- apiVersion : networking.istio.io/v1beta1 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - hosts : - dirigible. port : name : http number : 80 protocol : HTTP # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true tls : httpsRedirect : false - hosts : - dirigible. port : name : https-443 number : 443 protocol : HTTPS tls : credentialName : dirigible mode : SIMPLE --- apiVersion : networking.istio.io/v1alpha3 kind : VirtualService metadata : name : dirigible spec : hosts : - dirigible.default.svc.cluster.local - dirigible. gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : port : number : 8080 host : dirigible.default.svc.cluster.local Replace Placeholders Before deploying, replace the following placeholders: with your Cloud DNS Zone name (e.g. my-zone ) . with your Istio Ingress Gateway IP (e.g. 32.118.56.186 ) . with your custom domain (e.g. my-company.com ) . To enforce HTTPS, after the initial deployment, update the following fragment: # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true Deploy to the Google Kubernetes Engine Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://dirigible.","title":"Steps"},{"location":"setup/kubernetes/helm/","text":"Setup with Helm You can deploy Dirigible via Helm Chart in a Kubernetes cluster. Prerequisites Helm Kubernetes Cluster on IaaS provider of your choice Steps Add the Eclipse Dirigible Helm repository: helm repo add dirigible https://eclipse.github.io/dirigible helm repo update Verify Eclipse Dirigible Helm chart: helm pull dirigible/dirigible --prov curl -o ~/.gnupg/pubring.gpg https://eclipse.github.io/dirigible/charts/pubring.gpg helm verify dirigible-.tgz You shoul see message: Signed by: Using Key With Fingerprint: Chart Hash Verified: Basic: helm install dirigible dirigible/dirigible Access This will install Eclipse Dirigible Deployment and Service with ClusterIP only. To access the Dirigible instance execute the command that was printed in the console. Example: export POD_NAME=$(kubectl get pods --namespace default -l \"app.kubernetes.io/name=dirigible,app.kubernetes.io/instance=dirigible\" -o jsonpath=\"{.items[0].metadata.name}\") echo \"Visit http://127.0.0.1:8080 to use your application\" kubectl --namespace default port-forward $POD_NAME 8080:8080 Navigate to: http://127.0.0.1:8080 Login with: dirigible / dirigible Kubernetes: Basic Istio PostgreSQL PostgreSQL & Keycloak GCP Cloud SQL Postgre & Keycloak helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= This will expose the Dirigible instance through Ingress host ( http://... ). Prerequisites Install Istio . kubectl label namespace default istio-injection=enabled helm install dirigible dirigible/dirigible \\ --set istio.enabled=true \\ --set ingress.host= This will install Eclipse Dirigible Deployment , Service with ClusterIP only and Istio Gateway and Virtual Service . To access the Dirigible instance execute the command that was printed in the console. kubectl get svc istio-ingressgateway -n istio-system \\ -o jsonpath=\"{.status.loadBalancer.ingress[*].hostname}\" helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set keycloak.database.enabled=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Prerequisites Install the gcloud CLI Install kubectl and configure cluster access Install Helm Info You can check the blog for more details. helm upgrade --install dirigible dirigible -n dirigible-demo \\ --set volume.enabled=true \\ --set serviceAccount.create=false \\ --set keycloak.serviceAccountCreate=false \\ --set ingress.tls=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set istio.enabled=true \\ --set istio.enableHttps=true \\ --set gke.cloudSQL=true \\ --set gke.projectId= \\ --set gke.region= \\ --set ingress.host= Kyma: Basic PostgreSQL PostgreSQL & Keycloak helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= This will install additionally an ApiRule and XSUAA ServiceInstance and ServiceBinding . The appropriate roles should be assigned to the user. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Uninstall: helm uninstall dirigible Configuration The following table lists all the configurable parameters expose by the Dirigible chart and their default values. Generic Name Description Default dirigible.image Custom Dirigible image \"\" image.repository Dirigible image repo dirigiblelabs/dirigible-all image.repositoryKyma Dirigible Kyma image repo dirigiblelabs/dirigible-sap-kyma image.repositoryKeycloak Dirigible Keycloak image repo dirigiblelabs/dirigible-keycloak image.pullPolicy Image pull policy IfNotPresent service.type Service type ClusterIP service.port Service port 8080 replicaCount Number of replicas 1 imagePullSecrets Image pull secrets [] nameOverride Name override \"\" fullnameOverride Fullname override \"\" podSecurityContext Pod security context {} nodeSelector Node selector {} tolerations Tolerations [] affinity Affinity {} resources Resources {} Basic Name Description Default volume.enabled Volume to be mounted true volume.storage Volume storage size 1Gi database.enabled Database to be deployed false database.image Database image postgres:13 database.driver Database JDBC driver org.postgresql.Driver database.storage Database storage size 1Gi database.username Database username dirigible database.password Database password dirigible ingress.enabled Ingress to be created false ingress.annotations Ingress annotations {} ingress.host Ingress host \"\" ingress.tls Ingress tls false Istio Name Description Default istio.enabled Istio to be enable false istio.gatewayName Istio gateway name gateway istio.serversPortNumber Istio servers port number 80 istio.serversPortName Istio servers port name http istio.serversPortProtocol Istio servers port protocol HTTP istio.serversHost Istio servers host * istio.virtualserviceName Istio virtual service name dirigible istio.virtualserviceHosts Istio virtual service hosts * istio.virtualserviceGateways Istio virtual service gateway gateway istio.virtualserviceDestination Istio virtual service destination dirigible istio.virtualservicePort Istio virtual service port 8080 Kyma Name Description Default kyma.enabled Kyma environment to be used false kyma.apirule.enabled Kyma ApiRule to be created true kyma.apirule.host Kyma host to be used in ApiRule \"\" Keycloak Name Description Default keycloak.enabled Keycloak environment to be used false keycloak.install Keycloak to be installed false keycloak.name Keycloak deployment name keycloak keycloak.image Keycloak image jboss/keycloak:12.0.4 keycloak.username Keycloak username admin keycloak.password Keycloak password admin keycloak.replicaCount Keycloak number of replicas 1 keycloak.realm Keycloak realm to be set master keycloak.clientId Keycloak clientId to be used dirigible keycloak.database.enabled Keycloak database to be used false keycloak.database.enabled Keycloak database to be used true keycloak.database.image Keycloak database image postgres:13 keycloak.database.storage Keycloak database storage size 1Gi keycloak.database.username Keycloak database username keycloak keycloak.database.password Keycloak database password keycloak Usage Specify the parameters you which to customize using the --set argument to the helm install command. For instance, helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host=my-ingress-host.com The above command sets the ingress.host to my-ingress-host.com . Alternatively, a YAML file that specifies the values for the above parameters can be provided while installing the chart. For example, helm install dirigible dirigible/dirigible --values values.yaml Tip You can use the default values.yaml .","title":"Helm"},{"location":"setup/kubernetes/helm/#setup-with-helm","text":"You can deploy Dirigible via Helm Chart in a Kubernetes cluster. Prerequisites Helm Kubernetes Cluster on IaaS provider of your choice","title":"Setup with Helm"},{"location":"setup/kubernetes/helm/#steps","text":"Add the Eclipse Dirigible Helm repository: helm repo add dirigible https://eclipse.github.io/dirigible helm repo update Verify Eclipse Dirigible Helm chart: helm pull dirigible/dirigible --prov curl -o ~/.gnupg/pubring.gpg https://eclipse.github.io/dirigible/charts/pubring.gpg helm verify dirigible-.tgz You shoul see message: Signed by: Using Key With Fingerprint: Chart Hash Verified: Basic: helm install dirigible dirigible/dirigible Access This will install Eclipse Dirigible Deployment and Service with ClusterIP only. To access the Dirigible instance execute the command that was printed in the console. Example: export POD_NAME=$(kubectl get pods --namespace default -l \"app.kubernetes.io/name=dirigible,app.kubernetes.io/instance=dirigible\" -o jsonpath=\"{.items[0].metadata.name}\") echo \"Visit http://127.0.0.1:8080 to use your application\" kubectl --namespace default port-forward $POD_NAME 8080:8080 Navigate to: http://127.0.0.1:8080 Login with: dirigible / dirigible Kubernetes: Basic Istio PostgreSQL PostgreSQL & Keycloak GCP Cloud SQL Postgre & Keycloak helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= This will expose the Dirigible instance through Ingress host ( http://... ). Prerequisites Install Istio . kubectl label namespace default istio-injection=enabled helm install dirigible dirigible/dirigible \\ --set istio.enabled=true \\ --set ingress.host= This will install Eclipse Dirigible Deployment , Service with ClusterIP only and Istio Gateway and Virtual Service . To access the Dirigible instance execute the command that was printed in the console. kubectl get svc istio-ingressgateway -n istio-system \\ -o jsonpath=\"{.status.loadBalancer.ingress[*].hostname}\" helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set keycloak.database.enabled=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Prerequisites Install the gcloud CLI Install kubectl and configure cluster access Install Helm Info You can check the blog for more details. helm upgrade --install dirigible dirigible -n dirigible-demo \\ --set volume.enabled=true \\ --set serviceAccount.create=false \\ --set keycloak.serviceAccountCreate=false \\ --set ingress.tls=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true \\ --set istio.enabled=true \\ --set istio.enableHttps=true \\ --set gke.cloudSQL=true \\ --set gke.projectId= \\ --set gke.region= \\ --set ingress.host= Kyma: Basic PostgreSQL PostgreSQL & Keycloak helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= This will install additionally an ApiRule and XSUAA ServiceInstance and ServiceBinding . The appropriate roles should be assigned to the user. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true This will install also PostgreSQL database with 1Gi storage and update the Dirigible datasource configuration to consume the database. helm install dirigible dirigible/dirigible \\ --set kyma.enabled=true \\ --set kyma.apirule.host= \\ --set database.enabled=true \\ --set keycloak.enabled=true \\ --set keycloak.install=true In addition Keycloak will be deployed and configured. Disable HTTPS In some cases you might want to disable the \"Required HTTPS\" for Keycloak. Login to the PostgreSQL Pod: kubectl exec -it keycloak-database- /bin/bash Connect to the keycloak database: psql --u keycloak Set the ssl_required to NONE : update REALM set ssl_required='NONE' where id = 'master'; Restart the Keycloak pod to apply the updated configuration: kubectl delete pod keycloak- Now the Required HTTPS should be disabled and the keycloak instance should be accessible via http:// Uninstall: helm uninstall dirigible","title":"Steps"},{"location":"setup/kubernetes/helm/#configuration","text":"The following table lists all the configurable parameters expose by the Dirigible chart and their default values.","title":"Configuration"},{"location":"setup/kubernetes/helm/#generic","text":"Name Description Default dirigible.image Custom Dirigible image \"\" image.repository Dirigible image repo dirigiblelabs/dirigible-all image.repositoryKyma Dirigible Kyma image repo dirigiblelabs/dirigible-sap-kyma image.repositoryKeycloak Dirigible Keycloak image repo dirigiblelabs/dirigible-keycloak image.pullPolicy Image pull policy IfNotPresent service.type Service type ClusterIP service.port Service port 8080 replicaCount Number of replicas 1 imagePullSecrets Image pull secrets [] nameOverride Name override \"\" fullnameOverride Fullname override \"\" podSecurityContext Pod security context {} nodeSelector Node selector {} tolerations Tolerations [] affinity Affinity {} resources Resources {}","title":"Generic"},{"location":"setup/kubernetes/helm/#basic","text":"Name Description Default volume.enabled Volume to be mounted true volume.storage Volume storage size 1Gi database.enabled Database to be deployed false database.image Database image postgres:13 database.driver Database JDBC driver org.postgresql.Driver database.storage Database storage size 1Gi database.username Database username dirigible database.password Database password dirigible ingress.enabled Ingress to be created false ingress.annotations Ingress annotations {} ingress.host Ingress host \"\" ingress.tls Ingress tls false","title":"Basic"},{"location":"setup/kubernetes/helm/#istio","text":"Name Description Default istio.enabled Istio to be enable false istio.gatewayName Istio gateway name gateway istio.serversPortNumber Istio servers port number 80 istio.serversPortName Istio servers port name http istio.serversPortProtocol Istio servers port protocol HTTP istio.serversHost Istio servers host * istio.virtualserviceName Istio virtual service name dirigible istio.virtualserviceHosts Istio virtual service hosts * istio.virtualserviceGateways Istio virtual service gateway gateway istio.virtualserviceDestination Istio virtual service destination dirigible istio.virtualservicePort Istio virtual service port 8080","title":"Istio"},{"location":"setup/kubernetes/helm/#kyma","text":"Name Description Default kyma.enabled Kyma environment to be used false kyma.apirule.enabled Kyma ApiRule to be created true kyma.apirule.host Kyma host to be used in ApiRule \"\"","title":"Kyma"},{"location":"setup/kubernetes/helm/#keycloak","text":"Name Description Default keycloak.enabled Keycloak environment to be used false keycloak.install Keycloak to be installed false keycloak.name Keycloak deployment name keycloak keycloak.image Keycloak image jboss/keycloak:12.0.4 keycloak.username Keycloak username admin keycloak.password Keycloak password admin keycloak.replicaCount Keycloak number of replicas 1 keycloak.realm Keycloak realm to be set master keycloak.clientId Keycloak clientId to be used dirigible keycloak.database.enabled Keycloak database to be used false keycloak.database.enabled Keycloak database to be used true keycloak.database.image Keycloak database image postgres:13 keycloak.database.storage Keycloak database storage size 1Gi keycloak.database.username Keycloak database username keycloak keycloak.database.password Keycloak database password keycloak","title":"Keycloak"},{"location":"setup/kubernetes/helm/#usage","text":"Specify the parameters you which to customize using the --set argument to the helm install command. For instance, helm install dirigible dirigible/dirigible \\ --set ingress.enabled=true \\ --set ingress.host=my-ingress-host.com The above command sets the ingress.host to my-ingress-host.com . Alternatively, a YAML file that specifies the values for the above parameters can be provided while installing the chart. For example, helm install dirigible dirigible/dirigible --values values.yaml Tip You can use the default values.yaml .","title":"Usage"},{"location":"setup/kubernetes/red-hat-openshift/","text":"Setup in Red Hat OpenShift Deploy Eclipse Dirigible in Red Hat OpenShift environment. Prerequisites Install kubectl . Access to Red Hat OpenShift . Steps Access the Red Hat OpenShift environment via the OpenShift Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-temp-data mountPath : /usr/local/tomcat/target volumes : - name : dirigible-temp-data emptyDir : {} apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Route apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : dirigible spec : host : dirigible. to : kind : Service name : dirigible port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the OpenShift Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml OpenShift Console Alternatively the OpenShift Console could be used to deploy the artifacts via the Console UI. Open a web browser and go to: https://dirigible.","title":"Red Hat OpenShift"},{"location":"setup/kubernetes/red-hat-openshift/#setup-in-red-hat-openshift","text":"Deploy Eclipse Dirigible in Red Hat OpenShift environment. Prerequisites Install kubectl . Access to Red Hat OpenShift .","title":"Setup in Red Hat OpenShift"},{"location":"setup/kubernetes/red-hat-openshift/#steps","text":"Access the Red Hat OpenShift environment via the OpenShift Console: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC Deployment with Keycloak apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-temp-data mountPath : /usr/local/tomcat/target volumes : - name : dirigible-temp-data emptyDir : {} apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-all:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-keycloak:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" ports : - name : http containerPort : 8080 env : - name : DIRIGIBLE_THEME_DEFAULT value : \"fiori\" - name : DIRIGIBLE_KEYCLOAK_ENABLED value : \"true\" - name : DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL value : - name : DIRIGIBLE_KEYCLOAK_REALM value : - name : DIRIGIBLE_KEYCLOAK_SSL_REQUIRED value : external - name : DIRIGIBLE_KEYCLOAK_CLIENT_ID value : - name : DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT value : \"443\" volumeMounts : - name : dirigible-data mountPath : /usr/local/tomcat/target/dirigible/repository - name : dirigible-temp-data mountPath : /usr/local/tomcat/target/dirigible volumes : - name : dirigible-data persistentVolumeClaim : claimName : \"dirigible-data\" - name : dirigible-temp-data emptyDir : {} --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak Auth server (e.g. https://keycloak-server/auth/ ) . with your Keycloak Realm (e.g. my-realm ) . with your Keycloak Client Id (e.g. my-client ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use a stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service Route apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible apiVersion : v1 kind : Service metadata : name : dirigible labels : app : dirigible spec : ports : - name : http port : 8080 type : ClusterIP selector : app : dirigible --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : dirigible spec : host : dirigible. to : kind : Service name : dirigible port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the OpenShift Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml OpenShift Console Alternatively the OpenShift Console could be used to deploy the artifacts via the Console UI. Open a web browser and go to: https://dirigible.","title":"Steps"},{"location":"setup/kubernetes/sap-btp-kyma/","text":"Setup in SAP BTP Kyma Deploy Eclipse Dirigible in SAP BTP 1 , Kyma environment. Prerequisites Install kubectl - this step is optional. Access to SAP BTP account (the Trial landscape can be accessed here ). Steps Access the SAP BTP, Kyma environment via the SAP BTP cockpit: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. ports : - containerPort : 8080 name : dirigible protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. volumeMounts : - name : dirigible-volume mountPath : /usr/local/tomcat/target/dirigible/repository ports : - containerPort : 8080 name : dirigible protocol : TCP volumes : - name : dirigible-volume persistentVolumeClaim : claimName : dirigible-claim --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-claim spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service APIRule apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP --- apiVersion : gateway.kyma-project.io/v1alpha1 kind : APIRule metadata : name : dirigible spec : gateway : kyma-gateway.kyma-system.svc.cluster.local rules : - accessStrategies : - config : {} handler : noop methods : - GET - POST - PUT - PATCH - DELETE - HEAD path : /.* service : host : dirigible. name : dirigible port : 8080 Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Click on the Deploy new resource button and select the deployment.yaml and service.yaml files. Note Alternatively the kubectl could be used to deploy the resources: kubectl -f deployment.yaml kubectl -f service.yaml Create XSUAA service instance: From the Kyma dashboard, go to Service Management \u2192 Catalog . Find the Authorization & Trust Management service. Create new service instance. Provide the following additional parameters. { \"xsappname\" : \"dirigible-xsuaa\" , \"oauth2-configuration\" : { \"token-validity\" : 7200 , \"redirect-uris\" : [ \"https://dirigible.\" ] }, \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your Kyma cluster host (e.g. c-xxxxxxx.kyma.xxx.xxx.xxx.ondemand.com ). Bind the servce instance to the dirigible application. Assign the Developer and Operator roles. Log in. SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"SAP BTP Kyma"},{"location":"setup/kubernetes/sap-btp-kyma/#setup-in-sap-btp-kyma","text":"Deploy Eclipse Dirigible in SAP BTP 1 , Kyma environment. Prerequisites Install kubectl - this step is optional. Access to SAP BTP account (the Trial landscape can be accessed here ).","title":"Setup in SAP BTP Kyma"},{"location":"setup/kubernetes/sap-btp-kyma/#steps","text":"Access the SAP BTP, Kyma environment via the SAP BTP cockpit: Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. ports : - containerPort : 8080 name : dirigible protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible spec : replicas : 1 strategy : type : Recreate selector : matchLabels : app : dirigible template : metadata : labels : app : dirigible spec : containers : - name : dirigible image : dirigiblelabs/dirigible-sap-kyma:latest imagePullPolicy : Always resources : requests : memory : \"1Gi\" cpu : \"0.5\" limits : memory : \"4Gi\" cpu : \"2\" env : - name : DIRIGIBLE_THEME_DEFAULT value : fiori - name : DIRIGIBLE_HOST value : https://dirigible. volumeMounts : - name : dirigible-volume mountPath : /usr/local/tomcat/target/dirigible/repository ports : - containerPort : 8080 name : dirigible protocol : TCP volumes : - name : dirigible-volume persistentVolumeClaim : claimName : dirigible-claim --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-claim spec : accessModes : - ReadWriteOnce resources : requests : storage : 1Gi Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Eclipse Dirigible versions Instead of using the latest tag (version), for production and development use cases it is recomended to use stable release version: All released versions can be found here . All Eclipse Dirigible Docker images and tags (versions) can be found here . Create service configuration file: service.yaml Service APIRule apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP apiVersion : v1 kind : Service metadata : labels : app : dirigible name : dirigible spec : ports : - name : dirigible port : 8080 protocol : TCP targetPort : 8080 selector : app : dirigible type : ClusterIP --- apiVersion : gateway.kyma-project.io/v1alpha1 kind : APIRule metadata : name : dirigible spec : gateway : kyma-gateway.kyma-system.svc.cluster.local rules : - accessStrategies : - config : {} handler : noop methods : - GET - POST - PUT - PATCH - DELETE - HEAD path : /.* service : host : dirigible. name : dirigible port : 8080 Replace Placeholders Before deploying, replace the following placeholders: with your Kyma cluster host (e.g. c-xxxx.kyma.yyyy.ondemand.com ) . Click on the Deploy new resource button and select the deployment.yaml and service.yaml files. Note Alternatively the kubectl could be used to deploy the resources: kubectl -f deployment.yaml kubectl -f service.yaml Create XSUAA service instance: From the Kyma dashboard, go to Service Management \u2192 Catalog . Find the Authorization & Trust Management service. Create new service instance. Provide the following additional parameters. { \"xsappname\" : \"dirigible-xsuaa\" , \"oauth2-configuration\" : { \"token-validity\" : 7200 , \"redirect-uris\" : [ \"https://dirigible.\" ] }, \"scopes\" : [ { \"name\" : \"$XSAPPNAME.Developer\" , \"description\" : \"Developer scope\" }, { \"name\" : \"$XSAPPNAME.Operator\" , \"description\" : \"Operator scope\" } ], \"role-templates\" : [ { \"name\" : \"Developer\" , \"description\" : \"Developer related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Operator\" , \"description\" : \"Operator related roles\" , \"scope-references\" : [ \"$XSAPPNAME.Operator\" ] } ], \"role-collections\" : [ { \"name\" : \"Dirigible Developer\" , \"description\" : \"Dirigible Developer\" , \"role-template-references\" : [ \"$XSAPPNAME.Developer\" ] }, { \"name\" : \"Dirigible Operator\" , \"description\" : \"Dirigible Operator\" , \"role-template-references\" : [ \"$XSAPPNAME.Operator\" ] } ] } Note Replace the placeholder with your Kyma cluster host (e.g. c-xxxxxxx.kyma.xxx.xxx.xxx.ondemand.com ). Bind the servce instance to the dirigible application. Assign the Developer and Operator roles. Log in. SAP Cloud Platform is called SAP Business Technology Platform (SAP BTP) as of 2021. \u21a9","title":"Steps"},{"location":"setup/kubernetes/addons/azure-dns-zone/","text":"Create Google DNS Zone Setup Prerequisites Install Azure cli . Steps Create a resource group az group create \\ --name DirigibleResourceGroup \\ --location Create static public IP address az aks show \\ --resource-group DirigibleResourceGroup \\ --name dirigible \\ --query nodeResourceGroup \\ -o tsv After you run the previus command you will receive MC_.... and add to next command. az network public-ip create \\ --resource-group MC_DirigibleResourceGroup_dirigible_ \\ --name PublicIP \\ --sku Standard \\ --allocation-method static \\ --query publicIp.ipAddress \\ -o tsv Create DNS zone az network dns zone create \\ -g DirigibleResourceGroup \\ -n dirigible.io Create DNS Record Get ip address kubectl get svc -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Set A dns record az network dns record-set a add-record \\ -g DirigibleResourceGroup \\ -z \\ -n dirigible \\ -a ","title":"Azure DNS Zone"},{"location":"setup/kubernetes/addons/azure-dns-zone/#create-google-dns-zone-setup","text":"Prerequisites Install Azure cli .","title":"Create Google DNS Zone Setup"},{"location":"setup/kubernetes/addons/azure-dns-zone/#steps","text":"Create a resource group az group create \\ --name DirigibleResourceGroup \\ --location Create static public IP address az aks show \\ --resource-group DirigibleResourceGroup \\ --name dirigible \\ --query nodeResourceGroup \\ -o tsv After you run the previus command you will receive MC_.... and add to next command. az network public-ip create \\ --resource-group MC_DirigibleResourceGroup_dirigible_ \\ --name PublicIP \\ --sku Standard \\ --allocation-method static \\ --query publicIp.ipAddress \\ -o tsv Create DNS zone az network dns zone create \\ -g DirigibleResourceGroup \\ -n dirigible.io Create DNS Record Get ip address kubectl get svc -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Set A dns record az network dns record-set a add-record \\ -g DirigibleResourceGroup \\ -z \\ -n dirigible \\ -a ","title":"Steps"},{"location":"setup/kubernetes/addons/gke-cluster/","text":"Create Google Kubernetes cluster Setup Prerequisites First you will need to add your billing information Install the gcloud CLI Install kubectl and configure cluster access Steps Create organization Create project List the organizations gcloud organizations list gcloud projects create dirigible-demo --name=dirigible --organization= You can check for the new project with: gcloud projects list --filter 'parent.id=' Enable Engine Api Go to Kubernetes Engine -> Clusters and click on Enable to allow creating cluster. Create cluster Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . Create an IAM service account with the minimum permissions required to operate GKE SA_NAME: the name of the new service account. DISPLAY_NAME: the display name for the new service account, which makes the account easier to identify. PROJECT_ID: the project ID of the project in which you want to create the new service account. SA_NAME=sa-minimum-pemissions-gke-demo \\ DISPLAY_NAME='SA minimum permissions required to operate GKE' \\ PROJECT_ID= gcloud iam service-accounts create $SA_NAME \\ --display-name=\"$DISPLAY_NAME\" \\ --project $PROJECT_ID gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/logging.logWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.metricWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.viewer gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/stackdriver.resourceMetadata.writer Create the cluster gcloud container clusters create \\ --region europe-west1-b \\ --project=$PROJECT_ID \\ --service-account=$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com Get connection to the cluster gcloud container clusters get-credentials Note How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances","title":"GKE cluster"},{"location":"setup/kubernetes/addons/gke-cluster/#create-google-kubernetes-cluster-setup","text":"Prerequisites First you will need to add your billing information Install the gcloud CLI Install kubectl and configure cluster access","title":"Create Google Kubernetes cluster Setup"},{"location":"setup/kubernetes/addons/gke-cluster/#steps","text":"Create organization Create project List the organizations gcloud organizations list gcloud projects create dirigible-demo --name=dirigible --organization= You can check for the new project with: gcloud projects list --filter 'parent.id=' Enable Engine Api Go to Kubernetes Engine -> Clusters and click on Enable to allow creating cluster. Create cluster Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . Create an IAM service account with the minimum permissions required to operate GKE SA_NAME: the name of the new service account. DISPLAY_NAME: the display name for the new service account, which makes the account easier to identify. PROJECT_ID: the project ID of the project in which you want to create the new service account. SA_NAME=sa-minimum-pemissions-gke-demo \\ DISPLAY_NAME='SA minimum permissions required to operate GKE' \\ PROJECT_ID= gcloud iam service-accounts create $SA_NAME \\ --display-name=\"$DISPLAY_NAME\" \\ --project $PROJECT_ID gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/logging.logWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.metricWriter gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/monitoring.viewer gcloud projects add-iam-policy-binding $PROJECT_ID \\ --member \"serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com\" \\ --role roles/stackdriver.resourceMetadata.writer Create the cluster gcloud container clusters create \\ --region europe-west1-b \\ --project=$PROJECT_ID \\ --service-account=$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com Get connection to the cluster gcloud container clusters get-credentials Note How to create Google DNS Zone How to setup Istio . How to create certificate for your domain . How to create GCP Cloud SQL instances","title":"Steps"},{"location":"setup/kubernetes/addons/google-dns-zone/","text":"Create Google DNS Zone Setup Prerequisites Enable Cloud DNS API . install gcloud install gcloud component gcloud components install kubectl Access to Kubernetes cluster gcloud auth login . Update the kubectl configuration to use the plugin gcloud container clusters get-credentials --zone Steps Create managed DNS Zone Console gcloud Google Cloud console In the Google Cloud console, go to the Create a DNS zone page. `Go to Create a DNS zone` For the Zone type, select Public. Enter a Zone name such as my-new-zone. Enter a DNS name suffix for the zonegcloud config set project PROJECT_ID using a domain name that you own. All records in the zone share this suffix, for example: example.com. Under DNSSEC, select Off, On, or Transfer. For more information, see Enable DNSSEC for existing managed zones. Click Create. The Zone details page is displayed. Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . gcloud dns managed-zones create NAME \\ --description=DESCRIPTION \\ --dns-name=DNS_SUFFIX \\ --labels=LABELS \\ --visibility=public Replace Placeholders DESCRIPTION with your description. LABELS with your label. DNS_SUFFIX with your main domain or subdomain. Get Ingress IP address Kubernetes Ingress Istio Ingress kubectl get ingress check column ADDRESS kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Change namespace istio-ingress to match your installation. Note You can check Istio setup Create A record in Cloud DNS Set zone for which you will create records gcloud dns record-sets transaction start --zone= Add A record gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= Apply the new record gcloud dns record-sets transaction execute --zone= - Promote ephemeral ip to reserve ``` gcloud compute addresses create --addresses= \\ --region= ``` Get your current DNS records for your zone gcloud dns record-sets list --zone= Replace Placeholders Before run the commands, replace the following placeholders: with your Google cloud dnz zone name. Add name servers Subdomain Main domain Note If you configure subdomain add Google name servers to your main domain control panel for this subdomain example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note At the end you need to update your domain's name servers to use Cloud DNS to publish your new records to the internet. Example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note How to create certificate for your domain .","title":"GCP DNS Zone"},{"location":"setup/kubernetes/addons/google-dns-zone/#create-google-dns-zone-setup","text":"Prerequisites Enable Cloud DNS API . install gcloud install gcloud component gcloud components install kubectl Access to Kubernetes cluster gcloud auth login . Update the kubectl configuration to use the plugin gcloud container clusters get-credentials --zone ","title":"Create Google DNS Zone Setup"},{"location":"setup/kubernetes/addons/google-dns-zone/#steps","text":"Create managed DNS Zone Console gcloud Google Cloud console In the Google Cloud console, go to the Create a DNS zone page. `Go to Create a DNS zone` For the Zone type, select Public. Enter a Zone name such as my-new-zone. Enter a DNS name suffix for the zonegcloud config set project PROJECT_ID using a domain name that you own. All records in the zone share this suffix, for example: example.com. Under DNSSEC, select Off, On, or Transfer. For more information, see Enable DNSSEC for existing managed zones. Click Create. The Zone details page is displayed. Set the project Set the project on which you will create DNS Zone gcloud config set project PROJECT_ID Set the project in every command --project . gcloud dns managed-zones create NAME \\ --description=DESCRIPTION \\ --dns-name=DNS_SUFFIX \\ --labels=LABELS \\ --visibility=public Replace Placeholders DESCRIPTION with your description. LABELS with your label. DNS_SUFFIX with your main domain or subdomain. Get Ingress IP address Kubernetes Ingress Istio Ingress kubectl get ingress check column ADDRESS kubectl get service -n istio-ingress istio-ingressgateway -o jsonpath=\"{.status.loadBalancer.ingress[0].ip}\" Change namespace istio-ingress to match your installation. Note You can check Istio setup Create A record in Cloud DNS Set zone for which you will create records gcloud dns record-sets transaction start --zone= Add A record gcloud dns record-sets transaction add \\ --name=dirigible. \\ --ttl=300 \\ --type=A \\ --zone= Apply the new record gcloud dns record-sets transaction execute --zone= - Promote ephemeral ip to reserve ``` gcloud compute addresses create --addresses= \\ --region= ``` Get your current DNS records for your zone gcloud dns record-sets list --zone= Replace Placeholders Before run the commands, replace the following placeholders: with your Google cloud dnz zone name. Add name servers Subdomain Main domain Note If you configure subdomain add Google name servers to your main domain control panel for this subdomain example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note At the end you need to update your domain's name servers to use Cloud DNS to publish your new records to the internet. Example: ns-cloud-d1.googledomains.com , ns-cloud-d2.googledomains.com , ns-cloud-d3.googledomains.com , ns-cloud-d4.googledomains.com Note How to create certificate for your domain .","title":"Steps"},{"location":"setup/kubernetes/addons/istio/","text":"Istio Setup Prerequisites Install istioctl . Install kubectl Access to Kubernetes cluster. Create istio-system namespace kubectl create namespace istio-system Install Istio conrol plane service istiod apiVersion : v1 kind : Service metadata : labels : app : istiod istio : pilot release : istio name : istiod namespace : istio-system spec : type : ClusterIP ports : - name : grpc-xds port : 15010 - name : https-dns port : 15012 - name : https-webhook port : 443 targetPort : 15017 - name : http-monitoring port : 15014 selector : app : istiod Install minimal and reduce gateway config. Create control-plane.yaml file apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : control-plane spec : profile : minimal components : pilot : k8s : env : - name : PILOT_FILTER_GATEWAY_CLUSTER_CONFIG value : \"true\" meshConfig : defaultConfig : proxyMetadata : ISTIO_META_DNS_CAPTURE : \"true\" enablePrometheusMerge : true Check the latest version istioctl install -y -n istio-system -f control-plane.yaml --revision 1-14-3 Add Istio injection kubectl label namespace default istio-injection=enabled --overwrite Enable istio-ingressgateway component Create namespace istio-ingress kubectl create namespace istio-ingress Create istio-ingress-gw-install.yaml apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : istio-ingress-gw-install spec : profile : empty values : gateways : istio-ingressgateway : autoscaleEnabled : false components : ingressGateways : - name : istio-ingressgateway namespace : istio-ingress enabled : true k8s : overlays : - apiVersion : apps/v1 kind : Deployment name : istio-ingressgateway patches : - path : spec.template.spec.containers[name:istio-proxy].lifecycle value : preStop : exec : command : [ \"sh\" , \"-c\" , \"sleep 5\" ] Apply latest revision istioctl install -y -n istio-ingress -f istio-ingress-gw-install.yaml --revision 1-14-3 Apply Strict mTLS apiVersion : security.istio.io/v1beta1 kind : PeerAuthentication metadata : name : default namespace : istio-system spec : mtls : mode : STRICT","title":"Istio"},{"location":"setup/kubernetes/addons/istio/#istio-setup","text":"Prerequisites Install istioctl . Install kubectl Access to Kubernetes cluster. Create istio-system namespace kubectl create namespace istio-system Install Istio conrol plane service istiod apiVersion : v1 kind : Service metadata : labels : app : istiod istio : pilot release : istio name : istiod namespace : istio-system spec : type : ClusterIP ports : - name : grpc-xds port : 15010 - name : https-dns port : 15012 - name : https-webhook port : 443 targetPort : 15017 - name : http-monitoring port : 15014 selector : app : istiod Install minimal and reduce gateway config. Create control-plane.yaml file apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : control-plane spec : profile : minimal components : pilot : k8s : env : - name : PILOT_FILTER_GATEWAY_CLUSTER_CONFIG value : \"true\" meshConfig : defaultConfig : proxyMetadata : ISTIO_META_DNS_CAPTURE : \"true\" enablePrometheusMerge : true Check the latest version istioctl install -y -n istio-system -f control-plane.yaml --revision 1-14-3 Add Istio injection kubectl label namespace default istio-injection=enabled --overwrite Enable istio-ingressgateway component Create namespace istio-ingress kubectl create namespace istio-ingress Create istio-ingress-gw-install.yaml apiVersion : install.istio.io/v1alpha1 kind : IstioOperator metadata : name : istio-ingress-gw-install spec : profile : empty values : gateways : istio-ingressgateway : autoscaleEnabled : false components : ingressGateways : - name : istio-ingressgateway namespace : istio-ingress enabled : true k8s : overlays : - apiVersion : apps/v1 kind : Deployment name : istio-ingressgateway patches : - path : spec.template.spec.containers[name:istio-proxy].lifecycle value : preStop : exec : command : [ \"sh\" , \"-c\" , \"sleep 5\" ] Apply latest revision istioctl install -y -n istio-ingress -f istio-ingress-gw-install.yaml --revision 1-14-3 Apply Strict mTLS apiVersion : security.istio.io/v1beta1 kind : PeerAuthentication metadata : name : default namespace : istio-system spec : mtls : mode : STRICT","title":"Istio Setup"},{"location":"setup/kubernetes/addons/keycloak/","text":"Keycloak Setup Deploy Keycloak in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster. Steps Create deployment configuration file: deployment.yaml Deployment Deployment with PostgreSQL apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" ports : - name : http containerPort : 8080 protocol : TCP Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : initContainers : - name : wait-db-ready image : busybox:1.28 command : - sh - -c - for i in {1..15}; do echo \"Waiting for database creation.\"; sleep 2; done; containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" - name : DB_VENDOR value : postgres - name : DB_USER value : - name : DB_PASSWORD value : - name : DB_DATABASE value : - name : DB_ADDR value : keycloak-database ports : - name : http containerPort : 8080 protocol : TCP --- apiVersion : apps/v1 kind : Deployment metadata : name : keycloak-database labels : app : keycloak-database spec : replicas : 1 selector : matchLabels : app : keycloak-database template : metadata : labels : app : keycloak-database spec : containers : - name : keycloak-database image : postgres:13 volumeMounts : - name : keycloak-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : keycloak-database-data persistentVolumeClaim : claimName : keycloak-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : keycloak-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . with your Keycloak database username (e.g. dbadmin ) . with your Keycloak database password (e.g. dbadmin ) . Create service configuration file: service.yaml Service Service with PostgreSQL Route (OpenShift) apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : keycloak spec : host : keycloak. to : kind : Service name : keycloak port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://keycloak.","title":"Keycloak"},{"location":"setup/kubernetes/addons/keycloak/#keycloak-setup","text":"Deploy Keycloak in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster.","title":"Keycloak Setup"},{"location":"setup/kubernetes/addons/keycloak/#steps","text":"Create deployment configuration file: deployment.yaml Deployment Deployment with PostgreSQL apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" ports : - name : http containerPort : 8080 protocol : TCP Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . apiVersion : apps/v1 kind : Deployment metadata : name : keycloak labels : app : keycloak spec : replicas : 1 selector : matchLabels : app : keycloak template : metadata : labels : app : keycloak spec : initContainers : - name : wait-db-ready image : busybox:1.28 command : - sh - -c - for i in {1..15}; do echo \"Waiting for database creation.\"; sleep 2; done; containers : - name : keycloak image : jboss/keycloak:12.0.4 env : - name : PROXY_ADDRESS_FORWARDING value : \"true\" - name : KEYCLOAK_USER value : - name : KEYCLOAK_PASSWORD value : - name : KEYCLOAK_FRONTEND_URL value : \"https://keycloak./auth/\" - name : DB_VENDOR value : postgres - name : DB_USER value : - name : DB_PASSWORD value : - name : DB_DATABASE value : - name : DB_ADDR value : keycloak-database ports : - name : http containerPort : 8080 protocol : TCP --- apiVersion : apps/v1 kind : Deployment metadata : name : keycloak-database labels : app : keycloak-database spec : replicas : 1 selector : matchLabels : app : keycloak-database template : metadata : labels : app : keycloak-database spec : containers : - name : keycloak-database image : postgres:13 volumeMounts : - name : keycloak-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : keycloak-database-data persistentVolumeClaim : claimName : keycloak-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : keycloak-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . with your Keycloak host (e.g. my-company.com ) . with your Keycloak database username (e.g. dbadmin ) . with your Keycloak database password (e.g. dbadmin ) . Create service configuration file: service.yaml Service Service with PostgreSQL Route (OpenShift) apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database apiVersion : v1 kind : Service metadata : name : keycloak labels : app : keycloak spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : keycloak --- apiVersion : v1 kind : Service metadata : name : keycloak-database labels : app : keycloak-database spec : type : ClusterIP ports : - port : 5432 targetPort : jdbc protocol : TCP name : jdbc selector : app : keycloak-database --- kind : Route apiVersion : route.openshift.io/v1 metadata : name : keycloak spec : host : keycloak. to : kind : Service name : keycloak port : targetPort : http tls : termination : edge insecureEdgeTerminationPolicy : Redirect wildcardPolicy : None Replace Placeholders Before deploying, replace the following placeholders: with your OpenShift domain (e.g. apps.sandbox.xxxx.yy.openshiftapps.com ) . Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml Open a web browser and go to: https://keycloak.","title":"Steps"},{"location":"setup/kubernetes/addons/letsencrypt/","text":"Letsencrypt Setup Deploy Cert Manager in Kubernetes environment. Prerequisites Install kubectl . Install Helm Access to Kubernetes cluster. Steps Install cert-manager: Add Jetstack Helm repository: helm repo add jetstack https://charts.jetstack.io Update your local Helm chart repository cache: helm repo update Intall cert-manager and CustomResourceDefinitions : helm install \\ cert-manager jetstack/cert-manager \\ --namespace cert-manager \\ --create-namespace \\ --version v1.9.1 \\ --set installCRDs=true Note Check the current version of the Installation with Helm . Create Cluster Issuer: apiVersion : cert-manager.io/v1alpha2 kind : ClusterIssuer metadata : name : dirigible spec : acme : server : https://acme-v02.api.letsencrypt.org/directory email : privateKeySecretRef : name : dirigible http01 : {} Note Replace the placeholder with valid email address. Update ClusterIssuer If your ingress is Istio change the ClusterIssuer and add: solvers: - selector: {} http01: ingress: class: istio Create certificate: apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : dirigible kind : ClusterIssuer commonName : \"\" dnsNames : - \"\" Note Replace the placeholder with your domain from previous step. Add Namespace If your Istio Ingress is installed to namespace istio-ingress add namespace: istio-ingress . Create Ingress: Kubernetes Ingress Istio Ingress apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can install istio with default profile istioctl install this will install istio-ingressgateway and istiod and you can install manually : apiVersion : networking.istio.io/v1alpha3 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - port : number : 80 name : http protocol : HTTP hosts : - dirigible. # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true - port : number : 443 name : https-443 protocol : HTTPS hosts : - dirigible. tls : mode : SIMPLE credentialName : dirigible Replace the placeholder with your domain from previous step. Create Virtual Service for Istio: apiVersion : networking.istio.io/v1beta1 kind : VirtualService metadata : name : dirigible spec : hosts : - \"dirigible.\" gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : host : dirigible.default.svc.cluster.local port : number : 8080 Replace the placeholder with your domain from previous step. Check certificate status in cert-manager: kubectl logs -n cert-manager -lapp=cert-manager","title":"Letsencrypt"},{"location":"setup/kubernetes/addons/letsencrypt/#letsencrypt-setup","text":"Deploy Cert Manager in Kubernetes environment. Prerequisites Install kubectl . Install Helm Access to Kubernetes cluster.","title":"Letsencrypt Setup"},{"location":"setup/kubernetes/addons/letsencrypt/#steps","text":"Install cert-manager: Add Jetstack Helm repository: helm repo add jetstack https://charts.jetstack.io Update your local Helm chart repository cache: helm repo update Intall cert-manager and CustomResourceDefinitions : helm install \\ cert-manager jetstack/cert-manager \\ --namespace cert-manager \\ --create-namespace \\ --version v1.9.1 \\ --set installCRDs=true Note Check the current version of the Installation with Helm . Create Cluster Issuer: apiVersion : cert-manager.io/v1alpha2 kind : ClusterIssuer metadata : name : dirigible spec : acme : server : https://acme-v02.api.letsencrypt.org/directory email : privateKeySecretRef : name : dirigible http01 : {} Note Replace the placeholder with valid email address. Update ClusterIssuer If your ingress is Istio change the ClusterIssuer and add: solvers: - selector: {} http01: ingress: class: istio Create certificate: apiVersion : cert-manager.io/v1 kind : Certificate metadata : name : dirigible spec : secretName : dirigible issuerRef : name : dirigible kind : ClusterIssuer commonName : \"\" dnsNames : - \"\" Note Replace the placeholder with your domain from previous step. Add Namespace If your Istio Ingress is installed to namespace istio-ingress add namespace: istio-ingress . Create Ingress: Kubernetes Ingress Istio Ingress apiVersion : networking.k8s.io/v1 kind : Ingress metadata : name : dirigible spec : rules : - host : dirigible http : paths : - path : / pathType : Prefix backend : service : name : dirigible port : number : 8080 Note You can install istio with default profile istioctl install this will install istio-ingressgateway and istiod and you can install manually : apiVersion : networking.istio.io/v1alpha3 kind : Gateway metadata : name : dirigible-gateway spec : selector : istio : ingressgateway servers : - port : number : 80 name : http protocol : HTTP hosts : - dirigible. # Initially it should be commented, then uncomment to enforce https! # tls: # httpsRedirect: true - port : number : 443 name : https-443 protocol : HTTPS hosts : - dirigible. tls : mode : SIMPLE credentialName : dirigible Replace the placeholder with your domain from previous step. Create Virtual Service for Istio: apiVersion : networking.istio.io/v1beta1 kind : VirtualService metadata : name : dirigible spec : hosts : - \"dirigible.\" gateways : - dirigible-gateway - mesh http : - match : - uri : prefix : / route : - destination : host : dirigible.default.svc.cluster.local port : number : 8080 Replace the placeholder with your domain from previous step. Check certificate status in cert-manager: kubectl logs -n cert-manager -lapp=cert-manager","title":"Steps"},{"location":"setup/kubernetes/addons/postgresql/","text":"PostgreSQL Setup Deploy PostgreSQL in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster. Steps Kubernetes Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 volumeMounts : - name : dirigible-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : dirigible-database-data persistentVolumeClaim : claimName : dirigible-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . Create service configuration file: service.yaml Service apiVersion : v1 kind : Service metadata : name : dirigible-database labels : app : dirigible-database spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : dirigible-database Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml GCP Cloud Dirigible PostgreSQL instances Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create PostgreSQL instance gcloud beta sql instances create YOUR_DIRIGIBLE_SQL_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_DIRIGIBLE_SQL_INSTANCE --require-ssl Create Dirigible database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_DIRIGIBLE_DB_NAME \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE Create Dirigible user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_DIRIGIBLE_DB_USER \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Quickstart Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_GKE_CLUSTER_NAME \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KSA_NAME]\" \\ YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KSA_NAME \\ iam.gke.io/gcp-service-account=YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_SERVICE_ACCOUNT_NAME Configure secrets kubectl create secret generic YOUR_DIRIGIBLE_SECRET_NAMET \\ --from-literal=database=YOUR_DIRIGIBLE_DATABASE \\ --from-literal=username=YOUR_DIRIGIBLE_USERNAME \\ --from-literal=password=DB_PASS Deploye app connects to your Cloud SQL instance env : - name : POSTGRE_URL valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : postgre_url - name : POSTGRE_DRIVER value : org.postgresql.Driver - name : POSTGRE_USERNAME valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : username - name : POSTGRE_PASSWORD valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : password - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances=::=tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true GCP Cloud Keycloak PostgreSQL instances Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create Keycloak PostgreSQL instance gcloud beta sql instances create YOUR_KEYCLOAK_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_KEYCLOAK_INSTANCE --require-ssl Create Keycloak database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_KEYCLOAK_DB \\ --instance=YOUR_KEYCLOAK_INSTANCE Create Keycloak user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_KEYCLOAK_USER \\ --instance=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Keycloak Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog Update node pool if is not updated gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_CLUSTER \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME]\" \\ YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME \\ iam.gke.io/gcp-service-account=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_KUBERNETES_SERVICE_ACCOUNT Configure secrets kubectl create secret generic YOUR_KEYCLOAK_SECRET_NAME \\ --from-literal=database=YOUR_KEYCLOAK_DB_NAME \\ --from-literal=username=YOUR_KEYCLOAK_USER_NAME \\ --from-literal=password=YOUR_KEYCLOAK_DB_PASS \\ --from-literal=postgre_url=jdbc:postgresql://127.0.0.1:5432/YOUR_KEYCLOAK_DB_NAME Set the environments to Keycloak deployment. env : - name : DB_VENDOR value : postgres - name : DB_USER valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : username - name : DB_PASSWORD valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : password - name : DB_DATABASE valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : database - name : DB_ADDR value : 127.0.0.1 - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances==tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true","title":"PostgreSQL"},{"location":"setup/kubernetes/addons/postgresql/#postgresql-setup","text":"Deploy PostgreSQL in Kubernetes environment. Prerequisites Install kubectl . Access to Kubernetes cluster.","title":"PostgreSQL Setup"},{"location":"setup/kubernetes/addons/postgresql/#steps","text":"","title":"Steps"},{"location":"setup/kubernetes/addons/postgresql/#kubernetes","text":"Create deployment configuration file: deployment.yaml Deployment Deployment with PVC apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP apiVersion : apps/v1 kind : Deployment metadata : name : dirigible-database labels : app : dirigible-database spec : replicas : 1 selector : matchLabels : app : dirigible-database template : metadata : labels : app : dirigible-database spec : containers : - name : dirigible-database image : postgres:13 volumeMounts : - name : dirigible-database-data mountPath : /var/lib/postgresql/data env : - name : PGDATA value : \"/var/lib/postgresql/data/pgdata\" - name : POSTGRES_USER value : - name : POSTGRES_PASSWORD value : ports : - name : jdbc containerPort : 5432 protocol : TCP volumes : - name : dirigible-database-data persistentVolumeClaim : claimName : dirigible-database-data --- apiVersion : v1 kind : PersistentVolumeClaim metadata : name : dirigible-database-data spec : accessModes : - ReadWriteOnce resources : requests : storage : 2Gi Replace Placeholders Before deploying, replace the following placeholders: with your Keycloak username (e.g. admin ) . with your Keycloak password (e.g. admin ) . Create service configuration file: service.yaml Service apiVersion : v1 kind : Service metadata : name : dirigible-database labels : app : dirigible-database spec : type : ClusterIP ports : - port : 8080 targetPort : http protocol : TCP name : http selector : app : dirigible-database Deploy to the Kubernetes Cluster with: kubectl apply -f deployment.yml kubectl apply -f service.yml","title":"Kubernetes"},{"location":"setup/kubernetes/addons/postgresql/#gcp-cloud-dirigible-postgresql-instances","text":"Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create PostgreSQL instance gcloud beta sql instances create YOUR_DIRIGIBLE_SQL_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_DIRIGIBLE_SQL_INSTANCE --require-ssl Create Dirigible database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_DIRIGIBLE_DB_NAME \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE Create Dirigible user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_DIRIGIBLE_DB_USER \\ --instance=YOUR_DIRIGIBLE_SQL_INSTANCE \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Quickstart Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_GKE_CLUSTER_NAME \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KSA_NAME]\" \\ YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KSA_NAME \\ iam.gke.io/gcp-service-account=YOUR_DIRIGIBLE_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_SERVICE_ACCOUNT_NAME Configure secrets kubectl create secret generic YOUR_DIRIGIBLE_SECRET_NAMET \\ --from-literal=database=YOUR_DIRIGIBLE_DATABASE \\ --from-literal=username=YOUR_DIRIGIBLE_USERNAME \\ --from-literal=password=DB_PASS Deploye app connects to your Cloud SQL instance env : - name : POSTGRE_URL valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : postgre_url - name : POSTGRE_DRIVER value : org.postgresql.Driver - name : POSTGRE_USERNAME valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : username - name : POSTGRE_PASSWORD valueFrom : secretKeyRef : name : YOUR_DIRIGIBLE_SECRET_NAMET key : password - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances=::=tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true","title":"GCP Cloud Dirigible PostgreSQL instances"},{"location":"setup/kubernetes/addons/postgresql/#gcp-cloud-keycloak-postgresql-instances","text":"Enable API Console gcloud Enable API Cloud SQL Admin API gcloud services enable sqladmin.googleapis.com \\ servicenetworking.googleapis.com Create an instance with a private IP address and SSL enabled Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Click Create instance. Click PostgreSQL. Enter quickstart-instance for Instance ID. Enter a password for the postgres user. Save this password for future use. Click the Single zone option for Choose region and zonal availability. Click and expand Show configuration options. For Machine Type, select Lightweight. In Connections, select Private IP. Select default in the Network drop-down menu. If you see a dialog stating Private services access connection required, click the Set Up Connection button. In the Enable Service Networking API dialog, click the Enable API button. In the Allocate an IP range dialog, select Use an automatically allocated IP range and click Continue. In the Create a connection dialog, click Create Connection. Clear the Public IP checkbox to create an instance only with a private IP. Click Create instance and then wait for the instance to initialize and start. Click Connections. In the Security section, select Allow only SSL connections to enable SSL connections. In the Allow only SSL connections dialog, click Save and then wait for the instance to restart. Creating an instance with a private IP address only requires configuring private services access to enable connections from other Google Cloud services, such as GKE. gcloud compute addresses create google-managed-services-default \\ --global \\ --purpose=VPC_PEERING \\ --prefix-length=16 \\ --description=\"peering range for Google\" \\ --network=default Run the gcloud services vpc-peerings connect command to create the private services access connection: gcloud services vpc-peerings connect \\ --service=servicenetworking.googleapis.com \\ --ranges=google-managed-services-default \\ --network=default Create Keycloak PostgreSQL instance gcloud beta sql instances create YOUR_KEYCLOAK_INSTANCE \\ --database-version=POSTGRES_13 \\ --cpu=1 \\ --memory=4GB \\ --region= \\ --root-password='DB_ROOT_PASSWORD' \\ --no-assign-ip \\ --network=default Run the gcloud sql instances patch command to allow only SSL connections for the instance. gcloud sql instances patch YOUR_KEYCLOAK_INSTANCE --require-ssl Create Keycloak database Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances Select quickstart-instance. Open the Databases tab. Click Create database. In the New database dialog box, enter quickstart_db as the name of the database. Click Create. gcloud sql databases create YOUR_KEYCLOAK_DB \\ --instance=YOUR_KEYCLOAK_INSTANCE Create Keycloak user Console gcloud In the Google Cloud console, go to the Cloud SQL Instances page. Go to Cloud SQL Instances To open the Overview page of an instance, click the instance name. Select Users from the SQL navigation menu. Click Add user account. In the Add a user account to instance instance_name page, add the following information: Username: Set to quickstart-user Password: Specify a password for your database user. Make a note of this for use in a later step of this quickstart. Click Add. gcloud sql users create YOUR_KEYCLOAK_USER \\ --instance=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --password='DB_PASS' Create new service account Run the gcloud iam service-accounts create command as follows to create a new service account: gcloud iam service-accounts create YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT \\ --display-name=\"GKE Keycloak Service Account\" Run the gcloud projects add-iam-policy-binding command as follows to add the Cloud SQL Client role to the Google Cloud service account you just created. Replace YOUR_PROJECT_ID with the project ID. gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \\ --member=\"serviceAccount:YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com\" \\ --role=\"roles/cloudsql.client\" Enable Workload Identity To have access to Cloud SQL by binding it to the Google Cloud service account using Workload Identity. gcloud container clusters update CLUSTER_NAME \\ --region=COMPUTE_REGION \\ --workload-pool=PROJECT_ID.svc.id.goog Update node pool if is not updated gcloud container node-pools update YOUR_NODE_GKE_CLUSTER_NODE_POOL \\ --cluster=YOUR_CLUSTER \\ --workload-metadata=GKE_METADATA Create a Kubernetes Service Account kubectl create sa YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME Enable IAM binding Run the gcloud iam service-accounts add-iam-policy-binding command as follows to enable IAM binding of the Google Cloud Service Account and the Kubernetes Service Account. Make the following replacements: gcloud iam service-accounts add-iam-policy-binding \\ --role=\"roles/iam.workloadIdentityUser\" \\ --member=\"serviceAccount:YOUR_PROJECT_ID.svc.id.goog[YOUR_K8S_NAMESPACE/YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME]\" \\ YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Annotate the Kubernetes Service Account Run the kubectl annotate command as follows to annotate the Kubernetes Service Account with IAM binding. Make the following replacements: kubectl annotate serviceaccount \\ YOUR_KUBERNETES_SERVICE_ACCOUNT_NAME \\ iam.gke.io/gcp-service-account=YOUR_KEYCLOAK_GOOGLE_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com Use your new account for the your deployment spec : serviceAccountName : YOUR_KUBERNETES_SERVICE_ACCOUNT Configure secrets kubectl create secret generic YOUR_KEYCLOAK_SECRET_NAME \\ --from-literal=database=YOUR_KEYCLOAK_DB_NAME \\ --from-literal=username=YOUR_KEYCLOAK_USER_NAME \\ --from-literal=password=YOUR_KEYCLOAK_DB_PASS \\ --from-literal=postgre_url=jdbc:postgresql://127.0.0.1:5432/YOUR_KEYCLOAK_DB_NAME Set the environments to Keycloak deployment. env : - name : DB_VENDOR value : postgres - name : DB_USER valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : username - name : DB_PASSWORD valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : password - name : DB_DATABASE valueFrom : secretKeyRef : name : YOUR_KEYCLOAK_SECRET_NAME key : database - name : DB_ADDR value : 127.0.0.1 - name : cloud-sql-proxy # This uses the latest version of the Cloud SQL proxy # It is recommended to use a specific version for production environments. # See: https://github.com/GoogleCloudPlatform/cloudsql-proxy image : gcr.io/cloudsql-docker/gce-proxy:latest command : - \"/cloud_sql_proxy\" # If connecting from a VPC-native GKE cluster, you can use the # following flag to have the proxy connect over private IP - \"-ip_address_types=PRIVATE\" # tcp should be set to the port the proxy should listen on # and should match the DB_PORT value set above. # Defaults: MySQL: 3306, Postgres: 5432, SQLServer: 1433 - \"-instances==tcp:5432\" securityContext : # The default Cloud SQL proxy image runs as the # \"nonroot\" user and group (uid: 65532) by default. runAsNonRoot : true","title":"GCP Cloud Keycloak PostgreSQL instances"},{"location":"tutorials/application-development/file-upload/","text":"File Upload Overview This sample shows how to create a simple web application for uploading files. Steps Create a project named file-upload-project . Right click on the file-upload-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the file-upload-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"file-upload-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the file-upload-project project and select New \u2192 TypeScript Service . Enter service.ts for the name of the TypeScript Service. Replace the content with the following code: import { upload , request , response } from \"sdk/http\" ; import { cmis } from \"sdk/cms\" ; import { streams } from \"sdk/io\" ; if ( request . getMethod () === \"POST\" ) { if ( upload . isMultipartContent ()) { const fileItems = upload . parseRequest (); for ( let i = 0 ; i < fileItems . size (); i ++ ) { const fileItem = fileItems . get ( i ); const fileName = fileItem . getName (); const contentType = fileItem . getContentType (); const bytes = fileItem . getBytes (); const inputStream = streams . createByteArrayInputStream ( bytes ); const cmisSession = cmis . getSession (); const contentStream = cmisSession . getObjectFactory (). createContentStream ( fileName , bytes . length , contentType , inputStream ); cmisSession . createDocument ( \"file-upload-project/uploads\" , { [ cmis . OBJECT_TYPE_ID ] : cmis . OBJECT_TYPE_DOCUMENT , [ cmis . NAME ] : fileName }, contentStream , cmis . VERSIONING_STATE_MAJOR ); } response . sendRedirect ( \"/services/web/ide-documents/\" ); } else { response . println ( \"The request's content must be 'multipart'\" ); } } else { response . println ( \"Use POST request.\" ); } response . flush (); response . close (); Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . http/upload Take a look at the http/upload documentation for more details about the API. Right click on the file-upload-project project and select New \u2192 HTLM5 Page . Enter index.html for the name of the file. Replace the content with the following code: < html > < body > < form action = \"/services/ts/file-upload-project/service.ts\" method = \"post\" enctype = \"multipart/form-data\" > < label for = \"file\" > Filename: < input type = \"file\" name = \"file\" id = \"file\" multiple > < br > < input type = \"submit\" name = \"submit\" value = \"Submit\" > < p >< b > Note: After successful upload you'll be redirected to the < a href = \"/services/web/ide-documents/\" > Documents perspective where the file can be found under the < b > file-upload-project/uploads folder. Save & Publish Saving the files will trigger a Publish action, which will build and deploy the TypeScript Service and the HTML5 Page . Select the index.html file and open the Preview view to test the file upload. Summary Tutorial Completed After completing the steps in this tutorial, you would have: HTML page to submit the uploaded file to the TypeScript service. Backend TypeScript service that would render the uploaded file. Note: The complete content of the File Upload tutorial is available at: https://github.com/dirigiblelabs/tutorial-file-upload-project","title":"File Upload"},{"location":"tutorials/application-development/file-upload/#file-upload","text":"","title":"File Upload"},{"location":"tutorials/application-development/file-upload/#overview","text":"This sample shows how to create a simple web application for uploading files.","title":"Overview"},{"location":"tutorials/application-development/file-upload/#steps","text":"Create a project named file-upload-project . Right click on the file-upload-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the file-upload-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"file-upload-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the file-upload-project project and select New \u2192 TypeScript Service . Enter service.ts for the name of the TypeScript Service. Replace the content with the following code: import { upload , request , response } from \"sdk/http\" ; import { cmis } from \"sdk/cms\" ; import { streams } from \"sdk/io\" ; if ( request . getMethod () === \"POST\" ) { if ( upload . isMultipartContent ()) { const fileItems = upload . parseRequest (); for ( let i = 0 ; i < fileItems . size (); i ++ ) { const fileItem = fileItems . get ( i ); const fileName = fileItem . getName (); const contentType = fileItem . getContentType (); const bytes = fileItem . getBytes (); const inputStream = streams . createByteArrayInputStream ( bytes ); const cmisSession = cmis . getSession (); const contentStream = cmisSession . getObjectFactory (). createContentStream ( fileName , bytes . length , contentType , inputStream ); cmisSession . createDocument ( \"file-upload-project/uploads\" , { [ cmis . OBJECT_TYPE_ID ] : cmis . OBJECT_TYPE_DOCUMENT , [ cmis . NAME ] : fileName }, contentStream , cmis . VERSIONING_STATE_MAJOR ); } response . sendRedirect ( \"/services/web/ide-documents/\" ); } else { response . println ( \"The request's content must be 'multipart'\" ); } } else { response . println ( \"Use POST request.\" ); } response . flush (); response . close (); Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . http/upload Take a look at the http/upload documentation for more details about the API. Right click on the file-upload-project project and select New \u2192 HTLM5 Page . Enter index.html for the name of the file. Replace the content with the following code: < html > < body > < form action = \"/services/ts/file-upload-project/service.ts\" method = \"post\" enctype = \"multipart/form-data\" > < label for = \"file\" > Filename: < input type = \"file\" name = \"file\" id = \"file\" multiple > < br > < input type = \"submit\" name = \"submit\" value = \"Submit\" > < p >< b > Note: After successful upload you'll be redirected to the < a href = \"/services/web/ide-documents/\" > Documents perspective where the file can be found under the < b > file-upload-project/uploads folder. Save & Publish Saving the files will trigger a Publish action, which will build and deploy the TypeScript Service and the HTML5 Page . Select the index.html file and open the Preview view to test the file upload.","title":"Steps"},{"location":"tutorials/application-development/file-upload/#summary","text":"Tutorial Completed After completing the steps in this tutorial, you would have: HTML page to submit the uploaded file to the TypeScript service. Backend TypeScript service that would render the uploaded file. Note: The complete content of the File Upload tutorial is available at: https://github.com/dirigiblelabs/tutorial-file-upload-project","title":"Summary"},{"location":"tutorials/application-development/kafka/","text":"Kafka Producer and Counsmer Prerequisites Run a local Kafka server following the steps (1 and 2) from here: https://kafka.apache.org/quickstart Steps Create a project kafka_project Then create a JavaScript service named my_kafka_handler.js Replace the service code with the following content: Handler exports . onMessage = function ( message ) { console . log ( \"Hello from My Kafka Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Kafka Listener! Error: \" + error ); }; Then create a Kafka Consumer named my_kafka_consumer.js Replace the file content with the following code: var consumer = require ( \"kafka/consumer\" ); consumer . topic ( \"topic1\" , \"{}\" ). startListening ( \"kafka_project/my_kafka_handler\" , 1000 ); Then create another back-end service which will play the role of a trigger my_kafka_producer.js Replace the trigger content with the following code: var producer = require ( \"kafka/producer\" ); producer . topic ( \"topic1\" , \"{}\" ). send ( \"key1\" , \"value1\" ); Publish the project Select the my_kafka_producer.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: 2020-11-01 23:33:54.272 [INFO ] [Thread-275] o.e.dirigible.api.v3.core.Console - Hello from My Kafka Listener! Message: {\"topic\":\"topic1\",\"partition\":0,\"offset\":29,\"timestamp\":1604266434251,\"timestampType\":\"CREATE_TIME\",\"serializedKeySize\":4,\"serializedValueSize\":6,\"headers\":{\"headers\":[],\"isReadOnly\":false},\"key\":\"key1\",\"value\":\"value1\",\"leaderEpoch\":{\"value\":0}} Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Kafka Producer and Consumer"},{"location":"tutorials/application-development/kafka/#kafka-producer-and-counsmer","text":"","title":"Kafka Producer and Counsmer"},{"location":"tutorials/application-development/kafka/#prerequisites","text":"Run a local Kafka server following the steps (1 and 2) from here: https://kafka.apache.org/quickstart","title":"Prerequisites"},{"location":"tutorials/application-development/kafka/#steps","text":"Create a project kafka_project Then create a JavaScript service named my_kafka_handler.js Replace the service code with the following content:","title":"Steps"},{"location":"tutorials/application-development/kafka/#handler","text":"exports . onMessage = function ( message ) { console . log ( \"Hello from My Kafka Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Kafka Listener! Error: \" + error ); }; Then create a Kafka Consumer named my_kafka_consumer.js Replace the file content with the following code: var consumer = require ( \"kafka/consumer\" ); consumer . topic ( \"topic1\" , \"{}\" ). startListening ( \"kafka_project/my_kafka_handler\" , 1000 ); Then create another back-end service which will play the role of a trigger my_kafka_producer.js Replace the trigger content with the following code: var producer = require ( \"kafka/producer\" ); producer . topic ( \"topic1\" , \"{}\" ). send ( \"key1\" , \"value1\" ); Publish the project Select the my_kafka_producer.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: 2020-11-01 23:33:54.272 [INFO ] [Thread-275] o.e.dirigible.api.v3.core.Console - Hello from My Kafka Listener! Message: {\"topic\":\"topic1\",\"partition\":0,\"offset\":29,\"timestamp\":1604266434251,\"timestampType\":\"CREATE_TIME\",\"serializedKeySize\":4,\"serializedValueSize\":6,\"headers\":{\"headers\":[],\"isReadOnly\":false},\"key\":\"key1\",\"value\":\"value1\",\"leaderEpoch\":{\"value\":0}} Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Handler"},{"location":"tutorials/application-development/listener-queue/","text":"Listener of a Queue Steps Create a project message_queue_listener_project Then create a JavaScript service named my_listener_handler.js Replace the service code with the following content: Handler exports . onMessage = function ( message ) { console . log ( \"Hello from My Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Listener! Error: \" + error ); }; Then create a Message Listener named my_listener.listener Replace the file content with the following JSON code: { \"name\" : \"message_queue_listener_project/my_queue\" , \"type\" : \"Q\" , \"handler\" : \"message_queue_listener_project/my_listener_handler.js\" , \"description\" : \"My Listener\" } Then create another back-end service which will play the role of a trigger my_trigger.js Replace the trigger content with the following code: var producer = require ( 'messaging/v3/producer' ); var message = \"*** I am a message created at: \" + new Date () + \" ***\" ; producer . queue ( \"message_queue_listener_project/my_queue\" ). send ( message ); console . log ( \"Hello from My Trigger! Message: \" + message ); Publish the project Select the my_trigger.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: [2018-05-14T11:57:13.197Z] [INFO] Hello from My Listener! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) [2018-05-14T11:57:13.174Z] [INFO] Hello from My Trigger! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Listener of a Queue"},{"location":"tutorials/application-development/listener-queue/#listener-of-a-queue","text":"","title":"Listener of a Queue"},{"location":"tutorials/application-development/listener-queue/#steps","text":"Create a project message_queue_listener_project Then create a JavaScript service named my_listener_handler.js Replace the service code with the following content:","title":"Steps"},{"location":"tutorials/application-development/listener-queue/#handler","text":"exports . onMessage = function ( message ) { console . log ( \"Hello from My Listener! Message: \" + message ); }; exports . onError = function ( error ) { console . error ( \"Error from My Listener! Error: \" + error ); }; Then create a Message Listener named my_listener.listener Replace the file content with the following JSON code: { \"name\" : \"message_queue_listener_project/my_queue\" , \"type\" : \"Q\" , \"handler\" : \"message_queue_listener_project/my_listener_handler.js\" , \"description\" : \"My Listener\" } Then create another back-end service which will play the role of a trigger my_trigger.js Replace the trigger content with the following code: var producer = require ( 'messaging/v3/producer' ); var message = \"*** I am a message created at: \" + new Date () + \" ***\" ; producer . queue ( \"message_queue_listener_project/my_queue\" ). send ( message ); console . log ( \"Hello from My Trigger! Message: \" + message ); Publish the project Select the my_trigger.js file in the Workspace view to be able to trigger the invocation of this service via the Preview view In the Console view you should see the following lines: [2018-05-14T11:57:13.197Z] [INFO] Hello from My Listener! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) [2018-05-14T11:57:13.174Z] [INFO] Hello from My Trigger! Message: I am a message created at: Mon May 14 2018 14:57:13 GMT+0300 (EEST) Note: the log messages in the Console view are in a reverse order - the newest are on top For more information, see the API documentation.","title":"Handler"},{"location":"tutorials/application-development/shell-command/","text":"Shell Command Steps Create a project shell_command_project Then create a file named my_command.sh Replace the code with the following content: uname -an echo variable1=$variable1 Then create a Command named my_command.command Replace the content with the following JSON code: { \"description\" : \"command description\" , \"contentType\" : \"text/plain\" , \"commands\" :[ { \"os\" : \"mac\" , \"command\" : \"sh shell_command_project/my_command.sh\" }, { \"os\" : \"linux\" , \"command\" : \"sh shell_command_project/my_command.sh\" } ], \"set\" :{ \"variable1\" : \"value1\" }, \"unset\" :[ \"variable2\" ] } Publish the project Select the *.command file in the Workspace explorer and inspect the result in the Preview: Darwin XXXXXXXXXXXXX 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64 variable1=value1 Note: The working folder is set to the registry/public space under the file-based Repository. You can execute an arbitrary command e.g. even Node, Python, Julia, etc., by using the dirigible projects' content published and available under the registry space. For this case the given framework has to be setup in advance and the entry point executable to be added to the PATH environment variable. The standard output is redirected to the service response. For more information, see the API documentation.","title":"Shell Command"},{"location":"tutorials/application-development/shell-command/#shell-command","text":"","title":"Shell Command"},{"location":"tutorials/application-development/shell-command/#steps","text":"Create a project shell_command_project Then create a file named my_command.sh Replace the code with the following content: uname -an echo variable1=$variable1 Then create a Command named my_command.command Replace the content with the following JSON code: { \"description\" : \"command description\" , \"contentType\" : \"text/plain\" , \"commands\" :[ { \"os\" : \"mac\" , \"command\" : \"sh shell_command_project/my_command.sh\" }, { \"os\" : \"linux\" , \"command\" : \"sh shell_command_project/my_command.sh\" } ], \"set\" :{ \"variable1\" : \"value1\" }, \"unset\" :[ \"variable2\" ] } Publish the project Select the *.command file in the Workspace explorer and inspect the result in the Preview: Darwin XXXXXXXXXXXXX 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64 variable1=value1 Note: The working folder is set to the registry/public space under the file-based Repository. You can execute an arbitrary command e.g. even Node, Python, Julia, etc., by using the dirigible projects' content published and available under the registry space. For this case the given framework has to be setup in advance and the entry point executable to be added to the PATH environment variable. The standard output is redirected to the service response. For more information, see the API documentation.","title":"Steps"},{"location":"tutorials/application-development/bookstore/","text":"Bookstore Application Overview This sample shows how to create a simple web application for managing a single entity called Books . It contains a database table definition, a RESTful service and a web page for managing the instances via user interface. Sections Database Table and Data Layer REST API User Interface Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Bookstore Application"},{"location":"tutorials/application-development/bookstore/#bookstore-application","text":"","title":"Bookstore Application"},{"location":"tutorials/application-development/bookstore/#overview","text":"This sample shows how to create a simple web application for managing a single entity called Books . It contains a database table definition, a RESTful service and a web page for managing the instances via user interface.","title":"Overview"},{"location":"tutorials/application-development/bookstore/#sections","text":"Database Table and Data Layer REST API User Interface Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Sections"},{"location":"tutorials/application-development/bookstore/api/","text":"Bookstore Application - API Overview This section shows how to create the API layer for the Bookstore application. It contains a Books REST API . Steps REST API Right click on the babylon-project project and select New \u2192 Folder . Enter api for the name of the folder. Right click on the api folder and select New \u2192 TypeScript Service . Enter books.ts for the name of the TypeScript Service. Replace the content the following code: import { rs } from \"sdk/http\" ; import { BookRepository , Book } from '../data/BookRepository' ; const repository = new BookRepository (); rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ) { const entities : Book [] = repository . list (); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entities )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . get ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); response . setContentType ( \"application/json\" ); if ( entity ) { response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); } else { response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"application/json\" ]) . resource ( \"/count\" ) . get ( function ( ctx , request , response ) { const count : number = repository . count (); response . setStatus ( 200 ); response . println ( ` ${ count } ` ); }) . resource ( \"\" ) . post ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = repository . create ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 201 ); response . setHeader ( \"Content-Location\" , `/services/ts/babylon-project/service/Books.ts/ ${ entity . id } ` ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . put ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = ctx . pathParameters . id ; repository . update ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . delete ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); if ( entity ) { repository . deleteById ( id ); response . setStatus ( 204 ); } else { response . setContentType ( \"application/json\" ); response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"*/*\" ]) . execute (); Save & Publish After saving the file right click on the project and select Publish in order to run the compilation and the deployment of the TypeScript Service . The tsconfig.json and project.json files should be present at the project root in order to run the compilation (they can be found in the Bookstore Application - Database tutorial) . REST API Execution A GET to the root path of the REST API request is triggered by selecting the books.ts file and open the Preview view. The TypeScript Service is available at the http://localhost:8080/services/ts/babylon-project/api/books.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . https/rs Take a look at the http/rs documentation for more details about the API. OpenAPI Right click on the babylon-project/api folder and select New \u2192 File . Enter books.openapi for the name of the file. Replace the content with the following definition: openapi : 3.0.3 info : title : Bookstore Application description : Bookstore application based on the following tutorial - [https://www.dirigible.io/help/tutorials/application-development/bookstore/](https://www.dirigible.io/help/tutorials/application-development/bookstore/). contact : name : Eclipse Dirigible url : https://dirigible.io license : name : Eclipse Public License - v 2.0 url : https://github.com/dirigiblelabs/tutorial-babylon-project/blob/master/LICENSE version : 1.0.0 servers : - url : /services/ts tags : - name : Books paths : /babylon-project/api/books.ts : get : tags : - Books responses : 200 : content : application/json : schema : type : array items : $ref : '#/components/schemas/Book' post : tags : - Books requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 201 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' /babylon-project/api/books.ts/{id} : get : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found put : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found delete : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10000 responses : 204 : description : The resource was deleted successfully. 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found components : schemas : Error : type : object properties : code : type : integer example : 400 message : type : string example : Bad Request Book : type : object properties : id : type : integer isbn : type : string maxLength : 17 pattern : ^\\d{3}-\\d{1}-\\d{3}-\\d{5}-\\d{1}$ example : 978-1-599-86977-3 title : type : string maxLength : 120 example : The Art of War publisher : type : string maxLength : 120 example : Filiquarian date : type : string format : date example : \"2006-01-01\" price : type : number format : float minimum : 0 example : 18.99 Save & Publish Saving the file will trigger a Publish action, which will build and deploy the OpenAPI definition. To display the embedded SwaggerUI select the books.openapi file and open the Preview view. The SwaggerUI can be accessed at http://localhost:8080/services/web/ide-swagger/ui/index.html?openapi=/services/web/babylon-project/api/books.openapi Note: All published OpenAPI definitions can be seen at http://localhost:8080/services/web/ide-swagger/ui/ Next Steps Section Completed After completing the steps in this tutorial, you would have: REST API and business logic to perform CRUD operations on the Book entity. Continue to the User Interface section to build a UI for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"API"},{"location":"tutorials/application-development/bookstore/api/#bookstore-application-api","text":"","title":"Bookstore Application - API"},{"location":"tutorials/application-development/bookstore/api/#overview","text":"This section shows how to create the API layer for the Bookstore application. It contains a Books REST API .","title":"Overview"},{"location":"tutorials/application-development/bookstore/api/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/bookstore/api/#rest-api","text":"Right click on the babylon-project project and select New \u2192 Folder . Enter api for the name of the folder. Right click on the api folder and select New \u2192 TypeScript Service . Enter books.ts for the name of the TypeScript Service. Replace the content the following code: import { rs } from \"sdk/http\" ; import { BookRepository , Book } from '../data/BookRepository' ; const repository = new BookRepository (); rs . service () . resource ( \"\" ) . get ( function ( ctx , request , response ) { const entities : Book [] = repository . list (); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entities )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . get ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); response . setContentType ( \"application/json\" ); if ( entity ) { response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); } else { response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"application/json\" ]) . resource ( \"/count\" ) . get ( function ( ctx , request , response ) { const count : number = repository . count (); response . setStatus ( 200 ); response . println ( ` ${ count } ` ); }) . resource ( \"\" ) . post ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = repository . create ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 201 ); response . setHeader ( \"Content-Location\" , `/services/ts/babylon-project/service/Books.ts/ ${ entity . id } ` ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . put ( function ( ctx , request , response ) { const entity = request . getJSON (); entity . id = ctx . pathParameters . id ; repository . update ( entity ); response . setContentType ( \"application/json\" ); response . setStatus ( 200 ); response . println ( JSON . stringify ( entity )); }). produces ([ \"application/json\" ]) . resource ( \"{id}\" ) . delete ( function ( ctx , request , response ) { const id : number = ctx . pathParameters . id ; const entity : Book = repository . findById ( id ); if ( entity ) { repository . deleteById ( id ); response . setStatus ( 204 ); } else { response . setContentType ( \"application/json\" ); response . setStatus ( 404 ); response . println ( JSON . stringify ({ code : 404 , message : \"Book not found\" })); } }). produces ([ \"*/*\" ]) . execute (); Save & Publish After saving the file right click on the project and select Publish in order to run the compilation and the deployment of the TypeScript Service . The tsconfig.json and project.json files should be present at the project root in order to run the compilation (they can be found in the Bookstore Application - Database tutorial) . REST API Execution A GET to the root path of the REST API request is triggered by selecting the books.ts file and open the Preview view. The TypeScript Service is available at the http://localhost:8080/services/ts/babylon-project/api/books.ts URL. It can be accessed in a separate browser tab, consumed by a third-party application or API tools like Postman or cURL . https/rs Take a look at the http/rs documentation for more details about the API.","title":"REST API"},{"location":"tutorials/application-development/bookstore/api/#openapi","text":"Right click on the babylon-project/api folder and select New \u2192 File . Enter books.openapi for the name of the file. Replace the content with the following definition: openapi : 3.0.3 info : title : Bookstore Application description : Bookstore application based on the following tutorial - [https://www.dirigible.io/help/tutorials/application-development/bookstore/](https://www.dirigible.io/help/tutorials/application-development/bookstore/). contact : name : Eclipse Dirigible url : https://dirigible.io license : name : Eclipse Public License - v 2.0 url : https://github.com/dirigiblelabs/tutorial-babylon-project/blob/master/LICENSE version : 1.0.0 servers : - url : /services/ts tags : - name : Books paths : /babylon-project/api/books.ts : get : tags : - Books responses : 200 : content : application/json : schema : type : array items : $ref : '#/components/schemas/Book' post : tags : - Books requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 201 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' /babylon-project/api/books.ts/{id} : get : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found put : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10001 requestBody : required : true content : application/json : schema : $ref : '#/components/schemas/Book' responses : 200 : content : application/json : schema : $ref : '#/components/schemas/Book' 400 : content : application/json : schema : $ref : '#/components/schemas/Error' 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found delete : tags : - Books parameters : - name : id in : path required : true schema : type : integer example : 10000 responses : 204 : description : The resource was deleted successfully. 404 : content : application/json : schema : $ref : '#/components/schemas/Error' example : code : 404 message : Not Found components : schemas : Error : type : object properties : code : type : integer example : 400 message : type : string example : Bad Request Book : type : object properties : id : type : integer isbn : type : string maxLength : 17 pattern : ^\\d{3}-\\d{1}-\\d{3}-\\d{5}-\\d{1}$ example : 978-1-599-86977-3 title : type : string maxLength : 120 example : The Art of War publisher : type : string maxLength : 120 example : Filiquarian date : type : string format : date example : \"2006-01-01\" price : type : number format : float minimum : 0 example : 18.99 Save & Publish Saving the file will trigger a Publish action, which will build and deploy the OpenAPI definition. To display the embedded SwaggerUI select the books.openapi file and open the Preview view. The SwaggerUI can be accessed at http://localhost:8080/services/web/ide-swagger/ui/index.html?openapi=/services/web/babylon-project/api/books.openapi Note: All published OpenAPI definitions can be seen at http://localhost:8080/services/web/ide-swagger/ui/","title":"OpenAPI"},{"location":"tutorials/application-development/bookstore/api/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: REST API and business logic to perform CRUD operations on the Book entity. Continue to the User Interface section to build a UI for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Next Steps"},{"location":"tutorials/application-development/bookstore/database/","text":"Bookstore Application - Database Overview This section shows how to create the database layer for the Bookstore application. It contains a database table definition for the BOOKS table, CSV data, CSVIM import definition and TypeScript Repository class. Steps Table Definition Create a project named babylon-project . Right click on the babylon-project project and select New \u2192 Folder . Enter data for the name of the folder. Right click on the data folder and select New \u2192 Database Table . Enter BABYLON_BOOKS.table for the name of the database table descriptor. Right click on BABYLON_BOOKS.table and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"name\" : \"BABYLON_BOOKS\" , \"type\" : \"TABLE\" , \"columns\" : [ { \"name\" : \"BOOK_ID\" , \"type\" : \"INTEGER\" , \"primaryKey\" : true , \"identity\" : \"true\" , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_ISBN\" , \"type\" : \"CHAR\" , \"length\" : \"17\" , \"unique\" : true , \"primaryKey\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_TITLE\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_PUBLISHER\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_DATE\" , \"type\" : \"DATE\" , \"nullable\" : true , \"unique\" : false }, { \"name\" : \"BOOK_PRICE\" , \"type\" : \"DOUBLE\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false } ], \"dependencies\" : [] } Save the changes and close the Code Editor . Double click on BABYLON_BOOKS.table to view the definition with the Table Editor . Save & Publish Saving the file will trigger a Publish action, which will create the database table in the target database schema. Usually this action should take several seconds to complete, after which the database table would be visible in the Database Perspective . Note: Manual Publish can be performed by right clicking on the artifact and selecting Publish from the context menu. The Publish action can be performed also on project level. CSV Data Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csv for the name of the file. Right click on books.csv and select Open With \u2192 Code Editor . Paste the following CSV data: BOOK_ID,BOOK_ISBN,BOOK_TITLE,BOOK_PUBLISHER,BOOK_DATE,BOOK_PRICE 10001,978-3-598-21500-1,Beartown,Simon & Schuster,2019-05-01,17.0 10002,978-3-598-21501-8,Beneath a Scarlet Sky,Lake Union Publishing,2017-05-01,9.74 10003,978-3-598-21529-2,Dead Certain,Free Press,2007-09-04,7.19 10004,978-3-598-21550-6,Everything We Keep,Lake Union Publishing,2016-08-01,14.65 10005,978-3-598-21550-9,Exit West,Hamish Hamilton,2017-02-27,11.45 Save the changes and close the Code Editor . Double click on books.csv to view the data with the CSV Editor . CSVIM Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csvim for the name of the file. Right click on books.csvim and select Open With \u2192 Code Editor . Paste the following CSVIM definition: { \"files\" : [ { \"table\" : \"BABYLON_BOOKS\" , \"schema\" : \"PUBLIC\" , \"file\" : \"/babylon-project/data/books.csv\" , \"header\" : true , \"useHeaderNames\" : true , \"delimField\" : \",\" , \"delimEnclosing\" : \"\\\"\" , \"distinguishEmptyFromNull\" : true , \"version\" : \"\" } ] } Save the changes and close the Code Editor . Double click on books.csvim to view the definition with the CSVIM Editor . Save & Publish Once the file is saved a Publish action would be triggered, which will result into the data from the CSV file to be imported to the database table. Note: Navigate to the Database Perspective to check that the BABYLON_BOOKS table is created and perform the following SQL query to check that the data from the CSV file is imported. select * from BABYLON_BOOKS ; Repository Right click on the babylon-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the babylon-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"babylon-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the babylon-project/data folder and select New \u2192 TypeScript Service . Enter BookRepository.ts for the name of the TypeScript Service. Replace the content with the following code: import { dao as daoApi } from \"sdk/db\" export interface Book { readonly id? : number ; readonly isbn : string ; readonly title : string ; readonly publisher : string ; readonly date : Date ; readonly price : number ; } export class BookRepository { private repository ; constructor ( dataSourceName? : string , logCtxName? : string ) { this . repository = daoApi . create ({ table : \"BABYLON_BOOKS\" , properties : [ { name : \"id\" , column : \"BOOK_ID\" , type : \"INTEGER\" , id : true , required : true }, { name : \"isbn\" , column : \"BOOK_ISBN\" , type : \"CHAR\" , id : false , required : false }, { name : \"title\" , column : \"BOOK_TITLE\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"publisher\" , column : \"BOOK_PUBLISHER\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"date\" , column : \"BOOK_DATE\" , type : \"DATE\" , id : false , required : true }, { name : \"price\" , column : \"BOOK_PRICE\" , type : \"DOUBLE\" , id : false , required : true }] }, logCtxName , dataSourceName ); } public list = ( settings ? ) : Book [] => { return this . repository . list ( settings ); }; public findById = ( id : number ) : Book | null => { return this . repository . find ( id ); }; public create = ( entity : Book ) : Book => { return this . repository . insert ( entity ); }; public update = ( entity : Book ) : Book => { return this . repository . update ( entity ); }; public deleteById = ( id : number ) : void => { this . repository . remove ( id ); }; public count = () : number => { return this . repository . count (); } } Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . db/dao Take a look at the db/dao documentation for more details about the API. Next Steps Section Completed After completing the steps in this tutorial, you would have: Database table named BABYLON_BOOKS . Initial data imported into the database table. TypeScript repository class to perform basic data operations. Continue to the API section to build a REST API for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Database"},{"location":"tutorials/application-development/bookstore/database/#bookstore-application-database","text":"","title":"Bookstore Application - Database"},{"location":"tutorials/application-development/bookstore/database/#overview","text":"This section shows how to create the database layer for the Bookstore application. It contains a database table definition for the BOOKS table, CSV data, CSVIM import definition and TypeScript Repository class.","title":"Overview"},{"location":"tutorials/application-development/bookstore/database/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/bookstore/database/#table-definition","text":"Create a project named babylon-project . Right click on the babylon-project project and select New \u2192 Folder . Enter data for the name of the folder. Right click on the data folder and select New \u2192 Database Table . Enter BABYLON_BOOKS.table for the name of the database table descriptor. Right click on BABYLON_BOOKS.table and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"name\" : \"BABYLON_BOOKS\" , \"type\" : \"TABLE\" , \"columns\" : [ { \"name\" : \"BOOK_ID\" , \"type\" : \"INTEGER\" , \"primaryKey\" : true , \"identity\" : \"true\" , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_ISBN\" , \"type\" : \"CHAR\" , \"length\" : \"17\" , \"unique\" : true , \"primaryKey\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_TITLE\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_PUBLISHER\" , \"type\" : \"VARCHAR\" , \"length\" : \"120\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false }, { \"name\" : \"BOOK_DATE\" , \"type\" : \"DATE\" , \"nullable\" : true , \"unique\" : false }, { \"name\" : \"BOOK_PRICE\" , \"type\" : \"DOUBLE\" , \"primaryKey\" : false , \"unique\" : false , \"nullable\" : false } ], \"dependencies\" : [] } Save the changes and close the Code Editor . Double click on BABYLON_BOOKS.table to view the definition with the Table Editor . Save & Publish Saving the file will trigger a Publish action, which will create the database table in the target database schema. Usually this action should take several seconds to complete, after which the database table would be visible in the Database Perspective . Note: Manual Publish can be performed by right clicking on the artifact and selecting Publish from the context menu. The Publish action can be performed also on project level.","title":"Table Definition"},{"location":"tutorials/application-development/bookstore/database/#csv-data","text":"Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csv for the name of the file. Right click on books.csv and select Open With \u2192 Code Editor . Paste the following CSV data: BOOK_ID,BOOK_ISBN,BOOK_TITLE,BOOK_PUBLISHER,BOOK_DATE,BOOK_PRICE 10001,978-3-598-21500-1,Beartown,Simon & Schuster,2019-05-01,17.0 10002,978-3-598-21501-8,Beneath a Scarlet Sky,Lake Union Publishing,2017-05-01,9.74 10003,978-3-598-21529-2,Dead Certain,Free Press,2007-09-04,7.19 10004,978-3-598-21550-6,Everything We Keep,Lake Union Publishing,2016-08-01,14.65 10005,978-3-598-21550-9,Exit West,Hamish Hamilton,2017-02-27,11.45 Save the changes and close the Code Editor . Double click on books.csv to view the data with the CSV Editor .","title":"CSV Data"},{"location":"tutorials/application-development/bookstore/database/#csvim","text":"Right click on the babylon-project/data folder and select New \u2192 File . Enter books.csvim for the name of the file. Right click on books.csvim and select Open With \u2192 Code Editor . Paste the following CSVIM definition: { \"files\" : [ { \"table\" : \"BABYLON_BOOKS\" , \"schema\" : \"PUBLIC\" , \"file\" : \"/babylon-project/data/books.csv\" , \"header\" : true , \"useHeaderNames\" : true , \"delimField\" : \",\" , \"delimEnclosing\" : \"\\\"\" , \"distinguishEmptyFromNull\" : true , \"version\" : \"\" } ] } Save the changes and close the Code Editor . Double click on books.csvim to view the definition with the CSVIM Editor . Save & Publish Once the file is saved a Publish action would be triggered, which will result into the data from the CSV file to be imported to the database table. Note: Navigate to the Database Perspective to check that the BABYLON_BOOKS table is created and perform the following SQL query to check that the data from the CSV file is imported. select * from BABYLON_BOOKS ;","title":"CSVIM"},{"location":"tutorials/application-development/bookstore/database/#repository","text":"Right click on the babylon-project project and select New \u2192 File . Enter tsconfig.json for the name of the File. Replace the content with the following: { \"compilerOptions\" : { \"module\" : \"ESNext\" } } Right click on the babylon-project project and select New \u2192 File . Enter project.json for the name of the File. Replace the content with the following: { \"guid\" : \"babylon-project\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } TypeScript Compilation The tsconfig.json and project.json files are needed for the compilation of the TypeScript files. In order to run the compilation a Publish action should be performed on the Project level (right click on the project and select Publish ) . Right click on the babylon-project/data folder and select New \u2192 TypeScript Service . Enter BookRepository.ts for the name of the TypeScript Service. Replace the content with the following code: import { dao as daoApi } from \"sdk/db\" export interface Book { readonly id? : number ; readonly isbn : string ; readonly title : string ; readonly publisher : string ; readonly date : Date ; readonly price : number ; } export class BookRepository { private repository ; constructor ( dataSourceName? : string , logCtxName? : string ) { this . repository = daoApi . create ({ table : \"BABYLON_BOOKS\" , properties : [ { name : \"id\" , column : \"BOOK_ID\" , type : \"INTEGER\" , id : true , required : true }, { name : \"isbn\" , column : \"BOOK_ISBN\" , type : \"CHAR\" , id : false , required : false }, { name : \"title\" , column : \"BOOK_TITLE\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"publisher\" , column : \"BOOK_PUBLISHER\" , type : \"VARCHAR\" , id : false , required : false }, { name : \"date\" , column : \"BOOK_DATE\" , type : \"DATE\" , id : false , required : true }, { name : \"price\" , column : \"BOOK_PRICE\" , type : \"DOUBLE\" , id : false , required : true }] }, logCtxName , dataSourceName ); } public list = ( settings ? ) : Book [] => { return this . repository . list ( settings ); }; public findById = ( id : number ) : Book | null => { return this . repository . find ( id ); }; public create = ( entity : Book ) : Book => { return this . repository . insert ( entity ); }; public update = ( entity : Book ) : Book => { return this . repository . update ( entity ); }; public deleteById = ( id : number ) : void => { this . repository . remove ( id ); }; public count = () : number => { return this . repository . count (); } } Save & Publish In order to run the compilation of TypeScript files a Publish action should be performed on the Project level (right click on the project and select Publish ) . db/dao Take a look at the db/dao documentation for more details about the API.","title":"Repository"},{"location":"tutorials/application-development/bookstore/database/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Database table named BABYLON_BOOKS . Initial data imported into the database table. TypeScript repository class to perform basic data operations. Continue to the API section to build a REST API for the Book entity. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Next Steps"},{"location":"tutorials/application-development/bookstore/ui/","text":"Bookstore Application - UI Overview This section shows how to create the User Interface layer for the Bookstore application. It contains a Books Perspective , View for displaying the data and Dialog for modifing the Books data. Steps Perspective Right click on the babylon-project project and select New \u2192 Folder . Enter ui for the name of the folder. Create index.html , perspective.js and perspective.extension as shown below: index.html perspective.js perspective.extension Right click on the ui folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" ng-app = \"app\" ng-controller = \"ApplicationController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < script type = \"text/javascript\" src = \"perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-perspective-css\" /> < body > < ide-header menu-ext-id = \"books-menu\" > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > < script type = \"text/javascript\" > angular . module ( 'app' , [ 'ngResource' , 'ideLayout' , 'ideUI' ]) . constant ( 'branding' , { name : 'Babylon' , brand : 'Eclipse Dirigible' , brandUrl : 'https://dirigible.io' , icons : { faviconIco : '/services/web/resources/images/favicon.ico' , favicon32 : '/services/web/resources/images/favicon-32x32.png' , favicon16 : '/services/web/resources/images/favicon-16x16.png' , }, logo : '/services/web/resources/images/dirigible.svg' , }) . constant ( 'extensionPoint' , { perspectives : \"books\" , views : \"books-view\" , dialogWindows : \"books-dialog-window\" }) . controller ( 'ApplicationController' , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { const httpRequest = new XMLHttpRequest (); httpRequest . open ( \"GET\" , \"/services/js/resources-core/services/views.js?extensionPoint=books-view\" , false ); httpRequest . send (); $scope . layoutModel = { views : JSON . parse ( httpRequest . responseText ). filter ( e => ! e . isLaunchpad && e . perspectiveName === \"books\" ). map ( e => e . id ) }; }]); Right click on the ui folder and select New \u2192 File . Enter perspective.js for the name of the file. Replace the content with the following code: const perspectiveData = { id : \"books\" , name : \"books\" , link : \"/services/web/babylon-project/ui/index.html\" , order : \"100\" , icon : \"/services/web/resources/unicons/copy.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Right click on the ui folder and select New \u2192 Extension . Enter perspective.extension for the name of the Extension. Right click on perspective.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/perspective.js\" , \"extensionPoint\" : \"books\" , \"description\" : \"Books - Perspective\" } Note The index.html , perspective.js and perspective.extension files should be located at the babylon-project/ui folder. View Right click on the babylon-project/ui folder and select New \u2192 Folder . Enter Books for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the Books folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" > < fd-toolbar has-title = \"true\" > < fd-toolbar-title > Items ({{dataCount}}) < fd-toolbar-spacer > < fd-button compact = \"true\" dg-type = \"transparent\" dg-label = \"Create\" ng-click = \"createEntity()\" > < fd-scrollbar class = \"dg-full-height\" ng-hide = \"data == null\" > < table fd-table display-mode = \"compact\" inner-borders = \"top\" outer-borders = \"none\" > < thead fd-table-header sticky = \"true\" > < tr fd-table-row > < th fd-table-header-cell > Title < th fd-table-header-cell > Publisher < th fd-table-header-cell > Date < th fd-table-header-cell > Price < th fd-table-header-cell > < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-show = \"data.length == 0\" > < td fd-table-cell no-data = \"true\" > No data available. < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" dg-selected = \"next.id === selectedEntity.id\" ng-click = \"selectEntity(next)\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.title}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.publisher}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > < fd-input type = \"date\" ng-model = \"next.date\" ng-readonly = \"true\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.price}} < td fd-table-cell fit-content = \"true\" > < fd-popover > < fd-popover-control > < fd-button compact = \"true\" glyph = \"sap-icon--overflow\" dg-type = \"transparent\" aria-label = \"Table Row Menu Button\" ng-click = \"setTristate()\" > < fd-popover-body dg-align = \"bottom-right\" > < fd-menu aria-label = \"Table Row Menu\" no-backdrop = \"true\" no-shadow = \"true\" > < fd-menu-item title = \"View Details\" ng-click = \"openDetails(next)\" > < fd-menu-item title = \"Edit\" ng-click = \"updateEntity(next)\" > < fd-menu-item title = \"Delete\" ng-click = \"deleteEntity(next)\" > < fd-pagination total-items = \"dataCount\" items-per-page = \"dataLimit\" items-per-page-options = \"[10, 20, 50]\" page-change = \"loadPage(pageNumber)\" items-per-page-change = \"loadPage(pageNumber)\" items-per-page-placement = \"top-start\" compact = \"true\" display-total-items = \"true\" ng-hide = \"dataCount == 0\" > Right click on the Books folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { function resetPagination () { $scope . dataPage = 1 ; $scope . dataCount = 0 ; $scope . dataLimit = 20 ; } resetPagination (); //-----------------Events-------------------// messageHub . onDidReceiveMessage ( \"entityCreated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); messageHub . onDidReceiveMessage ( \"entityUpdated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); //-----------------Events-------------------// $scope . loadPage = function ( pageNumber ) { $scope . dataPage = pageNumber ; entityApi . count (). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to count Books: ' ${ response . message } '` ); return ; } $scope . dataCount = parseInt ( response . data ); let offset = ( pageNumber - 1 ) * $scope . dataLimit ; let limit = $scope . dataLimit ; entityApi . list ( offset , limit ). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to list Books: ' ${ response . message } '` ); return ; } response . data . forEach ( e => { if ( e . date ) { e . date = new Date ( e . date ); } }); $scope . data = response . data ; }); }); }; $scope . loadPage ( $scope . dataPage ); $scope . selectEntity = function ( entity ) { $scope . selectedEntity = entity ; }; $scope . openDetails = function ( entity ) { $scope . selectedEntity = entity ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"select\" , entity : entity , }); }; $scope . createEntity = function () { $scope . selectedEntity = null ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"create\" , entity : {}, }, null , false ); }; $scope . updateEntity = function ( entity ) { messageHub . showDialogWindow ( \"Books-details\" , { action : \"update\" , entity : entity , }, null , false ); }; $scope . deleteEntity = function ( entity ) { let id = entity . id ; messageHub . showDialogAsync ( 'Delete Books?' , `Are you sure you want to delete Books? This action cannot be undone.` , [{ id : \"delete-btn-yes\" , type : \"emphasized\" , label : \"Yes\" , }, { id : \"delete-btn-no\" , type : \"normal\" , label : \"No\" , }], ). then ( function ( msg ) { if ( msg . data === \"delete-btn-yes\" ) { entityApi . delete ( id ). then ( function ( response ) { if ( response . status != 204 ) { messageHub . showAlertError ( \"Books\" , `Unable to delete Books: ' ${ response . message } '` ); return ; } $scope . loadPage ( $scope . dataPage ); messageHub . postMessage ( \"clearDetails\" ); }); } }); }; }]); Right click on the Books folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books\" , label : \"Books\" , factory : \"frame\" , region : \"center\" , link : \"/services/web/babylon-project/ui/Books/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Right click on the Books folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/view.js\" , \"extensionPoint\" : \"books-view\" , \"description\" : \"Books - Application View\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books folder. Dialog Right click on the babylon-project/ui/Books folder and select New \u2192 Folder . Enter dialog-window for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the dialog-window folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < fd-scrollbar class = \"dg-full-height\" > < div class = \"fd-margin--md fd-message-strip fd-message-strip--error fd-message-strip--dismissible\" role = \"alert\" ng-show = \"errorMessage\" > < p class = \"fd-message-strip__text\" > {{ errorMessage }} < fd-button glyph = \"sap-icon--decline\" compact = \"true\" dg-type = \"transparent\" aria-label = \"Close\" in-msg-strip = \"true\" ng-click = \"clearErrorMessage()\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group dg-header = \"{{formHeaders[action]}}\" name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idisbn\" dg-required = \"false\" dg-colon = \"true\" > ISBN < fd-form-input-message-group dg-inactive = \"{{ formErrors.isbn ? false : true }}\" > < fd-input id = \"idisbn\" name = \"isbn\" state = \"{{ formErrors.isbn ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['isbn'].$valid, 'isbn')\" ng-model = \"entity.isbn\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter isbn\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idtitle\" dg-required = \"false\" dg-colon = \"true\" > Title < fd-form-input-message-group dg-inactive = \"{{ formErrors.title ? false : true }}\" > < fd-input id = \"idtitle\" name = \"title\" state = \"{{ formErrors.title ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['title'].$valid, 'title')\" ng-model = \"entity.title\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter title\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idpublisher\" dg-required = \"false\" dg-colon = \"true\" > Publisher < fd-form-input-message-group dg-inactive = \"{{ formErrors.publisher ? false : true }}\" > < fd-input id = \"idpublisher\" name = \"publisher\" state = \"{{ formErrors.publisher ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['publisher'].$valid, 'publisher')\" ng-model = \"entity.publisher\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter publisher\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"iddate\" dg-required = \"false\" dg-colon = \"true\" > Date < fd-form-input-message-group dg-inactive = \"{{ formErrors.date ? false : true }}\" > < fd-input id = \"iddate\" name = \"date\" state = \"{{ formErrors.date ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['date'].$valid, 'date')\" ng-model = \"entity.date\" ng-readonly = \"action === 'select'\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idprice\" dg-required = \"false\" dg-colon = \"true\" > Price < fd-form-input-message-group dg-inactive = \"{{ formErrors.price ? false : true }}\" > < fd-input id = \"idprice\" name = \"price\" state = \"{{ formErrors.price ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['price'].$valid, 'price')\" ng-model = \"entity.price\" ng-readonly = \"action === 'select'\" type = \"number\" placeholder = \"Enter price\" > < fd-form-message dg-type = \"error\" > Incorrect Input < footer class = \"fd-dialog__footer fd-bar fd-bar--footer\" ng-show = \"action !== 'select'\" > < div class = \"fd-bar__right\" > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"{{action === 'create' ? 'Create' : 'Update'}}\" ng-click = \"action === 'create' ? create() : update()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"cancel()\" > Right click on the dialog-window folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { $scope . entity = {}; $scope . formHeaders = { select : \"Books Details\" , create : \"Create Books\" , update : \"Update Books\" }; $scope . formErrors = {}; $scope . action = 'select' ; if ( window != null && window . frameElement != null && window . frameElement . hasAttribute ( \"data-parameters\" )) { let dataParameters = window . frameElement . getAttribute ( \"data-parameters\" ); if ( dataParameters ) { let params = JSON . parse ( dataParameters ); $scope . action = params . action ; if ( $scope . action == \"create\" ) { $scope . formErrors = { }; } if ( params . entity . date ) { params . entity . date = new Date ( params . entity . date ); } $scope . entity = params . entity ; $scope . selectedMainEntityKey = params . selectedMainEntityKey ; $scope . selectedMainEntityId = params . selectedMainEntityId ; } } $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . create = function () { let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . create ( entity ). then ( function ( response ) { if ( response . status != 201 ) { $scope . errorMessage = `Unable to create Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityCreated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully created\" ); }); }; $scope . update = function () { let id = $scope . entity . id ; let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . update ( id , entity ). then ( function ( response ) { if ( response . status != 200 ) { $scope . errorMessage = `Unable to update Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityUpdated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully updated\" ); }); }; $scope . cancel = function () { $scope . entity = {}; $scope . action = 'select' ; messageHub . closeDialogWindow ( \"Books-details\" ); }; $scope . clearErrorMessage = function () { $scope . errorMessage = null ; }; }]); Right click on the dialog-window folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books-details\" , label : \"Books\" , link : \"/services/web/babylon-project/ui/Books/dialog-window/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getDialogWindow = function () { return viewData ; } } Right click on the dialog-window folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/dialog-window/view.js\" , \"extensionPoint\" : \"books-dialog-window\" , \"description\" : \"Books - Application Dialog Window\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books/dialog-window folder. Publish and Preview (optional) Right click on the babylon-project project and select Publish . Select the babylon-project/ui/index.html in the Projects view In the Preview window you should see the web page for management of Books. Try to enter a few books to test how it works. Application URL The Bookstore Application is available at: http://localhost:8080/services/web/babylon-project/ui/ Summary Tutorial Completed After completing all steps in this tutorial, you would have: Extendable UI Perspective for the book related views. Books View to display the books data. Books Dialog for modifing the books data. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"User Interface"},{"location":"tutorials/application-development/bookstore/ui/#bookstore-application-ui","text":"","title":"Bookstore Application - UI"},{"location":"tutorials/application-development/bookstore/ui/#overview","text":"This section shows how to create the User Interface layer for the Bookstore application. It contains a Books Perspective , View for displaying the data and Dialog for modifing the Books data.","title":"Overview"},{"location":"tutorials/application-development/bookstore/ui/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/bookstore/ui/#perspective","text":"Right click on the babylon-project project and select New \u2192 Folder . Enter ui for the name of the folder. Create index.html , perspective.js and perspective.extension as shown below: index.html perspective.js perspective.extension Right click on the ui folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" ng-app = \"app\" ng-controller = \"ApplicationController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < script type = \"text/javascript\" src = \"perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-perspective-css\" /> < body > < ide-header menu-ext-id = \"books-menu\" > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > < script type = \"text/javascript\" > angular . module ( 'app' , [ 'ngResource' , 'ideLayout' , 'ideUI' ]) . constant ( 'branding' , { name : 'Babylon' , brand : 'Eclipse Dirigible' , brandUrl : 'https://dirigible.io' , icons : { faviconIco : '/services/web/resources/images/favicon.ico' , favicon32 : '/services/web/resources/images/favicon-32x32.png' , favicon16 : '/services/web/resources/images/favicon-16x16.png' , }, logo : '/services/web/resources/images/dirigible.svg' , }) . constant ( 'extensionPoint' , { perspectives : \"books\" , views : \"books-view\" , dialogWindows : \"books-dialog-window\" }) . controller ( 'ApplicationController' , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { const httpRequest = new XMLHttpRequest (); httpRequest . open ( \"GET\" , \"/services/js/resources-core/services/views.js?extensionPoint=books-view\" , false ); httpRequest . send (); $scope . layoutModel = { views : JSON . parse ( httpRequest . responseText ). filter ( e => ! e . isLaunchpad && e . perspectiveName === \"books\" ). map ( e => e . id ) }; }]); Right click on the ui folder and select New \u2192 File . Enter perspective.js for the name of the file. Replace the content with the following code: const perspectiveData = { id : \"books\" , name : \"books\" , link : \"/services/web/babylon-project/ui/index.html\" , order : \"100\" , icon : \"/services/web/resources/unicons/copy.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Right click on the ui folder and select New \u2192 Extension . Enter perspective.extension for the name of the Extension. Right click on perspective.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/perspective.js\" , \"extensionPoint\" : \"books\" , \"description\" : \"Books - Perspective\" } Note The index.html , perspective.js and perspective.extension files should be located at the babylon-project/ui folder.","title":"Perspective"},{"location":"tutorials/application-development/bookstore/ui/#view","text":"Right click on the babylon-project/ui folder and select New \u2192 Folder . Enter Books for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the Books folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" > < fd-toolbar has-title = \"true\" > < fd-toolbar-title > Items ({{dataCount}}) < fd-toolbar-spacer > < fd-button compact = \"true\" dg-type = \"transparent\" dg-label = \"Create\" ng-click = \"createEntity()\" > < fd-scrollbar class = \"dg-full-height\" ng-hide = \"data == null\" > < table fd-table display-mode = \"compact\" inner-borders = \"top\" outer-borders = \"none\" > < thead fd-table-header sticky = \"true\" > < tr fd-table-row > < th fd-table-header-cell > Title < th fd-table-header-cell > Publisher < th fd-table-header-cell > Date < th fd-table-header-cell > Price < th fd-table-header-cell > < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-show = \"data.length == 0\" > < td fd-table-cell no-data = \"true\" > No data available. < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" dg-selected = \"next.id === selectedEntity.id\" ng-click = \"selectEntity(next)\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.title}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.publisher}} < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > < fd-input type = \"date\" ng-model = \"next.date\" ng-readonly = \"true\" > < td fd-table-cell ng-click = \"openDetails(next)\" hoverable = \"true\" activable = \"true\" > {{next.price}} < td fd-table-cell fit-content = \"true\" > < fd-popover > < fd-popover-control > < fd-button compact = \"true\" glyph = \"sap-icon--overflow\" dg-type = \"transparent\" aria-label = \"Table Row Menu Button\" ng-click = \"setTristate()\" > < fd-popover-body dg-align = \"bottom-right\" > < fd-menu aria-label = \"Table Row Menu\" no-backdrop = \"true\" no-shadow = \"true\" > < fd-menu-item title = \"View Details\" ng-click = \"openDetails(next)\" > < fd-menu-item title = \"Edit\" ng-click = \"updateEntity(next)\" > < fd-menu-item title = \"Delete\" ng-click = \"deleteEntity(next)\" > < fd-pagination total-items = \"dataCount\" items-per-page = \"dataLimit\" items-per-page-options = \"[10, 20, 50]\" page-change = \"loadPage(pageNumber)\" items-per-page-change = \"loadPage(pageNumber)\" items-per-page-placement = \"top-start\" compact = \"true\" display-total-items = \"true\" ng-hide = \"dataCount == 0\" > Right click on the Books folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { function resetPagination () { $scope . dataPage = 1 ; $scope . dataCount = 0 ; $scope . dataLimit = 20 ; } resetPagination (); //-----------------Events-------------------// messageHub . onDidReceiveMessage ( \"entityCreated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); messageHub . onDidReceiveMessage ( \"entityUpdated\" , function ( msg ) { $scope . loadPage ( $scope . dataPage ); }); //-----------------Events-------------------// $scope . loadPage = function ( pageNumber ) { $scope . dataPage = pageNumber ; entityApi . count (). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to count Books: ' ${ response . message } '` ); return ; } $scope . dataCount = parseInt ( response . data ); let offset = ( pageNumber - 1 ) * $scope . dataLimit ; let limit = $scope . dataLimit ; entityApi . list ( offset , limit ). then ( function ( response ) { if ( response . status != 200 ) { messageHub . showAlertError ( \"Books\" , `Unable to list Books: ' ${ response . message } '` ); return ; } response . data . forEach ( e => { if ( e . date ) { e . date = new Date ( e . date ); } }); $scope . data = response . data ; }); }); }; $scope . loadPage ( $scope . dataPage ); $scope . selectEntity = function ( entity ) { $scope . selectedEntity = entity ; }; $scope . openDetails = function ( entity ) { $scope . selectedEntity = entity ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"select\" , entity : entity , }); }; $scope . createEntity = function () { $scope . selectedEntity = null ; messageHub . showDialogWindow ( \"Books-details\" , { action : \"create\" , entity : {}, }, null , false ); }; $scope . updateEntity = function ( entity ) { messageHub . showDialogWindow ( \"Books-details\" , { action : \"update\" , entity : entity , }, null , false ); }; $scope . deleteEntity = function ( entity ) { let id = entity . id ; messageHub . showDialogAsync ( 'Delete Books?' , `Are you sure you want to delete Books? This action cannot be undone.` , [{ id : \"delete-btn-yes\" , type : \"emphasized\" , label : \"Yes\" , }, { id : \"delete-btn-no\" , type : \"normal\" , label : \"No\" , }], ). then ( function ( msg ) { if ( msg . data === \"delete-btn-yes\" ) { entityApi . delete ( id ). then ( function ( response ) { if ( response . status != 204 ) { messageHub . showAlertError ( \"Books\" , `Unable to delete Books: ' ${ response . message } '` ); return ; } $scope . loadPage ( $scope . dataPage ); messageHub . postMessage ( \"clearDetails\" ); }); } }); }; }]); Right click on the Books folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books\" , label : \"Books\" , factory : \"frame\" , region : \"center\" , link : \"/services/web/babylon-project/ui/Books/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Right click on the Books folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/view.js\" , \"extensionPoint\" : \"books-view\" , \"description\" : \"Books - Application View\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books folder.","title":"View"},{"location":"tutorials/application-development/bookstore/ui/#dialog","text":"Right click on the babylon-project/ui/Books folder and select New \u2192 Folder . Enter dialog-window for the name of the folder. Create index.html , controller.js , view.js and view.extension as shown below: index.html controller.js view.js view.extension Right click on the dialog-window folder and select New \u2192 File . Enter index.html for the name of the file. Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < fd-scrollbar class = \"dg-full-height\" > < div class = \"fd-margin--md fd-message-strip fd-message-strip--error fd-message-strip--dismissible\" role = \"alert\" ng-show = \"errorMessage\" > < p class = \"fd-message-strip__text\" > {{ errorMessage }} < fd-button glyph = \"sap-icon--decline\" compact = \"true\" dg-type = \"transparent\" aria-label = \"Close\" in-msg-strip = \"true\" ng-click = \"clearErrorMessage()\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group dg-header = \"{{formHeaders[action]}}\" name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idisbn\" dg-required = \"false\" dg-colon = \"true\" > ISBN < fd-form-input-message-group dg-inactive = \"{{ formErrors.isbn ? false : true }}\" > < fd-input id = \"idisbn\" name = \"isbn\" state = \"{{ formErrors.isbn ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['isbn'].$valid, 'isbn')\" ng-model = \"entity.isbn\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter isbn\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idtitle\" dg-required = \"false\" dg-colon = \"true\" > Title < fd-form-input-message-group dg-inactive = \"{{ formErrors.title ? false : true }}\" > < fd-input id = \"idtitle\" name = \"title\" state = \"{{ formErrors.title ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['title'].$valid, 'title')\" ng-model = \"entity.title\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter title\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idpublisher\" dg-required = \"false\" dg-colon = \"true\" > Publisher < fd-form-input-message-group dg-inactive = \"{{ formErrors.publisher ? false : true }}\" > < fd-input id = \"idpublisher\" name = \"publisher\" state = \"{{ formErrors.publisher ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['publisher'].$valid, 'publisher')\" ng-model = \"entity.publisher\" ng-readonly = \"action === 'select'\" ng-minlength = \"0.0 || 0\" ng-maxlength = \"20.0 || -1\" dg-input-rules = \"{ patterns: [''] }\" type = \"text\" placeholder = \"Enter publisher\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"iddate\" dg-required = \"false\" dg-colon = \"true\" > Date < fd-form-input-message-group dg-inactive = \"{{ formErrors.date ? false : true }}\" > < fd-input id = \"iddate\" name = \"date\" state = \"{{ formErrors.date ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['date'].$valid, 'date')\" ng-model = \"entity.date\" ng-readonly = \"action === 'select'\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idprice\" dg-required = \"false\" dg-colon = \"true\" > Price < fd-form-input-message-group dg-inactive = \"{{ formErrors.price ? false : true }}\" > < fd-input id = \"idprice\" name = \"price\" state = \"{{ formErrors.price ? 'error' : '' }}\" ng-required = \"false\" ng-change = \"isValid(formFieldset['price'].$valid, 'price')\" ng-model = \"entity.price\" ng-readonly = \"action === 'select'\" type = \"number\" placeholder = \"Enter price\" > < fd-form-message dg-type = \"error\" > Incorrect Input < footer class = \"fd-dialog__footer fd-bar fd-bar--footer\" ng-show = \"action !== 'select'\" > < div class = \"fd-bar__right\" > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"{{action === 'create' ? 'Create' : 'Update'}}\" ng-click = \"action === 'create' ? create() : update()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"cancel()\" > Right click on the dialog-window folder and select New \u2192 File . Enter controller.js for the name of the file. Replace the content with the following code: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" , \"entityApi\" ]) . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'babylon-project.books.Books' ; }]) . config ([ \"entityApiProvider\" , function ( entityApiProvider ) { entityApiProvider . baseUrl = \"/services/ts/babylon-project/api/books.ts\" ; }]) . controller ( 'PageController' , [ '$scope' , 'messageHub' , 'entityApi' , function ( $scope , messageHub , entityApi ) { $scope . entity = {}; $scope . formHeaders = { select : \"Books Details\" , create : \"Create Books\" , update : \"Update Books\" }; $scope . formErrors = {}; $scope . action = 'select' ; if ( window != null && window . frameElement != null && window . frameElement . hasAttribute ( \"data-parameters\" )) { let dataParameters = window . frameElement . getAttribute ( \"data-parameters\" ); if ( dataParameters ) { let params = JSON . parse ( dataParameters ); $scope . action = params . action ; if ( $scope . action == \"create\" ) { $scope . formErrors = { }; } if ( params . entity . date ) { params . entity . date = new Date ( params . entity . date ); } $scope . entity = params . entity ; $scope . selectedMainEntityKey = params . selectedMainEntityKey ; $scope . selectedMainEntityId = params . selectedMainEntityId ; } } $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . create = function () { let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . create ( entity ). then ( function ( response ) { if ( response . status != 201 ) { $scope . errorMessage = `Unable to create Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityCreated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully created\" ); }); }; $scope . update = function () { let id = $scope . entity . id ; let entity = $scope . entity ; entity [ $scope . selectedMainEntityKey ] = $scope . selectedMainEntityId ; entityApi . update ( id , entity ). then ( function ( response ) { if ( response . status != 200 ) { $scope . errorMessage = `Unable to update Books: ' ${ response . message } '` ; return ; } messageHub . postMessage ( \"entityUpdated\" , response . data ); $scope . cancel (); messageHub . showAlertSuccess ( \"Books\" , \"Books successfully updated\" ); }); }; $scope . cancel = function () { $scope . entity = {}; $scope . action = 'select' ; messageHub . closeDialogWindow ( \"Books-details\" ); }; $scope . clearErrorMessage = function () { $scope . errorMessage = null ; }; }]); Right click on the dialog-window folder and select New \u2192 File . Enter view.js for the name of the file. Replace the content with the following code: const viewData = { id : \"Books-details\" , label : \"Books\" , link : \"/services/web/babylon-project/ui/Books/dialog-window/index.html\" , perspectiveName : \"books\" }; if ( typeof exports !== 'undefined' ) { exports . getDialogWindow = function () { return viewData ; } } Right click on the dialog-window folder and select New \u2192 Extension . Enter view.extension for the name of the Extension. Right click on view.extension and select Open With \u2192 Code Editor . Replace the content with the following code: { \"module\" : \"babylon-project/ui/Books/dialog-window/view.js\" , \"extensionPoint\" : \"books-dialog-window\" , \"description\" : \"Books - Application Dialog Window\" } Note The index.html , controller.js , view.js and view.extension files should be located at the babylon-project/ui/Books/dialog-window folder.","title":"Dialog"},{"location":"tutorials/application-development/bookstore/ui/#publish-and-preview","text":"(optional) Right click on the babylon-project project and select Publish . Select the babylon-project/ui/index.html in the Projects view In the Preview window you should see the web page for management of Books. Try to enter a few books to test how it works. Application URL The Bookstore Application is available at: http://localhost:8080/services/web/babylon-project/ui/","title":"Publish and Preview"},{"location":"tutorials/application-development/bookstore/ui/#summary","text":"Tutorial Completed After completing all steps in this tutorial, you would have: Extendable UI Perspective for the book related views. Books View to display the books data. Books Dialog for modifing the books data. Note: The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-babylon-project","title":"Summary"},{"location":"tutorials/application-development/scheduled-job/","text":"Scheduled Job Overview This sample shows how to create a simple application with scheduled job for log events. It contains a Database Table to store log events, helper Logger class to create log events, Job Definition and Job Handler . Sections Database Table Job Handler Job Definition Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Scheduled Job"},{"location":"tutorials/application-development/scheduled-job/#scheduled-job","text":"","title":"Scheduled Job"},{"location":"tutorials/application-development/scheduled-job/#overview","text":"This sample shows how to create a simple application with scheduled job for log events. It contains a Database Table to store log events, helper Logger class to create log events, Job Definition and Job Handler .","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/#sections","text":"Database Table Job Handler Job Definition Note The complete content of the Bookstore tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Sections"},{"location":"tutorials/application-development/scheduled-job/database/","text":"Scheduled Job - Database Overview This section shows how to create the database table for the Scheduled Job application. Steps Database Table Navigate to the Database Perspective . In the SQL View enter the following script: create table LOG_EVENTS ( LOG_ID integer primary key auto_increment , LOG_SEVERITY varchar ( 16 ), LOG_MESSAGE varchar ( 120 ), LOG_TIMESTAMP timestamp ); Press the Run icon to execute the SQL script. Keyboard Shortcut Press Ctrl + X for Windows, Cmd + X for macOS to execute the SQL script. Note: You can execute all or part of the SQL scripts in the SQL View by making a selection and pressing the Run icon or the keyboard shortcut. Press the Refresh button to see the LOG_EVENTS table. Table Content Right click on the LOG_EVENTS table and select Show Contents . The table data would be displayed in the Result View . As the table is empty, there should be no data: Next Steps Section Completed After completing the steps in this tutorial, you would: Have database table named LOG_EVENTS . Be familiar with the Database Perspective , the SQL View and the Result View . Continue to the Job Handler section to create a Job Handler , that would be executed by the Scheduled Job . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Database"},{"location":"tutorials/application-development/scheduled-job/database/#scheduled-job-database","text":"","title":"Scheduled Job - Database"},{"location":"tutorials/application-development/scheduled-job/database/#overview","text":"This section shows how to create the database table for the Scheduled Job application.","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/database/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/scheduled-job/database/#database-table","text":"Navigate to the Database Perspective . In the SQL View enter the following script: create table LOG_EVENTS ( LOG_ID integer primary key auto_increment , LOG_SEVERITY varchar ( 16 ), LOG_MESSAGE varchar ( 120 ), LOG_TIMESTAMP timestamp ); Press the Run icon to execute the SQL script. Keyboard Shortcut Press Ctrl + X for Windows, Cmd + X for macOS to execute the SQL script. Note: You can execute all or part of the SQL scripts in the SQL View by making a selection and pressing the Run icon or the keyboard shortcut. Press the Refresh button to see the LOG_EVENTS table.","title":"Database Table"},{"location":"tutorials/application-development/scheduled-job/database/#table-content","text":"Right click on the LOG_EVENTS table and select Show Contents . The table data would be displayed in the Result View . As the table is empty, there should be no data:","title":"Table Content"},{"location":"tutorials/application-development/scheduled-job/database/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would: Have database table named LOG_EVENTS . Be familiar with the Database Perspective , the SQL View and the Result View . Continue to the Job Handler section to create a Job Handler , that would be executed by the Scheduled Job . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Next Steps"},{"location":"tutorials/application-development/scheduled-job/handler/","text":"Scheduled Job - Job Handler Overview This section shows how to create helper Logger class to create log events and Job Handler that would be executed by the Scheduled Job . Steps Logger Right click on the scheduled-job-project project and select New \u2192 TypeScript Service . Enter Logger.ts for the name of the TypeScript Service. Replace the content with the following code: import { update } from \"sdk/db\" ; export enum LogDataSeverity { INFO = 'Info' , WARNING = 'Warning' , ERROR = 'Error' } export interface LogData { readonly date : Date ; readonly severity : LogDataSeverity ; readonly message : string ; } export class Logger { public static log ( logData : LogData ) { Logger . saveLogEvent ( logData ); const message = `---> [ ${ logData . severity } ] [ ${ Logger . toDateString ( logData . date ) } ]: ${ logData . message } <---` ; switch ( logData . severity ) { case LogDataSeverity.INFO : console.info ( message ); break ; case LogDataSeverity.WARNING : console.warn ( message ); break ; case LogDataSeverity.ERROR : console.error ( message ); break ; } } private static saveLogEvent ( logData : LogData ) { const sql = `insert into LOG_EVENTS (\"LOG_SEVERITY\", \"LOG_MESSAGE\", \"LOG_TIMESTAMP\") values (?, ?, ?)` ; const queryParameters = [ logData . severity , logData . message , logData . date ]; update . execute ( sql , queryParameters , null ); } private static toDateString ( date : Date ) : string { return ` ${ date . toLocaleDateString () } ; ${ date . toLocaleTimeString () } ` ; } } db/update Take a look at the db/update documentation for more details about the API. Logger Right click on the scheduled-job-project project and select New \u2192 JavaScript ESM Service . Enter handler.mjs for the name of the JavaScript Service. Replace the content with the following code: import { Logger , LogDataSeverity } from './Logger' ; const logData = [{ date : new Date (), severity : LogDataSeverity . INFO , message : 'Success feels so good!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'You made it!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Open Sesame!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Password updated!' }, { date : new Date (), severity : LogDataSeverity . ERROR , message : 'Welcome to the dark side!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'So glad you are back!' }]; const randomIndex = Math . floor ( Math . random () * logData . length ); Logger . log ( logData [ randomIndex ]); Navigate to the Database Perspective to check that there is a record in the LOG_EVENTS table. Save & Publish Saving the file will trigger a Publish action, which will build and deploy the JavaScript and TypeScript services. The handler.mjs service would be executed by the Preview view. As it's expected to be executed by a Scheduled Job and not by HTTP Request nothing would be displayed in the Preview view, however the log event data would be insterted into the LOG_EVENTS table. JavaScript ESM Handler At the time of writing the tutorial, it was not possible to create a TypeScript handler for the Scheduled Job . Next Steps Section Completed After completing the steps in this tutorial, you would have: Job Handler and Logger class to create log events. At least one record in the LOG_EVENTS table. Continue to the Job Definition section to create a Scheduled Job , that would trigger the Job Handler . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Job Handler"},{"location":"tutorials/application-development/scheduled-job/handler/#scheduled-job-job-handler","text":"","title":"Scheduled Job - Job Handler"},{"location":"tutorials/application-development/scheduled-job/handler/#overview","text":"This section shows how to create helper Logger class to create log events and Job Handler that would be executed by the Scheduled Job .","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/handler/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/scheduled-job/handler/#logger","text":"Right click on the scheduled-job-project project and select New \u2192 TypeScript Service . Enter Logger.ts for the name of the TypeScript Service. Replace the content with the following code: import { update } from \"sdk/db\" ; export enum LogDataSeverity { INFO = 'Info' , WARNING = 'Warning' , ERROR = 'Error' } export interface LogData { readonly date : Date ; readonly severity : LogDataSeverity ; readonly message : string ; } export class Logger { public static log ( logData : LogData ) { Logger . saveLogEvent ( logData ); const message = `---> [ ${ logData . severity } ] [ ${ Logger . toDateString ( logData . date ) } ]: ${ logData . message } <---` ; switch ( logData . severity ) { case LogDataSeverity.INFO : console.info ( message ); break ; case LogDataSeverity.WARNING : console.warn ( message ); break ; case LogDataSeverity.ERROR : console.error ( message ); break ; } } private static saveLogEvent ( logData : LogData ) { const sql = `insert into LOG_EVENTS (\"LOG_SEVERITY\", \"LOG_MESSAGE\", \"LOG_TIMESTAMP\") values (?, ?, ?)` ; const queryParameters = [ logData . severity , logData . message , logData . date ]; update . execute ( sql , queryParameters , null ); } private static toDateString ( date : Date ) : string { return ` ${ date . toLocaleDateString () } ; ${ date . toLocaleTimeString () } ` ; } } db/update Take a look at the db/update documentation for more details about the API.","title":"Logger"},{"location":"tutorials/application-development/scheduled-job/handler/#logger_1","text":"Right click on the scheduled-job-project project and select New \u2192 JavaScript ESM Service . Enter handler.mjs for the name of the JavaScript Service. Replace the content with the following code: import { Logger , LogDataSeverity } from './Logger' ; const logData = [{ date : new Date (), severity : LogDataSeverity . INFO , message : 'Success feels so good!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'You made it!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Open Sesame!' }, { date : new Date (), severity : LogDataSeverity . WARNING , message : 'Password updated!' }, { date : new Date (), severity : LogDataSeverity . ERROR , message : 'Welcome to the dark side!' }, { date : new Date (), severity : LogDataSeverity . INFO , message : 'So glad you are back!' }]; const randomIndex = Math . floor ( Math . random () * logData . length ); Logger . log ( logData [ randomIndex ]); Navigate to the Database Perspective to check that there is a record in the LOG_EVENTS table. Save & Publish Saving the file will trigger a Publish action, which will build and deploy the JavaScript and TypeScript services. The handler.mjs service would be executed by the Preview view. As it's expected to be executed by a Scheduled Job and not by HTTP Request nothing would be displayed in the Preview view, however the log event data would be insterted into the LOG_EVENTS table. JavaScript ESM Handler At the time of writing the tutorial, it was not possible to create a TypeScript handler for the Scheduled Job .","title":"Logger"},{"location":"tutorials/application-development/scheduled-job/handler/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Job Handler and Logger class to create log events. At least one record in the LOG_EVENTS table. Continue to the Job Definition section to create a Scheduled Job , that would trigger the Job Handler . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Next Steps"},{"location":"tutorials/application-development/scheduled-job/job/","text":"Scheduled Job - Job Definition Overview This section shows how to create and manage Job Definition for the Scheduled Job application. Steps Job Definition Right click on the scheduled-job-project project and select New \u2192 Scheduled Job . Enter log.job for the name of the Scheduled Job. Right click on log.job and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"expression\" : \"0/10 * * * * ?\" , \"group\" : \"dirigible-defined\" , \"handler\" : \"scheduled-job-project/handler.mjs\" , \"description\" : \"Scheduled Log Job\" , \"parameters\" : [ { \"name\" : \"severity\" , \"type\" : \"choice\" , \"defaultValue\" : \"\" , \"choices\" : \"Info,Warning,Error\" , \"description\" : \"The log severity\" }, { \"name\" : \"message\" , \"type\" : \"string\" , \"defaultValue\" : \"\" , \"description\" : \"The log message\" } ] } Double click on log.job to open it with the Job Editor . Save & Publish Saving the file will trigger a Publish action, that would schedule the job. As defined by the expression ( 0/10 * * * * ? ) , the job handler would be executed each 10 seconds and data would be insterted into the LOG_EVENTS table. Log Events Data Navigate to the Database Perspective to check that there are new records in the LOG_EVENTS table. You can notice in the LOG_TIMESTAMP column that the last records are 10 seconds apart each. Manage Jobs Navigate to the Jobs Perspective to see a list of the Scheduled Jobs on the instance. Click on the Enable/Disable icon to stop the log Scheduled Job. Navigate to the Database Perspective to check that there are no new records in the LOG_EVENTS table after the job was disabled. Go back to the Jobs Perspective . Click on the Trigger icon and then on the Trigger button to start new Job Execution . Force Trigger This action would instantly trigger the Job Handler without respecting the Job Schedule Expression or whether the Job Schedule is enabled or disabled. Navigate back to the Database Perspective to check that there was a new record added in the LOG_EVENTS table after the job was disabled. Summary Tutorial Completed After completing all steps in this tutorial, you would: Have Scheduled Job . New records in the LOG_EVENTS table. Experience with the Jobs Perspective . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Scheduled Job"},{"location":"tutorials/application-development/scheduled-job/job/#scheduled-job-job-definition","text":"","title":"Scheduled Job - Job Definition"},{"location":"tutorials/application-development/scheduled-job/job/#overview","text":"This section shows how to create and manage Job Definition for the Scheduled Job application.","title":"Overview"},{"location":"tutorials/application-development/scheduled-job/job/#steps","text":"","title":"Steps"},{"location":"tutorials/application-development/scheduled-job/job/#job-definition","text":"Right click on the scheduled-job-project project and select New \u2192 Scheduled Job . Enter log.job for the name of the Scheduled Job. Right click on log.job and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"expression\" : \"0/10 * * * * ?\" , \"group\" : \"dirigible-defined\" , \"handler\" : \"scheduled-job-project/handler.mjs\" , \"description\" : \"Scheduled Log Job\" , \"parameters\" : [ { \"name\" : \"severity\" , \"type\" : \"choice\" , \"defaultValue\" : \"\" , \"choices\" : \"Info,Warning,Error\" , \"description\" : \"The log severity\" }, { \"name\" : \"message\" , \"type\" : \"string\" , \"defaultValue\" : \"\" , \"description\" : \"The log message\" } ] } Double click on log.job to open it with the Job Editor . Save & Publish Saving the file will trigger a Publish action, that would schedule the job. As defined by the expression ( 0/10 * * * * ? ) , the job handler would be executed each 10 seconds and data would be insterted into the LOG_EVENTS table.","title":"Job Definition"},{"location":"tutorials/application-development/scheduled-job/job/#log-events-data","text":"Navigate to the Database Perspective to check that there are new records in the LOG_EVENTS table. You can notice in the LOG_TIMESTAMP column that the last records are 10 seconds apart each.","title":"Log Events Data"},{"location":"tutorials/application-development/scheduled-job/job/#manage-jobs","text":"Navigate to the Jobs Perspective to see a list of the Scheduled Jobs on the instance. Click on the Enable/Disable icon to stop the log Scheduled Job. Navigate to the Database Perspective to check that there are no new records in the LOG_EVENTS table after the job was disabled. Go back to the Jobs Perspective . Click on the Trigger icon and then on the Trigger button to start new Job Execution . Force Trigger This action would instantly trigger the Job Handler without respecting the Job Schedule Expression or whether the Job Schedule is enabled or disabled. Navigate back to the Database Perspective to check that there was a new record added in the LOG_EVENTS table after the job was disabled.","title":"Manage Jobs"},{"location":"tutorials/application-development/scheduled-job/job/#summary","text":"Tutorial Completed After completing all steps in this tutorial, you would: Have Scheduled Job . New records in the LOG_EVENTS table. Experience with the Jobs Perspective . Note: The complete content of the Scheduled Job tutorial is available at: https://github.com/dirigiblelabs/tutorial-scheduled-job-project","title":"Summary"},{"location":"tutorials/customizations/custom-stack/","text":"Custom Stack Overview This tutorial will guide you through the creation of a custom Eclipse Dirigible stack. Sections Project Structure Branding Facade Advanced Facade Dependency Note The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Custom Stack"},{"location":"tutorials/customizations/custom-stack/#custom-stack","text":"","title":"Custom Stack"},{"location":"tutorials/customizations/custom-stack/#overview","text":"This tutorial will guide you through the creation of a custom Eclipse Dirigible stack.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/#sections","text":"Project Structure Branding Facade Advanced Facade Dependency Note The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Sections"},{"location":"tutorials/customizations/custom-stack/advanced-facade/","text":"Custom Stack - Advanced Facade Overview This section will guide you through the different ways of creating a TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here The Facade section is completed. Steps Create Java Facade Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create Example.java , SubExample.java , ExampleRequest.java , ExampleResponse.java and ExampleService.java files. Example.java SubExample.java ExampleRequest.java ExampleResponse.java ExampleService.java Create new apis/src/main/java/io/dirigible/samples/api/domain/Example.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/Example.java package io.dirigible.samples.api.domain ; import java.util.ArrayList ; import java.util.List ; public class Example { private String id ; private String name ; private List < SubExample > subexamples = new ArrayList <> (); public String getId () { return id ; } public String getName () { return name ; } public List < SubExample > getSubexamples () { return subexamples ; } public void setId ( String id ) { this . id = id ; } public void setName ( String name ) { this . name = name ; } public void setSubexamples ( List < SubExample > subexamples ) { this . subexamples = subexamples ; } public Example withId ( String id ) { setId ( id ); return this ; } public Example withName ( String name ) { setName ( name ); return this ; } public Example withSubexamples ( List < SubExample > subexamples ) { setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java package io.dirigible.samples.api.domain ; import java.util.Date ; public class SubExample { private Date date ; public Date getDate () { return date ; } public void setDate ( Date date ) { this . date = date ; } public SubExample withDate ( Date date ) { setDate ( date ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java package io.dirigible.samples.api.domain.input ; public class ExampleRequest { private String exampleId ; private String exampleName ; public String getExampleId () { return exampleId ; } public void setExampleId ( String exampleId ) { this . exampleId = exampleId ; } public String getExampleName () { return exampleName ; } public void setExampleName ( String exampleName ) { this . exampleName = exampleName ; } public ExampleRequest withExampleId ( String exampleId ) { setExampleId ( exampleId ); return this ; } public ExampleRequest withExampleName ( String exampleName ) { setExampleName ( exampleName ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java package io.dirigible.samples.api.domain.output ; import java.util.ArrayList ; import java.util.List ; import io.dirigible.samples.api.domain.Example ; public class ExampleResponse { private List < Example > examples = new ArrayList <> (); public List < Example > getExamples () { return examples ; } public void setExamples ( List < Example > examples ) { this . examples = examples ; } public ExampleResponse withExamples ( List < Example > examples ) { setExamples ( examples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java package io.dirigible.samples.api.service ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public interface ExampleService { ExampleResponse doExample ( ExampleRequest request ); } Create TypeScript API Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create Example.ts , SubExample.ts , ExampleRequest.ts and ExampleResponse.ts files. Note The TypeScript files are 1:1 representation of the Java classes. They have the same methods, signature and logic as the Java classes. All TypeScript files are in the custom-api folder and don't follow the Java packages nesting, just for simplicity. Example.ts SubExample.ts ExampleRequest.ts ExampleResponse.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts import { SubExample } from \"./SubExample\" ; export class Example { // @ts-ignore private id : string ; // @ts-ignore private name : string ; // @ts-ignore private subexamples : SubExample [] = []; public getId () : string { return this . id ; } public getName () : string { return this . name ; } public getSubexamples () : SubExample [] { return this . subexamples ; } public setId ( id : string ) : void { this . id = id ; } public setName ( name : string ) : void { this . name = name ; } public setSubexamples ( subexamples : SubExample []) : void { this . subexamples = subexamples ; } public withId ( id : string ) : Example { this . setId ( id ); return this ; } public withName ( name : string ) : Example { this . setName ( name ); return this ; } public withSubexamples ( subexamples : SubExample []) : Example { this . setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts export class SubExample { // @ts-ignore private date : Date ; public getDate () : Date { return this . date ; } public setDate ( date : Date ) : void { this . date = date ; } public withDate ( date : Date ) : SubExample { this . setDate ( date ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts export class ExampleRequest { // @ts-ignore private exampleId : string ; // @ts-ignore private exampleName : string ; public getExampleId () : string { return this . exampleId ; } public setExampleId ( exampleId : string ) : void { this . exampleId = exampleId ; } public getExampleName () : string { return this . exampleName ; } public setExampleName ( exampleName : string ) : void { this . exampleName = exampleName ; } public withExampleId ( exampleId : string ) : ExampleRequest { this . setExampleId ( exampleId ); return this ; } public withExampleName ( exampleName : string ) : ExampleRequest { this . setExampleName ( exampleName ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts import { Example } from \"./Example\" ; export class ExampleResponse { private examples : Example [] = []; public getExamples () : Example [] { return this . examples ; } public setExamples ( examples : Example []) : void { this . examples = examples ; } public withExamples ( examples : Example []) : ExampleResponse { this . setExamples ( examples ); return this ; } } Create Java Client Facade Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create ExampleClient.java and ExampleClientV2.java files. ExampleClient.java vs ExampleClientV2.java There is a difference in the method signature of the ExampleClient and the ExampleClientV2 classes. Although they have the same functionallity there is difference in the input parameter type and the return type . In ExampleClient : public ExampleResponse doExample ( ExampleRequest request ) In ExampleClientV2 : public String doExample ( String requestAsString ) The ExampleClientV2 accepts String input parameter instead of ExampleRequest and returns also String instead of ExampleResponse . Inside the implementation Gson is used to parse and to stringify the JSON representation of the ExampleRequest and the ExampleResponse . This technique is used to simplify the integration between the Java facade and the TypeScript API. ExampleClient.java ExampleClientV2.java Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; import io.dirigible.samples.api.service.ExampleService ; public class ExampleClient implements ExampleService { @Override public ExampleResponse doExample ( ExampleRequest request ) { final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return exampleResponse ; } } Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public class ExampleClientV2 { public String doExample ( String requestAsString ) { final var gson = new Gson (); final var request = gson . fromJson ( requestAsString , ExampleRequest . class ); final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return gson . toJson ( exampleResponse ); } } Create TypeScript API Client Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create ExampleClient.ts , ExampleClientV2.ts , ExampleRequestV2.ts and ExampleResponseV2.ts files. ExampleClient.ts vs ExampleClientV2.ts The ExampleClient uses the native Java objects, so it has to follow the Java way of creation of objects and assigning properties. The ExampleClientV2 uses TypeScript interfaces , that represents the Java classes (see ExampleRequestV2.ts and ExampleResponseV2.ts ) to follow the TypeScript way of creation of objects and assigning properties. ExampleClient.ts ExampleClientV2.ts ExampleRequestV2.ts ExampleResponseV2.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts import { ExampleResponse } from \"./ExampleResponse\" ; import { ExampleRequest } from \"./ExampleRequest\" ; import { Example } from \"./Example\" ; import { SubExample } from \"./SubExample\" ; const ExampleClientClass = Java . type ( \"io.dirigible.samples.api.client.ExampleClient\" ); const ExampleRequestClass = Java . type ( \"io.dirigible.samples.api.domain.input.ExampleRequest\" ); export class ExampleClient { public doExample ( request : ExampleRequest ) : ExampleResponse { const requestObj = new ExampleRequestClass (); requestObj . setExampleId ( request . getExampleId ()); requestObj . setExampleName ( request . getExampleName ()); const responseObj = new ExampleClientClass (). doExample ( requestObj ); const examples : Example [] = []; for ( const exampleObj of responseObj . getExamples ()) { const example = new Example (); const subExamples : SubExample [] = []; example . setId ( exampleObj . getId ()); example . setName ( exampleObj . getName ()); for ( const subexampleObj of exampleObj . getSubexamples ()) { const subexample = new SubExample (); subexample . setDate ( subexampleObj . getDate ()); subExamples . push ( subexample ); } example . setSubexamples ( subExamples ) examples . push ( example ); } const response = new ExampleResponse (); response . setExamples ( examples ); return response ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts import { ExampleResponseV2 } from \"./ExampleResponseV2\" ; import { ExampleRequestV2 } from \"./ExampleRequestV2\" ; const ExampleClientV2Class = Java . type ( \"io.dirigible.samples.api.client.ExampleClientV2\" ); export class ExampleClientV2 { public doExample ( request : ExampleRequestV2 ) : ExampleResponseV2 { const response = new ExampleClientV2Class (). doExample ( JSON . stringify ( request )); return JSON . parse ( response ); } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts export interface ExampleRequestV2 { readonly exampleId : string ; readonly exampleName : string ; } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts export interface SubExampleV2 { readonly date : Date ; } export interface ExampleV2 { readonly id : string ; readonly name : string ; readonly subexamples : SubExampleV2 []; } export interface ExampleResponseV2 { readonly examples : ExampleV2 []; } Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Test the Advanced TypeScript API Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter demo-client.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClient } from \"custom-api/ExampleClient\" ; import { ExampleRequest } from \"custom-api/ExampleRequest\" ; const exampleRequest = new ExampleRequest (); exampleRequest . setExampleId ( 'example-id-1234' ); exampleRequest . setExampleName ( 'Custom Stack Example' ); const exampleClient = new ExampleClient (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Enter demo-client-v2.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClientV2 } from \"custom-api/ExampleClientV2\" ; import { ExampleRequestV2 } from \"custom-api/ExampleRequestV2\" ; const exampleRequest : ExampleRequestV2 = { exampleId : 'example-id-1234' , exampleName : 'Custom Stack Example' }; const exampleClient = new ExampleClientV2 (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo-client.ts from the Projects explorer and open the Preview view to see the result. Select the demo-client-v2.ts from the Projects explorer and open the Preview view to see the result. Tip As in the TypeScript API Client section, there is a difference between the usage of the ExampleClient and the ExampleClientV2 in the application code. The demo-client.ts uses the ExampleClient and the native Java objects, so it has to follow the Java way of creation of objects and assigning properties, while the demo-client-v2.ts follows the TypeScript way of creation of objects and assigning properties. Next Steps Section Completed After completing the steps in this tutorial, you would have: Two different versions of the ExampleClient Java Facades. Two different versions of the ExampleClient TypeScript APIs. Learned the difference between the native Java way and native TypeScript way of implementing the Java Facades and the TypeScript APIs Continue to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Advanced Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#custom-stack-advanced-facade","text":"","title":"Custom Stack - Advanced Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#overview","text":"This section will guide you through the different ways of creating a TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here The Facade section is completed.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-java-facade","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create Example.java , SubExample.java , ExampleRequest.java , ExampleResponse.java and ExampleService.java files. Example.java SubExample.java ExampleRequest.java ExampleResponse.java ExampleService.java Create new apis/src/main/java/io/dirigible/samples/api/domain/Example.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/Example.java package io.dirigible.samples.api.domain ; import java.util.ArrayList ; import java.util.List ; public class Example { private String id ; private String name ; private List < SubExample > subexamples = new ArrayList <> (); public String getId () { return id ; } public String getName () { return name ; } public List < SubExample > getSubexamples () { return subexamples ; } public void setId ( String id ) { this . id = id ; } public void setName ( String name ) { this . name = name ; } public void setSubexamples ( List < SubExample > subexamples ) { this . subexamples = subexamples ; } public Example withId ( String id ) { setId ( id ); return this ; } public Example withName ( String name ) { setName ( name ); return this ; } public Example withSubexamples ( List < SubExample > subexamples ) { setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/SubExample.java package io.dirigible.samples.api.domain ; import java.util.Date ; public class SubExample { private Date date ; public Date getDate () { return date ; } public void setDate ( Date date ) { this . date = date ; } public SubExample withDate ( Date date ) { setDate ( date ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/input/ExampleRequest.java package io.dirigible.samples.api.domain.input ; public class ExampleRequest { private String exampleId ; private String exampleName ; public String getExampleId () { return exampleId ; } public void setExampleId ( String exampleId ) { this . exampleId = exampleId ; } public String getExampleName () { return exampleName ; } public void setExampleName ( String exampleName ) { this . exampleName = exampleName ; } public ExampleRequest withExampleId ( String exampleId ) { setExampleId ( exampleId ); return this ; } public ExampleRequest withExampleName ( String exampleName ) { setExampleName ( exampleName ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/domain/output/ExampleResponse.java package io.dirigible.samples.api.domain.output ; import java.util.ArrayList ; import java.util.List ; import io.dirigible.samples.api.domain.Example ; public class ExampleResponse { private List < Example > examples = new ArrayList <> (); public List < Example > getExamples () { return examples ; } public void setExamples ( List < Example > examples ) { this . examples = examples ; } public ExampleResponse withExamples ( List < Example > examples ) { setExamples ( examples ); return this ; } } Create new apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/service/ExampleService.java package io.dirigible.samples.api.service ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public interface ExampleService { ExampleResponse doExample ( ExampleRequest request ); }","title":"Create Java Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-typescript-api","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create Example.ts , SubExample.ts , ExampleRequest.ts and ExampleResponse.ts files. Note The TypeScript files are 1:1 representation of the Java classes. They have the same methods, signature and logic as the Java classes. All TypeScript files are in the custom-api folder and don't follow the Java packages nesting, just for simplicity. Example.ts SubExample.ts ExampleRequest.ts ExampleResponse.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/Example.ts import { SubExample } from \"./SubExample\" ; export class Example { // @ts-ignore private id : string ; // @ts-ignore private name : string ; // @ts-ignore private subexamples : SubExample [] = []; public getId () : string { return this . id ; } public getName () : string { return this . name ; } public getSubexamples () : SubExample [] { return this . subexamples ; } public setId ( id : string ) : void { this . id = id ; } public setName ( name : string ) : void { this . name = name ; } public setSubexamples ( subexamples : SubExample []) : void { this . subexamples = subexamples ; } public withId ( id : string ) : Example { this . setId ( id ); return this ; } public withName ( name : string ) : Example { this . setName ( name ); return this ; } public withSubexamples ( subexamples : SubExample []) : Example { this . setSubexamples ( subexamples ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/SubExample.ts export class SubExample { // @ts-ignore private date : Date ; public getDate () : Date { return this . date ; } public setDate ( date : Date ) : void { this . date = date ; } public withDate ( date : Date ) : SubExample { this . setDate ( date ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequest.ts export class ExampleRequest { // @ts-ignore private exampleId : string ; // @ts-ignore private exampleName : string ; public getExampleId () : string { return this . exampleId ; } public setExampleId ( exampleId : string ) : void { this . exampleId = exampleId ; } public getExampleName () : string { return this . exampleName ; } public setExampleName ( exampleName : string ) : void { this . exampleName = exampleName ; } public withExampleId ( exampleId : string ) : ExampleRequest { this . setExampleId ( exampleId ); return this ; } public withExampleName ( exampleName : string ) : ExampleRequest { this . setExampleName ( exampleName ); return this ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponse.ts import { Example } from \"./Example\" ; export class ExampleResponse { private examples : Example [] = []; public getExamples () : Example [] { return this . examples ; } public setExamples ( examples : Example []) : void { this . examples = examples ; } public withExamples ( examples : Example []) : ExampleResponse { this . setExamples ( examples ); return this ; } }","title":"Create TypeScript API"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-java-client-facade","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/java/io/dirigible/samples/ folder. Create ExampleClient.java and ExampleClientV2.java files. ExampleClient.java vs ExampleClientV2.java There is a difference in the method signature of the ExampleClient and the ExampleClientV2 classes. Although they have the same functionallity there is difference in the input parameter type and the return type . In ExampleClient : public ExampleResponse doExample ( ExampleRequest request ) In ExampleClientV2 : public String doExample ( String requestAsString ) The ExampleClientV2 accepts String input parameter instead of ExampleRequest and returns also String instead of ExampleResponse . Inside the implementation Gson is used to parse and to stringify the JSON representation of the ExampleRequest and the ExampleResponse . This technique is used to simplify the integration between the Java facade and the TypeScript API. ExampleClient.java ExampleClientV2.java Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClient.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; import io.dirigible.samples.api.service.ExampleService ; public class ExampleClient implements ExampleService { @Override public ExampleResponse doExample ( ExampleRequest request ) { final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return exampleResponse ; } } Create new apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/api/client/ExampleClientV2.java package io.dirigible.samples.api.client ; import java.util.Date ; import com.google.gson.Gson ; import io.dirigible.samples.api.domain.Example ; import io.dirigible.samples.api.domain.SubExample ; import io.dirigible.samples.api.domain.input.ExampleRequest ; import io.dirigible.samples.api.domain.output.ExampleResponse ; public class ExampleClientV2 { public String doExample ( String requestAsString ) { final var gson = new Gson (); final var request = gson . fromJson ( requestAsString , ExampleRequest . class ); final var exampleResponse = new ExampleResponse (); final var subexample = new SubExample (). withDate ( new Date ()); final var example = new Example (). withId ( request . getExampleId ()). withName ( \"Example Name\" ); example . getSubexamples (). add ( subexample ); exampleResponse . getExamples (). add ( example ); return gson . toJson ( exampleResponse ); } }","title":"Create Java Client Facade"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#create-typescript-api-client","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the apis/src/main/resources/META-INF/dirigible/custom-api/ folder. Create ExampleClient.ts , ExampleClientV2.ts , ExampleRequestV2.ts and ExampleResponseV2.ts files. ExampleClient.ts vs ExampleClientV2.ts The ExampleClient uses the native Java objects, so it has to follow the Java way of creation of objects and assigning properties. The ExampleClientV2 uses TypeScript interfaces , that represents the Java classes (see ExampleRequestV2.ts and ExampleResponseV2.ts ) to follow the TypeScript way of creation of objects and assigning properties. ExampleClient.ts ExampleClientV2.ts ExampleRequestV2.ts ExampleResponseV2.ts Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClient.ts import { ExampleResponse } from \"./ExampleResponse\" ; import { ExampleRequest } from \"./ExampleRequest\" ; import { Example } from \"./Example\" ; import { SubExample } from \"./SubExample\" ; const ExampleClientClass = Java . type ( \"io.dirigible.samples.api.client.ExampleClient\" ); const ExampleRequestClass = Java . type ( \"io.dirigible.samples.api.domain.input.ExampleRequest\" ); export class ExampleClient { public doExample ( request : ExampleRequest ) : ExampleResponse { const requestObj = new ExampleRequestClass (); requestObj . setExampleId ( request . getExampleId ()); requestObj . setExampleName ( request . getExampleName ()); const responseObj = new ExampleClientClass (). doExample ( requestObj ); const examples : Example [] = []; for ( const exampleObj of responseObj . getExamples ()) { const example = new Example (); const subExamples : SubExample [] = []; example . setId ( exampleObj . getId ()); example . setName ( exampleObj . getName ()); for ( const subexampleObj of exampleObj . getSubexamples ()) { const subexample = new SubExample (); subexample . setDate ( subexampleObj . getDate ()); subExamples . push ( subexample ); } example . setSubexamples ( subExamples ) examples . push ( example ); } const response = new ExampleResponse (); response . setExamples ( examples ); return response ; } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleClientV2.ts import { ExampleResponseV2 } from \"./ExampleResponseV2\" ; import { ExampleRequestV2 } from \"./ExampleRequestV2\" ; const ExampleClientV2Class = Java . type ( \"io.dirigible.samples.api.client.ExampleClientV2\" ); export class ExampleClientV2 { public doExample ( request : ExampleRequestV2 ) : ExampleResponseV2 { const response = new ExampleClientV2Class (). doExample ( JSON . stringify ( request )); return JSON . parse ( response ); } } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleRequestV2.ts export interface ExampleRequestV2 { readonly exampleId : string ; readonly exampleName : string ; } Create new apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/ExampleResponseV2.ts export interface SubExampleV2 { readonly date : Date ; } export interface ExampleV2 { readonly id : string ; readonly name : string ; readonly subexamples : SubExampleV2 []; } export interface ExampleResponseV2 { readonly examples : ExampleV2 []; }","title":"Create TypeScript API Client"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#test-the-advanced-typescript-api","text":"Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter demo-client.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClient } from \"custom-api/ExampleClient\" ; import { ExampleRequest } from \"custom-api/ExampleRequest\" ; const exampleRequest = new ExampleRequest (); exampleRequest . setExampleId ( 'example-id-1234' ); exampleRequest . setExampleName ( 'Custom Stack Example' ); const exampleClient = new ExampleClient (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Enter demo-client-v2.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { ExampleClientV2 } from \"custom-api/ExampleClientV2\" ; import { ExampleRequestV2 } from \"custom-api/ExampleRequestV2\" ; const exampleRequest : ExampleRequestV2 = { exampleId : 'example-id-1234' , exampleName : 'Custom Stack Example' }; const exampleClient = new ExampleClientV2 (); const exampleResponse = exampleClient . doExample ( exampleRequest ); response . println ( JSON . stringify ( exampleResponse , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo-client.ts from the Projects explorer and open the Preview view to see the result. Select the demo-client-v2.ts from the Projects explorer and open the Preview view to see the result. Tip As in the TypeScript API Client section, there is a difference between the usage of the ExampleClient and the ExampleClientV2 in the application code. The demo-client.ts uses the ExampleClient and the native Java objects, so it has to follow the Java way of creation of objects and assigning properties, while the demo-client-v2.ts follows the TypeScript way of creation of objects and assigning properties.","title":"Test the Advanced TypeScript API"},{"location":"tutorials/customizations/custom-stack/advanced-facade/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Two different versions of the ExampleClient Java Facades. Two different versions of the ExampleClient TypeScript APIs. Learned the difference between the native Java way and native TypeScript way of implementing the Java Facades and the TypeScript APIs Continue to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/custom-stack/branding/","text":"Custom Stack - Branding Overview This section will guide you through the process of rebranding of Eclipse Dirigible Custom Stack. Steps Create Maven Module Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create branding folder and navigate to it. Create pom.xml file. pom.xml Create new branding/pom.xml file. Paste the following content: branding/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - branding custom-stack-branding 1.0.0-SNAPSHOT jar Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Open the pom.xml file. Navigate to the section. Add the following module: pom.xml Final pom.xml application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Create Branding Resources Navigate to the branding folder. Create src/main/resources/META-INF/dirigible/ide-branding/ folder structure and navigate to it. Create branding.js and custom-stack.svg files. branding.js custom-stack.svg Create new branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js const brandingInfo = { name : 'Custom Stack' , brand : 'Custom Stack' , brandUrl : 'https://github.com/dirigiblelabs/tutorial-custom-stack' , icons : { faviconIco : '/services/web/ide-branding/favicon.ico' , favicon32 : '/services/web/ide-branding/favicon-32x32.png' , favicon16 : '/services/web/ide-branding/favicon-16x16.png' , }, logo : '/services/web/ide-branding/custom-stack.svg' }; Favicons For the sake of simplicity, the favicon files were omitted. Create new branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg Add Branding Dependency Navigate to the application folder. Open the pom.xml file. Make the following changes: Add Branding Dependency Exclude Default Branding Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT Navigate to the section. Edit the dirigible-components-group-ide dependency: org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Reset Theme If the branding changes aren't visible, clear the browser cache and reset the theme by selecting Theme \u2192 Reset in the top right corner. Next Steps Section Completed After completing the steps in this tutorial, you would have: Branding Maven Module. Eclipse Dirigible Stack wih custom branding running at http://localhost:8080 . Continue to the Facade section to create Java facade and TypeScript API for the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Branding"},{"location":"tutorials/customizations/custom-stack/branding/#custom-stack-branding","text":"","title":"Custom Stack - Branding"},{"location":"tutorials/customizations/custom-stack/branding/#overview","text":"This section will guide you through the process of rebranding of Eclipse Dirigible Custom Stack.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/branding/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/branding/#create-maven-module","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create branding folder and navigate to it. Create pom.xml file. pom.xml Create new branding/pom.xml file. Paste the following content: branding/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - branding custom-stack-branding 1.0.0-SNAPSHOT jar Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Open the pom.xml file. Navigate to the section. Add the following module: pom.xml Final pom.xml application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none ","title":"Create Maven Module"},{"location":"tutorials/customizations/custom-stack/branding/#create-branding-resources","text":"Navigate to the branding folder. Create src/main/resources/META-INF/dirigible/ide-branding/ folder structure and navigate to it. Create branding.js and custom-stack.svg files. branding.js custom-stack.svg Create new branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/branding.js const brandingInfo = { name : 'Custom Stack' , brand : 'Custom Stack' , brandUrl : 'https://github.com/dirigiblelabs/tutorial-custom-stack' , icons : { faviconIco : '/services/web/ide-branding/favicon.ico' , favicon32 : '/services/web/ide-branding/favicon-32x32.png' , favicon16 : '/services/web/ide-branding/favicon-16x16.png' , }, logo : '/services/web/ide-branding/custom-stack.svg' }; Favicons For the sake of simplicity, the favicon files were omitted. Create new branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg file. Paste the following content: branding/src/main/resources/META-INF/dirigible/ide-branding/custom-stack.svg ","title":"Create Branding Resources"},{"location":"tutorials/customizations/custom-stack/branding/#add-branding-dependency","text":"Navigate to the application folder. Open the pom.xml file. Make the following changes: Add Branding Dependency Exclude Default Branding Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT Navigate to the section. Edit the dirigible-components-group-ide dependency: org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Add Branding Dependency"},{"location":"tutorials/customizations/custom-stack/branding/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/branding/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Reset Theme If the branding changes aren't visible, clear the browser cache and reset the theme by selecting Theme \u2192 Reset in the top right corner.","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/branding/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Branding Maven Module. Eclipse Dirigible Stack wih custom branding running at http://localhost:8080 . Continue to the Facade section to create Java facade and TypeScript API for the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/custom-stack/dependency/","text":"Custom Stack - Dependency Overview This section will guide you through the process of adding external Maven dependency for generating barcodes and using it in the Eclipse Dirigible Custom Stack without creating separate Java Facade and/or TypeScript API. Note Creating TypeScript APIs is always recommended, as there is no out of the box code completion for native Java objects. Steps Add External Dependency: Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the application folder. Open the pom.xml file. Make the following changes: Add External Dependency Final pom.xml Navigate to the section. Add the following dependency: uk.org.okapibarcode okapibarcode 0.3.3 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT uk.org.okapibarcode okapibarcode 0.3.3 org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Test the Changes Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter barcode.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; const Code128 = Java . type ( \"uk.org.okapibarcode.backend.Code128\" ); const BufferedImage = Java . type ( \"java.awt.image.BufferedImage\" ); const Java2DRenderer = Java . type ( \"uk.org.okapibarcode.output.Java2DRenderer\" ); const Color = Java . type ( \"java.awt.Color\" ); const File = Java . type ( \"java.io.File\" ); const ImageIO = Java . type ( \"javax.imageio.ImageIO\" ); const FileUtils = Java . type ( \"org.apache.commons.io.FileUtils\" ); const barcode = new Code128 (); barcode . setFontName ( \"Monospaced\" ); barcode . setFontSize ( 16 ); barcode . setContent ( \"custom-stack-1234\" ); const image = new BufferedImage ( barcode . getWidth (), barcode . getHeight (), BufferedImage . TYPE_BYTE_GRAY ); const g2d = image . createGraphics (); const renderer = new Java2DRenderer ( g2d , 1 , Color . WHITE , Color . BLACK ); renderer . render ( barcode ); const file = new File ( \"code128.png\" ); ImageIO . write ( image , \"png\" , file ); const bytes = FileUtils . readFileToByteArray ( file ); response . setContentType ( \"image/png\" ); response . write ( bytes ); response . flush (); response . close (); Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 ); Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish Select the barcode.ts from the Projects explorer and open the Preview view to see the result. Summary Tutorial Completed After completing all steps in this tutorial, you would have: Custom Eclipse Dirigible Stack. Custom branding of the Eclipse Dirigible Stack. Custom Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Dependency"},{"location":"tutorials/customizations/custom-stack/dependency/#custom-stack-dependency","text":"","title":"Custom Stack - Dependency"},{"location":"tutorials/customizations/custom-stack/dependency/#overview","text":"This section will guide you through the process of adding external Maven dependency for generating barcodes and using it in the Eclipse Dirigible Custom Stack without creating separate Java Facade and/or TypeScript API. Note Creating TypeScript APIs is always recommended, as there is no out of the box code completion for native Java objects.","title":"Overview"},{"location":"tutorials/customizations/custom-stack/dependency/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/dependency/#add-external-dependency","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Navigate to the application folder. Open the pom.xml file. Make the following changes: Add External Dependency Final pom.xml Navigate to the section. Add the following dependency: uk.org.okapibarcode okapibarcode 0.3.3 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT uk.org.okapibarcode okapibarcode 0.3.3 org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Add External Dependency:"},{"location":"tutorials/customizations/custom-stack/dependency/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/dependency/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/dependency/#test-the-changes","text":"Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript Service . Enter barcode.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; const Code128 = Java . type ( \"uk.org.okapibarcode.backend.Code128\" ); const BufferedImage = Java . type ( \"java.awt.image.BufferedImage\" ); const Java2DRenderer = Java . type ( \"uk.org.okapibarcode.output.Java2DRenderer\" ); const Color = Java . type ( \"java.awt.Color\" ); const File = Java . type ( \"java.io.File\" ); const ImageIO = Java . type ( \"javax.imageio.ImageIO\" ); const FileUtils = Java . type ( \"org.apache.commons.io.FileUtils\" ); const barcode = new Code128 (); barcode . setFontName ( \"Monospaced\" ); barcode . setFontSize ( 16 ); barcode . setContent ( \"custom-stack-1234\" ); const image = new BufferedImage ( barcode . getWidth (), barcode . getHeight (), BufferedImage . TYPE_BYTE_GRAY ); const g2d = image . createGraphics (); const renderer = new Java2DRenderer ( g2d , 1 , Color . WHITE , Color . BLACK ); renderer . render ( barcode ); const file = new File ( \"code128.png\" ); ImageIO . write ( image , \"png\" , file ); const bytes = FileUtils . readFileToByteArray ( file ); response . setContentType ( \"image/png\" ); response . write ( bytes ); response . flush (); response . close (); Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 ); Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish Select the barcode.ts from the Projects explorer and open the Preview view to see the result.","title":"Test the Changes"},{"location":"tutorials/customizations/custom-stack/dependency/#summary","text":"Tutorial Completed After completing all steps in this tutorial, you would have: Custom Eclipse Dirigible Stack. Custom branding of the Eclipse Dirigible Stack. Custom Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Summary"},{"location":"tutorials/customizations/custom-stack/facade/","text":"Custom Stack - Facade Overview This section will guide you through the process of creation of Java Facade and TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here Steps Create APIs Module Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create apis folder and navigate to it. Create pom.xml , MyFacade.java , MyApi.ts , project.json and tsconfig.json files. pom.xml MyFacade.java MyApi.ts Create new apis/pom.xml file. Paste the following content: apis/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - apis custom-stack-apis jar Note The creation of a Java facade is optional, as the same logic can be wrapped/implemented in the TypeScript API only by using the Java.type() function. Create src/main/java/io/dirigible/samples/ folder stucture and navigate to it. Create new apis/src/main/java/io/dirigible/samples/MyFacade.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/MyFacade.java package io.dirigible.samples ; public class MyFacade { public static String greet () { return \"Hello, welcome to my custom Eclipse Dirigible stack!\" ; } public int add ( int a , int b ) { return a + b ; } public int multiply ( int a , int b ) { return a * b ; } public String customMethod ( String input ) { // Your custom logic here return \"Processed input: \" + input ; } } Create src/main/resources/META-INF/dirigible/custom-api/ folder stucture and navigate to it. Create new apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); export class MyApi { private facadeInstance = new MyFacade (); public static greet () : string { return MyFacade . greet (); } public add ( a : number , b : number ) : number { return this . facadeInstance . add ( a , b ); } public multiply ( a : number , b : number ) : number { return this . facadeInstance . multiply ( a , b ); } public customMethod ( input : string ) : string { return this . facadeInstance . customMethod ( input ); } } Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 ); Add Module Dependency Navigate to the root folder of the project (e.g. /custom-stack ) . Open the pom.xml file. Make the following changes: Add APIs Module Final pom.xml Navigate to the section. Add the following module: apis application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack apis application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Navigate to the application folder. Open the pom.xml file. Make the following changes: Add APIs Dependency Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Build the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install Run the Custom Platform Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Test the TypeScript API Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript CJS Service . Enter demo.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { MyApi } from \"custom-api/MyApi\" ; const myApiInstance = new MyApi (); const firstNumber = myApiInstance . add ( 5 , 3 ); const secondNumber = myApiInstance . multiply ( 5 , 3 ); const customMethod = myApiInstance . customMethod ( \"tutorial-custom-stack\" ); const greetingMessage = MyApi . greet (); const data = { firstNumber : firstNumber , secondNumber : secondNumber , customMethod : customMethod , greetingMessage : greetingMessage , }; response . println ( JSON . stringify ( data , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo.ts from the Projects explorer and open the Preview view to see the result. Next Steps Section Completed After completing the steps in this tutorial, you would have: APIs Maven Module. Java Facade io.dirigible.samples.MyFacade . TypeScript API custom-api/MyApi exposing the Java Facade. Sample project utilizing the TypeScript API. Continue either to the the Advanced Facade section or to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Facade"},{"location":"tutorials/customizations/custom-stack/facade/#custom-stack-facade","text":"","title":"Custom Stack - Facade"},{"location":"tutorials/customizations/custom-stack/facade/#overview","text":"This section will guide you through the process of creation of Java Facade and TypeScript API for the Eclipse Dirigible Custom Stack. Prerequisites Node.js 18+ - Node.js versions can be found here esbuild 0.19+ - esbuild versions can be found here tsc 5.2+ - tsc versions can be found here","title":"Overview"},{"location":"tutorials/customizations/custom-stack/facade/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/facade/#create-apis-module","text":"Navigate to the root folder of the custom stack (e.g. /custom-stack ) . Create apis folder and navigate to it. Create pom.xml , MyFacade.java , MyApi.ts , project.json and tsconfig.json files. pom.xml MyFacade.java MyApi.ts Create new apis/pom.xml file. Paste the following content: apis/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - apis custom-stack-apis jar Note The creation of a Java facade is optional, as the same logic can be wrapped/implemented in the TypeScript API only by using the Java.type() function. Create src/main/java/io/dirigible/samples/ folder stucture and navigate to it. Create new apis/src/main/java/io/dirigible/samples/MyFacade.java file. Paste the following content: apis/src/main/java/io/dirigible/samples/MyFacade.java package io.dirigible.samples ; public class MyFacade { public static String greet () { return \"Hello, welcome to my custom Eclipse Dirigible stack!\" ; } public int add ( int a , int b ) { return a + b ; } public int multiply ( int a , int b ) { return a * b ; } public String customMethod ( String input ) { // Your custom logic here return \"Processed input: \" + input ; } } Create src/main/resources/META-INF/dirigible/custom-api/ folder stucture and navigate to it. Create new apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts file. Paste the following content: apis/src/main/resources/META-INF/dirigible/custom-api/MyApi.ts const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); export class MyApi { private facadeInstance = new MyFacade (); public static greet () : string { return MyFacade . greet (); } public add ( a : number , b : number ) : number { return this . facadeInstance . add ( a , b ); } public multiply ( a : number , b : number ) : number { return this . facadeInstance . multiply ( a , b ); } public customMethod ( input : string ) : string { return this . facadeInstance . customMethod ( input ); } } Access to Java Classes Java classes can be accessed through the Java.type() function by providing the Fully Qualified Name (FQN) (e.g. io.dirigible.samples.MyFacade ) : const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); To invoke static method of the MyFacade class: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); MyFacade . greet (); To create class instance and call a method: const MyFacade = Java . type ( \"io.dirigible.samples.MyFacade\" ); const facadeInstance = new MyFacade (); facadeInstance . add ( 5 , 3 );","title":"Create APIs Module"},{"location":"tutorials/customizations/custom-stack/facade/#add-module-dependency","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the pom.xml file. Make the following changes: Add APIs Module Final pom.xml Navigate to the section. Add the following module: apis application branding pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack apis application branding org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Navigate to the application folder. Open the pom.xml file. Make the following changes: Add APIs Dependency Final pom.xml Navigate to the section. Add the following dependency: io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT application/pom.xml 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar io.dirigible.samples custom-stack-apis 1.0.0-SNAPSHOT io.dirigible.samples custom-stack-branding 1.0.0-SNAPSHOT org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-ide-ui-branding org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Add Module Dependency"},{"location":"tutorials/customizations/custom-stack/facade/#build-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Platform : mvn clean install","title":"Build the Custom Platform"},{"location":"tutorials/customizations/custom-stack/facade/#run-the-custom-platform","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Platform"},{"location":"tutorials/customizations/custom-stack/facade/#test-the-typescript-api","text":"Create a project named demo-application . Right click on the demo-application project and select New \u2192 TypeScript CJS Service . Enter demo.ts for the name of the TypeScript Service. Replace the content with the following code: import { response } from \"sdk/http\" ; import { MyApi } from \"custom-api/MyApi\" ; const myApiInstance = new MyApi (); const firstNumber = myApiInstance . add ( 5 , 3 ); const secondNumber = myApiInstance . multiply ( 5 , 3 ); const customMethod = myApiInstance . customMethod ( \"tutorial-custom-stack\" ); const greetingMessage = MyApi . greet (); const data = { firstNumber : firstNumber , secondNumber : secondNumber , customMethod : customMethod , greetingMessage : greetingMessage , }; response . println ( JSON . stringify ( data , null , 2 )); Save the changes. Right click on the demo-application project and select New \u2192 File . Enter tsconfig.json for the name of the File. { \"compilerOptions\" : { \"module\" : \"ESNext\" , \"target\" : \"ES6\" , \"moduleResolution\" : \"Node\" , \"baseUrl\" : \"../\" , \"lib\" : [ \"ESNext\" , \"DOM\" ], \"paths\" : { \"sdk/*\" : [ \"../modules/src/*\" ], \"/*\" : [ \"../*\" ] }, \"types\" : [ \"../modules/types\" ] } } Save the changes. Right click on the demo-application project and select New \u2192 File . Enter project.json for the name of the File. { \"guid\" : \"demo-application\" , \"actions\" : [ { \"name\" : \"Build TypeScript\" , \"commands\" : [ { \"os\" : \"unix\" , \"command\" : \"tsc\" }, { \"os\" : \"windows\" , \"command\" : \"cmd /c tsc\" } ], \"registry\" : \"true\" } ] } Save the changes. Right click on the demo-application project and select Publish . Select the demo.ts from the Projects explorer and open the Preview view to see the result.","title":"Test the TypeScript API"},{"location":"tutorials/customizations/custom-stack/facade/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: APIs Maven Module. Java Facade io.dirigible.samples.MyFacade . TypeScript API custom-api/MyApi exposing the Java Facade. Sample project utilizing the TypeScript API. Continue either to the the Advanced Facade section or to the Dependency section where external Maven dependency is added and used in the Custom Stack without creating a Java Facade and TypeScript API. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/custom-stack/project-structure/","text":"Custom Stack - Project Structure Overview This section shows how to create the project structure of the Custom Stack. It contains the creation of several Maven pom.xml files, static content resources, application.properties configuration files and a Spring Boot Java class. Prerequisites JDK 21+ - OpenJDK versions can be found here . Maven 3.5+ - Maven version 3.5.3 can be found here . Steps Create Maven Project Create new folder on your machine, for the custom stack (e.g. /custom-stack ) . Create pom.xml and application/pom.xml files. pom.xml application/pom.xml Create new pom.xml file. Paste the following content: pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Eclipse Dirigible version The tutorial is using Eclipse Dirigible version 10.2.7 as highlighted on line 229 . To check for a more recent and stable version go to Eclipse Dirigible Releases . Create new folder application and navigate to it. Create new application/pom.xml file. Paste the following content: application/pom.xml Git Repository For git repositories uncomment the following lines, in order to receive the Commit Id information in the About view: 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true Create Eclipse Dirigible Resources Navigate to the application folder. Create src/main/resources/ folder structure and navigate to it. Create dirigible.properties , index.html and index-busy.html files. dirigible.properties static/index.html static/index-busy.html Create application/src/main/resources/dirigible.properties file. Paste the following content: application/src/main/resources/dirigible.properties # General DIRIGIBLE_PRODUCT_NAME=${project.title} DIRIGIBLE_PRODUCT_VERSION=${project.version} DIRIGIBLE_PRODUCT_COMMIT_ID=${git.commit.id} DIRIGIBLE_PRODUCT_REPOSITORY=https://github.com/dirigiblelabs/tutorial-custom-stack DIRIGIBLE_PRODUCT_TYPE=all DIRIGIBLE_INSTANCE_NAME=custom-stack DIRIGIBLE_DATABASE_PROVIDER=local DIRIGIBLE_JAVASCRIPT_HANDLER_CLASS_NAME=org.eclipse.dirigible.graalium.handler.GraaliumJavascriptHandler DIRIGIBLE_GRAALIUM_ENABLE_DEBUG=true DIRIGIBLE_HOME_URL=services/web/ide/ DIRIGIBLE_FTP_PORT=22 Environment Variables The properties file will be packaged inside the Custom Stack , and the above environment variables will be set by default. These environment variables could be overridden during Deployment or at Runtime . To learn more about the supported configurations go to Environment Variables . Create static folder and navigate to it. Create application/src/main/resources/static/index.html file. Paste the following content: application/src/main/resources/static/index.html < html lang = \"en-US\" > < meta charset = \"utf-8\" > < title > Redirecting … < link rel = \"canonical\" href = \"/home\" > < script > location = \"/home\" < meta http-equiv = \"refresh\" content = \"0; url=/home\" > < meta name = \"robots\" content = \"noindex\" > < h1 > Redirecting … < a href = \"/home\" > Click here if you are not redirected. Create static folder and navigate to it. Create application/src/main/resources/static/index-busy.html file. Paste the following content: application/src/main/resources/static/index-busy.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"busyPage\" ng-controller = \"BusyController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Loading ... < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"padding-left: 10rem; padding-right: 10rem; margin-top: 3rem;\" > < div class = \"fd-panel fd-panel--fixed\" > < div class = \"fd-panel__header\" > < h4 class = \"fd-panel__title\" > Preparing Custom Stack Instance < fd-list > < fd-list-item ng-repeat = \"job in jobs\" > < span fd-object-status status = \"{{job.status}}\" glyph = \"{{job.statusIcon}}\" text = \"{{job.name}}\" > < fd-busy-indicator style = \"margin-top: 3rem;\" dg-size = \"l\" > < script > let busyPage = angular . module ( 'busyPage' , [ 'ideUI' , 'ideView' ]); busyPage . controller ( 'BusyController' , [ '$scope' , '$http' , 'theming' , function ( $scope , $http , theming ) { setInterval ( function () { $http ({ method : 'GET' , url : '/services/healthcheck' }). then ( function ( healthStatus ){ if ( healthStatus . data . status === \"Ready\" ) { window . location = '/home' ; } let jobs = []; for ( const [ key , value ] of Object . entries ( healthStatus . data . jobs . statuses )) { let job = new Object (); job . name = key ; switch ( value ) { case \"Succeeded\" : job . status = \"positive\" ; job . statusIcon = \"sap-icon--message-success\" break ; case \"Failed\" : job . status = \"negative\" ; job . statusIcon = \"sap-icon--message-error\" ; default : job . status = \"informative\" ; job . statusIcon = \"sap-icon--message-information\" break ; } jobs . push ( job ); } $scope . jobs = jobs . sort (( x , y ) => x . name > y . name ? 1 : - 1 ); }), ( function ( e ){ console . error ( \"Error retreiving the health status\" , e ); }); }, 1000 ); }]); (optional) Create Eclipse Dirigible Error Resources Navigate to the application/src/main/resources folder. Create public folder and navigate to it. Create error.html , 403.html and 404.html files. error.html 403.html 404.html Create application/src/main/resources/public/error/error.html file. Paste the following content: application/src/main/resources/public/error/error.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Unexpected Error Occurred < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--error\" > < fd-message-page-title > Unexpected Error Occurred < fd-message-page-subtitle > < b > There was a problem serving the requested page . < br > Usually this means that an enexpected error happened while processing your request. Here's what you can try next: < br > < br > < i >< b > Reload the page , the problem may be temporary. If the problem persists, < b > contact us and we'll help get you on your way. < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Reload Page\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"reloadPage()\" > < fd-button compact = \"true\" dg-label = \"Contact Support\" ng-click = \"contactSupport()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . reloadPage = function () { location . reload (); }; $scope . contactSupport = function () { window . open ( \"https://bugs.dirigible.io\" , \"_blank\" ); }; }]); Create error folder and navigate to it. Create application/src/main/resources/error/403.html file. Paste the following content: application/src/main/resources/error/403.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Access Denied < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--alert\" > < fd-message-page-title > Access Denied < fd-message-page-subtitle > < b > The page you're trying to access has resctricted access . < br > Pleace contact your system administrator for more details. < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { }]); Create error folder and navigate to it. Create application/src/main/resources/error/404.html file. Paste the following content: application/src/main/resources/error/404.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Page Not Found < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--documents\" > < fd-message-page-title > Page Not Found < fd-message-page-subtitle > < b > It looks like you've reached a URL that doesn't exist . < br > The page you are looking for is no longer here, or never existed in the first place. < br > < br > < i > You can go to the < b > previous page , or start over from the < b > home page . < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Go Back\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"goBack()\" > < fd-button compact = \"true\" dg-label = \"Take Me Home\" ng-click = \"goHome()\" ng-click = \"goHome()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . goBack = function () { history . back (); }; $scope . goHome = function () { window . location = \"/home\" ; }; }]); Create Spring Boot Resources Navigate to the application folder. Create application.properties , quartz.properties and CustomStackApplication.java files. application.properties quartz.properties CustomStackApplication.java Navigate to the src/main/resources/ folder. Create application/src/main/resources/application.properties file. Paste the following content: application/src/main/resources/application.properties server.port=8080 spring.main.allow-bean-definition-overriding=true server.error.include-message=always spring.servlet.multipart.enabled=true spring.servlet.multipart.file-size-threshold=2KB spring.servlet.multipart.max-file-size=1GB spring.servlet.multipart.max-request-size=1GB spring.servlet.multipart.max-file-size=200MB spring.servlet.multipart.max-request-size=215MB spring.servlet.multipart.location=${java.io.tmpdir} spring.datasource.hikari.connectionTimeout=3600000 spring.mvc.async.request-timeout=3600000 basic.enabled=${DIRIGIBLE_BASIC_ENABLED:true} terminal.enabled=${DIRIGIBLE_TERMINAL_ENABLED:false} keycloak.enabled=${DIRIGIBLE_KEYCLOAK_ENABLED:false} keycloak.realm=${DIRIGIBLE_KEYCLOAK_REALM:null} keycloak.auth-server-url=${DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL:null} keycloak.ssl-required=${DIRIGIBLE_KEYCLOAK_SSL_REQUIRED:external} keycloak.resource=${DIRIGIBLE_KEYCLOAK_CLIENT_ID:null} keycloak.public-client=true keycloak.principal-attribute=preferred_username keycloak.confidential-port=${DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT:443} keycloak.use-resource-role-mappings=true management.metrics.mongo.command.enabled=false management.metrics.mongo.connectionpool.enabled=false management.endpoints.jmx.exposure.include=* management.endpoints.jmx.exposure.exclude= management.endpoints.web.exposure.include=* management.endpoints.web.exposure.exclude= management.endpoint.health.show-details=always springdoc.api-docs.path=/api-docs cxf.path=/odata/v2 # the following are used to force the Spring to create QUARTZ tables # quartz properties are manged in quartz.properties don't try to add them here spring.quartz.job-store-type=jdbc spring.quartz.jdbc.initialize-schema=always Navigate to the src/main/resources/ folder. Create application/src/main/resources/quartz.properties file. Paste the following content: application/src/main/resources/quartz.properties # thread-pool org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool org.quartz.threadPool.threadCount=2 org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread=true # job-store # Enable this property for RAMJobStore org.quartz.jobStore.class=org.quartz.simpl.RAMJobStore # Enable these properties for a JDBCJobStore using JobStoreTX #org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX #org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.StdJDBCDelegate #org.quartz.jobStore.dataSource=quartzDataSource # Enable this property for JobStoreCMT #org.quartz.jobStore.nonManagedTXDataSource=quartzDataSource # H2 database # use an in-memory database & initialise Quartz using their standard SQL script #org.quartz.dataSource.quartzDataSource.URL=jdbc:h2:mem:spring-quartz;INIT=RUNSCRIPT FROM 'classpath:/org/quartz/impl/jdbcjobstore/tables_h2.sql' #org.quartz.dataSource.quartzDataSource.driver=org.h2.Driver #org.quartz.dataSource.quartzDataSource.user=sa #org.quartz.dataSource.quartzDataSource.password= #org.quartz.jdbc.initialize-schema=never Navigate to the src/main folder. Create java/io/dirigible/samples/ and navigate to it. Create application/src/main/java/io/dirigible/samples/CustomStackApplication.java file. Paste the following content: application/src/main/java/io/dirigible/samples/CustomStackApplication.java package io.dirigible.samples ; import org.springframework.boot.SpringApplication ; import org.springframework.boot.autoconfigure.SpringBootApplication ; import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration ; import org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration ; import org.springframework.data.jpa.repository.config.EnableJpaAuditing ; import org.springframework.data.jpa.repository.config.EnableJpaRepositories ; import org.springframework.scheduling.annotation.EnableScheduling ; @EnableJpaAuditing @EnableJpaRepositories @SpringBootApplication ( scanBasePackages = { \"io.dirigible.samples\" , \"org.eclipse.dirigible.components\" }, exclude = { DataSourceAutoConfiguration . class , DataSourceTransactionManagerAutoConfiguration . class , HibernateJpaAutoConfiguration . class , JdbcTemplateAutoConfiguration . class }) @EnableScheduling public class CustomStackApplication { public static void main ( String [] args ) { SpringApplication . run ( CustomStackApplication . class , args ); } } Build the Custom Stack Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Stack : mvn clean install Run the Custom Stack Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Debugging To run in debug mode, execute the following command: java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8000 -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack . Next Steps Section Completed After completing the steps in this tutorial, you would have: Maven project structure. Spring Boot application. Eclipse Dirigible Stack running at http://localhost:8080 . Continue to the Branding section to customize the branding of the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Project Structure"},{"location":"tutorials/customizations/custom-stack/project-structure/#custom-stack-project-structure","text":"","title":"Custom Stack - Project Structure"},{"location":"tutorials/customizations/custom-stack/project-structure/#overview","text":"This section shows how to create the project structure of the Custom Stack. It contains the creation of several Maven pom.xml files, static content resources, application.properties configuration files and a Spring Boot Java class. Prerequisites JDK 21+ - OpenJDK versions can be found here . Maven 3.5+ - Maven version 3.5.3 can be found here .","title":"Overview"},{"location":"tutorials/customizations/custom-stack/project-structure/#steps","text":"","title":"Steps"},{"location":"tutorials/customizations/custom-stack/project-structure/#create-maven-project","text":"Create new folder on your machine, for the custom stack (e.g. /custom-stack ) . Create pom.xml and application/pom.xml files. pom.xml application/pom.xml Create new pom.xml file. Paste the following content: pom.xml 4.0.0 org.sonatype.oss oss-parent 7 custom - stack - parent Custom Stack - Sample io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT pom 2024 https://www.dirigible.io Eclipse Foundation https://www.eclipse.org https://github.com/dirigiblelabs/tutorial-custom-stack application org.slf4j slf4j-api compile ch.qos.logback logback-core compile ch.qos.logback logback-classic compile org.eclipse.dirigible dirigible-commons-config org.springframework.boot spring-boot-starter-web org.apache.logging.log4j log4j-to-slf4j org.springframework.boot spring-boot-starter-websocket org.springframework.boot spring-boot-starter-data-jdbc org.springframework.boot spring-boot-starter-data-jpa org.springframework.boot spring-boot-starter-security org.springframework.boot spring-boot-starter-validation org.springframework.boot spring-boot-starter-actuator org.springframework.security spring-security-web org.springframework.boot spring-boot-starter-test test org.junit.vintage junit-vintage-engine org.springframework.security spring-security-test test com.fasterxml.jackson.datatype jackson-datatype-joda org.springdoc springdoc-openapi-ui ${org.springdoc.openapi.ui.version} com.h2database h2 org.webjars webjars-locator ${webjars-locator} org.apache.olingo olingo-odata2-lib ${olingo.version} pom javax.ws.rs javax.ws.rs-api com.google.code.gson gson org.eclipse.dirigible dirigible-dependencies ${dirigible.version} pom import default true org.jacoco jacoco-maven-plugin ${jacoco.version} prepare-agent prepare-agent SOURCEFILE *src/test/* org.apache.maven.plugins maven-compiler-plugin ${maven.compiler.plugin.version} ${maven.compiler.source} ${maven.compiler.target} true lines,vars,source custom stack UTF-8 10.2.7 17 17 17 3.3.0 3.2.0 src/main/resources/META-INF/dirigible 3.13.0 2.22.2 1.13.0 branch 2.11.0 1.15 3.12.0 1.3 1.10.0 2.10.1 4.11.0 1.3 1.8.0 4.10.0 1.7.36 1.7.12 1.4.5 2.9.0 42.7.0 2.20.11 3.15.0 5.17.3 1.0 9.4.48.v20220622 9.4.2 1.1.0 6.8.0 2.3.0 2.3.3 2.1.5 4.3 2.2.3 6.4.0.202211300538-r 1.6.4 2.0.13 3.3.1 4.9.10 3.12.11 3.1.2 4.16.1 1.9.0 1.13.0 11.1.42 0.24.4 1.8.2 1.6.5 5.1.0 0.33.0 2.3.6 3.3.12 3.6.0 1.0.8r1250 3.3.7 4.6.7 2.6.1 1.8.2 4.7.0 4.8.154 1.22 1.17.6 1.17.6 1.17.6 5.16.0 7.7.1 0.7.5 0.5.4 2.7.10 3.0.0 5.3.24 0.51 20.0.2 5.0.1 1.7 2.3.2 0.9.5.5 22.3.1 31.1-jre 72.1 3.2.2 4.4 2.3 3.0.45.202211090110 0.64.0 2.3.1 1.12.386 1.17.6 3.11.3 4.3.1 3.3.1 6.2.1 1.1 0.8.11 3.0.2 1.7.0 1.6.9 none Eclipse Dirigible version The tutorial is using Eclipse Dirigible version 10.2.7 as highlighted on line 229 . To check for a more recent and stable version go to Eclipse Dirigible Releases . Create new folder application and navigate to it. Create new application/pom.xml file. Paste the following content: application/pom.xml Git Repository For git repositories uncomment the following lines, in order to receive the Commit Id information in the About view: 4.0.0 io.dirigible.samples custom-stack-parent 1.0.0-SNAPSHOT ../pom.xml custom - stack - application custom-stack-application jar org.eclipse.dirigible dirigible-components-group-core pom com.zaxxer HikariCP-java7 org.eclipse.dirigible dirigible-components-security-basic org.eclipse.dirigible dirigible-components-security-keycloak org.eclipse.dirigible dirigible-components-group-database pom org.eclipse.dirigible dirigible-components-group-engines pom javax.validation validation-api javax.servlet javax.servlet-api org.apache.cxf cxf-rt-frontend-jaxrs org.apache.cxf cxf-spring-boot-starter-jaxrs org.eclipse.dirigible dirigible-components-engine-command org.eclipse.dirigible dirigible-components-group-ide pom org.eclipse.dirigible dirigible-components-group-api pom org.eclipse.dirigible dirigible-components-group-resources pom org.eclipse.dirigible dirigible-components-security-oauth2 org.eclipse.dirigible dirigible-components-group-templates pom org.springframework.boot spring-boot-starter-validation com.codeborne selenide 7.2.2 test org.postgresql postgresql org.eclipse.dirigible dirigible-database-mongodb-jdbc com.sap.cloud.db.jdbc ngdbc ${ngdbc.version} net.snowflake snowflake-jdbc ${snowflake.version} org.eclipse.dirigible dirigible-tests-framework org.springframework.boot spring-boot-maven-plugin io.dirigible.samples.CustomStackApplication repackage src/main/resources true ","title":"Create Maven Project"},{"location":"tutorials/customizations/custom-stack/project-structure/#create-eclipse-dirigible-resources","text":"Navigate to the application folder. Create src/main/resources/ folder structure and navigate to it. Create dirigible.properties , index.html and index-busy.html files. dirigible.properties static/index.html static/index-busy.html Create application/src/main/resources/dirigible.properties file. Paste the following content: application/src/main/resources/dirigible.properties # General DIRIGIBLE_PRODUCT_NAME=${project.title} DIRIGIBLE_PRODUCT_VERSION=${project.version} DIRIGIBLE_PRODUCT_COMMIT_ID=${git.commit.id} DIRIGIBLE_PRODUCT_REPOSITORY=https://github.com/dirigiblelabs/tutorial-custom-stack DIRIGIBLE_PRODUCT_TYPE=all DIRIGIBLE_INSTANCE_NAME=custom-stack DIRIGIBLE_DATABASE_PROVIDER=local DIRIGIBLE_JAVASCRIPT_HANDLER_CLASS_NAME=org.eclipse.dirigible.graalium.handler.GraaliumJavascriptHandler DIRIGIBLE_GRAALIUM_ENABLE_DEBUG=true DIRIGIBLE_HOME_URL=services/web/ide/ DIRIGIBLE_FTP_PORT=22 Environment Variables The properties file will be packaged inside the Custom Stack , and the above environment variables will be set by default. These environment variables could be overridden during Deployment or at Runtime . To learn more about the supported configurations go to Environment Variables . Create static folder and navigate to it. Create application/src/main/resources/static/index.html file. Paste the following content: application/src/main/resources/static/index.html < html lang = \"en-US\" > < meta charset = \"utf-8\" > < title > Redirecting … < link rel = \"canonical\" href = \"/home\" > < script > location = \"/home\" < meta http-equiv = \"refresh\" content = \"0; url=/home\" > < meta name = \"robots\" content = \"noindex\" > < h1 > Redirecting … < a href = \"/home\" > Click here if you are not redirected. Create static folder and navigate to it. Create application/src/main/resources/static/index-busy.html file. Paste the following content: application/src/main/resources/static/index-busy.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"busyPage\" ng-controller = \"BusyController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Loading ... < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"padding-left: 10rem; padding-right: 10rem; margin-top: 3rem;\" > < div class = \"fd-panel fd-panel--fixed\" > < div class = \"fd-panel__header\" > < h4 class = \"fd-panel__title\" > Preparing Custom Stack Instance < fd-list > < fd-list-item ng-repeat = \"job in jobs\" > < span fd-object-status status = \"{{job.status}}\" glyph = \"{{job.statusIcon}}\" text = \"{{job.name}}\" > < fd-busy-indicator style = \"margin-top: 3rem;\" dg-size = \"l\" > < script > let busyPage = angular . module ( 'busyPage' , [ 'ideUI' , 'ideView' ]); busyPage . controller ( 'BusyController' , [ '$scope' , '$http' , 'theming' , function ( $scope , $http , theming ) { setInterval ( function () { $http ({ method : 'GET' , url : '/services/healthcheck' }). then ( function ( healthStatus ){ if ( healthStatus . data . status === \"Ready\" ) { window . location = '/home' ; } let jobs = []; for ( const [ key , value ] of Object . entries ( healthStatus . data . jobs . statuses )) { let job = new Object (); job . name = key ; switch ( value ) { case \"Succeeded\" : job . status = \"positive\" ; job . statusIcon = \"sap-icon--message-success\" break ; case \"Failed\" : job . status = \"negative\" ; job . statusIcon = \"sap-icon--message-error\" ; default : job . status = \"informative\" ; job . statusIcon = \"sap-icon--message-information\" break ; } jobs . push ( job ); } $scope . jobs = jobs . sort (( x , y ) => x . name > y . name ? 1 : - 1 ); }), ( function ( e ){ console . error ( \"Error retreiving the health status\" , e ); }); }, 1000 ); }]); ","title":"Create Eclipse Dirigible Resources"},{"location":"tutorials/customizations/custom-stack/project-structure/#optional-create-eclipse-dirigible-error-resources","text":"Navigate to the application/src/main/resources folder. Create public folder and navigate to it. Create error.html , 403.html and 404.html files. error.html 403.html 404.html Create application/src/main/resources/public/error/error.html file. Paste the following content: application/src/main/resources/public/error/error.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Unexpected Error Occurred < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--error\" > < fd-message-page-title > Unexpected Error Occurred < fd-message-page-subtitle > < b > There was a problem serving the requested page . < br > Usually this means that an enexpected error happened while processing your request. Here's what you can try next: < br > < br > < i >< b > Reload the page , the problem may be temporary. If the problem persists, < b > contact us and we'll help get you on your way. < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Reload Page\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"reloadPage()\" > < fd-button compact = \"true\" dg-label = \"Contact Support\" ng-click = \"contactSupport()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . reloadPage = function () { location . reload (); }; $scope . contactSupport = function () { window . open ( \"https://bugs.dirigible.io\" , \"_blank\" ); }; }]); Create error folder and navigate to it. Create application/src/main/resources/error/403.html file. Paste the following content: application/src/main/resources/error/403.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Access Denied < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--alert\" > < fd-message-page-title > Access Denied < fd-message-page-subtitle > < b > The page you're trying to access has resctricted access . < br > Pleace contact your system administrator for more details. < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { }]); Create error folder and navigate to it. Create application/src/main/resources/error/404.html file. Paste the following content: application/src/main/resources/error/404.html < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"errorPage\" ng-controller = \"ErrorPageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"/services/web/resources/images/favicon.ico\" /> < title > Custom Stack | Page Not Found < theme > < script type = \"text/javascript\" src = \"/services/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/js/resources-core/services/loader.js?id=application-view-css\" /> < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < div style = \"height: 600px; width: 100%;\" > < fd-message-page glyph = \"sap-icon--documents\" > < fd-message-page-title > Page Not Found < fd-message-page-subtitle > < b > It looks like you've reached a URL that doesn't exist . < br > The page you are looking for is no longer here, or never existed in the first place. < br > < br > < i > You can go to the < b > previous page , or start over from the < b > home page . < fd-message-page-actions > < fd-button compact = \"true\" dg-label = \"Go Back\" dg-type = \"emphasized\" style = \"margin: 0 0.25rem;\" ng-click = \"goBack()\" > < fd-button compact = \"true\" dg-label = \"Take Me Home\" ng-click = \"goHome()\" ng-click = \"goHome()\" > < script > let errorPage = angular . module ( 'errorPage' , [ 'ideUI' , 'ideView' ]); errorPage . controller ( 'ErrorPageController' , [ '$scope' , 'theming' , function ( $scope , theming ) { $scope . goBack = function () { history . back (); }; $scope . goHome = function () { window . location = \"/home\" ; }; }]); ","title":"(optional) Create Eclipse Dirigible Error Resources"},{"location":"tutorials/customizations/custom-stack/project-structure/#create-spring-boot-resources","text":"Navigate to the application folder. Create application.properties , quartz.properties and CustomStackApplication.java files. application.properties quartz.properties CustomStackApplication.java Navigate to the src/main/resources/ folder. Create application/src/main/resources/application.properties file. Paste the following content: application/src/main/resources/application.properties server.port=8080 spring.main.allow-bean-definition-overriding=true server.error.include-message=always spring.servlet.multipart.enabled=true spring.servlet.multipart.file-size-threshold=2KB spring.servlet.multipart.max-file-size=1GB spring.servlet.multipart.max-request-size=1GB spring.servlet.multipart.max-file-size=200MB spring.servlet.multipart.max-request-size=215MB spring.servlet.multipart.location=${java.io.tmpdir} spring.datasource.hikari.connectionTimeout=3600000 spring.mvc.async.request-timeout=3600000 basic.enabled=${DIRIGIBLE_BASIC_ENABLED:true} terminal.enabled=${DIRIGIBLE_TERMINAL_ENABLED:false} keycloak.enabled=${DIRIGIBLE_KEYCLOAK_ENABLED:false} keycloak.realm=${DIRIGIBLE_KEYCLOAK_REALM:null} keycloak.auth-server-url=${DIRIGIBLE_KEYCLOAK_AUTH_SERVER_URL:null} keycloak.ssl-required=${DIRIGIBLE_KEYCLOAK_SSL_REQUIRED:external} keycloak.resource=${DIRIGIBLE_KEYCLOAK_CLIENT_ID:null} keycloak.public-client=true keycloak.principal-attribute=preferred_username keycloak.confidential-port=${DIRIGIBLE_KEYCLOAK_CONFIDENTIAL_PORT:443} keycloak.use-resource-role-mappings=true management.metrics.mongo.command.enabled=false management.metrics.mongo.connectionpool.enabled=false management.endpoints.jmx.exposure.include=* management.endpoints.jmx.exposure.exclude= management.endpoints.web.exposure.include=* management.endpoints.web.exposure.exclude= management.endpoint.health.show-details=always springdoc.api-docs.path=/api-docs cxf.path=/odata/v2 # the following are used to force the Spring to create QUARTZ tables # quartz properties are manged in quartz.properties don't try to add them here spring.quartz.job-store-type=jdbc spring.quartz.jdbc.initialize-schema=always Navigate to the src/main/resources/ folder. Create application/src/main/resources/quartz.properties file. Paste the following content: application/src/main/resources/quartz.properties # thread-pool org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool org.quartz.threadPool.threadCount=2 org.quartz.threadPool.threadsInheritContextClassLoaderOfInitializingThread=true # job-store # Enable this property for RAMJobStore org.quartz.jobStore.class=org.quartz.simpl.RAMJobStore # Enable these properties for a JDBCJobStore using JobStoreTX #org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX #org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.StdJDBCDelegate #org.quartz.jobStore.dataSource=quartzDataSource # Enable this property for JobStoreCMT #org.quartz.jobStore.nonManagedTXDataSource=quartzDataSource # H2 database # use an in-memory database & initialise Quartz using their standard SQL script #org.quartz.dataSource.quartzDataSource.URL=jdbc:h2:mem:spring-quartz;INIT=RUNSCRIPT FROM 'classpath:/org/quartz/impl/jdbcjobstore/tables_h2.sql' #org.quartz.dataSource.quartzDataSource.driver=org.h2.Driver #org.quartz.dataSource.quartzDataSource.user=sa #org.quartz.dataSource.quartzDataSource.password= #org.quartz.jdbc.initialize-schema=never Navigate to the src/main folder. Create java/io/dirigible/samples/ and navigate to it. Create application/src/main/java/io/dirigible/samples/CustomStackApplication.java file. Paste the following content: application/src/main/java/io/dirigible/samples/CustomStackApplication.java package io.dirigible.samples ; import org.springframework.boot.SpringApplication ; import org.springframework.boot.autoconfigure.SpringBootApplication ; import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.DataSourceTransactionManagerAutoConfiguration ; import org.springframework.boot.autoconfigure.jdbc.JdbcTemplateAutoConfiguration ; import org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration ; import org.springframework.data.jpa.repository.config.EnableJpaAuditing ; import org.springframework.data.jpa.repository.config.EnableJpaRepositories ; import org.springframework.scheduling.annotation.EnableScheduling ; @EnableJpaAuditing @EnableJpaRepositories @SpringBootApplication ( scanBasePackages = { \"io.dirigible.samples\" , \"org.eclipse.dirigible.components\" }, exclude = { DataSourceAutoConfiguration . class , DataSourceTransactionManagerAutoConfiguration . class , HibernateJpaAutoConfiguration . class , JdbcTemplateAutoConfiguration . class }) @EnableScheduling public class CustomStackApplication { public static void main ( String [] args ) { SpringApplication . run ( CustomStackApplication . class , args ); } }","title":"Create Spring Boot Resources"},{"location":"tutorials/customizations/custom-stack/project-structure/#build-the-custom-stack","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to build the Custom Stack : mvn clean install","title":"Build the Custom Stack"},{"location":"tutorials/customizations/custom-stack/project-structure/#run-the-custom-stack","text":"Navigate to the root folder of the project (e.g. /custom-stack ) . Open the Terminal and execute the following command to run the Custom Stack : java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -jar application/target/custom-stack-application-*.jar Debugging To run in debug mode, execute the following command: java --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8000 -jar application/target/custom-stack-application-*.jar Go to http://localhost:8080 to access the Custom Stack .","title":"Run the Custom Stack"},{"location":"tutorials/customizations/custom-stack/project-structure/#next-steps","text":"Section Completed After completing the steps in this tutorial, you would have: Maven project structure. Spring Boot application. Eclipse Dirigible Stack running at http://localhost:8080 . Continue to the Branding section to customize the branding of the Custom Stack. Note: The complete content of the Custom Stack tutorial is available at: https://github.com/dirigiblelabs/tutorial-custom-stack","title":"Next Steps"},{"location":"tutorials/customizations/ide/create-perspective/","text":"Create Perspective All perspectives in Eclipse Dirigible are loaded via the ide-perspective extension point. List with all extension points can be found at the Extensions Overview page. To develop a new perspective, extension , perspective definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library . Steps Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-perspective for the name of the project. The project will appear under the projects list. Create perspective extension: Right click on the my-perspective project and select New \u2192 Folder . Enter perspective for the name of the folder. Right click on the perspective folder and select New \u2192 Folder . Enter extensions for the name of the folder. Create perspective.extension , perspective-menu.extension , perspective-menu-window.extension and perspective-menu-help.extension files. perspective.extension perspective-menu.extension perspective-menu-window.extension perspective-menu-help.extension Right click on the extensions folder and select New \u2192 Extension . Enter perspective.extension for the name of the file. Right click on the perspective.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"My Perspective\" } Save the changes and close the Code Editor . (optional) Double click on the perspective.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu.extension for the name of the file. Right click on the perspective-menu.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective-menu.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"My Perspective Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-window.extension for the name of the file. Right click on the perspective-menu-window.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/window.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Window Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-window.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-help.extension for the name of the file. Right click on the perspective-menu-help.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/help.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Help Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-help.extension file to open the extension with the Extension Editor . Create perspective definition: Create perspective.js and perspective-menu.js files. perspective.js perspective-menu.js Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective.js for the name of the file. Double click on the perspective.js file to open it with the Code Editor . Replace the content with the following code: const perspectiveData = { id : \"my-perspective\" , name : \"My Perspective\" , link : \"../my-perspective/index.html\" , order : \"1000\" , icon : \"../my-perspective/icon.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Save the changes and close the Code Editor . Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective-menu.js for the name of the file. Double click on the perspective-menu.js file to open it with the Code Editor . Replace the content with the following code: exports . getMenu = function () { return { label : \"My Menu\" , order : 1 , items : [ { label : \"Empty item\" , order : 1 }, { label : \"Empty item with divider\" , divider : true , order : 2 }, { label : \"Submenu\" , order : 3 , items : [ { label : \"GitHub page\" , data : \"https://github.com/eclipse/dirigible\" , action : \"open\" , order : 1 } ] }, { label : \"About\" , action : \"openDialogWindow\" , dialogId : \"about\" , order : 4 } ] }; } Save the changes and close the Code Editor . Create perspective frontend resources: Create index.html , controller.js and icon.svg files. index.html controller.js icon.svg Right click on the my-perspective project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" ng-app = \"myPerspective\" ng-controller = \"MyPerspectiveController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < title dg-brand-title > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < script type = \"text/javascript\" src = \"/services/v4/web/my-perspective/perspective/perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body dg-contextmenu = \"contextMenuContent\" > < ide-header menu-ext-id = \"my-perspective-menu\" > < ide-contextmenu > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myPerspective = angular . module ( \"myPerspective\" , [ \"ngResource\" , \"ideLayout\" , \"ideUI\" ]); myPerspective . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'example' ; }]); myPerspective . controller ( \"MyPerspectiveController\" , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { $scope . layoutModel = { // Array of view ids views : [ \"import\" , \"welcome\" , \"console\" ], layoutSettings : { hideEditorsPane : false }, events : { \"example.alert.info\" : function ( msg ) { console . info ( msg . data . message ); } } }; $scope . contextMenuContent = function ( element ) { return { callbackTopic : \"example.contextmenu\" , items : [ { id : \"new\" , label : \"New\" , icon : \"sap-icon--create\" , items : [ { id : \"tab\" , label : \"Tab\" }, ] }, { id : \"other\" , label : \"Other\" , divider : true , icon : \"sap-icon--question-mark\" } ] } }; messageHub . onDidReceiveMessage ( \"contextmenu\" , function ( msg ) { if ( msg . data == \"other\" ) { messageHub . showAlertSuccess ( \"Success\" , \"You have selected the other option!\" ); } else { messageHub . showAlertInfo ( \"Nothing will happen\" , \"This is just a demo after all.\" ); } } ); }]); Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter icon.svg for the name of the file. Right click on the icon.svg file and select Open With \u2192 Code Editor . Replace the content with the following code: Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. The new perspective should be visibile at the bottom of the perspectives list. Info Alternatively go to Window \u2192 Open Perspective \u2192 My Perspective to open the new perspective.","title":"Perspective"},{"location":"tutorials/customizations/ide/create-perspective/#create-perspective","text":"All perspectives in Eclipse Dirigible are loaded via the ide-perspective extension point. List with all extension points can be found at the Extensions Overview page. To develop a new perspective, extension , perspective definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library .","title":"Create Perspective"},{"location":"tutorials/customizations/ide/create-perspective/#steps","text":"Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-perspective for the name of the project. The project will appear under the projects list. Create perspective extension: Right click on the my-perspective project and select New \u2192 Folder . Enter perspective for the name of the folder. Right click on the perspective folder and select New \u2192 Folder . Enter extensions for the name of the folder. Create perspective.extension , perspective-menu.extension , perspective-menu-window.extension and perspective-menu-help.extension files. perspective.extension perspective-menu.extension perspective-menu-window.extension perspective-menu-help.extension Right click on the extensions folder and select New \u2192 Extension . Enter perspective.extension for the name of the file. Right click on the perspective.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective.js\" , \"extensionPoint\" : \"ide-perspective\" , \"description\" : \"My Perspective\" } Save the changes and close the Code Editor . (optional) Double click on the perspective.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu.extension for the name of the file. Right click on the perspective-menu.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-perspective/perspective/perspective-menu.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"My Perspective Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-window.extension for the name of the file. Right click on the perspective-menu-window.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/window.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Window Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-window.extension file to open the extension with the Extension Editor . Right click on the extensions folder and select New \u2192 Extension . Enter perspective-menu-help.extension for the name of the file. Right click on the perspective-menu-help.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"ide-core/services/menus/help.js\" , \"extensionPoint\" : \"my-perspective-menu\" , \"description\" : \"Help Menu\" } Save the changes and close the Code Editor . (optional) Double click on the perspective-menu-help.extension file to open the extension with the Extension Editor . Create perspective definition: Create perspective.js and perspective-menu.js files. perspective.js perspective-menu.js Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective.js for the name of the file. Double click on the perspective.js file to open it with the Code Editor . Replace the content with the following code: const perspectiveData = { id : \"my-perspective\" , name : \"My Perspective\" , link : \"../my-perspective/index.html\" , order : \"1000\" , icon : \"../my-perspective/icon.svg\" , }; if ( typeof exports !== 'undefined' ) { exports . getPerspective = function () { return perspectiveData ; } } Save the changes and close the Code Editor . Right click on the perspective folder and select New \u2192 JavaScript CJS Service . Enter perspective-menu.js for the name of the file. Double click on the perspective-menu.js file to open it with the Code Editor . Replace the content with the following code: exports . getMenu = function () { return { label : \"My Menu\" , order : 1 , items : [ { label : \"Empty item\" , order : 1 }, { label : \"Empty item with divider\" , divider : true , order : 2 }, { label : \"Submenu\" , order : 3 , items : [ { label : \"GitHub page\" , data : \"https://github.com/eclipse/dirigible\" , action : \"open\" , order : 1 } ] }, { label : \"About\" , action : \"openDialogWindow\" , dialogId : \"about\" , order : 4 } ] }; } Save the changes and close the Code Editor . Create perspective frontend resources: Create index.html , controller.js and icon.svg files. index.html controller.js icon.svg Right click on the my-perspective project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" ng-app = \"myPerspective\" ng-controller = \"MyPerspectiveController\" xmlns = \"http://www.w3.org/1999/xhtml\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < title dg-brand-title > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < script type = \"text/javascript\" src = \"/services/v4/web/my-perspective/perspective/perspective.js\" > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/ide-core/services/loader.js?id=ide-perspective-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body dg-contextmenu = \"contextMenuContent\" > < ide-header menu-ext-id = \"my-perspective-menu\" > < ide-contextmenu > < ide-container > < ide-layout views-layout-model = \"layoutModel\" > < ide-dialogs > < ide-status-bar > Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myPerspective = angular . module ( \"myPerspective\" , [ \"ngResource\" , \"ideLayout\" , \"ideUI\" ]); myPerspective . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = 'example' ; }]); myPerspective . controller ( \"MyPerspectiveController\" , [ \"$scope\" , \"messageHub\" , function ( $scope , messageHub ) { $scope . layoutModel = { // Array of view ids views : [ \"import\" , \"welcome\" , \"console\" ], layoutSettings : { hideEditorsPane : false }, events : { \"example.alert.info\" : function ( msg ) { console . info ( msg . data . message ); } } }; $scope . contextMenuContent = function ( element ) { return { callbackTopic : \"example.contextmenu\" , items : [ { id : \"new\" , label : \"New\" , icon : \"sap-icon--create\" , items : [ { id : \"tab\" , label : \"Tab\" }, ] }, { id : \"other\" , label : \"Other\" , divider : true , icon : \"sap-icon--question-mark\" } ] } }; messageHub . onDidReceiveMessage ( \"contextmenu\" , function ( msg ) { if ( msg . data == \"other\" ) { messageHub . showAlertSuccess ( \"Success\" , \"You have selected the other option!\" ); } else { messageHub . showAlertInfo ( \"Nothing will happen\" , \"This is just a demo after all.\" ); } } ); }]); Save the changes and close the Code Editor . Right click on the my-perspective project and select New \u2192 File . Enter icon.svg for the name of the file. Right click on the icon.svg file and select Open With \u2192 Code Editor . Replace the content with the following code: Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. The new perspective should be visibile at the bottom of the perspectives list. Info Alternatively go to Window \u2192 Open Perspective \u2192 My Perspective to open the new perspective.","title":"Steps"},{"location":"tutorials/customizations/ide/create-view/","text":"Create View All views in Eclipse Dirigible are loaded via the ide-view extension point. List with all extension points can be found at the Extensions Overview page. To develop a new view, extension , view definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library . Steps Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-view for the name of the project. The project will appear under the projects list. Create view extension: Right click on the my-view project and select New \u2192 Folder . Enter view for the name of the folder. Create view.extension and view.js files. view.extension view.js Right click on the view folder and select New \u2192 Extension . Enter view.extension for the name of the file. Right click on the view.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-view/view/view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"My View\" } Save the changes and close the Code Editor . (optional) Double click on the view.extension file to open the extension with the Extension Editor . Right click on the view folder and select New \u2192 JavaScript CJS Service . Enter view.js for the name of the file. Double click on the view.js file to open it with the Code Editor . Replace the content with the following code: const viewData = { id : \"my-view\" , label : \"My View\" , factory : \"frame\" , region : \"bottom\" , link : \"../my-view/index.html\" , }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Save the changes and close the Code Editor . Create view frontend resources: Create index.html and controller.js files. index.html controller.js Right click on the my-view project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"myView\" ng-controller = \"MyViewController as mvc\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" sizes = \"any\" href = \"data:;base64,iVBORw0KGgo=\" > < title dg-view-title > < script type = \"text/javascript\" src = \"/webjars/jquery/3.6.0/jquery.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular-resource.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angular-aria/1.8.2/angular-aria.min.js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/webjars/fundamental-styles/0.24.0/dist/fundamental-styles.css\" > < theme > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/core.css\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/widgets.css\" /> < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/ide-message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/theming.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/widgets.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/view.js\" > < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < fd-fieldset > < fd-form-group dg-header = \"My Form\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idName\" dg-required = \"true\" dg-colon = \"true\" > Name < fd-input id = \"idName\" type = \"text\" placeholder = \"Enter name here\" ng-model = \"inputData.name\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idEmail\" dg-required = \"true\" dg-colon = \"true\" > Email < fd-input id = \"idEmail\" type = \"text\" placeholder = \"Enter email here\" ng-model = \"inputData.email\" > < button class = \"fd-button fd-button--emphasized\" ng-click = \"saveForm()\" style = \"margin: 6px;\" > Save < table fd-table display-mode = \"compact\" style = \"margin-top: 20px\" > < thead fd-table-header > < tr fd-table-row > < th fd-table-header-cell > Name < th fd-table-header-cell > Email < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" > < td fd-table-cell > {{next.name}} < td fd-table-cell activable = \"true\" >< a class = \"fd-link\" > {{next.email}} Save the changes and close the Code Editor . Right click on the my-view project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myView = angular . module ( \"myView\" , [ \"ideUI\" , \"ideView\" ]); myView . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = \"myView\" ; }]); myView . controller ( \"MyViewController\" , [ \"$scope\" , \"$http\" , \"messageHub\" , function ( $scope , $http , messageHub ) { $scope . inputData = {}; $scope . data = [{ name : \"John Doe\" , email : \"john.doe@email.com\" }, { name : \"Jane Doe\" , email : \"jane.doe@email.com\" }]; $scope . saveForm = function () { messageHub . showAlertInfo ( \"Form Successfully Save\" , `Name: ${ $scope . inputData . name } , Email: ${ $scope . inputData . email } ` ); }; }]); Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. Go to Window \u2192 Show View \u2192 My View to open the new view.","title":"View"},{"location":"tutorials/customizations/ide/create-view/#create-view","text":"All views in Eclipse Dirigible are loaded via the ide-view extension point. List with all extension points can be found at the Extensions Overview page. To develop a new view, extension , view definition and frontend resources should be created. The following example is using AngularJS and Fundamental Library .","title":"Create View"},{"location":"tutorials/customizations/ide/create-view/#steps","text":"Start Eclipse Dirigible. Info You can find more information on how to do that by following: Getting Started section. Setup section. Go to the Projects perspective and create New Project . Enter my-view for the name of the project. The project will appear under the projects list. Create view extension: Right click on the my-view project and select New \u2192 Folder . Enter view for the name of the folder. Create view.extension and view.js files. view.extension view.js Right click on the view folder and select New \u2192 Extension . Enter view.extension for the name of the file. Right click on the view.extension file and select Open With \u2192 Code Editor . Replace the content with the following definition: { \"module\" : \"my-view/view/view.js\" , \"extensionPoint\" : \"ide-view\" , \"description\" : \"My View\" } Save the changes and close the Code Editor . (optional) Double click on the view.extension file to open the extension with the Extension Editor . Right click on the view folder and select New \u2192 JavaScript CJS Service . Enter view.js for the name of the file. Double click on the view.js file to open it with the Code Editor . Replace the content with the following code: const viewData = { id : \"my-view\" , label : \"My View\" , factory : \"frame\" , region : \"bottom\" , link : \"../my-view/index.html\" , }; if ( typeof exports !== 'undefined' ) { exports . getView = function () { return viewData ; } } Save the changes and close the Code Editor . Create view frontend resources: Create index.html and controller.js files. index.html controller.js Right click on the my-view project and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double click on the index.html file to open it with the Code Editor . Replace the content with the following code: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"myView\" ng-controller = \"MyViewController as mvc\" > < head > < meta charset = \"utf-8\" /> < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" sizes = \"any\" href = \"data:;base64,iVBORw0KGgo=\" > < title dg-view-title > < script type = \"text/javascript\" src = \"/webjars/jquery/3.6.0/jquery.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angularjs/1.8.2/angular-resource.min.js\" > < script type = \"text/javascript\" src = \"/webjars/angular-aria/1.8.2/angular-aria.min.js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/webjars/fundamental-styles/0.24.0/dist/fundamental-styles.css\" > < theme > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/core.css\" /> < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/web/resources/styles/widgets.css\" /> < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/core/ide-message-hub.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/theming.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/widgets.js\" > < script type = \"text/javascript\" src = \"/services/v4/web/ide-core/ui/view.js\" > < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"fd-scrollbar\" dg-contextmenu = \"contextMenuContent\" > < fd-fieldset > < fd-form-group dg-header = \"My Form\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idName\" dg-required = \"true\" dg-colon = \"true\" > Name < fd-input id = \"idName\" type = \"text\" placeholder = \"Enter name here\" ng-model = \"inputData.name\" > < fd-form-item horizontal = \"true\" > < fd-form-label for = \"idEmail\" dg-required = \"true\" dg-colon = \"true\" > Email < fd-input id = \"idEmail\" type = \"text\" placeholder = \"Enter email here\" ng-model = \"inputData.email\" > < button class = \"fd-button fd-button--emphasized\" ng-click = \"saveForm()\" style = \"margin: 6px;\" > Save < table fd-table display-mode = \"compact\" style = \"margin-top: 20px\" > < thead fd-table-header > < tr fd-table-row > < th fd-table-header-cell > Name < th fd-table-header-cell > Email < tbody fd-table-body > < tr fd-table-row hoverable = \"true\" ng-repeat = \"next in data\" > < td fd-table-cell > {{next.name}} < td fd-table-cell activable = \"true\" >< a class = \"fd-link\" > {{next.email}} Save the changes and close the Code Editor . Right click on the my-view project and select New \u2192 File . Enter controller.js for the name of the file. Double click on the controller.js file to open it with the Code Editor . Replace the content with the following code: let myView = angular . module ( \"myView\" , [ \"ideUI\" , \"ideView\" ]); myView . config ([ \"messageHubProvider\" , function ( messageHubProvider ) { messageHubProvider . eventIdPrefix = \"myView\" ; }]); myView . controller ( \"MyViewController\" , [ \"$scope\" , \"$http\" , \"messageHub\" , function ( $scope , $http , messageHub ) { $scope . inputData = {}; $scope . data = [{ name : \"John Doe\" , email : \"john.doe@email.com\" }, { name : \"Jane Doe\" , email : \"jane.doe@email.com\" }]; $scope . saveForm = function () { messageHub . showAlertInfo ( \"Form Successfully Save\" , `Name: ${ $scope . inputData . name } , Email: ${ $scope . inputData . email } ` ); }; }]); Save the changes and close the Code Editor . Refresh the browser. Info In some cases you may want to go to Theme \u2192 Reset to clean Web IDE state. Go to Window \u2192 Show View \u2192 My View to open the new view.","title":"Steps"},{"location":"tutorials/modeling/bpmn-process/","text":"BPMN Process This tutorial will guide you through the steps of creating a Business Process with Service Task , User Task and Choice Gateway elements. The result of the business process modeling would be a Time Entry Request process, that once started would trigger an approval process (with mail notifications, if configured) with the following steps: Steps Start Eclipse Dirigible Info You can find more information on how to do that by following: Getting Started section. Setup section. Create Project Go to the Projects perspective and create New Project . Enter sample-bpm for the name of the project. The project will appear under the projects list. Create JavaScript Process Task Handlers JavaScript handlers should be provided for the Service Task steps in the Business Process . The following handlers will be executed during the Approve Time Entry Request , Deny Time Entry Request and Send Notification tasks. Right click on the sample-bpm project and select New \u2192 Folder . Enter tasks for the name of the folder. Create approve-request.js , reject-request.js and send-notification.js files. approve-request.js reject-request.js send-notification.js Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter approve-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . log ( `Time Entry Request Approved for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Approved\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Approved` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter reject-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . error ( `Time Entry Request Rejected for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Rejected\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Rejected

` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter send-notification.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const base64 = require ( \"utils/v4/base64\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let data = { executionId : executionId , User : process . getVariable ( executionId , \"User\" ), Project : process . getVariable ( executionId , \"Project\" ), Start : process . getVariable ( executionId , \"Start\" ), End : process . getVariable ( executionId , \"End\" ), Hours : process . getVariable ( executionId , \"Hours\" ) }; let urlEncodedData = base64 . encode ( JSON . stringify ( data )); let url = `http://localhost:8080/services/v4/web/sample-bpm/process/?data= ${ urlEncodedData } ` ; console . log ( `Approve Request URL: ${ url } ` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Pending\" ; let content = `

Status:

Time Entry Request for [ ${ data . User } ] - Pending

Click here to process request.` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Create Business Process Model Right click on the sample-bpm project and select New \u2192 Business Process Model . Enter time-entry-request.bpmn for the name of the business process. Manual Steps XML Content Double-click the time-entry-request.bpmn file to open it with the Flowable Editor . Click on the Process identifier field and change the value to time-entry-request . Click on the Name field and change the value to Time Entry Request . Click on the MyServiceTasks to select the first step of the business process. Click on the Name field and change the value to Send Notification . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/send-notification.js . JavaScript Task Handler The value of the handler field (e.g. sample-bpm/tasks/send-notification.js ) points to the location of the javascript task handler created in the previous step. Delete the arrow comming out of the Send Notification step. Expand the Activities group and drag and drop new User task to editor area. Connect the Send Notification task and the newly created user task. User Task Once the business process is triggered, it would stop at the Process Time Entry Request user task and it will wait for process continuation after the user task is completed. Select the user task. Click on the Name field and change the value to Process Time Entry Request . Create Choice gateway comming out of the Process Time Entry Request user task. Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Approve Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/approve-request.js . Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Reject Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/reject-request.js . Connect the Choice gateway with the Approve Time Entry Request and Reject Time Entry Request steps. Select the connection between the Choice gateway and the Reject Time Entry Request step. Click on the Default flow checkbox. Select the connection between the Choice gateway and the Approve Time Entry Request step. Click on the Flow condition field and change the value to ${isRequestApproved} . Flow Condition In the flow condition isRequestApproved is a process context variable, that would be set as part of the process continuation after the completion of the Process Time Entry Request user task. Connect the Approve Time Entry Request and Reject Time Entry Request steps with the end event. Save the changes. Right click on the time-entry-request.bpmn file and select Open With \u2192 Code Editor . Replace the content with the following: Save the changes. Business Process Synchronization Usually when the *.bpmn process is saved it would take between one and two minutes to be deployed and active. After that period of time the business process can be executed. The synchronization period by default is set to 50 seconds ( 0/50 * * * * ? ) . Find out more about the Job Expression environment variables. Updating the *.bpmn file would result in new synchronization being triggered and the updated process flow would be available after minute or two. Updating the JavaScript Task Handler won't require new synchronization and the new behaviour of the handlers will be available on the fly. Create Process API To trigger and continue the BPMN Process execution a server-side JavaScript API will be created. Right click on the sample-bpm project and select New \u2192 Folder . Enter api for the name of the folder. Create process.js file. process.js Right click on the api folder and select New \u2192 JavaScript CJS Service . Enter process.js for the name of the file. Double-click to open the file. Replace the content with the following: const rs = require ( \"http/v4/rs\" ); const process = require ( \"bpm/v4/process\" ); const tasks = require ( \"bpm/v4/tasks\" ); const user = require ( \"security/v4/user\" ); rs . service () . post ( \"\" , ( ctx , request , response ) => { let data = request . getJSON (); process . start ( 'time-entry-request' , { \"User\" : \"\" + user . getName (), \"Project\" : \"\" + data . Project , \"Start\" : \"\" + data . Start , \"End\" : \"\" + data . End , \"Hours\" : \"\" + data . Hours }); response . setStatus ( response . ACCEPTED ); }) . resource ( \"continue/:executionId\" ) . post (( ctx , request , response ) => { let executionId = request . params . executionId ; let tasksList = tasks . list (); let data = request . getJSON (); for ( const task of tasksList ) { if ( task . executionId . toString () === executionId . toString ()) { tasks . completeTask ( task . id , { isRequestApproved : data . approved , user : data . user }); break ; } } response . setStatus ( response . ACCEPTED ); }) . execute () Create Submit Form The submit form would call the server-side javascript api that was created in the previous step and will trigger the business process. Right click on the sample-bpm project and select New \u2192 Folder . Enter submit for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the submit folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--time-entry-request\" > < fd-message-page-title > Submit Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idProject\" dg-required = \"true\" dg-colon = \"true\" > Project < fd-combobox-input id = \"idProject\" name = \"Project\" state = \"{{ formErrors.Project ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Project'].$valid, 'Project')\" ng-model = \"entity.Project\" dropdown-items = \"optionsProject\" dg-placeholder = \"Search Project ...\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-required = \"true\" dg-colon = \"true\" > Start < fd-form-input-message-group dg-inactive = \"{{ formErrors.Start ? false : true }}\" > < fd-input id = \"idStart\" name = \"Start\" state = \"{{ formErrors.Start ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Start'].$valid, 'Start')\" ng-model = \"entity.Start\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-required = \"true\" dg-colon = \"true\" > End < fd-form-input-message-group dg-inactive = \"{{ formErrors.End ? false : true }}\" > < fd-input id = \"idEnd\" name = \"End\" state = \"{{ formErrors.End ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['End'].$valid, 'End')\" ng-model = \"entity.End\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-required = \"true\" dg-colon = \"true\" > Hours < fd-form-input-message-group dg-inactive = \"{{ formErrors.Hours ? false : true }}\" > < fd-input id = \"idHours\" name = \"Hours\" state = \"{{ formErrors.Hours ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Hours'].$valid, 'Hours')\" ng-model = \"entity.Hours\" min = \"0\" max = \"40\" dg-input-rules = \"{ patterns: [''] }\" type = \"number\" placeholder = \"Enter Hours\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Submit\" ng-click = \"submit()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"resetForm()\" > Right click on the api folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , function ( $scope , $http ) { $scope . entity = {}; $scope . optionsProject = [{ text : \"Project Alpha\" , value : \"Project Alpha\" }, { text : \"Project Beta\" , value : \"Project Beta\" }, { text : \"Project Evolution\" , value : \"Project Evolution\" }, { text : \"Project Next\" , value : \"Project Next\" }]; $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . submit = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js\" , JSON . stringify ( $scope . entity )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to submit Time Entry Request: ' ${ response . message } '` ); $scope . resetForm (); return ; } alert ( \"Time Entry Request successfully submitted\" ); $scope . resetForm (); }); }; $scope . resetForm = function () { $scope . entity = {}; $scope . formErrors = { Project : true , Start : true , End : true , Hours : true , }; }; $scope . resetForm (); }]); Create Process Form The process form would call the server-side javascript api that was created before and will resume the business process execution. Right click on the sample-bpm project and select New \u2192 Folder . Enter process for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the process folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--approvals\" > < fd-message-page-title > Approve Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idProject\" name = \"Project\" ng-model = \"entity.Project\" type = \"input\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-colon = \"true\" > Start < fd-form-input-message-group > < fd-input id = \"idStart\" name = \"Start\" ng-model = \"entity.Start\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-colon = \"true\" > End < fd-form-input-message-group > < fd-input id = \"idEnd\" name = \"End\" ng-model = \"entity.End\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idHours\" name = \"Hours\" ng-model = \"entity.Hours\" type = \"number\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Approve\" ng-click = \"approve()\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"negative\" dg-label = \"Reject\" ng-click = \"reject()\" > Right click on the process folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , '$location' , function ( $scope , $http , $location ) { let data = JSON . parse ( atob ( window . location . search . split ( \"=\" )[ 1 ])); $scope . executionId = data . executionId ; $scope . user = data . User ; $scope . entity = { Project : data . Project , Start : new Date ( data . Start ), End : new Date ( data . End ), Hours : parseInt ( data . Hours ) }; $scope . approve = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : true } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to approve Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Approved\" ); }); }; $scope . reject = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : false } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to reject Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Rejected\" ); }); }; }]); (Optional) Email Configuration In order to recieve email notifications about the process steps a mail configuration should be provided. The following environment variables are needed: DIRIGIBLE_MAIL_USERNAME= DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST= DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL= Connecting Eclipse Dirigible with SendGrid SMTP Relay To use a gmail account for the mail configuration follow the steps in the Connecting Eclipse Dirigible with SendGrid SMTP Relay blog. DIRIGIBLE_MAIL_USERNAME=apikey DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST=smtp.sendgrid.net DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL= Demo Navigate to http://localhost:8080/services/v4/web/sample-bpm/submit/ to open the Submit form . Enter the required data and press the Submit button. If email configuration was provided an email notification will be send to the email address set by the APP_SAMPLE_BPM_TO_EMAIL= environment variable. If email configuration wasn't provided then in the Console view the following message can be found: Approve Request URL: http://localhost:8080/services/v4/web/sample-bpm/process/?data=eyJleGVjdXRpb25JZCI6IjE4Ni... Open the URL from the Console view or open it from the email notification. The Process form would be prefilled with the data that was entered in the Submit form . Press the Approve or Reject button to resume the process execution. One more email notification would be send and message in the Console would be logged as part of the last step of the Business Process . BPM Sample GitHub Repository Go to https://github.com/dirigiblelabs/sample-bpm to find the complete sample. The repository can be cloned in the Git perspective and after few minutes the BPM Sample would be active.","title":"BPMN Process"},{"location":"tutorials/modeling/bpmn-process/#bpmn-process","text":"This tutorial will guide you through the steps of creating a Business Process with Service Task , User Task and Choice Gateway elements. The result of the business process modeling would be a Time Entry Request process, that once started would trigger an approval process (with mail notifications, if configured) with the following steps:","title":"BPMN Process"},{"location":"tutorials/modeling/bpmn-process/#steps","text":"","title":"Steps"},{"location":"tutorials/modeling/bpmn-process/#start-eclipse-dirigible","text":"Info You can find more information on how to do that by following: Getting Started section. Setup section.","title":"Start Eclipse Dirigible"},{"location":"tutorials/modeling/bpmn-process/#create-project","text":"Go to the Projects perspective and create New Project . Enter sample-bpm for the name of the project. The project will appear under the projects list.","title":"Create Project"},{"location":"tutorials/modeling/bpmn-process/#create-javascript-process-task-handlers","text":"JavaScript handlers should be provided for the Service Task steps in the Business Process . The following handlers will be executed during the Approve Time Entry Request , Deny Time Entry Request and Send Notification tasks. Right click on the sample-bpm project and select New \u2192 Folder . Enter tasks for the name of the folder. Create approve-request.js , reject-request.js and send-notification.js files. approve-request.js reject-request.js send-notification.js Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter approve-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . log ( `Time Entry Request Approved for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Approved\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Approved` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter reject-request.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let user = process . getVariable ( executionId , \"user\" ); console . error ( `Time Entry Request Rejected for User [ ${ user } ]` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Rejected\" ; let content = `

Status:

Time Entry Request for [ ${ user } ] - Rejected

` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes. Right click on the tasks folder and select New \u2192 JavaScript CJS Service . Enter send-notification.js for the name of the file. Double-click to open the file. Replace the content with the following: const process = require ( \"bpm/v4/process\" ); const base64 = require ( \"utils/v4/base64\" ); const mailClient = require ( \"mail/v4/client\" ); const config = require ( \"core/v4/configurations\" ); let execution = process . getExecutionContext (); let executionId = execution . getId (); let data = { executionId : executionId , User : process . getVariable ( executionId , \"User\" ), Project : process . getVariable ( executionId , \"Project\" ), Start : process . getVariable ( executionId , \"Start\" ), End : process . getVariable ( executionId , \"End\" ), Hours : process . getVariable ( executionId , \"Hours\" ) }; let urlEncodedData = base64 . encode ( JSON . stringify ( data )); let url = `http://localhost:8080/services/v4/web/sample-bpm/process/?data= ${ urlEncodedData } ` ; console . log ( `Approve Request URL: ${ url } ` ); if ( isMailConfigured ()) { let from = config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ); let to = config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ); let subject = \"Time Entry Request - Pending\" ; let content = `

Status:

Time Entry Request for [ ${ data . User } ] - Pending

Click here to process request.` ; let subType = \"html\" ; mailClient . send ( from , to , subject , content , subType ); } else { console . error ( \"Missing mail configuration\" ); } function isMailConfigured () { return config . get ( \"DIRIGIBLE_MAIL_USERNAME\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_PASSWORD\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_HOST\" ) != \"\" && config . get ( \"DIRIGIBLE_MAIL_SMTPS_PORT\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_FROM_EMAIL\" ) != \"\" && config . get ( \"APP_SAMPLE_BPM_TO_EMAIL\" ) != \"\" } Save the changes.","title":"Create JavaScript Process Task Handlers"},{"location":"tutorials/modeling/bpmn-process/#create-business-process-model","text":"Right click on the sample-bpm project and select New \u2192 Business Process Model . Enter time-entry-request.bpmn for the name of the business process. Manual Steps XML Content Double-click the time-entry-request.bpmn file to open it with the Flowable Editor . Click on the Process identifier field and change the value to time-entry-request . Click on the Name field and change the value to Time Entry Request . Click on the MyServiceTasks to select the first step of the business process. Click on the Name field and change the value to Send Notification . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/send-notification.js . JavaScript Task Handler The value of the handler field (e.g. sample-bpm/tasks/send-notification.js ) points to the location of the javascript task handler created in the previous step. Delete the arrow comming out of the Send Notification step. Expand the Activities group and drag and drop new User task to editor area. Connect the Send Notification task and the newly created user task. User Task Once the business process is triggered, it would stop at the Process Time Entry Request user task and it will wait for process continuation after the user task is completed. Select the user task. Click on the Name field and change the value to Process Time Entry Request . Create Choice gateway comming out of the Process Time Entry Request user task. Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Approve Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/approve-request.js . Expand the Activities group and drag and drop new Service task to editor area. Select the service task. Click on the Name field and change the value to Reject Time Entry Request . Scroll down to the Class fields and click on it. Change the handler filed to sample-bpm/tasks/reject-request.js . Connect the Choice gateway with the Approve Time Entry Request and Reject Time Entry Request steps. Select the connection between the Choice gateway and the Reject Time Entry Request step. Click on the Default flow checkbox. Select the connection between the Choice gateway and the Approve Time Entry Request step. Click on the Flow condition field and change the value to ${isRequestApproved} . Flow Condition In the flow condition isRequestApproved is a process context variable, that would be set as part of the process continuation after the completion of the Process Time Entry Request user task. Connect the Approve Time Entry Request and Reject Time Entry Request steps with the end event. Save the changes. Right click on the time-entry-request.bpmn file and select Open With \u2192 Code Editor . Replace the content with the following: Save the changes. Business Process Synchronization Usually when the *.bpmn process is saved it would take between one and two minutes to be deployed and active. After that period of time the business process can be executed. The synchronization period by default is set to 50 seconds ( 0/50 * * * * ? ) . Find out more about the Job Expression environment variables. Updating the *.bpmn file would result in new synchronization being triggered and the updated process flow would be available after minute or two. Updating the JavaScript Task Handler won't require new synchronization and the new behaviour of the handlers will be available on the fly.","title":"Create Business Process Model"},{"location":"tutorials/modeling/bpmn-process/#create-process-api","text":"To trigger and continue the BPMN Process execution a server-side JavaScript API will be created. Right click on the sample-bpm project and select New \u2192 Folder . Enter api for the name of the folder. Create process.js file. process.js Right click on the api folder and select New \u2192 JavaScript CJS Service . Enter process.js for the name of the file. Double-click to open the file. Replace the content with the following: const rs = require ( \"http/v4/rs\" ); const process = require ( \"bpm/v4/process\" ); const tasks = require ( \"bpm/v4/tasks\" ); const user = require ( \"security/v4/user\" ); rs . service () . post ( \"\" , ( ctx , request , response ) => { let data = request . getJSON (); process . start ( 'time-entry-request' , { \"User\" : \"\" + user . getName (), \"Project\" : \"\" + data . Project , \"Start\" : \"\" + data . Start , \"End\" : \"\" + data . End , \"Hours\" : \"\" + data . Hours }); response . setStatus ( response . ACCEPTED ); }) . resource ( \"continue/:executionId\" ) . post (( ctx , request , response ) => { let executionId = request . params . executionId ; let tasksList = tasks . list (); let data = request . getJSON (); for ( const task of tasksList ) { if ( task . executionId . toString () === executionId . toString ()) { tasks . completeTask ( task . id , { isRequestApproved : data . approved , user : data . user }); break ; } } response . setStatus ( response . ACCEPTED ); }) . execute ()","title":"Create Process API"},{"location":"tutorials/modeling/bpmn-process/#create-submit-form","text":"The submit form would call the server-side javascript api that was created in the previous step and will trigger the business process. Right click on the sample-bpm project and select New \u2192 Folder . Enter submit for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the submit folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--time-entry-request\" > < fd-message-page-title > Submit Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idProject\" dg-required = \"true\" dg-colon = \"true\" > Project < fd-combobox-input id = \"idProject\" name = \"Project\" state = \"{{ formErrors.Project ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Project'].$valid, 'Project')\" ng-model = \"entity.Project\" dropdown-items = \"optionsProject\" dg-placeholder = \"Search Project ...\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-required = \"true\" dg-colon = \"true\" > Start < fd-form-input-message-group dg-inactive = \"{{ formErrors.Start ? false : true }}\" > < fd-input id = \"idStart\" name = \"Start\" state = \"{{ formErrors.Start ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Start'].$valid, 'Start')\" ng-model = \"entity.Start\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-required = \"true\" dg-colon = \"true\" > End < fd-form-input-message-group dg-inactive = \"{{ formErrors.End ? false : true }}\" > < fd-input id = \"idEnd\" name = \"End\" state = \"{{ formErrors.End ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['End'].$valid, 'End')\" ng-model = \"entity.End\" type = \"date\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-required = \"true\" dg-colon = \"true\" > Hours < fd-form-input-message-group dg-inactive = \"{{ formErrors.Hours ? false : true }}\" > < fd-input id = \"idHours\" name = \"Hours\" state = \"{{ formErrors.Hours ? 'error' : '' }}\" ng-required = \"true\" ng-change = \"isValid(formFieldset['Hours'].$valid, 'Hours')\" ng-model = \"entity.Hours\" min = \"0\" max = \"40\" dg-input-rules = \"{ patterns: [''] }\" type = \"number\" placeholder = \"Enter Hours\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Submit\" ng-click = \"submit()\" state = \"{{ !isFormValid ? 'disabled' : '' }}\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"transparent\" dg-label = \"Cancel\" ng-click = \"resetForm()\" > Right click on the api folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , function ( $scope , $http ) { $scope . entity = {}; $scope . optionsProject = [{ text : \"Project Alpha\" , value : \"Project Alpha\" }, { text : \"Project Beta\" , value : \"Project Beta\" }, { text : \"Project Evolution\" , value : \"Project Evolution\" }, { text : \"Project Next\" , value : \"Project Next\" }]; $scope . isValid = function ( isValid , property ) { $scope . formErrors [ property ] = ! isValid ? true : undefined ; for ( let next in $scope . formErrors ) { if ( $scope . formErrors [ next ] === true ) { $scope . isFormValid = false ; return ; } } $scope . isFormValid = true ; }; $scope . submit = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js\" , JSON . stringify ( $scope . entity )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to submit Time Entry Request: ' ${ response . message } '` ); $scope . resetForm (); return ; } alert ( \"Time Entry Request successfully submitted\" ); $scope . resetForm (); }); }; $scope . resetForm = function () { $scope . entity = {}; $scope . formErrors = { Project : true , Start : true , End : true , Hours : true , }; }; $scope . resetForm (); }]);","title":"Create Submit Form"},{"location":"tutorials/modeling/bpmn-process/#create-process-form","text":"The process form would call the server-side javascript api that was created before and will resume the business process execution. Right click on the sample-bpm project and select New \u2192 Folder . Enter process for the name of the folder. Create index.html and controller.js files. index.html controller.js Right click on the process folder and select New \u2192 HTML5 Page . Enter index.html for the name of the file. Double-click to open the file. Replace the content with the following: < html lang = \"en\" xmlns = \"http://www.w3.org/1999/xhtml\" ng-app = \"page\" ng-controller = \"PageController\" > < head > < meta charset = \"utf-8\" > < meta http-equiv = \"X-UA-Compatible\" content = \"IE=edge\" > < meta name = \"viewport\" content = \"width=device-width, initial-scale=1\" > < link rel = \"icon\" href = \"data:;base64,iVBORw0KGgo=\" dg-brand-icon /> < title dg-brand-title > < theme > < script type = \"text/javascript\" src = \"/services/v4/js/resources-core/services/loader.js?id=application-view-js\" > < link type = \"text/css\" rel = \"stylesheet\" href = \"/services/v4/js/resources-core/services/loader.js?id=application-view-css\" /> < script type = \"text/javascript\" src = \"controller.js\" > < body class = \"dg-vbox\" dg-contextmenu = \"contextMenuContent\" > < div > < fd-message-page glyph = \"sap-icon--approvals\" > < fd-message-page-title > Approve Time Entry Request < fd-message-page-subtitle > < fd-scrollbar class = \"dg-full-height\" > < fd-fieldset class = \"fd-margin--md\" ng-form = \"formFieldset\" > < fd-form-group name = \"entityForm\" > < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idProject\" name = \"Project\" ng-model = \"entity.Project\" type = \"input\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idStart\" dg-colon = \"true\" > Start < fd-form-input-message-group > < fd-input id = \"idStart\" name = \"Start\" ng-model = \"entity.Start\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idEnd\" dg-colon = \"true\" > End < fd-form-input-message-group > < fd-input id = \"idEnd\" name = \"End\" ng-model = \"entity.End\" type = \"date\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-form-item horizontal = \"false\" > < fd-form-label for = \"idHours\" dg-colon = \"true\" > Hours < fd-form-input-message-group > < fd-input id = \"idHours\" name = \"Hours\" ng-model = \"entity.Hours\" type = \"number\" ng-readonly = \"true\" > < fd-form-message dg-type = \"error\" > Incorrect Input < fd-message-page-actions > < fd-button class = \"fd-margin-end--tiny fd-dialog__decisive-button\" compact = \"true\" dg-type = \"emphasized\" dg-label = \"Approve\" ng-click = \"approve()\" > < fd-button class = \"fd-dialog__decisive-button\" compact = \"true\" dg-type = \"negative\" dg-label = \"Reject\" ng-click = \"reject()\" > Right click on the process folder and select New \u2192 File . Enter controller.js for the name of the file. Double-click to open the file. Replace the content with the following: angular . module ( 'page' , [ \"ideUI\" , \"ideView\" ]) . controller ( 'PageController' , [ '$scope' , '$http' , '$location' , function ( $scope , $http , $location ) { let data = JSON . parse ( atob ( window . location . search . split ( \"=\" )[ 1 ])); $scope . executionId = data . executionId ; $scope . user = data . User ; $scope . entity = { Project : data . Project , Start : new Date ( data . Start ), End : new Date ( data . End ), Hours : parseInt ( data . Hours ) }; $scope . approve = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : true } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to approve Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Approved\" ); }); }; $scope . reject = function () { $http . post ( \"/services/v4/js/sample-bpm/api/process.js/continue/\" + $scope . executionId , JSON . stringify ( { user : $scope . user , approved : false } )). then ( function ( response ) { if ( response . status != 202 ) { alert ( `Unable to reject Time Entry Request: ' ${ response . message } '` ); return ; } $scope . entity = {}; alert ( \"Time Entry Request Rejected\" ); }); }; }]);","title":"Create Process Form"},{"location":"tutorials/modeling/bpmn-process/#optional-email-configuration","text":"In order to recieve email notifications about the process steps a mail configuration should be provided. The following environment variables are needed: DIRIGIBLE_MAIL_USERNAME= DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST= DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL= Connecting Eclipse Dirigible with SendGrid SMTP Relay To use a gmail account for the mail configuration follow the steps in the Connecting Eclipse Dirigible with SendGrid SMTP Relay blog. DIRIGIBLE_MAIL_USERNAME=apikey DIRIGIBLE_MAIL_PASSWORD= DIRIGIBLE_MAIL_TRANSPORT_PROTOCOL=smtps DIRIGIBLE_MAIL_SMTPS_HOST=smtp.sendgrid.net DIRIGIBLE_MAIL_SMTPS_PORT=465 APP_SAMPLE_BPM_FROM_EMAIL= APP_SAMPLE_BPM_TO_EMAIL=","title":"(Optional) Email Configuration"},{"location":"tutorials/modeling/bpmn-process/#demo","text":"Navigate to http://localhost:8080/services/v4/web/sample-bpm/submit/ to open the Submit form . Enter the required data and press the Submit button. If email configuration was provided an email notification will be send to the email address set by the APP_SAMPLE_BPM_TO_EMAIL= environment variable. If email configuration wasn't provided then in the Console view the following message can be found: Approve Request URL: http://localhost:8080/services/v4/web/sample-bpm/process/?data=eyJleGVjdXRpb25JZCI6IjE4Ni... Open the URL from the Console view or open it from the email notification. The Process form would be prefilled with the data that was entered in the Submit form . Press the Approve or Reject button to resume the process execution. One more email notification would be send and message in the Console would be logged as part of the last step of the Business Process . BPM Sample GitHub Repository Go to https://github.com/dirigiblelabs/sample-bpm to find the complete sample. The repository can be cloned in the Git perspective and after few minutes the BPM Sample would be active.","title":"Demo"},{"location":"tutorials/modeling/generate-application-from-datasource/","text":"Generate Application from Datasource This tutorial will guide you through the creation of an Entity Data Model (EDM) and the generation of a full-stack Dirigible application from datasource. We will be using MySQL for that purpose but Eclipse Dirigible supports other databases as well. Prerequisites Access to the latest version of Eclipse Dirigible (10.2.7+) Docker Image setup (follow the steps below) Steps Pull the Docker Image Pull the official Eclipse Dirigible Docker Image to your local environment. docker pull dirigiblelabs/dirigible:latest Run the Docker Image Run with Environment Configurations Launch the Docker Image using the following command: docker run --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest Optional If you want to use environment variables to automatically import your datasource prepare the following file: my_env.list DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=TUTORIAL TUTORIAL_DRIVER=com.mysql.cj.jdbc.Driver TUTORIAL_URL=jdbc:mysql://host.docker.internal/my_db TUTORIAL_USERNAME=*my_username* TUTORIAL_PASSWORD=*my_pass* Launch the Docker Image using the following command: docker run --env-file ./my_env.list --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest Add the Data Source There are several ways how to to add a datasource ( via the Web IDE , via Environment Variables , via *.datsource file ) : via the Web IDE via Environment Variables via *.datsource file Note Note that this method may result in the loss of the datasource upon restart. Navigate to the Database perspective In the bottom right corner select the + sign and input the information for you datasource Test your connection with a simple query Set the following environment variables: DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES= _DRIVER=com.mysql.cj.jdbc.Driver _URL=jdbc:mysql://host.docker.internal/my_db _USERNAME=*my_username* _PASSWORD=*my_pass* Note In the previous section the steps were explained in more details. Create a *.datasource file in your application with the following content: { \"location\" : \"///.datasource\" , \"name\" : \"\" , \"driver\" : \"com.mysql.cj.jdbc.Driver\" , \"url\" : \"jdbc:mysql://${MY_DB_HOST}:${MY_DB_PORT}/${MY_DB_NAME}\" , \"username\" : \"${MY_DB_USER}\" , \"password\" : \"${MY_DB_PASS}\" } Note Replace the following placeholders: ///.datasource with the location of the datasource file in the application. with the name of the datasource. Set the following environment variables: - MY_DB_HOST - the database host. - MY_DB_PORT - the database port. - MY_DB_NAME - the database name. - MY_DB_USER - the database user. - MY_DB_PASS - the database password Application Generation Steps Once the datasource is added, proceed with the following steps to generate the application: Right-click the database you want and select Export Schema as Model . Navigate to the Workbench perspective and you should see a project and a *.model file created from your database. Right click the *.model file: Click on the Generate option. From the Generate from template pop-up select Application - Full Stack . Input additional information for you application Note The Data Source field is where your records are going to be saved. For this tutorial we want to use our imported datasource TUTORIAL but if you want you can use the Eclipse Dirigible H2 datasource (by default named DefaultDB ) . In the TUTORIAL project a couple of files will be generated - this is our application. Right click the project and publish your application using the Publish button. Navigate in the gen folder in the TUTORIAL project, select the index.html and in the Preview section below you can fetch your link and start using your application: Conclusion By following the steps outlined above, you can seamlessly generate an application in Eclipse Dirigible using a datasource. Ensure to set up the datasource correctly and choose the appropriate method based on your requirements.","title":"Generate Application from Datasource"},{"location":"tutorials/modeling/generate-application-from-datasource/#generate-application-from-datasource","text":"This tutorial will guide you through the creation of an Entity Data Model (EDM) and the generation of a full-stack Dirigible application from datasource. We will be using MySQL for that purpose but Eclipse Dirigible supports other databases as well. Prerequisites Access to the latest version of Eclipse Dirigible (10.2.7+) Docker Image setup (follow the steps below)","title":"Generate Application from Datasource"},{"location":"tutorials/modeling/generate-application-from-datasource/#steps","text":"","title":"Steps"},{"location":"tutorials/modeling/generate-application-from-datasource/#pull-the-docker-image","text":"Pull the official Eclipse Dirigible Docker Image to your local environment. docker pull dirigiblelabs/dirigible:latest","title":"Pull the Docker Image"},{"location":"tutorials/modeling/generate-application-from-datasource/#run-the-docker-image","text":"Run with Environment Configurations Launch the Docker Image using the following command: docker run --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest Optional If you want to use environment variables to automatically import your datasource prepare the following file: my_env.list DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES=TUTORIAL TUTORIAL_DRIVER=com.mysql.cj.jdbc.Driver TUTORIAL_URL=jdbc:mysql://host.docker.internal/my_db TUTORIAL_USERNAME=*my_username* TUTORIAL_PASSWORD=*my_pass* Launch the Docker Image using the following command: docker run --env-file ./my_env.list --name dirigible --rm -p 8080:8080 -p 8081:8081 dirigiblelabs/dirigible:latest","title":"Run the Docker Image"},{"location":"tutorials/modeling/generate-application-from-datasource/#add-the-data-source","text":"There are several ways how to to add a datasource ( via the Web IDE , via Environment Variables , via *.datsource file ) : via the Web IDE via Environment Variables via *.datsource file Note Note that this method may result in the loss of the datasource upon restart. Navigate to the Database perspective In the bottom right corner select the + sign and input the information for you datasource Test your connection with a simple query Set the following environment variables: DIRIGIBLE_DATABASE_CUSTOM_DATASOURCES= _DRIVER=com.mysql.cj.jdbc.Driver _URL=jdbc:mysql://host.docker.internal/my_db _USERNAME=*my_username* _PASSWORD=*my_pass* Note In the previous section the steps were explained in more details. Create a *.datasource file in your application with the following content: { \"location\" : \"///.datasource\" , \"name\" : \"\" , \"driver\" : \"com.mysql.cj.jdbc.Driver\" , \"url\" : \"jdbc:mysql://${MY_DB_HOST}:${MY_DB_PORT}/${MY_DB_NAME}\" , \"username\" : \"${MY_DB_USER}\" , \"password\" : \"${MY_DB_PASS}\" } Note Replace the following placeholders: ///.datasource with the location of the datasource file in the application. with the name of the datasource. Set the following environment variables: - MY_DB_HOST - the database host. - MY_DB_PORT - the database port. - MY_DB_NAME - the database name. - MY_DB_USER - the database user. - MY_DB_PASS - the database password","title":"Add the Data Source"},{"location":"tutorials/modeling/generate-application-from-datasource/#application-generation-steps","text":"Once the datasource is added, proceed with the following steps to generate the application: Right-click the database you want and select Export Schema as Model . Navigate to the Workbench perspective and you should see a project and a *.model file created from your database. Right click the *.model file: Click on the Generate option. From the Generate from template pop-up select Application - Full Stack . Input additional information for you application Note The Data Source field is where your records are going to be saved. For this tutorial we want to use our imported datasource TUTORIAL but if you want you can use the Eclipse Dirigible H2 datasource (by default named DefaultDB ) . In the TUTORIAL project a couple of files will be generated - this is our application. Right click the project and publish your application using the Publish button. Navigate in the gen folder in the TUTORIAL project, select the index.html and in the Preview section below you can fetch your link and start using your application:","title":"Application Generation Steps"},{"location":"tutorials/modeling/generate-application-from-datasource/#conclusion","text":"By following the steps outlined above, you can seamlessly generate an application in Eclipse Dirigible using a datasource. Ensure to set up the datasource correctly and choose the appropriate method based on your requirements.","title":"Conclusion"},{"location":"tutorials/modeling/generate-application-from-model/","text":"Generate Application from Model This tutorial will guide you through the creation of an entity data model and generation of a full-stack Dirigible application, from this model. Prerequisites Access to the latest version of Eclipse Dirigible (3.2.2+) Overview In this tutorial we will create an entity model of a car service bookings and generate full-stack Dirigible application from it. The complete sample can be found here . Setup Car Service Bookings Create new project car-service-bookings Right click -> New -> Entity Data Model Rename file.edm to car-service-bookings.edm Open car-service-bookings.edm Brands Drag and drop new entity Name it Brands Rename entityId to Id Drag and drop new property Rename property2 to Name Open the properties of the Brands entity Open the General tab Set the Type to Primary Entity Switch to the User Interface tab Set the Layout Type to Manage Master Entites Models Drag and drop new entity Name it Models Rename entityId to Id Drag and drop new property Rename property2 to Name Add new relation between Models and Brands Rename the relation property in the Models entity to BrandId Open the relation properties Set Name to Brand Set Relationship Type to Composition Set Relationship Cardinality to one-to-many Open the properties of the BrandId property Switch to the User Interface tab Set Is Major to Show in form only Open the properties of the Models entity Open the General tab Set the Type to Dependent Entity Swith to the User Interface tab Set the Layout Tab to Manage Details Entities Cars Drag and drop new entity Name it Cars Rename entityId to Id Drag and drop new property Rename property2 to PlateNumber Add new relation between Cars and Models Rename the relation property in the Cars entity to ModelId Open the properties of the ModelId property Open the Data tab Set the Data Type to INTEGER Switch to the User Interface Set Widget Type to Dropdown Set Label to Model Set Dropdown Key to Id Set Dropdown Value to Name > Note : the dropdown key and value refers respectively to the Models:Id and Models:Name values Generation Save the model Right click on car-service-bookings.model and select Generate Set Template to Full-stack Application (AngularJS) Set Extension to car-service Check Embedded mode Set Title to Car Service Set Brand to Car Service Click Generate Publish the project Extensibility Sample view based extension can be found here Wrap up The whole application can be found here Resources Sample Car Service Bookings: sample-v3-car-service-bookings Sample Data: sample-v3-car-service-bookings-data Sample Extension: sample-v3-car-service-bookings-extension","title":"Generate Application from Model"},{"location":"tutorials/modeling/generate-application-from-model/#generate-application-from-model","text":"This tutorial will guide you through the creation of an entity data model and generation of a full-stack Dirigible application, from this model.","title":"Generate Application from Model"},{"location":"tutorials/modeling/generate-application-from-model/#prerequisites","text":"Access to the latest version of Eclipse Dirigible (3.2.2+)","title":"Prerequisites"},{"location":"tutorials/modeling/generate-application-from-model/#overview","text":"In this tutorial we will create an entity model of a car service bookings and generate full-stack Dirigible application from it. The complete sample can be found here .","title":"Overview"},{"location":"tutorials/modeling/generate-application-from-model/#setup-car-service-bookings","text":"Create new project car-service-bookings Right click -> New -> Entity Data Model Rename file.edm to car-service-bookings.edm Open car-service-bookings.edm","title":"Setup Car Service Bookings"},{"location":"tutorials/modeling/generate-application-from-model/#brands","text":"Drag and drop new entity Name it Brands Rename entityId to Id Drag and drop new property Rename property2 to Name Open the properties of the Brands entity Open the General tab Set the Type to Primary Entity Switch to the User Interface tab Set the Layout Type to Manage Master Entites","title":"Brands"},{"location":"tutorials/modeling/generate-application-from-model/#models","text":"Drag and drop new entity Name it Models Rename entityId to Id Drag and drop new property Rename property2 to Name Add new relation between Models and Brands Rename the relation property in the Models entity to BrandId Open the relation properties Set Name to Brand Set Relationship Type to Composition Set Relationship Cardinality to one-to-many Open the properties of the BrandId property Switch to the User Interface tab Set Is Major to Show in form only Open the properties of the Models entity Open the General tab Set the Type to Dependent Entity Swith to the User Interface tab Set the Layout Tab to Manage Details Entities","title":"Models"},{"location":"tutorials/modeling/generate-application-from-model/#cars","text":"Drag and drop new entity Name it Cars Rename entityId to Id Drag and drop new property Rename property2 to PlateNumber Add new relation between Cars and Models Rename the relation property in the Cars entity to ModelId Open the properties of the ModelId property Open the Data tab Set the Data Type to INTEGER Switch to the User Interface Set Widget Type to Dropdown Set Label to Model Set Dropdown Key to Id Set Dropdown Value to Name > Note : the dropdown key and value refers respectively to the Models:Id and Models:Name values","title":"Cars"},{"location":"tutorials/modeling/generate-application-from-model/#generation","text":"Save the model Right click on car-service-bookings.model and select Generate Set Template to Full-stack Application (AngularJS) Set Extension to car-service Check Embedded mode Set Title to Car Service Set Brand to Car Service Click Generate Publish the project","title":"Generation"},{"location":"tutorials/modeling/generate-application-from-model/#extensibility","text":"Sample view based extension can be found here","title":"Extensibility"},{"location":"tutorials/modeling/generate-application-from-model/#wrap-up","text":"The whole application can be found here","title":"Wrap up"},{"location":"tutorials/modeling/generate-application-from-model/#resources","text":"Sample Car Service Bookings: sample-v3-car-service-bookings Sample Data: sample-v3-car-service-bookings-data Sample Extension: sample-v3-car-service-bookings-extension","title":"Resources"}]} \ No newline at end of file diff --git a/docs/help/setup/setup-environment-variables/index.html b/docs/help/setup/setup-environment-variables/index.html index 5d05514ad..4371f8c5f 100644 --- a/docs/help/setup/setup-environment-variables/index.html +++ b/docs/help/setup/setup-environment-variables/index.html @@ -4524,27 +4524,27 @@

Custom Database

DS1_DRIVER -The JDBC driver used for the examplary DS1 database connection +The JDBC driver used for the exemplary DS1 database connection `` DS1_URL -The JDBC url used for the examplary DS1 database connection +The JDBC url used for the exemplary DS1 database connection `` DS1_SCHEMA -The default schema used for the examplary DS1 database connection +The default schema used for the exemplary DS1 database connection `` DS1_USERNAME -The username used for the examplary DS1 database connection +The username used for the exemplary DS1 database connection `` DS1_PASSWORD -The password used for the examplary DS1 database connection +The password used for the exemplary DS1 database connection `` diff --git a/docs/help/sitemap.xml.gz b/docs/help/sitemap.xml.gz index 527cbd68f..b9c862b26 100644 Binary files a/docs/help/sitemap.xml.gz and b/docs/help/sitemap.xml.gz differ diff --git a/docs/news/sitemap.xml.gz b/docs/news/sitemap.xml.gz index 7b67784f7..27a8eac7d 100644 Binary files a/docs/news/sitemap.xml.gz and b/docs/news/sitemap.xml.gz differ diff --git a/docs/releases/sitemap.xml.gz b/docs/releases/sitemap.xml.gz index dca2c58dd..15427e6c7 100644 Binary files a/docs/releases/sitemap.xml.gz and b/docs/releases/sitemap.xml.gz differ diff --git a/docs/sitemap.xml.gz b/docs/sitemap.xml.gz index e972079ab..5f4c2baa4 100644 Binary files a/docs/sitemap.xml.gz and b/docs/sitemap.xml.gz differ