Very High Activity
I Use This!

News

Analyzed 7 days ago. based on code collected 7 days ago.
Posted about 22 hours ago
In my blog post about Getting Started with OSGi Declarative Services I provided an introduction to OSGi declarative services. How to create them, how they behave at runtime, how to reference other services, and so on. But I left out an important ... [More] topic there: configuring OSGi components. Well to be precise I mentioned it, and one sort of configuration was also used in the examples, but it was not explained in detail. As there are multiple aspects with regards to component configuration I wanted to write a blog post that is dedicated to that topic, and here it is. After reading this blog post you should have a deeper understanding of how OSGi components can be configured. Basics A component can be configured via Component Properties. Properties are key-value-pairs that can be accessed via Map. With DS 1.3 the Component Property Types are introduced for type safe access to Component Properties. Component Properties can be defined in different ways: inline via Java properties file via OSGi Configuration Admin via argument of the ComponentFactory.newInstance method (only for factory components, and as I didn’t cover them in the previous blog post, I won’t cover that topic here aswell) Component Properties that are defined inline or via properties file can be overridden by using the OSGi Configuration Admin or the ComponentFactory.newInstance argument. Basically the property propagation is executed sequentially. Therefore it is even possible to override inline properties with properties from a properties file, if the properties file is specified after the inline properties. The SCR (Service Component Runtime) always adds the following Component Properties that can’t be overridden: component.name – The component name. component.id – A unique value (Long) that is larger than all previously assigned values. These values are not persistent across restarts. In a life cycle method (activate/modified/deactivate) you can get the Component Properties via method parameter. The properties that are retrieved in event methods for referencing other services (bind/updated/unbind) are called Service Properties. The SCR performs a property propagation in that case, which means that all non-private Component Properties are propagated as Service Properties. To mark a property as private, the property name needs to be prefixed with a full stop (‘.’). First I will explain how to specify Component Properties in different ways. I will use a simple example that inspects the properties in a life cycle method. After that I will show some examples on the usage of properties of service references. Let’s start to create a new project for the configurable components: Create a new Plug-in Project via File -> New -> Plug-in Project. (Plug-in Perspective needs to be active) Set the Plug-in name to org.fipro.ds.configurable Press Next Ensure that no Activator is generated, no UI contributions will be added and that no Rich Client Application is created Press Finish Open the MANIFEST.MF file and switch to the Dependencies tab Add the following dependency on the Imported Packages side: org.osgi.service.component.annotations (1.2.0) Mark org.osgi.service.component.annotations as Optional via Properties… to ensure there are no runtime dependencies. We only need this dependency at build time. Create the package org.fipro.ds.configurable Inline Component Properties You can add Component Properties to a declarative service component via the @Component annotation property type element. The value of that annotation type element is an array of Strings, which need to be given as key-value pairs in the format (:)?= where the type information is optional and defaults to String. The following types are supported: String (default) Boolean Byte Short Integer Long Float Double Character There are typically two use cases for specifying Component Properties inline: Define default values for Component Properties Specify some sort of meta-data that is examined by referencing components Of course the same applies for Component Properties that are applied via Properties file, as they have an equal ranking. Create a new class StaticConfiguredComponent like shown below. It is a simple Immediate Component with the Component Properties message and iteration, where message is a String and iteration is an Integer value. In the @Activate method the Component Properties will be inspected and the message will be printed out to the console as often as specified in iteration. Remember that it is an Immediate Component, as it doesn’t implement an interface and it doesn’t specify the service type element. package org.fipro.ds.configurable; import java.util.Map; import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Component; @Component( property = { "message=Welcome to the inline configured service", "iteration:Integer=3" } ) public class StaticConfiguredComponent { @Activate void activate(Map properties) { String msg = (String) properties.get("message"); Integer iter = (Integer) properties.get("iteration"); for (int i = 1; i <= iter; i++) { System.out.println(i + ": " + msg); } System.out.println(); } } Now execute the example as a new OSGi Framework run configuration (please have a look at Getting Started with OSGi Declarative Services – 6. Run to see how to setup such a configuration). If you used the same property values as specified in the above example, you should see the welcome message printed out 3 times to the console. It is for sure not a typical use case to inspect the inline specified properties at activation time. But it should give an idea on how to specify Component Properties statically inline via @Component. Component Properties from resource files Another way specify Component Properties statically is to use a Java Properties File that is located inside the bundle. It can be specified via the @Component annotation properties type element, where the value needs to be an entry path relative to the root of the bundle. Create a simple properties file named config.properties inside the OSGI-INF folder of the org.fipro.ds.configurable bundle. message=Welcome to the file configured service iteration=4 Create a new class FileConfiguredComponent like shown below. It is a simple Immediate Component like the one before, getting the Component Properties message and iteration from the properties file. package org.fipro.ds.configurable; import java.util.Map; import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Component; @Component( properties="OSGI-INF/config.properties" ) public class FileConfiguredComponent { @Activate void activate(Map properties) { String msg = (String) properties.get("message"); String iter = (String) properties.get("iteration"); if (msg != null && iter != null) { Integer count = Integer.valueOf(iter); for (int i = 1; i <= count; i++) { System.out.println(i + ": " + msg); } System.out.println(); } } } Add the OSGI-INF/config.properties file to the build.properties to include it in the resulting bundle jar file. This is of course only necessary in case you haven’t added the whole directory to the build.properties. On executing the example you should now see the console outputs for both components. I’ve noticed two things when playing around with the Java Properties File approach: Compared with the inline properties it is not possible to specify a type. You can only get Strings, which leads to manual conversions (at least before DS 1.3 – see below). The properties file needs to be located in the same bundle as the component. It can not be added via fragment. Having these two facts in mind, there are not many use cases for this approach. IMHO this approach was intended to support client specific properties that are for example placed inside the bundle in the build process. Bndtools vs. PDE Create the config.properties file in the project root Add the -includeresource instruction to the bnd.bnd file This is necessary to include the config.properties file to the resulting bundle jar file. The instruction should look similar to the following snippet to specify the destination and the source. -includeresource: OSGI-INF/config.properties=config.properties Note: The destination is on the left side of the assignment and the source is on the right. If only the source is specified (that means no assignment), the file is added to the bundle root without the folder where it is included in the sources. Component Properties via OSGi Configuration Admin Now let’s have a look at the dynamic configuration by using the OSGi Configuration Admin. For this we create a new component, although it would not be necessary, as we could also use one of the examples before (remember that we could override the statically defined Component Properties dynamically via the Configuration Admin). But I wanted to start with creating a new component, to have a class that can be directly compared with the previous ones. To specify properties via Configuration Admin it is not required to use any additional type element. You only need to know the configuration PID of the component to be able to provide a configuration object for it. The configuration PID (Persistent IDentity) is used as a key for objects that need a configuration dictionary. With regards to the Component Configuration this means, we need the configuration PID to be able to provide the configuration object for the component. The PID can be specified via the configurationPid type element of the @Component annotation. If not specified explicitly it is the same as the component name, which is the fully qualified class name, if not explicitly set to another value. Via the configurationPolicy type element it is possible to configure the relationship between component and component configuration, e.g. whether there needs to be a configuration object provided via Configuration Admin to satisfy the component. The following values are available: ConfigurationPolicy.OPTIONAL Use the corresponding configuration object if present, but allow the component to be satisfied even if the corresponding configuration object is not present. This is the default value. ConfigurationPolicy.REQUIRE There must be a corresponding configuration object for the component configuration to become satisfied. This means that there needs to be a configuration object that is set via Configuration Admin before the component is satisfied and therefore can be activated. With this policy it is for example possible to control the startup order or component activation based on configurations. ConfigurationPolicy.IGNORE Always allow the component configuration to be satisfied and do not use the corresponding configuration object even if it is present. This basically means that the Component Properties can not be changed dynamically using the Configuration Admin. If a configuration change happens at runtime, the SCR needs to take actions based on the configuration policy. Configuration changes can be creating, modifying or deleting configuration objects. Corresponding actions can be for example that a Component Configuration becomes unsatisfied and therefore Component Instances are deactivated, or to call the modified life cycle method, so the component is able to react on a change. To be able to react on a configuration change at runtime, a method to handle the modified life cycle can be implemented. Using the DS annotations this can be done by using the @Modified annotation, where the method parameters can be the same as for the other life cycle methods (see the Getting Started Tutorial for further information on that). Note: If you do not specify a modified life cycle method, the Component Configuration is deactivated and afterwards activated again with the new configuration object. This is true for the configuration policy require as well as for the configuration policy optional. Now create a component similar to the previous ones, that should only be satisfied if a configuration object is provided via the Configuration Admin. It should also be prepared to react on configuration changes at runtime. Specify an alternative configuration PID so it is not necessary to use the full qualified class name of the component. Create a new class AdminConfiguredComponent like shown below. It is an Immediate Component that prints out a message for a specified number of iterations. Specify the configuration PID AdminConfiguredComponent so it is not necessary to use the full qualified class name of the component when trying to configure it. Set the configuration policy REQUIRE, so the component will only be activated once a configuration object is set by the Configuration Admin. Add life cycle methods for modified and deactivate to be able to play around with different scenarios. package org.fipro.ds.configurable; import java.util.Map; import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.ConfigurationPolicy; import org.osgi.service.component.annotations.Deactivate; import org.osgi.service.component.annotations.Modified; @Component( configurationPid = "AdminConfiguredComponent", configurationPolicy = ConfigurationPolicy.REQUIRE ) public class AdminConfiguredComponent { @Activate void activate(Map properties) { System.out.println(); System.out.println("AdminConfiguredComponent activated"); printMessage(properties); } @Modified void modified(Map properties) { System.out.println(); System.out.println("AdminConfiguredComponent modified"); printMessage(properties); } @Deactivate void deactivate() { System.out.println("AdminConfiguredComponent deactivated"); System.out.println(); } private void printMessage(Map properties) { String msg = (String) properties.get("message"); Integer iter = (Integer) properties.get("iteration"); if (msg != null && iter != null) { for (int i = 1; i <= iter; i++) { System.out.println(i + ": " + msg); } } } } If we now execute our example, we will see nothing new. The reason is of course that there is no configuration object yet provided by the Configuration Admin. Before we are able to do this we need to prepare our environment. That means that we need to install the Configuration Admin Service to the Eclipse IDE or the used Target Platform, as it is not part of the default installation. To install the Configuration Admin to the Eclipse IDE you need to perform the following steps: Select Help -> Install New Software… from the main menu Select the Neon – http://download.eclipse.org/releases/neon repository (assuming you are following the tutorial with Eclipse Neon, otherwise use the matching update site) Disable Group items by category Filter for Equinox Select the Equinox Compendium SDK Click Next Click Next Accept the license agreement and Finish Restart the Eclipse IDE to safely apply the changes Now we can create a Gogo Shell command that will be used to change a configuration object at runtime. Open MANIFEST.MF of org.fipro.ds.configurable Add org.osgi.service.cm to the Imported Packages Create a new package org.fipro.ds.configurable.command Create a new class ConfigureServiceCommand in that package that looks similar to the following snippet. It is a Delayed Component that will be registered as a service for the ConfigureCommand class. It has a reference to the ConfigurationAdmin service, which is used to create/get the Configuration object for the PID AdminConfiguredComponent and updates the configuration with the given values. package org.fipro.ds.configurable.command; import java.io.IOException; import java.util.Hashtable; import org.osgi.service.cm.Configuration; import org.osgi.service.cm.ConfigurationAdmin; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Reference; @Component( property = { "osgi.command.scope=fipro", "osgi.command.function=configure" }, service=ConfigureCommand.class ) public class ConfigureCommand { ConfigurationAdmin cm; @Reference void setConfigurationAdmin(ConfigurationAdmin cm) { this.cm = cm; } public void configure(String msg, int count) throws IOException { Configuration config = cm.getConfiguration("AdminConfiguredComponent"); Hashtable props = new Hashtable<>(); props.put("message", msg); props.put("iteration", count); config.update(props); } } Note: The ConfigurationAdmin reference is a static reference. Therefore it doesn’t need an unbind method. If you follow the example with Eclipse Neon you will probably see an error mentioning the missing unbind method. Either implement the unbind method for now or disable the error via Preferences. This is fixed with Eclipse Oxygen M2. Note: The two Component Properties osgi.command.scope and osgi.command.function are specified inline. These are necessary so the Apache Gogo Shell recognizes the component as a service that can be triggered by entering the corresponding values as a command to the console. This shows the usage of Component Properties as additional meta-data that is examined by other components. Also note that we need to set the service type element, as only services can be referenced by other components. To execute the example you need to include the org.eclipse.equinox.cm bundle to the Run configuration. On executing the example you should notice that the AdminConfiguredComponent is not activated on startup, although it is an Immediate Component. Now execute the following command on the console: configure foo 2 As a result you should get an output like this: AdminConfiguredComponent activated 1: foo 2: foo If you execute the command a second time with different parameters (e.g. configure bar 3), the output should change to this: AdminConfiguredComponent modified 1: bar 2: bar 3: bar The component gets activated after we created a configuration object via the Configuration Admin. The reason for this is ConfigurationPolicy.REQUIRED which means that there needs to be a configuration object for the component configuration in order to be satisfied. Subsequent executions change the configuration object, so the modified method is called then. Now you can play around with the implementation to get a better feeling. For example, remove the modified method and see how the component life cycle handling changes on configuration changes. Note: To start from a clean state again you need to check the option Clear the configuration area before launching in the Settings tab of the Run configuration. Using the modified life cycle event enables to react on configuration changes inside the component itself. To be able to react to configuration changes inside components that reference the service, the updated event method can be used. Create a simple component that references the AdminConfiguredComponent to test this: package org.fipro.ds.configurable; import java.util.Map; import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Deactivate; import org.osgi.service.component.annotations.Modified; import org.osgi.service.component.annotations.Reference; @Component public class AdminReferencingComponent { AdminConfiguredComponent component; @Activate void activate() { System.out.println("AdminReferencingComponent activated"); } @Modified void modified() { System.out.println("AdminReferencingComponent modified"); } @Deactivate void deactivate() { System.out.println("AdminReferencingComponent deactivated"); } @Reference void setAdminConfiguredComponent( AdminConfiguredComponent comp, Map properties) { System.out.println("AdminReferencingComponent: set service"); printMessage(properties); } void updatedAdminConfiguredComponent( AdminConfiguredComponent comp, Map properties) { System.out.println("AdminReferencingComponent: update service"); printMessage(properties); } void unsetAdminConfiguredComponent( AdminConfiguredComponent comp) { System.out.println("AdminReferencingComponent: unset service"); } private void printMessage(Map properties) { String msg = (String) properties.get("message"); Integer iter = (Integer) properties.get("iteration"); System.out.println("[" + msg + "|" + iter + "]"); } } Configure the AdminConfiguredComponent to be a service component by adding the attribute service=AdminConfiguredComponent.class to the @Component annotation. Otherwise it can not be referenced. @Component( configurationPid = "AdminConfiguredComponent", configurationPolicy = ConfigurationPolicy.REQUIRE, service=AdminConfiguredComponent.class ) public class AdminConfiguredComponent { Now execute the example and call the configure command two times. The result should look similar to this: osgi> configure blubb 2 AdminConfiguredComponent activated 1: blubb 2: blubb AdminReferencingComponent: set service [blubb|2] AdminReferencingComponent activated osgi> configure dingens 3 AdminConfiguredComponent modified 1: dingens 2: dingens 3: dingens AdminReferencingComponent: update service [dingens|3] Calling the configure command the first time triggers the activation of the AdminConfiguredComponent, which then can be bound to the AdminReferencingComponent, which is satisfied and therefore can be activated afterwards. The second execution of the configure command triggers the modified life cycle event of the AdminConfiguredComponent and the updated event method of the AdminReferencingComponent. If you ask yourself why the AdminConfiguredComponent is still immediately activated, although we made it a service now, the answer is, because it is referenced by an Immediate Component. Therefore the target services need to be bound, which means the referenced services need to be activated too. This example is also helpful in getting a better understanding of the component life cycle. For example, if you remove the modified life cycle method from the AdminConfiguredComponent and call the configure command subsequently, both components get deactivated and activated, which results in new instances. Modifying the @Reference attributes will also lead to different results then. Change the cardinality, the policy and the policyOption to see the different behavior. Making the service reference OPTIONAL|DYNAMIC|GREEDY results in only re-activating the AdminConfiguredComponent but keeping the AdminReferencingComponent in active state. Changing it to OPTIONAL|STATIC|GREEDY will lead to re-activation of both components, while setting it OPTIONAL|STATIC|RELUCTANT any changes will be ignored, and actually nothing happens as the AdminReferencingComponent never gets satisfied, and therefore the AdminConfiguredComponent never gets activated. The correlation between cardinality, reference policy and reference policy option is explained in detail in the OSGi Compendium Specification (table 112.1 in chapter 112.3.7 Reference Policy Option in Specification Version 6). Location Binding Some words about location binding here. The example above created a configuration object using the single parameter version of ConfigurationAdmin#getConfiguration(String). The parameter specifies the PID for which a configuration object is requested or should be created. This means that the configuration is bound to the location of the calling bundle. It then can not be consumed by other bundles. So the method is used to ensure that only the components inside the same bundle are affected. A so-called bound configuration object is sufficient for the example above, as all created components are located in the same bundle. But there are also other cases where for example a configuration service in another bundle should be used to configure the components in all bundles of the application. This can be done by creating an unbound configuration object using the two argument version of ConfigurationAdmin#getConfiguration(String, String). The first parameter is the PID and the second parameter specifies the bundle location string. Note: The location parameter only becomes important if a configuration object will be created. If a configuration for the given PID already exists in the ConfigurationAdmin service, the location parameter will be ignored and the existing object will be returned. You can use different values for the location argument: Exact bundle location identifier In this case you explicitly specify the location identifier of the bundle to which the configuration object should be bound. The location identifier is set when a bundle is installed and typically it is a file URL that points to the bundle jar. It is impossible to have that hard coded and work across multiple installations. But you could retrieve it via a snippet similar to this: Bundle adminBundle = FrameworkUtil.getBundle(AdminConfiguredComponent.class); adminBundle.getLocation() But doing this introduces a dependency to the bundle that should be configured, which is typically not a good practice. null The location value for the binding will be set when a service with the corresponding PID is registered the first time. Note that this could lead to issues if you have multiple services with the same PID in different bundles. In that case only the services in the first bundle that requests a configuration object would be able to get it because of the binding. Multi-locations By using a multi-location binding, the configurations are dispatched to any target that has visibility to the configuration. A multi-location is specified with a leading question mark. It is possible to use only the question mark or adding a multi-location name behind the question mark, e.g. Configuration config = cm.getConfiguration("AdminConfiguredComponent", "?"); Configuration config = cm.getConfiguration("AdminConfiguredComponent", "?org.fipro"); Note: The multi-location name only has importance in case security is turned on and a ConfigurationPermission is specified. Otherwise it doesn’t has an effect. That means, it can not be used to restrict the targets based on the bundle symbolic name without security turned on. Note: The Equinox DS implementation has some bugs with regards to location binding. Basically the location binding is ignored. I had a discussion on Stackoverflow (thanks again to Neil Bartlett) and created the ticket Bug 493637 to address that issue. I also created Bug 501898 to report that multi-location binding doesn’t work. To get familiar with the location binding basics create two additional bundles: Create the bundle org.fipro.ds.configurator Open the MANIFEST.MF file and switch to the Dependencies tab Add the following dependencies on the Imported Packages side: org.osgi.service.cm org.osgi.service.component.annotations (1.2.0) Mark org.osgi.service.component.annotations as Optional Create the package org.fipro.ds.configurator Create the class ConfCommand Copy the ConfigureCommand implementation Change the property value for osgi.command.function to conf Change the method name from configure to conf to match the osgi.command.function property Create the bundle org.fipro.ds.other Open the MANIFEST.MF file and switch to the Dependencies tab Add the following dependency on the Imported Packages side: org.osgi.service.component.annotations (1.2.0) Mark org.osgi.service.component.annotations as Optional Create the package org.fipro.ds.other Create the class OtherConfiguredComponent Copy the AdminConfiguredComponent implementation Change the console outputs to show the new class name Ensure that it is an Immediate Component (i.e. remove the service property or add the immediate property) Ensure that configurationPID and configurationPolicy are the same as in AdminConfiguredComponent Use three different scenarios: Use the single parameter getConfiguration(String) Calling the conf command on the console will result in nothing. As the configuration object is bound to the bundle of the command, the other bundles don’t see it and the contained components don’t get activated. Use the double parameter getConfiguration(String, String) where location == null Only the component(s) of one bundle will receive the configuration object, as it will be bound to the bundle that first registers a service for the corresponding PID. Use the double parameter getConfiguration(String, String) where location == “?” The components of both bundles will receive the configuration object, as it is dispatched to all bundles that have visibility to the configuration. And as we didn’t mention and configure permissions, all our bundles receive it. Note: Because of the location binding issues in Equinox DS (see above), the examples doesn’t work using it. For testing I replaced Equinox DS with Apache Felix SCR in the Run Configuration, which worked well. To make this work just download SCR (Declarative Services) from the Apache Felix Download page and put it in the dropins folder of your Eclipse installation. After restarting the IDE you are able to select org.apache.felix.scr as bundle in the Run Configuration. Remember to remove org.eclipse.equinox.ds to ensure that only one SCR implementation is running. Bndtools vs. PDE For the org.fipro.ds.configurable bundle you need to add the package org.fipro.ds.configurable.command to the Private Packages in the bnd.bnd file. Otherwise it will not be part of the resulting bundle. While we needed to add the Import-Package statement for org.osgi.service.cm manually in PDE, that import is automatically calculated by Bndtools. So at that point there is no action necessary. Only the launch configuration needs to be updated manually to include the Configuration Admin bundle. Open the launch.bndrun file On the Run tab click on Resolve Verify the values values shown in the opened dialog in the Required Resources section Click Finish If you change a component class while the example is running, you will notice that the OSGi framework automatically restarts and the values set before via Configuration Admin are gone. This is because the Bndtools OSGi Framework launch configuration has two options enabled by default on the OSGi tab: Framework: Update bundles during runtime. Framework: Clean storage area before launch. To test the behavior of components in case of persisted configuration values, you need to disable these settings. DS 1.3 A new feature added to the DS 1.3 specification are the Component Property Types. They can be used as alternative to the component property Map parameter for retrieving the Configuration Properties in a life cycle method. The Component Property Type is specified as a custom annotation type, that contains property names, property types and default values. The following snippet shows the definition of such an annotation for the above examples: package org.fipro.ds.configurable; public @interface MessageConfig {     String message() default "";     int iteration() default 0; } Most of the examples found in the web show the definition of the annotation inside the component class. But of course it is also possible to create a public annotation in a separate file so it is reusable in multiple components. The following snippet shows one of the examples above, modified to use a Component Property Type instead of the property Map. package org.fipro.ds.configurable; import java.util.Map; import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Component; @Component( property = { "message=Welcome to the inline configured service", "iteration:Integer=3" } ) public class StaticConfiguredComponent { @Activate void activate(MessageConfig config) { String msg = config.message(); int iter = config.iteration(); for (int i = 1; i <= iter; i++) { System.out.println(i + ": " + msg); } } } Note: If properties are needed that are not specified in the Component Property Type, you can have both as method arguments. Since DS 1.3 there are different method signatures supported, including the combination of Component Property Type and the component property Map. Although the Component Property Type is defined as an annotation type, it is not used as an annotation. The reasons for choosing annotation types are: Limitations on annotation type definitions match component property types (no-argument methods and limited return types supported) Support of default values As Component Property Types are intended to be type safe, an automatic conversion happens. This is also true for Component Properties that are specified via Java Properties files. To set configuration values via ConfigurationAdmin service you still need to operate on a Dictionary, which means you need to know the parameter names. But of course on setting the values you are type safe. Another new feature in DS 1.3 is that you can specify multiple configuration PIDs for a component. This way it is for example possible to specify configuration objects for multiple components that share a common PID, while at the same time having a specific configuration object for a single component. To specify multiple configuration PIDs and still keep the default (that is the component name), the placeholder “$” can be used. By adding the following property to the StaticConfiguredComponent and the FileConfiguredComponent created before, the execution of the configure command will update all three components at once. @Component( configurationPid = {"$", "AdminConfiguredComponent"}, ... ) Note that we don’t update the configurationPid value of AdminConfiguredComponent. The reason for this is that we use the configuration policy REQUIRE, which means that the component only gets satisfied if there are configuration objects available for BOTH configuration PIDs. And our example does not create a configuration object for the default PID of the AdminConfiguredComponent. The order of the configuration PIDs matters with regards to property propagation. The configuration object for a PID at the end overrides values that were applied by another configuration object for a PID before. This is similar to the propagation of inline properties or property files. The processing is sequential and therefore later processed instructions override previous ones. Service Properties As initially explained there is a slight difference between Component Properties and Service Properties. Component Properties are all properties specified for a component that can be accessed in life cycle methods via method parameter. Service Properties can be retrieved via Event Methods (bind/updated/unbind) or since DS 1.3 via field strategy. They contain all public Component Properties, which means all excluding those whose property names start with a full stop. Additionally some service properties are added that are intended to give additional information about the service. These properties are prefixed with service, set by the framework and specified in the OSGi Core Specification (service.id, service.scope and service.bundeid). To play around with Service Properties we set up another playground. For this create the following bundles to simulate a data provider service: API bundle Create the bundle org.fipro.ds.data.api Add the following service interface package org.fipro.ds.data; public interface DataService { /** * @param id * The id of the requested data value. * @return The data value for the given id. */ String getData(int id); } Modify the MANIFEST.MF to export the package Online data service provider bundle Create the bundle org.fipro.ds.data.online Add the necessary package import statements to the MANIFEST.MF Create the following simple service implementation, that specifies the property fipro.connectivity=online for further use package org.fipro.ds.data.online; import org.fipro.ds.data.DataService; import org.osgi.service.component.annotations.Component; @Component(property="fipro.connectivity=online") public class OnlineDataService implements DataService { @Override public String getData(int id) { return "ONLINE data for id " + id; } } Offline data service provider bundle Create the bundle org.fipro.ds.data.offline Add the necessary package import statements to the MANIFEST.MF Create the following simple service implementation, that specifies the property fipro.connectivity=offline for further use package org.fipro.ds.data.offline; import org.fipro.ds.data.DataService; import org.osgi.service.component.annotations.Component; @Component(property="fipro.connectivity=offline") public class OfflineDataService implements DataService { @Override public String getData(int id) { return "OFFLINE data for id " + id; } } Note: For Java best practices you would of course specify the property name and the possible values as constants in the API bundle to prevent typing errors. To be able to interact with the data provider services, we create an additional console command in the bundle  that references the services and shows the retrieved data on the console on execution. Add it to the bundle org.fipro.ds.configurator or create a new bundle if you skipped the location binding example. package org.fipro.ds.configurator; import java.util.ArrayList; import java.util.List; import java.util.Map; import org.fipro.ds.data.DataService; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Reference; import org.osgi.service.component.annotations.ReferenceCardinality; import org.osgi.service.component.annotations.ReferencePolicy; @Component( property= { "osgi.command.scope:String=fipro", "osgi.command.function:String=retrieve"}, service=DataRetriever.class ) public class DataRetriever { private List dataServices = new ArrayList<>(); @Reference( cardinality=ReferenceCardinality.MULTIPLE, policy=ReferencePolicy.DYNAMIC ) void addDataService( DataService service, Map properties) { this.dataServices.add(service); System.out.println( "Added " + service.getClass().getName()); } void removeDataService(DataService service) { this.dataServices.remove(service); System.out.println( "Removed " + service.getClass().getName()); } public void retrieve(int id) { for (DataService service : this.dataServices) { System.out.println(service.getData(id)); } } } Add the new bundles to an existing Run Configuration and execute it. By calling the retrieve command on the console you should get an output similar to this: osgi> retrieve 3 OFFLINE data for id 3 ONLINE data for id 3 Nothing special so far. Now let’s modify the example to verify the Service Properties. Modify DataRetriever#addDataService() to print the given properties to the console @Reference( cardinality=ReferenceCardinality.MULTIPLE, policy=ReferencePolicy.DYNAMIC ) void addDataService( DataService service, Map properties) { this.dataServices.add(service); System.out.println("Added " + service.getClass().getName()); properties.forEach((k, v) -> { System.out.println(k+"="+v); }); System.out.println(); } Start the example and execute the retrieve command. The result should now look like this: osgi> retrieve 3 org.fipro.ds.data.offline.OfflineDataService fipro.connectivity=offline component.id=3 component.name=org.fipro.ds.data.offline.OfflineDataService service.id=51 objectClass=[Ljava.lang.String;@1403f0fa service.scope=bundle service.bundleid=5 org.fipro.ds.data.online.OnlineDataService fipro.connectivity=online component.id=4 component.name=org.fipro.ds.data.online.OnlineDataService service.id=52 objectClass=[Ljava.lang.String;@c63166 service.scope=bundle service.bundleid=6 OFFLINE data for id 3 ONLINE data for id 3 The Service Properties contain the fipro.connectivity property specified by us, aswell as several properties that are set by the SCR. Note: The DataRetriever is not in Immediate Component and therefore gets activated when the retrieve command is executed the first time. The target services are bound at activation time, therefore the setter is called at that time and not before. Modify the OfflineDataService Add an Activate life cycle method Add a property with a property name that starts with a full stop package org.fipro.ds.data.offline; import java.util.Map; import org.fipro.data.Constants; import org.fipro.data.DataService; import org.osgi.service.component.annotations.Activate; import org.osgi.service.component.annotations.Component; @Component( property= { "fipro.connectivity=offline", ".private=private configuration" } ) public class OfflineDataService implements DataService { @Activate void activate(Map properties) { System.out.println("OfflineDataService activated"); properties.forEach((k, v) -> { System.out.println(k+"="+v); }); System.out.println(); } @Override public String getData(int id) { return "OFFLINE data for id " + id; } } Execute the retrieve command again and verify the console output. You will notice that the output from the Activate life cycle method contains the .private property but no properties with a service prefix. The output from the bind event method on the other hand does not contain the .private property, as the leading full stop marks it as a private property. osgi> retrieve 3 OfflineDataService activated objectClass=[Ljava.lang.String;@c60d42 component.name=org.fipro.ds.data.offline.OfflineDataService component.id=3 .private=private configuration fipro.connectivity=offline org.fipro.ds.data.offline.OfflineDataService fipro.connectivity=offline component.id=3 component.name=org.fipro.ds.data.offline.OfflineDataService service.id=51 objectClass=[Ljava.lang.String;@2b5d77a6 service.scope=bundle service.bundleid=5 ... Service Ranking In case multiple services of the same type are available, the service ranking is taken into account to determine which service will get bound. In case of multiple bindings the service ranking effects in which order the services are bound. The ranking order is defined as follows: Sorted on descending ranking order (highest first) If the ranking numbers are equal, sorted on ascending service.id property (oldest first) As service ids are never reused and handed out in order of their registration time, the ordering is always complete. The property service.ranking can be used to specify the ranking order and in case of OSGi components it can be specified as a Component Property via @Component where the value needs to be of type Integer. The default ranking value is zero if the property is not specified explicitly. Modify the two DataService implementations to specify the initial service.ranking property. @Component( property = { "fipro.connectivity=online", "service.ranking:Integer=7" } ) public class OnlineDataService implements DataService { ... @Component( property = { "fipro.connectivity=offline", "service.ranking:Integer=5", ".private=private configuration } ) public class OfflineDataService implements DataService { ... If you start the application and execute the retrieve command now, you will notice that the OnlineDataService is called first. Change the service.ranking of the OnlineDataService to 3 and restart the application. Now executing the retrieve command will first call the OfflineDataService. To make this more obvious and show that the service ranking can also be changed dynamically, create a new DataGetter command in the org.fipro.ds.configurator bundle: package org.fipro.ds.configurator; import java.util.Map; import org.fipro.ds.data.DataService; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Reference; import org.osgi.service.component.annotations.ReferencePolicy; import org.osgi.service.component.annotations.ReferencePolicyOption; @Component( property= { "osgi.command.scope:String=fipro", "osgi.command.function:String=get" }, service=DataGetter.class ) public class DataGetter { private DataService dataService; @Reference( policy=ReferencePolicy.DYNAMIC, policyOption=ReferencePolicyOption.GREEDY ) void setDataService(DataService service, Map properties) { this.dataService = service; } void unsetDataService(DataService service) { if (service == this.dataService) { this.dataService = null; } } public void get(int id) { System.out.println(this.dataService.getData(id)); } } This command has a MANDATORY reference to a DataService. The policy option is set to GREEDY which is necessary to bind to a higher ranked service if available. The policy is set to DYNAMIC to avoid re-activation of the DataGetter component if a service changes. If you change the policy to STATIC, the binding to the higher ranked service is done by re-activating the component. Note: For dynamic references the unbind event method is mandatory. This is necessary because the component is not re-activated if the bound services change, which means there will be no new Component Instance. Therefore the Component Instance state needs to be secured in the unbind method. In our case we check if the current service reference is the same that should be unbound. In that case we set the reference to null, otherwise there is already another service bound. Finally create a toggle command, which dynamically toggles the service.ranking property of OnlineDataService. package org.fipro.ds.configurator; import java.io.IOException; import java.util.Dictionary; import java.util.Hashtable; import org.osgi.service.cm.Configuration; import org.osgi.service.cm.ConfigurationAdmin; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Reference; @Component( property= { "osgi.command.scope:String=fipro", "osgi.command.function:String=ranking" }, service=ToggleRankingCommand.class ) public class ToggleRankingCommand { ConfigurationAdmin admin; @Reference void setConfigurationAdmin(ConfigurationAdmin admin) { this.admin = admin; } public void ranking() throws IOException { Configuration configOnline = this.admin.getConfiguration( "org.fipro.ds.data.online.OnlineDataService", null); Dictionary propsOnline = null; if (configOnline != null && configOnline.getProperties() != null) { propsOnline = configOnline.getProperties(); } else { propsOnline = new Hashtable<>(); } int onlineRanking = 7; if (configOnline != null && configOnline.getProperties() != null) { Object rank = configOnline.getProperties().get("service.ranking"); if (rank != null) { onlineRanking = (Integer)rank; } } // toggle between 3 and 7 onlineRanking = (onlineRanking == 7) ? 3 : 7; propsOnline.put("service.ranking", onlineRanking); configOnline.update(propsOnline); } } Starting the example application the first time and executing the get command will return the ONLINE data. After executing the ranking command, the get command will return the OFFLINE data (or vice versa dependent on the initial state). Note: Equinox DS will log an error or warning to the console every second time. Probably an issue with processing the service reference update in Equinox DS. The example will still work, and if you replace Equinox DS with Felix SCR the message does not come up. So it looks like another Equinox DS issue. Reference Properties Reference Properties are special Component Properties that are associated with specific component references. They are used to configure component references more specifically. With DS 1.2 the target property is the only supported Reference Property. The reference property name needs to follow the pattern . so it can be accessed dynamically. The target property can be specified via the @Reference annotation on the bind event method via the target annotation type element. The value needs to be an LDAP filter expression and is used to select target services for the reference. The following example specifies a target property for the DataService reference of the DataRetriever command to only select target services which specify the Service Property fipro.connectivity with value online. @Reference( cardinality=ReferenceCardinality.MULTIPLE, policy=ReferencePolicy.DYNAMIC, target="(fipro.connectivity=online)" ) If you change that in the example and execute the retrieve command in the console again, you will notice that only the OnlineDataService will be selected by the DataRetriever. Specifying the target property directly on the reference is a static way of defining the filter. The registering of custom commands to the Apache Gogo Shell seems to work that way, as you can register any service to become a console command when the necessary properties are specified. In a dynamic environment it needs to be possible to change the target property at runtime aswell. This way it is possible to react on changes to the environment for example, like whether there is an active internet connection or not. To change the target property dynamically you can use the ConfigurationAdmin service. For this the reference property name needs to be known. Following the pattern     . this means for our example where     reference_name = DataService     reference_property = target the reference property name is     DataService.target To test this we implement a new command component in org.fipro.ds.configurator that allows us to toggle the connectivity state filter on the DataService reference target property. package org.fipro.ds.configurator; import java.io.IOException; import java.util.Dictionary; import java.util.Hashtable; import org.osgi.service.cm.Configuration; import org.osgi.service.cm.ConfigurationAdmin; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Reference; @Component( property= { "osgi.command.scope:String=fipro", "osgi.command.function:String=toggle" }, service=ToggleConnectivityCommand.class ) public class ToggleConnectivityCommand { ConfigurationAdmin admin; @Reference void setConfigurationAdmin(ConfigurationAdmin admin) { this.admin = admin; } public void toggle() throws IOException { Configuration config = this.admin.getConfiguration( "org.fipro.ds.configurator.DataRetriever"); Dictionary props = null; Object target = null; if (config != null && config.getProperties() != null) { props = config.getProperties(); target = props.get("DataService.target"); } else { props = new Hashtable(); } boolean isOnline = (target == null || target.toString().contains("online")); // toggle the state StringBuilder filter = new StringBuilder("(fipro.connectivity="); filter.append(isOnline ? "offline" : "online").append(")"); props.put("DataService.target", filter.toString()); config.update(props); } } Some things to notice here: We use the default PID org.fipro.ds.data.configurator.DataRetriever to get a configuration object. We check if there is already an existing configuration. If there is an existing configuration we operate on the existing Dictionary. Otherwise we create a new one. We try to get the current state from the Dictionary. We create an LDAP filter String based on the retrieved information (or default if the configuration is created) and set it as reference target property. We update the configuration with the new values. From my observation the reference policy and reference policy option doesn’t matter in that case. On changing the reference target property dynamically, the component gets re-activated to ensure a consistent state. DS 1.3 With DS 1.3 the Minimum Cardinality Reference Property was introduced. Via this reference property it is possible to modify the minimum cardinality value at runtime. While it is only possible to specify the optionality via the @Reference cardinality attribute (this means 0 or 1), you can specify any positive number for MULTIPLE or AT_LEAST_ONE references. So it can be used for example to specify that at least 2 services of a special type needs to be available in order to satisfy the Component Configuration. The name of the minimum cardinality property is the name of the reference appended with .cardinality.minimum. In our example this would be DataService.cardinality.minimum Note: The minimum cardinality can only be specified via the cardinality attribute of the reference element. So it is only possible to specify the optionality to be 0 or 1. To specify the minimum cardinality in an extended way, the minimum cardinality reference property needs to be applied via Configuration Admin. Create a command component in org.fipro.ds.configurator to modify the minimum cardinality property dynamically. It should look like the following example: package org.fipro.ds.configurator; import java.io.IOException; import java.util.Dictionary; import java.util.Hashtable; import org.osgi.service.cm.Configuration; import org.osgi.service.cm.ConfigurationAdmin; import org.osgi.service.component.annotations.Component; import org.osgi.service.component.annotations.Reference; @Component( property = { "osgi.command.scope=fipro", "osgi.command.function=cardinality" }, service=ToggleMinimumCardinalityCommand.class ) public class ToggleMinimumCardinalityCommand { @Reference ConfigurationAdmin admin; public void cardinality(int count) throws IOException { Configuration config = this.admin.getConfiguration( "org.fipro.ds.configurator.DataRetriever"); Dictionary props = null; if (config != null && config.getProperties() != null) { props = config.getProperties(); } else { props = new Hashtable(); } props.put("DataService.cardinality.minimum", count); config.update(props); } } Launch the example and execute retrieve 3. You should get a valid response like before from a single service (online or offline dependent on the target property that is set). Now if you execute cardinality 2 and afterwards retrieve 3 you should get a CommandNotFoundException. Checking the components on the console via scr:list will show that org.fipro.ds.configurator.DataRetriever now has a unsatisfied reference. Calling cardinality 1 afterwards will resolve that again. Now you can play around and create additional services to test if this is also working for values > 1. While I was writing on this blog post, finding and reporting some issues in Equinox DS, the following ticket was created Bug 501950. If everything works out, Equinox DS will be replaced with Felix SCR. This would solve several issues and finally bring DS 1.3 also to Eclipse. So I cross my fingers that this ticket will be fixed for Oxygen. (which on the other hand means some work for the DS Annotations @pnehrer) That’s if for this blog post. It again got much longer than I intended. But on the way writing the blog post I again learned a lot that wasn’t clear to me before. I hope you also could take something out of it to use declarative services even more in your projects. Of course you can find the sources of this tutorial in my GitHub account: PDE Examples Bndtools Examples [Less]
Posted 3 days ago
With a recent patch, Eclipse can now show you the return value of a method during a debug session. For years, when I was debugging and I needed to see the return value of a method, I would change code like: return function();   To: String retVal ... [More] = function(); return retVal; And then step through the code and inspect the value of “retVal”. Recently [September 2016] a patch was merged to support this feature. Now when you return from a method, in the upper method, in the variable view it shows the return value of the previously finished call: As a side note, the reason this was not implemented sooner is that the Java virtual machine debugger did not provide this information until Java 1.6. If your version of Eclipse doesn’t yet have that feature, try downloading a recent integration or nightly build. Happy debugging.     [Less]
Posted 6 days ago
JBoss Tools 4.4.1 and Red Hat JBoss Developer Studio 10.1 for Eclipse Neon are here waiting for you. Check it out! Installation JBoss ... [More] Developer Studio comes with everything pre-bundled in its installer. Simply download it from our JBoss Products page and run it like this: java -jar jboss-devstudio-.jar JBoss Tools or Bring-Your-Own-Eclipse (BYOE) JBoss Developer Studio require a bit more: This release requires at least Eclipse 4.6 (Neon) but we recommend using the latest Eclipse 4.6 Neon JEE Bundle since then you get most of the dependencies preinstalled. Once you have installed Eclipse, you can either find us on the Eclipse Marketplace under "JBoss Tools" or "Red Hat JBoss Developer Studio". For JBoss Tools, you can also use our update site directly. http://download.jboss.org/jbosstools/neon/stable/updates/ What is new? Our main focus for this release was improvements for container based development and bug fixing. Improved OpenShift 3 and Docker Tools We continue to work on providing better experience for container based development in JBoss Tools and Developer Studio. Let’s go through a few interesting updates here. Support for Container Labels Users can now specify labels when running a container. The labels are saved in the launch configuration and can also be edited before relaunching the container. Automatically detect known Docker daemon connections When the Docker Explorer view is opened, the list of existing connections (saved from a previous session) is reloaded. In addition to this behaviour, the view will also attempt to find new connections using default settings such the &aposunix:///var/run/docker.sock&apos Unix socket or the &aposDOCKER_HOST&apos, &aposDOCKER_CERT_PATH&apos and &aposDOCKER_TLS_VERIFY&apos environment variables. This means that by default, in a new workspace, if a Docker daemon is reachable using one of those methods, the user does not have to use the "New Connection" wizard to get a connection. Extension point for Docker daemon connection settings An extension point has been added to the Docker core plugin to allow for custom connection settings provisionning. Support for Docker Compose Support for Docker Compose has finally landed ! Users can select a docker-compose.yml file and start Docker Compose from the context menu, using the Run > Docker Compose launcher shortcut. The Docker Compose process displays it logs (with support for text coloring based on ANSI escape codes) and provides a stop button to stop the underlying process. Also, as with the support for building and running containers, a launch configuration is created after the first call to Docker Compose on the selected docker-compose.yml file. Docker Image Hierarchy View Improvements The new Docker Image Hierarchy view not only shows the relationships between images (which is particularly interesting when an image is built using a Dockerfile), but it also includes containers based on the images in the tree view while providing with all relevant commands (in the context menu) for containers and images. Server templates can now be displayed / edited Server templates are now displayed in the property view under the Templates tab: You can access/edit the content of the template with the Edit command. Events can now be displayed Events generated as part of the application livecycle are now displayed in the property view under the Events tab (available at the project level): You can refresh the content of the event with the Refresh command or open the event in the OpenShift web console with the Show In → Web Console command. Volume claims can now be displayed Volume claims are now displayed in the property view under the Storage tab (available at the project level): You can create a new volume claim using a resource file like the following: { "apiVersion": "v1", "kind": "PersistentVolumeClaim", "metadata": { "name": "claim1" }, "spec": { "accessModes": [ "ReadWriteOnce" ], "resources": { "requests": { "storage": "1Gi" } } } } If you deploy such a resource file with the New → Resource command at the project level, the Storage tab will be updated: You can access/edit the content of the volume claim with the Edit command or open the volume claim in the OpenShift web console with the Show In → Web Console command. Server Tools QuickFixes now available in runtime detection Runtime detection has been a feature of JBossTools for a long while, however, it would sometimes create runtime and server adapters with configuration errors without alerting the user. Now, the user will have an opportunity to execute quickfixes before completing the creation of their runtimes and servers. To see this in action, we can first open up the runtime-detection preference page. We can see that our runtime-detection will automatically search three paths for valid runtimes of any type. Once we click search, the runtime-detection’s search dialog appears, with results it has found. In this case, it has located an EAP 6.4 and an EAP 7.0 installation. However, we can see that both have errors. If we click on the error column for the discovered EAP 7.0, the error is expanded, and we see that we’re missing a valid / compatible JRE. To fix the issue, we should click on this item. When we click on the problem for EAP 7, the new JRE dialog appears, allowing us to add a compatible JRE. The dialog helpfully informs us of what the restrictions are for this specific runtime. In this case, we’re asked to define a JRE with a minimum version of Java-8. If we continue along with the process by locating and adding a Java 8 JRE, as shown above, and finish the dialog, we’ll see that all the errors will disappear for both runtimes. In this example, the EAP 6.4 required a JRE of Java 7 or higher. The addition of the Java 8 JRE fixed this issue as well. Hopefully, this will help users preemptively discover and fix errors before being hit with surprising errors when trying to use the created server adapters. Support for WildFly 10.1 The WildFly 10.0 Server adapter has been renamed to WildFly 10.x. It has been tested and verified to work for WildFly 10.1 installations. Hibernate Tools Hibernate Runtime Provider Updates A number of additions and updates have been performed on the available Hibernate runtime providers. New Hibernate 5.2 Runtime Provider With final releases available in the Hibernate 5.2 stream, the time was right to make available a corresponding Hibernate 5.2 runtime provider. This runtime provider incorporates Hibernate Core version 5.2.2.Final and Hibernate Tools version 5.2.0.Beta1. Figure 1. Hibernate 5.2 is available Other Runtime Provider Updates The Hibernate 4.3 runtime provider now incorporates Hibernate Core version 4.3.11.Final and Hibernate Tools version 4.3.5.Final. The Hibernate 5.0 runtime provider now incorporates Hibernate Core version 5.0.10.Final and Hibernate Tools version 5.0.2.Final. The Hibernate 5.1 runtime provider now incorporates Hibernate Core version 5.1.1.Final and Hibernate Tools version 5.1.0.CR1. Forge Tools Added Install addon from the catalog command From Forge 3.3.0.Final onwards it is now possible to query and install addons listed in the Forge addons page. Forge Runtime updated to 3.3.1.Final The included Forge runtime is now 3.3.1.Final. Read the official announcement here. Freemarker Freemarker 2.3.25 Freemarker library included in the Freemarker IDE was updated to latest available version 2.3.25. flth / fltx file extensions added The new flth and fltx extensions have been added and associated with Freemarker IDE. flth stands for HTML content whereas fltx stands for XML content. Overhaul of the plugin template parser The parser that FreeMarker IDE uses to extract IDE-centric information (needed for syntax highlighting, related tag highlighting, auto-completion, outline view, etc.) was overhauled. Several bugs were fixed, and support for the newer template language features were added. Also, the syntax highlighting is now more detailed inside expressions. Related tag background highlighting fixed Fixed the issue when the (by default) yellow highlighting of the related FTL tags shift away from under the tag as you type. Showing whitespace, block selection mode The standard "Show whitespace characters" and "Toggle block selection mode" icons are now available when editing a template. Improved automatic finishing of FreeMarker constructs When you type , , ${, #{ and the freemarker editor now automatically closes them. Error positions links on the console When a FreeMarker exception is printed to the console, the error position in it is a link that navigates to the error. This has worked long ago, but was broken for quite a while. Fixed auto-indentation When hitting enter, sometimes the new line haven’t inherited the indentation of the last line. Updated the "database" used for auto completion Auto completion now knows all directives and "built-ins" up to FreeMarker 2.3.25. What is next? Having JBoss Tools 4.4.1 and Developer Studio 10.1 out we are already working on the next maintenance release for Eclipse Neon.1. Enjoy! Jeff Maury [Less]
Posted 6 days ago by nore...@blogger.com (Christian Pontesegger)
Having support for the internal browser is often not working out of the box on linux. You can check the status by opening your Preferences/General/Web Browser settings. If the radio button Use internal  web browser is enabled (not necessarily ... [More] activated) internal browser support is working, otherwise not.Most annoyingly without internal browser support help hovers in your text editors use a fallback mode not rendering links or images.To solve this issue you may first check the SWT FAQ. For me working on gentoo linux the following command fixed the problem:emerge net-libs/webkit-gtk:2It is important to not only install the latest version of webkit-gtk which will not be recognized by Eclipse. After installation restart eclipse and your browser should work. Verified on Eclipse Neon. [Less]
Posted 7 days ago
Angular 2 is a framework for building desktop and mobile web applications. After hearing rave reviews about Angular 2, I decided to check it out and take my first steps into modern web development. In this article, I’ll show you how to create a ... [More] simple master-details application using Angular 2, TypeScript, Angular CLI and Eclipse […] The post Creating My First Web App with Angular 2 in Eclipse appeared first on Genuitec. [Less]
Posted 8 days ago
Eclipse 4.7 M2 is out with a focus on usability. From simplified filter functionality in the Problems, Bookmark and Task view, improved color usage for popups, simplified editor assignments for file extensions, enhancements of quick access, a ... [More] configurable compare direction in the compare editor, etc. you will find lots of nice goodies which will increase your love with the Eclipse IDE. Also the background jobs API has been improved and we run jobs still fast, even if you do a lot of status updates in your job implementation. Checkout the Eclipse 4.7 M2 Notes and Noteworthy for the details. [Less]
Posted 8 days ago
Just a basic into to Eclipse, aimed at people who are new to Java. Covers creating a new project, debugging, common shortcuts/navigation, git. Workspace Workspace contains your settings, ex your keyboard shortcut preferences, list of your open ... [More] projects. You can have multiple workspaces. You can switch between Workspaces via File ->Switch Workspaces Projects A project is essentially an application, or a library to an application. Projects can be opened or closed. Content of closed projects don’t appear in searches. Hello world Project To run some basic java code: File -> New -> Java project Give the project some name ->  finish. Right click on src -> New -> Class Give your Class some name, check “Public static void main(String [] args)” Add “Hello World” print line: System.out.println(“Hello world”); Right click on “SomeName.java” -> run as -> Java Application Output printed in Console: Next time you can run the file via run button: Or via “Ctrl+F11” Debugging Set a breakpoint by double clicking on the line numbers in the margin, then click on the bug icon or right click and “Debug as” -> “Java appliaction” For more info on debugging, head over to Vogella: http://www.vogella.com/tutorials/EclipseDebugging/article.html Switching perspectives Eclipse has the notion of Perspectives. One is for Java development, one for debugging, (others could be C++ development, or task planning etc..). It’s basically a customisation of features and layout. When you finish debugging, you can switch back to the java perspective: Common keyboard shortcuts Ctrl+/    – comment code “//” Ctrl+shift+/    – comment code ‘/* … */’ Ctrl+F11   – Run last run configuration Ctrl+Shift+L  Keyboard reminder cue sheet. (Type to search) Ctrl+Shift+L, then Ctrl+Shift+L again, open keyboard preferences. Ctrl+O – Java quick method Outline: Note: Regex and case search works. Ex “*Key” will find “getBackgroundColorKey()”, so will  “gBFCK”. Ctrl+Shift+r – search for resource (navigate between your classes). Ctrl+Shift+f – automatically format the selected code. (Or all code if no block selected). For more on shortcuts, head over to Vogella: http://www.vogella.com/tutorials/EclipseShortcuts/article.html Source code navigation Right click on a method/variable to bring up a context menu, from there select: Open Declaration (F3) This is one of the most used functions. It’s a universal “jump to where the method/variable/class/constant is defined”.   Open Call hierarchy See where variable or method is called. Tip: For variables, you can narrow down the Field Access so that it only shows where a field is read/written. Quick outline (Ctrl+O) The quick outline is a quick way to find a function in your class. It has regex support and case search. E.g “*size” will find any method with ‘size’ in it and “cSI” will find ‘computeSizeInPixels’. Tip: Press Ctrl+O again and you will be shown methods that get inherited from parent classes. Navigate to super/implementation (Ctrl+click) Sometimes you may want to see which sub-classes overrides a method. You can hover over a method and press ctrl+click, then on “Open Implementation”. You will be presented with a list of sub-implementations. You can similarly navigate to parent classes. Code completion Predict variable names, method names, Start typing something, press: “Ctrl+space” It can complete by case also, ex if you type “mOF” and press Ctrl+Space, it will expand to “myOtherFunction()”. Templates Typing “System.out.println();” is tedious. Instead you can type “syso” and then press ‘ctrl+space’. Eclipse can fill in the template code. You can find more on templates in Eclipse Preferences. Git integration 99% of my git workflow happens inside Eclipse. You will want to open three useful views: Window -> Show view -> others Team -> History git -> Git Repositories git -> Git Staging You can manage git repositories in the “Git Repositories” view: You can add changed files in the “Git Staging View” via drag and drop, and fill in the commit message. You can view your changes by double clicking on the files: In the “History” view, you can create new branches, cherry pick commits, checkout older versions. Compare current files to previous versions etc.. More on Eclipse If you want to know more about the Eclipse interface, feel free to head over to Vogella’s in-depth Eclipse tutorial: http://www.vogella.com/tutorials/Eclipse/article.html Also free free to leave comments with questions. [Less]
Posted 9 days ago
It’s been a crazy week if you follow the ide-dev mailing list at Eclipse. We’ve had many posts over the years discussing our competitive relationship with IntelliJ and the depression that sets in when we try to figure out how to make Eclipse make ... [More] better so people don’t hate on it so much and then how nothing changes. This time, though,  sparked by what seemed to be an innocent post by Mickael Istria about yet another claim that IntelliJ has better content assist (which from what I’ve seen, it actually does). This time it sparked a huge conversation with many Eclipse contributors chiming in with their thoughts about where we are with the Eclipse IDE and what needs to be done to make things better. A great summary of the last few days has been captured in a German-language Jaxenter article. The difference this time is that it’s actually sparked action. Mickael, Pascal Rapicault, and others have switched some of their focus on the low hanging user experience issues and are providing fixes for them. The community has been activated and I love seeing it. Someone asked why the Architecture Council at Eclipse doesn’t step in and help guide some of this effort and after discussing it at our monthly call, we’ve decided to do just that. Dani Megert and I will revive the UI Guidelines effort and update the current set and extend it to more general user experience guidance. We’ll use the UI Best Practices group mailing list to hold public discussions to help with that. Everyone is welcome to participate. And I’m sure the ide-dev list will continue to be busy as contributors discuss implementation details. Eclipse became the number one Java IDE with little marketing. Back in the 2000’s developers were hungry for a good Java IDE and since Eclipse was free and easy to set up (yes, unzipping the IDE wasn’t that bad an experience) and worked well, had great static analysis and refactoring, they fell in love with it. Other IDEs have caught up and in certain areas passed Eclipse and, yes, IntelliJ has become more popular. It’s not because of marketing. Developers decide what they like to use by downloading it and trying it out. As long as we keep our web presence in shape that developers can find the IDE, especially the Java one, and then keep working to make it functionally the best IDE we can, we’ll be able to continue to serve the needs of developers for a long time. Our best marketing comes from our users. That’s the same with all technology these days. I’d rather hear from someone who’s tried Docker Swarm than believe what the Docker people are telling me (for example). That’s how we got Eclipse to number one, and where we need to focus to keep the ball rolling. And as a contributor community, we’re working hard to get them something good to talk about. [Less]
Posted 11 days ago
Over the last few days, a large group of my minions and admires met in Sweden at EMD2017 to talk about me…in all my incarnation. One of the most polarizing discussion was about whether I should stay graphical or whether I also needed to be textual. ... [More] For those who do not know, I am a UML-based modeling tool and therefore graphical by nature. However, some of my minions think that I would be more usable if I also allowed them to create/edit models using text (just like this posting, but in a model instead of a blog post. During the meeting, there was a lot of discussion about whether it was a good idea or not, whether it was useful or not, whether I was even able to support this! The main point made by the pro-text minions was that many things are simply easier to do by writing text rather than drawing images, but that both could be supported. Other minions were saying that it was simply impossible. Now, this is all a bit strange to me. After all, when I look at my picture, I am an image, but then I can express myself in text (again, like in this posting). Regardless, any new capability given me makes me happy! And I wonder how I would look as text… I think I like myself better as an image, but it’s good to have a choice. In the end, I trust my minions.  Filed under: Papyrus, Textual Tagged: modeling, Textual, uml [Less]
Posted 11 days ago
There is a lot of documentation about the Eclipse "Plug-in Spy" feature (Plug-in Spy for UI parts or Eclipse 3.5 - Plug-in Spy and menus). Im my opinion one information is missing: what you need to install to use the Spy feature in your Eclipse ... [More] Neon IDE. Here is my small how-to. Select "Install new Software…​" in the "Help" Menu. In the dialog, switch to the "The Eclipse Project Updates" update site (or enter its location http://download.eclipse.org/eclipse/updates/4.6). Filter with "PDE" and select the "Eclipse PDE Plug-in Developer Resources". Validate your choices with "Next" and "Finish", Eclipse will install the feature and ask for a Restart. Figure 1. Install new Software in Eclipse If you prefer the Oomph way, you can paste the snippet contained in Listing 1 in your installation.setup file (Open it with the Menu: Navigate ▸ Open Setup ▸ Installation). Your Oomph Editor should looks like in Figure 2. Save the file and select "Perform Setup Task…​" (in the Help menu). Oomph will update your installation and will ask for a restart. Figure 2. Oomph setup Editor: installation.setup File In both cases, after the restart you can press alt+shift+f1 and use the Plug-in Spy as in Figure 3. Figure 3. Plug-in Spy in Eclipse Neon [Less]