Spark + openfire Secondary Development

Source: Internet
Author: User
Tags i18n

Spark + openfire secondary development (1)

Article category:Java programming

1. preparations:

Download openfire 3.6.4 from the official website and use SVN to download the source code of openfire, spark, and sparkweb.

The official website address is as follows:

Http://www.igniterealtime.org/downloads/index.jsp

Note that the latest spark version on the official website is 2.5.8. It is best to use version 1.6 in its JDK environment.

2. Environment setup-Spark Source Code installation and configuration

Double-click openfire_3_6_4.exe to install openfire. The installation process is simple.

In this example, the development environment is based on Eclipse.

1) Select File -- New -- project -- javaproject.

Enter the project name spark

In contents, select "Create project from existiing source" and add the folder where the spark file is located.

 

Click Finish.

2) generate spark:

Click window: Show view: ant
Right-click the ant panel and choose add buildfiles.
Expand spark: Build folder, select build. XML, and click "OK"
On the ant panel, expand spark and double-click "release". After a while, the system prompts "build successful ".

3) Run spark:

Click Run: Open debug dialog.... The "run" window appears.
Select "javaapplication" and Right-click "new.
On the "Main" tab, replace new_configuration with spark.
Click project: browse, select spark, and click OK.
Click the main class: Search button, select the main class org. jivesoftware. launcher. startup, and click OK.
We recommend that you select stop in main.
Click the classpath tab and select user entries to enable advanced .. the button becomes available. click the Advanced button. in the displayed advanced options window, select Add folders, click OK, select the spark: SRC: Resources folder in the folderselection window, and click OK.
Select the common tab and check the debug and run boxes.
Click Apply, and then click Close.

4) Select spark under run. The operation is successful!

Spark + openfire Secondary Development (2)

 

1. openfire source code deployment

Decompress openfire_src_3_6_4.tar.gz and rename it openfire_src. For the source code deployment method, see the Spark Source Code deployment method in Spark + openfire secondary development (1.

2. openfire running configuration

1) Click Run: opendebug dialog.... The "run" window appears.
2) Select "javaapplication" and Right-click "new.
3) on the "Main" tab, replace new_configuration with openfire.
4) Click project: browse, select openfire, and click OK.
5) Click the main class: Search button, select the main class serverstarter, and click OK. We recommend that you select stop inmain.

6) Arguments page. Add content in VM arguments

-Dopenfirehome = "$ {workspace_loc: openfire}/target/openfire ".
7) Click the classpath tab and select user entries to make the advanced .. the button becomes available. click the Advanced button. in the displayed advanced options window, select Add folders and click OK. In the folderselection window, select the openfire/src/i18n folder. In the same way, select the openfire/src/resources/jar folder, click OK.
8) Select the common tab, check the debug and run boxes.
9) Click Apply and then close.

3. Compile

1) copy the openfire_i18n_en.properties file under the openfire/src/i18n folder and the admin-sidebar.xml file under the openfire/src/resources/jar folder to the SRC \ bin directory, open the ant panel, select openfire [Default] to run the task.

4. Run

Run openfire. The following information is displayed on the console:

Openfire 3.6.4 [Mar 15,201 am]
Admin Console listening at http: // 127.0.0.1: 9090

Open the address shown above in the browser to manage and configure openfire.

Spark + openfire Secondary Development (III)

Article category:Java programming

The spark plug-in is mainly used to enhance the functions of the client. The following describes the development process of the spark plug-in.

1. The final result is as follows:

Add a my plugin menu and click it to display the hello panel.

 

2. Development Process

1) Plug-in Structure

The jar package structure of the plug-in is as follows:

Java code

  1. Example. Jar
  2. |-Plugin. xml plug-in definition file
  3. |-Libs/contains all the classes required to run this plug-in

Example. Jar

|-Plugin. xml plug-in definition file

|-Libs/contains all the classes required to run this plug-in

Define your plugin. xml file. Spark automatically reads the plugin. xml file in the plug-in jar to load the plug-in. The example file is as follows:

 

Java code

  1. <! -- Define your plugins -->
  2. <Plugin>
  3. <Name> examples plugin </Name>
  4. <Version> 1.0 </version>
  5. <Author> Derek demoro </author>
  6. <Homepage> http://www.jivesoftware.com
  7. <Email> derek@jivesoftware.com </Email>
  8. <Description> shows some simple ways to create plugins. </description>
  9. <! -- Plug-in interface implementation class -->
  10. <Class> com. jivesoftware. Spark. Examples. exampleplugin </class>
  11. <Minsparkversion> 2.5.0 </minsparkversion>
  12. </Plugin>

<! -- Define your plugins -->

<Plugin>

<Name> examples plugin </Name>

<Version> 1.0 </version>

<Author> Derek demoro </author>

<Homepage> http://www.jivesoftware.com

<Email> derek@jivesoftware.com </Email>

<Description> shows some simple waysto create plugins. </description>

<! -- Plug-in interface implementation class -->

<Class> com. jivesoftware. Spark. Examples. exampleplugin </class>

<Minsparkversion> 2.5.0 </minsparkversion>

</Plugin>

2) Develop your plug-in

Directory structure of plug-in source code

Java code

  1. Example
  2. |-SRC Source Code
  3. |-Lib supports additional jar files of this plug-in
  4. |-Resources images and other resource files
  5. |-Build the compilation File
  6. |-Configuration file of the build. xml ant packaging plug-in

Example

|-SRC Source Code

|-Lib supports additional jar files of this plug-in

|-Resources images and other resource files

|-Build the compilation File

|-Configuration file of the build. xml ant packaging plug-in

 

Plug-in implementation class. Your class must first implement the plug-in interface provided by spark, and then implement some of its methods.

Java code

  1. Package org. jivesoftware. Spark. examples;
  2. Import org. jivesoftware. Spark. plugin. plugin;
  3. /**
  4. * Display different plug-in functions by different implementation methods
  5. */
  6. Public class exampleplugin implements plugin {
  7. /**
  8. * Initialize the plug-in after the plug-in is installed.
  9. */
  10. Public void initialize (){
  11. System. Out. println ("Welcome to spark ");
  12. }
  13. /**
  14. * Called when spark is disabled to realize information persistence or release resources
  15. */
  16. Public void Shutdown (){
  17. }
  18. /**
  19. * When a user requests to disable spark, if yes, the system returns true
  20. */
  21. Public Boolean canshutdown (){
  22. Return true;
  23. }
  24. /**
  25. * Called when the plug-in is uninstalled to clear resources on the disk, such as files, images, and all components left after the plug-in is installed.
  26. */
  27. Public void uninstall (){
  28. // Remove all resources belonging to this plugin.
  29. }
  30. }

Packageorg. jivesoftware. Spark. examples;

 

Importorg. jivesoftware. Spark. plugin. plugin;

 

/**

* Display different plug-in functions by different implementation methods

*/

Public class examplepluginimplements plugin {

 

/**

* Initialize the plug-in after the plug-in is installed.

*/

Public void initialize (){

System. Out. println ("Welcome tospark ");

 

}

 

/**

* Called when spark is disabled to realize information persistence or release resources

*/

Public void Shutdown (){

 

}

 

/**

* When a user requests to disable spark, if yes, the system returns true

*/

Public Boolean canshutdown (){

Return true;

}

 

/**

* Called when the plug-in is uninstalled to clear resources on the disk, such as files, images, and all components left after the plug-in is installed.

*/

Public void uninstall (){

// Remove all resources belonging tothis plugin.

}

}

 

 

 

The development process is to be continued.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.