About message filter with sparkplug

Hi there,

I have some strange problem with message filters in spark plugs. Basically, what I want is to append some text in the message body at the sender side, and remove it at the receiver side. The code is like the following.

private void installMessageFilter() {

ChatManager chatManager = SparkManager.getChatManager();

chatManager.addMessageFilter(new MessageFilter() {

XMPPConnection conn = SparkManager.getSessionManager().getConnection();

public void filterOutgoing(ChatRoom room, Message message) {

String before = message.getBody();

message.setBody(before + “some text”);

}

public void filterIncoming(ChatRoom room, Message message) {

String before = message.getBody();

index i = before.lastIndexOf(“some text”);

message.setBody(before.subString(0, i));

}

});

}

What I observed is

a) both filters won’t work in the same time. At one time, filterOutgoing is always working and filterIncoming is always NOT working. If I start the spark again, it might be the other way.

b) filterOutgoing only filter selected messages. For example, if I say “hi”, it won’t append text; if I say “new” or something else, it may append text. If, for some mysterious reason, it decides to or not to append text for some message, it will always do the same for the same message.

c) in both filters, if I don’t read message but always set the body to something directly, it seems to be working. i.e just call message.setBody() without calling getBody()

I’m being working on this for a while, and very frustrated now. Could someone help me please? Thanks a lot.

BTW, the spark I’m using is 2.5.8. The downloaded sparkplug is 2.0.7. I also found the documentation of spark plug development guide is kind of outdated. For example, it said:

======

To setup a project to run Spark within your IDE, you will need to the following:

  • It is required that you use the 1.4 JRE to build Spark.

  • Add all *.jar files in the Sparkplugs/spark/lib and Sparkplugs/spark/lib/windows directories to your classpath.

  • Add the resource directory (Sparkplugins/spark/resource) to your classpath for the native libraries.

======

It turns out that all three bullets are unnecessary, at least for Spark 2.5.8, although this is slightly off my topic.