Wednesday, March 25th, 2009 | Author:

I finally got Ignite Realtimeचे Spark to work. I don’t particularly like Sparkits a necessity though and I’m sure others have had trouble with it.

As some readers might be aware, I’m using 64-bit Arch Linux. Spark runs on top of a JRE, independent of the base platform. Therefore, this shouldn’t be an issue. However, Spark appears to come bundled with a 32-bit JRE.

After a lot of hassle, I eventually figured all I had to do was obscure or remove (rename or delete) the bundled JRE. This way, Spark’s startup script wouldn’t find the bundled JRE and it would be forced to search for the one built into the system. I had previously installed openjdk, an open source JRE from Arch’s [extra] repository.

There also happens to be a minor bug in the startup script in that its looking for a folder calledwindowswhen there’s clearly no such folder except one namedlinux”. Go figure.

Anyway, here’s the gist of the installation if you’re doing it manually on 64bit आणि you already have a JRE (such as openjdk) installed for your system:

mkdir -p ~/src
cd ~/src
wget http://download.igniterealtime.org/spark/spark_2_5_8.tar.gz
tar -zxvf spark_2_5_8.tar.gz
mv Spark/jre Spark/jre.not
sed -i 's/\/lib\/windows/\/lib\/linux/g' Spark/Spark
sudo mkdir -p /opt
sudo mv Spark /opt
सामायिक करा
Category: linux, networking, web
You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

6 Responses

  1. 1
    Bel 

    Thank you for sharing this. This is what I want a step by step because I am new to linux. I don’t know whats the meaning of those commands. I perform all the commands and what’s the next step of that? Can I run now the spark? Thank you so much!!!

  2. 2
    Bel 

    I was able to run the spark after this. The window appeared! great!

    /opt/Spark$ ./starter
    testing JVM in /usr
    /opt/Spark$ sh -x ./Spark

    But when I entered the jabber.apac.local in the server box, it can’t connect. But in windows same server I am using and its working. Is there anything that I can do here?

    Please advise. Thank you so much!!!!

  3. 3
    Tricky 

    Hi Bel

    I’m afraid that I’m not familiar with troubleshooting Spark once it is actually running since I’ve never had any similar issues. I’m happy I at least helped one person get a few steps further. 🙂

    I think the next steps would be to check if you don’t have a firewall-type issue preventing your requests from leaving the Linux desktop. If that is not the issue, I’m not at all certain how to proceed.

  4. 4
    Bel 

    Thanks Tricky! I’m gonna ask our security team if they are blocking something when the users are using linux box. I tried pidgin as well and its not working. Thank you so much again!!!

  5. 5
    Bel 

    Here I am again! 🙂 I was able to get it worked Tricky! But using PIDGIN. Looks like there is a dns issue. on the server I entered the actual IP and yes it worked. But thru dns name? no joy

    I have an issue now with the spark. Once I disable the openJDK in the sysmte software I can’t run the Spark thru terminal its because my vpn will not work if it enabled. I don’t know what happened. I was getting session timed-out when connecting to juniper vpn. It was working ok but just today when I open it its timing out and I researched what can fix that issue is by disabling the JDK. 🙁 it worked but the spark don’t!

    Any ideas? Here is the java -version of mine

    java version “1.6.0_20
    OpenJDK Runtime Environment (IcedTea6 1.9.1) (6b20-1.9.1-1ubuntu3)
    OpenJDK Server VM (build 17.0-b16, mixed mode)

    Many thanks!!!

  6. 6
    Bel 

    ok Tricky its working now! hehe. Spark and VPN doesnt have any conflict now. What I did was,

    apt-get autoremove

    sudo apt-get install openjdk-6-jdk

    then restart and it worked ok! Thank you so much!!! God Bless!

Leave a Reply » Log in