星期三, 3月25日, 2009 | 筆者:

我終於得到了 點燃實時火花 工作. 我並不特別喜歡星火 – 它的必需品,雖然,我相信其他人有過與它的麻煩.

正如一些讀者可能知道, 我使用的是64位 Arch Linux的. 火花上的頂部運行 JRE, 獨立的底座平台的. 因此, 這不應該是一個問題. 但, 火花似乎都捆綁了32位JRE.

很多的麻煩後,, 我終於想通所有我需要做的就是掩蓋或去除 (重命名或刪除) 捆綁的JRE. 這樣, 星火的啟動腳本不會發現捆綁的JRE,並且將被迫尋找一個內置到系統中. 我以前安裝過 OpenJDK的, 從Arch的一個開放源碼的JRE [額外] 知識庫.

此外,還恰好是在啟動腳本中的小錯誤在其尋找一個文件夾,名為 “窗” 當有清楚沒有這樣的文件夾中除了一個名為 “linux下”. 去圖.

無論如何, 這裡安裝的要點,如果你手動在64位做 你已經有了一個JRE (如OpenJDK的) 安裝系統:

MKDIR -p/SRC
光盤/SRC
wget的 HTTP://download.igniterealtime.org/火花/spark_2_5_8.tar.gz
 -zxvf spark_2_5_8.tar.gz
MV 火花/jre的火花/jre.not
 -我 'S /  / lib目錄 /窗/  / lib目錄 / linux的/ G' 火花/火花
 MKDIR -p /選擇
 MV 火花 /選擇
分享
您可以通過,這是一條任何反應 RSS 2.0 飼料. 您可以 留下回應, 或 引用通告 從您自己的網站.

6 回复

  1. 1
    倍儿 

    Thank you for sharing this. This is what I want a step by step because I am new to linux. I don’t know whats the meaning of those commands. I perform all the commands and what’s the next step of that? Can I run now the spark? Thank you so much!!!

  2. 2
    倍儿 

    I was able to run the spark after this. The window appeared! great!

    /opt/Spark$ ./starter
    testing JVM in /usr
    /opt/Spark$ sh -x ./Spark

    But when I entered the jabber.apac.local in the server box, it can’t connect. But in windows same server I am using and its working. Is there anything that I can do here?

    Please advise. Thank you so much!!!!

  3. 3
    狡猾 

    Hi Bel

    I’m afraid that I’m not familiar with troubleshooting Spark once it is actually running since I’ve never had any similar issues. I’m happy I at least helped one person get a few steps further. 🙂

    I think the next steps would be to check if you don’t have a firewall-type issue preventing your requests from leaving the Linux desktop. If that is not the issue, I’m not at all certain how to proceed.

  4. 4
    倍儿 

    Thanks Tricky! I’m gonna ask our security team if they are blocking something when the users are using linux box. I tried pidgin as well and its not working. Thank you so much again!!!

  5. 5
    倍儿 

    Here I am again! 🙂 I was able to get it worked Tricky! But using PIDGIN. Looks like there is a dns issue. on the server I entered the actual IP and yes it worked. But thru dns name? no joy

    I have an issue now with the spark. Once I disable the openJDK in the sysmte software I can’t run the Spark thru terminal its because my vpn will not work if it enabled. I don’t know what happened. I was getting session timed-out when connecting to juniper vpn. It was working ok but just today when I open it its timing out and I researched what can fix that issue is by disabling the JDK. 🙁 it worked but the spark don’t!

    Any ideas? Here is the java -version of mine

    java version “1.6.0_20
    OpenJDK Runtime Environment (IcedTea6 1.9.1) (6b20-1.9.1-1ubuntu3)
    OpenJDK Server VM (build 17.0-b16, mixed mode)

    Many thanks!!!

  6. 6
    倍儿 

    ok Tricky its working now! hehe. Spark and VPN doesnt have any conflict now. What I did was,

    apt-get autoremove

    sudo apt-get install openjdk-6-jdk

    then restart and it worked ok! Thank you so much!!! God Bless!

發表評論 » 登錄