Difference between revisions of "CrossSpider"
|Line 80:||Line 80:|
Download all the *.txt files, copy they to your website and rename to *.php. (
Download all the *.txt files, copy they to your website and rename to *.php. (be because otherwise you get output of PHP scripts, not source code). Edit first part of ajax.php and set up access to database.
Revision as of 20:16, 23 October 2016
CrossSpider is web frontend for DX cluster software. Some DX clusters used Java applet client. But this technology is little obsolete yet. So I wrote this small frontend for OK0DXI. The name CrossSpider links to DX Spider which is used here.
- CrossSpider can be used only for read-only access to DX cluster. It does not handle user connection. It is connected as special user which have no filters set.
- Band filtering is done on client so all spots are transferred via network.
How it works
On the DX cluster computer, small Perl daemon "robot.pl" is running. It is connected to DX cluster using telnet (exactly using netcat program).
The daemon inserts all spots and other messages to PostgreSQL database.
Next part is web page which uses HTML5/Ajax to access latest spots from PostgreSQL.
First set up the PostgreSQL database. In this text database name is crossspider, user is robot:
# apt-get install libdbd-pg-perl postgresql postgresql-client # su - postgres $ createdb crossspider $ psql crossspider crossspider=# create table lines(id serial, stamp timestamp with time zone, call text, qrg numeric(10,1), spot text); crossspider=# create user robot with password '01001'; crossspider=# grant all on lines TO robot; crossspider=# grant all on lines_id_seq TO robot; crossspider=# \q $ exit #
Allow access for user robot. Edit /etc/postgres/*/pg_hba.conf and add this line:
# TYPE DATABASE USER ADDRESS METHOD local crossspider robot password
Robot is reference to PE1NWL's DX robot.
Of course you have to install Perl first. But it is probaly already part of your distribution. Copy robot.pl somewhere (/usr/local/sbin).
If you use current linux distribution, you probably use systemd for service maintain. It's config directory is /etc/systemd/system. Here create file named robot.service:
[Unit] Description=DX-robot daemon Requires=spider.service # depends on how you startb the DX spider After=network.target spider.service [Service] Type=simple User=nobody StandardOutput=journal StandardError=journal Restart=always RestartSec=3 ExecStart=/usr/bin/perl /usr/local/sbin/robot.pl RestartSec=10 Restart=on-failure [Install] WantedBy=multi-user.target
Now set up the service:
# systemctl daemon-reload # systemctl enable robot.service
If you use older distro, you can use traditional /etc/rc.local or something:
# /etc/rc.local nohup /usr/bin/perl /usr/local/sbin/robot.pl &
Edit the robot.pl file and modify head part to set access to database and DX cluster. Of course you have to enable access also on your DX Cluster software.
If you do not have any web server with php support and want to use Apache, run:
# apt-get install apache2 libapache2-mod-php5 php5-pgsql
Download all the *.txt files, copy they to your website and rename to *.php. (Files on server cannot be PHP because otherwise you get output of PHP scripts, not source code). Edit first part of ajax.php and set up access to database.
I'm not going to develop new complex features. At least not the per-user access to DX cluster. If you want new feature, best way is to write it and send me path or affected files.
73 Lada, OK1ZIA