Beechlog‎ > ‎

December 2005

The on-line magazine of the Burnham Beeches Radio Club.
Tree Beechlog
The on-line magazine of the
Burnham Beeches Radio Club.

Welcome to the December 2005 edition of Beechlog.

A long time since the last issue! What with selling the house, I haven't had much inspiration for writing Beechlog. The house sale fell through at the last hurdle, on the day contracts were being exchanged the buyer at the bottom of the chain dropped out! So it was back to square one. In the meantime I have paid a visit to Princess Margaret Hospital, from which I emerged yesterday, rather sore. As walking is a rather delicate process at the moment, what better opportunity to write this stuff! If my aerials were still up, I could have gone onto eighty metres and told everyone anout the op, as I gather that's what the band's for!
Incidentally, I've changed the email address below as the spam was getting a bit too much. The new one is constructed by javascript so it might be more difficult for the spammers to read.

Roger GØHZK, Editor


Locator calculating
Ofcoms Surveys


I wrote last time about the way computer data can be preserved by duplicating it onto additional storage mediums. People still assume that their data is safe on their hard drives, yet in the past few weeks I have been called out to recue data from non-functioning systems.

One of these involved a broadband-connected system which had become somewhat infected. I was not surprised, because the ADSL modem supplied by the ISP had absolutely no way of shielding the user from the multitude of risks. It is surely criminal to supply kit with no firewall of any sort. The world must be awash with people with compromised and often unusable computers.

DVD-RAM I have still been fiddling about trying to improve methods of backing up my data. Recently I have been looking at DVD-RAM, a system which seems to have been half forgotten by most people. DVD-RAM was designed for storing computer data, but the other DVD formats are much more popular for this job. I suppose it's not surprising as a DVD-RAM disc costs from about £2, much more expensive that other DVD formats. But there are distinct advantages on spending the extra, although this might not be evident at first.

DVD-RAM discs look the same as DVD-R, although you will see a pattern on the writing side of each disc. Some types come in a plastic housing, although the discs can often be removed for use in equipment that doesn't accept this hardware. Most DVD drives don't cater for the discs, although some do, particularly LG drives and recent Toshiba laptops. The reason is probably that although all DVD discs look similar, the RAM variety are quite different. The others all write to a spiral track, but DVD-RAM has concentric tracks, rather like a floppy, Zip or hard drive. This allows true random access, which allows independent near-simultaneous reading and writing. In addition, they are usually FAT32 formatted, and work in much the same way as hard drives, including a means of marking and dealing with damaged sectors.

DVD-RAM This last point is important. It means that duff areas are marked as such during formatting (takes about 2 minutes), so that the writer avoids these areas. In addition, most writers verify data as they write, because data that doesn't verify can immediately by written in another sector, just like a hard disk.

Thus the writing integrity is much better that any other DVD writeable media. The cost is that writing is slower. Current discs write at up to 5X speed, although this is halved due to the built-in verify process. However the discs have a longer life, 100 times as many write cycles as other RW discs. There is a greater reflectivity, so discs should be easier for drives to read. For those of us with a modern operating system such as Windows XP and recent Apple Mac OS, no additional software is needed. You just drag and drop. DVD-RAM works just like your hard drive. Of course it is slower, especially if you are copying large amounts of data, although with smaller amounts it seems almost instant. It's just like using a Zip or Jaz drive, but with greater capacity.

It is, of course, possible to get similar behaviour from a DVD+RW disk by using something like InCD. There is some debate about reliability, although I think this was due to early versions of some packet-writing software being a little unstable. While the process seems similar, currently it's a bit of a bodge. You don't get the error mapping and protection of DVD-RAM, which is a much more robust process. And the latter discs are quoted as having 30 year data integrity, although I wouldn't like to guess what will be around in 2035. There is a standard called Mt. Rainier, which is intended to improve DVD+RW error handling. This has been around for a while, although I have yet to see any drives that handle it. DVD-R and +R speeds have now increased to 16 and the +RW versions, including dual layer, are likely to reach that speed in a matter of months. Much the same goes for DVD-RAM (will there be dual-layer?), although being more of a niche product, the discs will probably be hard to find.

These improvements may be overshadowed by the new HD-DVD and Blu-Ray drives. These are primarily intended for mass produced high definition video. Whether writers will available early on, and whether either will come in a form intended for data, who knows?

Locator calculating

This article has a slight radio connection! Since I published my Javascript map and locator conversion programs on the web, I have received a few enquiries from others interested in map co-ordinates. Many of these relate to integrating the MS and Google world mapping with each other and with internet street maps. These systems require different sorts of inputs, so my imperfect code has been incorporated into various other web applications.

I have also had a few thoughts. Javascript seemed a good idea, as it's cross-platform and well established. But with the plethora of viruses, trojans and other nasty things that can infect computers so easily, there have been many suggestions that you should turn off your scripting. Also I have been unhappy with some of my algorithms, which although standard are lacking in some areas.

To cater for the former perceived problem with scripting, I have decided to try running the number crunching on the web server. There are many ways of doing this, but I have currently been working on using Perl code, as this is available on the servers that host Beechlog and my other web sites. And to improve accuracy, I have moved to rather more involved maths to calculate the map references.

Perl is a fairly simple script language that's been around for years. Scripts can be started from HTML, and using the POST or GET instructions, you can pass data entered by users to the script. If you look at the browser address bar after a Google search, you will see how GET works - it simply encodes your data into the web address. POST works another way, long web addresses are not used, and the data is sent to server in a form not visible to the user. The script reads this data from the web server. POST is generally used where there is a lot of data, as there are sometimes limits to the length of web addresses (I have seen some containing over a thousand characters). In either case, there are standard scripts available which read the data for you, so I didn't spend to much time working out how to do it!

Perl scripts can be written in any text editor, although you can buy software dedicated to this task. I use Monkey Editor, a free offering which uses colour highlighting for a variety of programming languages. I have installed ActivePerl and the Apache web server on my PC so that I can test my code easily. Then it's simply a matter of copying the code to the hosting companies web server.

Of course, initially I had problems! The code worked fine on my Apache, but not on my web site. The first thing I realised was that the address of the Perl program differed on each machine. This is defined in the first line of Perl scrips, called the "She-Bang"! On a hosts Unix or Linux server, its written like #!/user/bin/perl and on a Windows machine it is #!c:\perl\bin\perl. In fact for a while I found that although the HTML ran from the web server, the Perl was still running on my machine! Having got that out of the way, I then found that it still didn't work! After scratching about for a while I discovered that the problem was one of permissions - I needed to change the attributes of the script on the Linux host to allow anyone to run it. Luckily just a right-click in my FTP client revealed how to do this.

After correcting a few typos, it all worked! I found I had to take a bit more care with my programming too. If the code crashes, it's no big deal, but if it gets stuck in a loop, it's easier to kill this on a local PC than on a remote server. I assume web hosting companies have some protection against this? Also care needs to be taken to prevent your code compromising the hosting companies web server. This is mainly down to input validation, in my case it means validating the map references before anything else happens.

I've had to mess around with the code a bit. There are differences in Javascript and Perl syntax, and some operators are implemented in a different way. For example, in Perl, eight cubed is written 8 ** 3, but in Javascript it's Math.pow(8, 3). All Perl simple (scalar) variables start with a $, it's too easy to miss this off when converting from Javascript, and it doesn't always flag an error!

After Perl has calculated the necessary results, it's just a matter of sending these to the clients browser. This can be sent as plain text, but it's much nicer sending it as an HTML page. In my case, I wanted to send it as a form with the input data and results filled in, and ready for further actions. This is a bit more involved, and here I saw the reason why Perl variables start with the $ sign. You can simply use a variable name in the HTML, and its content is sent to the browser. This is very powerful, as a Perl variable can hold HTML code, so you can vary the page dynamically. So what of the changes in algorithms?

Helmert Changing from one mapping system to another often involves a transformation from one geoid to another. A geoid is simple the shape, usually an ellipsoid, which is used to represent the surface of the part of earth being mapped. For British OS maps, the Airey 1830 ellipsoid is used, because if nicely fits Great Britain. This differs from other geoids in its shape and orientation. So to change from a GPS reading (which uses a GRS80 or WGS84 geoid) to a British map reference, a transformation is required. In the past I transformed just the main three dimensions, because of the difficulty of finding the data for everything I needed to transform. However, stumbling a a NATO table of parameters covering most systems used for maps, I have now switched to the full 7 parameters required for a proper Helmert transformation. The added four comprise rotational corrections for each dimension and a scale factor correction. The end result means a maximum of a 4 metre error, in practice it's often much less.

I've also changed the method I use to convert the three dimensional geoid to a flat map. Originally I used a general purpose formula to convert to and from the transverse Mercator projection. I've now switched to the formulae published by the Ordnance Survey. The combination of these two changes means that I can now convert and transform and remain within three or four metres of the reference stations whose coordinates are published by the OS GPS Survey, and by the Irish equivalent.

The maths has become rather more involved as a result. I can't say I really understand what is going on, but it looks impressive.

Ofcoms Surveys

Ofcom have now published the results of their licensing consultation. Both the postal poll and the invited letters that are published on their web site show a wide variety of opinion. Like many government sponsored consultations, you can read the results in several ways, and so many vociferous pundits are claiming that their views are vindicated, whatever they may be.

I'm not going to try to interpret the results. Basically it all boils down to whether you believe that Ofcom has a hidden agenda. I really don't know, but presumably we will find out in due course.

Many are claiming that the RSGB has been severely harmed by the results, and that their strategy has been shown to be at odds to the majority opinion of radio amateurs. Predictions of a membership collapse abound. Whether this is true remains to be seen. It could be argued that the RSGB was on a loser whichever way it looked, as the Ofcom consultation appears to be a typical divide and rule affair as with motorways and airport terminals. Now we have to wait and see what Ofcom will implement. As a pointer, they have now published details of the new lifetime Ships Radio Licence, which you can read by clicking here.

It also appears that Ofcom are now 'taking back' some of the NoV functions that they and the RA paid the RSGB to do on their behalf. There have been great cheers from some people who regard this as a blow against the RSGB. Once again the overall effect remains to be seen. There have been predictions that Ofcom may recover costs by charging very much more for an NoV, as do some other administrations, and this is welcomed by the lobby who wish to see the end of any new communication modes ("they're not amateur radio"), or for that matter, any mode other than CW.

In any case, it is difficult to predict the future of amateur radio. It is quite obvious that some of the magic has gone, as worldwide communication is so simple and immediate these days. The advent of digital everything in the home is causing a fog of noise on the bands, and a whiff of RF can interfere with many more domestic appliances than before. The stability of modes of communications is diminishing too. While Morse and SSB are still the main modes, digital versions come and go, the pace of change means that nothing new will be around for very long. The December RadCom has mention of several modes unfamiliar to me. I thought that PSK31 would have been around for a while, but there are already improvements that may unseat it. With so much fragmentation I can't see where this is going, although conventional SSB and CW are still likely to be around for a long time, if only because of their ease of use.

Writing personally, I've not been using amateur radio very much for the last few years. In 2005 I have had less contacts from my station that ever before. As some of you might know, I became unmarried on February 14th of this year. An ironic date. I'm still living at the same address with my ex-wife, the realities of the costs of living apart make any other arrangement a quick trip to unresolvable debt. I still have one white stick aerial aloft so short range VHF/UHF is still possible, but everything else has been dismantled. The future has a question mark, I might end up with nowhere at all for aerials. One possibility is mobile/portable working, although sitting outside with a radio in freezing temperatures is unappealing! So I really don't know what I will do. However I used to have a good time on VHF/UHF with concealed loft aerials, so all is not lost. Whether I could erect a small aerial suitable for the 40/80 ragchew bands is an interesting thought.


I'm currently sitting in my bed typing this lot into my new laptop. Since everyone else at BBRC seems to have one, I thought I might as well lash out the readies. It's a Toshiba M50-192, which is a 14 inch widescreen model. I didn't want to spend a lot, so it's a budget machine, but fairly well equipped. Along with the 14 inch TruBrite display, there's 1 gig of RAM, CD/DVD everything (including -RAM) writer, Wi-Fi, dual display, 60 gig HD, etc. No serial or parallel ports, so if I wish to drive my FT-100 from it I'll need some sort of adaptor. My printer is on my network, so no problem there.

The 1.5 GHz Celeron Mobile sounds worse than it is. It's probably faster than my mark 1 1.5GHz Pentium 4 desktop, and all that RAM is what really matters. The Wi-Fi has exposed the problems with my router - sometimes it won't connect - a router reboot sorts that out, it was for this reason I reverted back to wired ethernet. I had some discussions with the network pros on Cix, and it seems that if you want a reliable wireless network, you have to avoid all the popular brands. The good stuff costs a fortune, but at a push brands such as Zyxel and Draytek are better than most.

Linkstation My employer's bonus also bought me a network storage unit. This is a small box with a large hard drive and a computer inside. It's configured by a web interface, and now provides my regular backup storage, as well as solving the problem of sharing stuff between the laptop and main computer. I was going to build such a device, but never got round to it. Goodness knows how it works, there is a copyright reference to Sun Solaris, but nothing else. When I've had it for a while I'll open up the case and have a look. It has a few interesting features, such as a timer to power it down during the night, and USB sockets for additional hard drives and printers.

To backup stuff to the network store I use a free Microsoft application called SyncToy. This needs .NET 1.1 on the PC (also free). It is a fairly simple application for transferring data between computers, but works very well. My new laptop already had .NET on it.

I also decided to transfer my legit MS Office installation to the Toshiba. This is the full 2003 Professional stuff, which cost me £17 under the Microsoft Home User Program. Well of course it failed the online validation, so I has to use the phone method. This involves typing a 64 digit number into the phone before finding out that that doesn't work either! No problem, you then get connected to customer support in some far off land. Guess what - Microsoft's network was down, "phone back in 20 minutes". Each time I phoned, I had to key in that long number before I got to speak to a real person. After about five attempts and four hours they got their network going. The girl I spoke to had an accent I could understand (the earlier ones were not so easy), she needed only the first four digits in order to generate a new code for me! She also detected that I was writing it down, and told me that it was pointless since it would only work once. I'm not sure that I believe this, as Office generated the same number each time I tried to validate it. Maybe it's linked to the date?

I've just checked the hosting stats for the Beechlog website. In the past year, it seems that over 1.6 Gigabytes have been requested from the site! There has been over 120,000 hits and over 100,000 files requested. I might add that the also hosts the and websites, but the Beechlog site is by far the largest and must account for most of the traffic. You can see the stats here. Click on the dates on the left of the table and you get an amazing wealth of information, including the actual Beechlog issues, pages, referrers, etc. Amazing.

Anyway, enough waffle for now.