Top 3 Cable Tracing Technologies

Firstly, why would you need to trace network cabling? In a perfect world you wouldn’t need to, but even if a network begins life properly labelled, things have a habit of changing. Documentation and cable labelling don’t always keep up when changes are made.

Network Cable Mess

A jumble of network cables in a network cabinet.

 

When you need to re-arrange the cabling in your patch panel, can you be 100% certain that the label is correct? You can be reasonably certain if you installed and maintain the network yourself, but what if others are involved. Are they as fastidious as you are in keeping the network documentation up to date?

Before cable moves, it doesn’t do any harm to make sure the cable is connecting check that it isn’t something vital. It is one thing to disconect a single phone or workstation, quite another to disconnect the server.

A number of different technologies exist for tracing network cabling. Each technology has their plus points and their down sides too. A brief explanation of each technology will now follow outlining when each should be applied.

Tone Tracing

Tone tracing is pretty much as old as copper cabling itself. Tone tracing is the grand daddy of all cable tracer technologies. The basic idea is that at one end of the cable you place an electrical signal onto the cable, using a tone generator, and then trace that signal, using a tone tracer, in order to understand where the cable is located.

It could not really be much simpler. Well unfortunately, it is simple in theory, and usually it really is that simple in practice, but there are some things in network cabling especially that manage to make things a little more complex.

The design characteristics of the network cable are working against you. The design of Unshielded Twisted Pair (UTP) cabling is meant to reduce the interference between the pairs of copper wire that make up the cable as a whole. All of CAT5, CAT5e, CAT6 cable types have four pairs of copper wire carefully twisted together to minimise interference. This presents a number of problems to anyone attempting to trace a category 5, 5e or 6 cable because you wish to maximise the signal on the cable in order to increase the strength of the signal you can detect.

The best way to minimise the effective damping effect of the cable twists is to place the signal on a single wire within the cable whilst the other pair is earthed. If a signal is placed onto both pairs in a cable the twists in the cable will work to dampen the signal, placing the signal onto a single wire will help to avoid the dampening effect.

Tone tracers have traditionally been analog. More recently digital or asymmetric tone tracers have arrived onto the market. Asymmetric toners have a number of advantages over more traditional analog tracers.

Tone Generator and Trace Diagram

A diagram showing how a tone generator and tone tracer works.

Tone tracing provides the only cable tracing technology that can be performed on a live cable. More modern asymmetric tone tracers operate at frequencies well above the level that even very modern cable like CAT7 operate. Consequently, the tone signal does not interfere with a network signal.

Locating exactly where the fault lies can be very useful in deciding whether a cable run needs replacing or repairing. A fault close to the end of the cable may be repairable. Conversely, a fault in the middle of the cable would, in all likelihood, require a replacement. A tone tracer can be used to locate a fault, though it is a laborious process tracing each wire until the break is located. A full featured cable tester or TDR tester would be much faster and consequently a much better use of your time.

Continuity Testing

A continuity tester provides the ideal way to locate and label your network cabling. Whilst a toner can only work one cable at a time, a continuity tester can locate up to 20 cables at a time.

Each continuity tester is supplied with at least one remote. The remote fixes onto one end of your cable and you place the tester itself on the other end. When the tester and the remote are connected to the same cable, the continuity tester will show the number of the connected remote. Additional remotes can usually be supplied for most continuity testers or are included in the kit form of the tester. The additional remotes are numbered sequentially allowing you to locate and label a batch of cables at a time. If you have a large number of cables to locate, this can be a real time saver and will save you a lot of leg work.

In addition, a continuity tester is also capable of simple cable testing ensuring that all of the pairs of wires have no breaks and are connected correctly. It must be stressed that low end continuity testers can be fooled into giving a positive result when the cable is incorrectly wired.

Most continuity testers have a tone generator built in, so, with the addition of a tone tracer, can be used for tone tracing as well. Using the built in tone generator on a continuity tester can save you from carrying around a tool, saving you space and weight in your kit bag.

Hub Blink

Most modern hubs and switches have activity lights for each port indicating the traffic level and status. A recent addition to many cable testers and outlet identifiers has been the ability to blink the lights on a port. This feature is called hub blink.

Of course, this feature is only useful if the cables you wish to locate are connected to a live network. Hub blink is completely useless if you are trying to locate bare wire cables or before the network infrastructure has been installed.

Conclusion

Which technology you find a best fit is largely dictated by a few factors like how many cables you need to track and whether the cables are terminated and connected to a switch.

If you are tracking cables that are not terminated, your only realistic option is to use a traditional tone generator and tracer. Ditto if the cable is live, your only option is to use a toner.

Hub blinking is also fine so long as your switch is relatively small. If you’ve got a huge switch with hundreds of ports, you may well struggle to identify ecactly which port is blinking.

If you have a lot of cables to find, then a continuity tester with multiple remotes will allow you to identify cables in batches, speeding up the process of identification and labelling.

 


My 2014 Reading Log

A list of all of the books I read in 2014 and logged in Good Reads, I read a few more technical books but didn’t log them for whatever reason.

January

Andrew Smith: Moondust: In Search Of The Men Who Fell To Earth (Non-Fiction)

February

March

William Golding: The Spire (Fiction)

Niall Ferguson: The Pity of War: Explaining World War 1 (Non-Fiction)

April

Simon Parkes: Live at the Brixton Academy: A rioutous life in the music business (Non-Fiction)

May

Christopher Priest: Inverted World (Fiction)

Philip K. Dick: VALIS (Fiction)

Lee Campbell: Introduction to Rx (Non-Fiction)

June
July

Neil Gaiman: Neverwhere (Fiction)

August
September

Fred Hoyle: A for Andromeda (Fiction)

October

Mark Ellen: Rock Stars Stole My Life! (Non-Fiction)

John Crowley: The Deep (Fiction)

Neil Gaiman: The Ocean at the End of the Lane (Fiction)

Philip K. Dick: A Maze of Death (Fiction)

November

Paul Ham: Hiroshima Nagasaki: The Real Story of the Atomic Bombings and their Aftermath (Non-Fiction)

December

A total of 14 books, 8 fiction (mostly science fiction and fantasy) and 6 non-fiction. Of the two autobiographies, I found Mark Ellen’s Rock Stars Stole My Life! to be very good, taking the reader through the rock landscape of the 1960s to the present day through the eyes of a music journalist.

The book I most enjoyed was Neil Gaiman’s The Ocean at the End of the Lane. Neil Gaiman was certainly my find of 2014. He manages to write quite original fantasy in a way that doesn’t make the book feel like fantasy.


New Aviosys IP Power 9858 Box Opening

A series of box opening photos of the new Aviosys IP Power 9858 4 port network power switch. This model will in due course replace the Aviosys IP Power 9258 series of power switches. The 9258 series is still available in the mean time though, so don’t worry.

The new model supports WiFi (802.11n-b/g and WPS for easy WiFi setup), auto reboot on ping failure, time of day scheduler and internal temperature sensor. Aviosys have also built apps for iOS and Android, so you can now manage your power switch on the move. Together with the 8 port Aviosys IP Power 9820 they provide very handy tools for remote power management of devices. Say goodbye to travelling to a remote site just to reboot a broadband router.

 


Back to Basics

After a while things stop being new. Things that really used to excite you, stop exciting you. Things that you were passionate about, you stop being passionate about. That’s just how things work.

I wrote my very first computer program 26 years ago this month. It was in college, using a Perkin Elmer mini computer running Berkely BSD 4.2 on a VT220 terminal (with a really good keyboard.) The program was written in Pascal. Pascal was the educational programming language of the time. Every time I went near the terminal, I approached with a sense of wonder. It felt like the possibilities were endless.

But, over time, the sense of wonder starts to wane. Once somebody starts paying you to do something, the sense of wonder starts to wane real fast. You don’t control it any more. You are likely to be producing something that somebody else wants you to produce. In a manner they want you to produce it.

I have been pondering my career recently. Such as it is. You do start pondering your career when you hit the wrong end of your forties. How can I get back that sense of wonder again?

I’ve always had a hankering after learning Lisp. I read about it even before I went to college twenty six years ago, and it has always fascinated me. Pretty well any programming concept you can think of, Lisp usually got there first.

One of my recent discoveries has been a series of books: The Little Schemer, The Seasoned Schemer and The Reasoned Schemer teaching Scheme in quite a unique, accessible and fun style.

Scheme is a modern dialect of Lisp. There are lots of others including Clojure.

I think that learning a language from scratch, just for the fun of it, may just be the tonic for a mild dose of mid-career blues. Hopefully, that sense of wonder may return. I sure hope so.

I’ll let you know :)


Software the old fashioned way

I was clearing out my old bedroom after many years nagging by my parents when I came across two of my old floppy disk boxes. Contained within is a small snapshot of my personal computing starting from 1990 through until late 1992. Everything before and after those dates doesn’t survive I’m afraid.

The archive contains loads of backups of work I produced, now stored on Github, as well as public domain / shareware software, magazine cover disks and commercial software I purchased. Yes, people used to actually buy software. With real money. A PC game back in the late 1980s cost around £50 in 1980s money. According to this historic inflation calculator, that would be £117 now. Pretty close to a week’s salary for me at the time.

One of my better discoveries from the late 1980s was public domain and shareware software libraries. Back then there were a number of libraries, usually advertised in the small ads at the back of computer magazines.

This is a run down of how you’d use your typical software library:

  1. Find an advert from a suitable library and write them a nice little letter requesting they send you a catalog. Include payment as necessary;
  2. Wait for a week or two;
  3. Receive a small, photocopied catalog with lists of floppies and a brief description of the contents;
  4. Send the order form back to the library with payment, usually by cheque;
  5. Wait for another week or two;
  6. Receive  a small padded envelope through the post with my selection of floppies;
  7. Explore and enjoy!

If you received your order in two weeks you were doing well. After the first order, when you have your catalog to hand, you could get your order in around a week. A week was pretty quick for pretty well anything back then.

The libraries were run as small home businesses. They were the perfect second income. Everything was done by mail, all you had to do was send catalogs when requested and process orders.

One of the really nice things about shareware libraries was that you never really knew what you were going to get. Whilst you’d have an idea of what was on the disk from the description in the catalog, they’d be a lot of programs that were not described. Getting a new delivery was like a mini MS-DOS based text adventure, discovering all of the neat things on the disks.

The libraries contained lots of different things, mostly shareware applications of every kind you can think of. The most interesting to me as an aspiring programmer was the array of public domain software. Public domain software was distributed with the source code. There is no better learning tool when programming than reading other peoples’ code. The best code I’ve ever read was the CLIPS sources for a forward chaining expert system shell written by NASA.

Happy days :)

PS All of the floppies I’ve tried so far still work :) Not bad after 23 years.

PPS I found a letter from October 1990 ordering ten disks from the library.

Letter ordering disks

 


Early 1990s Software Development Tools for Microsoft Windows

The early 1990s were an interesting time for software developers. Many of the tools that are taken for granted today made their debut for a mass market audience.

I don’t mean that the tools were not available previously. Both Smalltalk  and LISP sported what would today be considered modern development environments all the way back in the 1970s, but hardware requirements put the tools well beyond the means of regular joe programmers. Not too many people had workstations at home or in the office for that matter.

I spent the early 1990s giving most of my money to software development tool companies of one flavour or another.

Actor was a combination of object oriented language and programming environment for very early versions of Microsoft Windows. There is a review in Info World magazine of Actor version 3 that makes interesting reading. It was somewhat similar to Smalltalk, but rather more practical for building distributable programs. Unlike Smalltalk, it was not cross platform but on the plus side, programs did look like native Windows programs. It was very much ahead of its time in terms of both the language and the programming environment and ran on pretty modest hardware.

I gave Borland quite a lot of money too. I bought Turbo Pascal for Windows when it was released, having bought regular old Turbo Pascal v6 for DOS a year or so earlier. The floppy disks don’t have a version number on so I have no idea which version it is. Turbo Pascal for Windows eventually morphed in Delphi.

I bought Microsoft C version 6 introducing as it did a DOS based IDE, it was still very much an old school C compiler. If you wanted to create Windows software you needed to buy the Microsoft Windows SDK at considerable extra cost.

Asymetrix Toolbook was marketed in the early 1990s as a generic Microsoft Windows development tool. There are old Info World reviews here and here. Asymetrix later moved the product to be a learning authorship tool. I rather liked the tool, though it didn’t really have the performance and flexibility I was looking for. Distributing your finished work was also not a strong point.

Microsoft Quick C for Windows version 1.0 was released in late 1991. Quick C bundled a C compiler with the Windows SDK so that you could build 16 bit Windows software. It also sported an integrated C text editor, resource editor  and debugger.

The first version of Visual Basic was released in 1991. I am not sure why I didn’t buy it, I imagine there was some programming language snobbery on my part. I know there are plenty of programmers of a certain age who go all glassy eyed at the mere thought of BASIC, but I’m not one of them. Visual Basic also had an integrated editor and debugger.

Both Quick C and Visual Basic are the immediate predecesors of the Visual Studio product of today.


New Aviosys IP Power 9820 Box Opening

A series of box opening photos of the newly released Aviosys IP Power 9820 8 port rack-mountable power switch which arrived in the office this morning. This new model replaces the older IP Power Switch 9258-PRO model.

The new model is higher powered, supports Wi-Fi, live charts display the value of WH (Watt per hour), current amp , voltage & temperature and the LCD display shows temperature, voltage, IP address and current for each port.


I’ve a feeling we’re not in Kansas any more

I was researching a follow up to how will cloud computing change network management post and I came across something rather odd I’d like to share with you before I’ve done the follow up.

Below are a series of graphs culled from Google Trends showing the relative search levels of various network management related keywords.

What is the most significant feature of them? What struck me is the downward decline with various degrees of steepness. The searches don’t just represent commercial network management tools, there are open source projects and open core products there too. I even put searches for network management protocols like SNMP and NetFlow in too. They all show declines.

netcool

NetCool Search Trend

netflow

NetFlow Search Trend

opennms

OpenNMS Search Trend

openview

OpenView Search Trend

sflow

SFlow Search Trend

syslog

Syslog Search Trend

zenoss

Zenoss Search Trend

ipfix

IPFIX Search Trend

mrtg

MRTG Search Trend

nagios

Nagios Search Trend

snmp

SNMP Search Trend

The only search not showing a decline is Icinga. But, that may just be because it is a relatively recent project so it doesn’t have a history of higher volumes of searches it probably would have had if it were a bit older.

icinga

Icinga Search Trend


Stack Overflow Driven Development

The rise of Stack Overflow has certainly changed how many programmers go about their trade.

I have recently been learning some new client side web skills because I need them for a new project. I have noticed that the way I go about learning is quite different from the way I used to learn pre-web.

I used to have a standard technique. I’d go through back issues of magazines I’d bought (I used to have hundreds of back issues) and read any articles related to the new technology. Then I’d purchase a book about the topic, read it and start a simple starter project. Whilst doing the starter project, I’d likely pick up a couple of extra books and skim them to find techniques I needed for the project. This method worked pretty well, I’d be working idiomatically, without a manual in anywhere from a month to three months.

Using the old method, if I got stuck on something, I’d have to figure it out on my own. I remember it took three days to get a simple window to display when I was learning Windows programming in 1991. Without the internet, there was nobody you could ask when you got stuck. If you didn’t own the reference materials you needed, then you were stuck.

Fast forward twenty years and things are rather different. For starters, I don’t have a bunch of magazines sitting around. I don’t even read tech magazines any more, either in print or digitally. None of my favourite magazines survived the transition to digital.

Now when I want to learn a new tech, I head to Wikipedia first to get a basic idea. Then I start trawling google for simple tutorials. I then read one of the new generation of short introductory books on my Kindle.

I then start my project safe in the knowledge that google will always be there. And, of course, google returns an awful lot of Stack Overflow pages. Whilst I would have felt very uncomfortable starting a project without a full grasp of a technology twenty years ago, now I think it would be odd not to. The main purpose of the initial reading is to get a basic understanding of the technology and, most importantly, the vocabulary. You can’t search properly if you don’t know what to search for.

Using my new approach, I’ve cut my learning time from one to three months down to one to three weeks.

The main downside to my approach is that, at the beginning at least, I may not write idiomatic code. But, whilst that is a problem, software is very maleable and you can always re-write parts later on if the project is a success. The biggest challenge now seems to be getting to the point when you know a project has legs as quickly as possible. Fully understanding a tech before starting a project, just delays the start and I doubt you’ll get that time back later in increased productivity.

Of course, by far the quickest approach is to use a tech stack you already know. Unfortunately, in my case that wasn’t possible because I didn’t know a suitable web client side tech. It is a testament to the designers of Angular.js, SignalR and NancyFX that I found it pretty easy to get started. I wish everything was so well designed and documented.