Feeds:
Posts
Comments

Archive for the ‘Spatial Data for Network Management’ Category

Continuing the discussion of spatially enabled network data, there are some great reasons for taking the additional step of distributing this data to the telecom mobile field force. Send it right to their laptops and cell phones while they are resolving an outage or engaged in routine maintenance. There are at least two great reasons for doing this:

1. Efficiency. All field tasks will be done faster and more accurately, once workers can see the precise location of network equipment, anticipate job requirements, search for the nearest spares, and contact co-workers with detailed information. It really cuts down on the need to first inspect the site, then drive back to the shop for tools and equipment.
2. Buy-In. As the field force starts actively using spatial data and recognizing its value, they naturally become motivated to maintain that data with corrections from the field. Today’s applications have good tools for ‘redlining’ or updating data from a mobile device. Once the field force buys in to the process of updating network data as part of their work, the corporate data becomes more accurate, reflecting the ‘as built’ network.

Benefits return to the NOC manager and VP of network operations in the form of more timely and accurate reports on the network, to better support management decisions.

Network and spatial data have been converging for some time, but with increasing scope. As Sherlock Holmes would say, “the game is afoot” now with deals such the recent partnership of Nokia and Yahoo (http://news.yahoo.com/s/ap/20100524/ap_on_hi_te/us_tec_yahoo_nokia).

At Superna, we are developing plug-ins that spatially reference EMC Ionix network data. Here is a one-minute video that demonstrates how our Network Discovery Engine can spatially enable the alarm notification system of EMC Ionix and report results to a mobile device.

Read Full Post »

Server virtualization has changed the way IT operates, and is now well established.  The promise that virtualization can enable low cost CPUs and storage in the Cloud provisioned in seconds for any IT department on demand seems too good to be true. I just returned from EMC World 2010 and got to witness the “everything thing in the Cloud” pitch based on virtualization technologies.

Aside from the aging Counting Crows concert put on by EMC,

Counting Crows

the most interesting announcement from the show was the vPLEX product line and “Access Data Anywhere” technology. This promises to allow application data typically residing in the SAN (storage area network)—which for years has been orphaned data inside a fibre channel island—to be accessible by any application, regardless of physical location.

It’s about time fibre channel data was easier to network but a real challenge emerges: terabytes of information that never left the network are now flowing over network links without any predictability as to when or where it’s going. Today’s management applications are poorly designed to handle a network and application workload that is free to execute or run anywhere in the network. I see the old adage about “location, location, location” coming into play with such a dynamic IT environment. Management software will need to know where data is flowing, where it’s coming from, where it’s going, and over which network links. This will need to happen in real time, as the network has become too complex for humans to make sense of the data without some level of automation. Management software needs to make decisions and raise alarms and show trending based not only on devices but also location or locations.

Let’s look at the example of “follow the moon computing” , where workloads in North America migrate to Europe at the end of the work day to take advantage of lower power, available CPU power and lower latency to end users before moving on to Asia. Terabytes of information starts moving and applications never stop running with some executing in North America some in-flight to Europe or Asia. If something fails, where is the latest copy of the data? Where was the application actually running last? Where was the application and data going? Here again, location will be very important for answering these questions and leveraging Cloud computing’s promise of lower TCO. The future of geospatially enabled management systems seems like a promising technology breakthrough.

Read Full Post »