Reality Check: What’s Next for GIS?
Anyone who undertakes a geographic information project at some point faces this question: "Will I ever be done?"
When my team started down our current path some ten years ago, our only real goal was to standardize the data in our plant maps. It only made sense that if the maps were standardized, we would be able to use the data to create common bills of materials (BOMs) and reporting to support the rebuild we were then undertaking as part of TCI.
Little did we know that our decisions to move towards standardized intelligent data would lead us down our current path of building a full-fledged geographic information system (GIS). Looking back, I think our initial decision to focus on the structure and standards of the data was one of our best decisions. Those standards have been the foundation of everything we have been working on since.
"The more common the model is, the more powerful it becomes."
Are we done?
As you might have guessed, the answer is no. There is always more to do. Not only does technology change, but so do the ways to use and resource our data.
So if we’re not done, what’s next? Let’s first take stock. The foundation of the system we have been building has always been centered on the standardization of all plant data and the workflow that allows the data to accurately reflect what is actually in the plant and its relationship to the customer’s location.
With the initial creation of @MApp we laid down a standard that exists throughout our plant data, which allowed us to move the data into Oracle when the time came. Focusing on the workflow and processes surrounding the data has given the data a life cycle and inherent validity.
Over the last ten years, our maps have grown from a nice-to-have when we were operating a video plant to a need-to-have in operating a telephony plant to an absolutely critical need when serving customers with service level agreements (SLAs).
It seemed we had reached the pinnacle when we were able to automate service requests, cleanse the billing system for both residential and commercial household past accuracy and even identify the most probable equipment failures within the plant — all of which continue to drive huge returns on our initial investment. But we hadn’t.
HFC common model
What’s next is something we’ve seen coming for few years. It’s a big one. The next logical step in GIS development is separating the production tools from the common model data warehouse.
That means creating an HFC common model separate from the production tools and proprietary vendor models used to design and manage the plant.
The vendor models are perfect for their intended purpose of production design and workflow management. But to create business critical tools and empower the system data for the enterprise requires something else. We’ve taken the first steps by defining the HFC common model; however, we believe that all the MSOs should control this standard through CableLabs. The more common the model is, the more powerful it becomes.
Once the model has been put into place, vendors would need to build APIs or services-oriented architecture (SOA)-based services that would populate the HFC common model from their proprietary models. This is a critical step in that it will would allow anyone to use the production tools of their choice while giving the enterprise the ability to use the plant data in an MSO-controlled and standardized model.
The correct endgame is a set of common tools for reducing our cost of development and creating new and ever-expanding ways to drive efficiency and automation into our business.
Sean Bristol is director of engineering and construction, Comcast Washington.