IT management strategies for insurance companies

An insurance company's historical IT perspective- from "green screens" to the 21st Century.

This article originally appeared on the BeyeNETWORK

An historical perspective is often difficult for the IT professionals in their own company, because they remember the blood, sweat and tears that they endured to make it happen. However, I hope that the readers of this article will find that it identifies the pitfalls of homegrown IT and helps them navigate their own IT development efforts.  

One thing that many insurance companies have in common is that they jumped on the technology bandwagon early.  My company was not an exception.  We were there with a data processing task force (you can show your age if you remember the term) in 1956.  This led to our implementation of an IBM 727 processor in 1957. We stayed at the forefront in the use of technology, with IBM even taking up offices in our headquarters.  We became a beta sight for all new main frame upgrades through the 70’s and 80’s.  We were a beta sight for the new relational product DB2.  I was given the task to determine how we could best utilize a relational database.  This still left our company with most of our data housed in the IMS hierarchical structure.  We considered IMS to be fast and efficient.  We also had a lot of experience with it.  

Since the company is mutually held, we were never in fear of being purchased by a larger company. We were always looking for other companies that we could acquire.  With the safety of being a mutual company, our philosophy was to do “everything” by ourselves.  We did not want outside vendor help.  If our IT people needed assistance, one person was sent for training and we relied on that person’s ability to educate the rest of us.  

Over the course of time leading up to the 90’s and the ever popular dotcom era, we lived in a vacuum.  As recently as 1999, we were using mostly “green screen” technology, a homegrown intranet, a handful of small servers and a large mainframe.  We were one of the only companies of our size that took on the Y2K effort completely in-house.   

Only 4 years ago, all of the data sources in 82 disparate systems, approached one terabyte in size.  Our technology budget only grew by the annual rate of inflation. Although we were able to save cash, we needed to catch up to all of the newest technology that was created by the Web.  We needed to do a motorcycle jump over the Grand Canyon.  

How did we do it?  The biggest step was to admit that we were losing business to others that were utilizing some of the latest technologies.  This led us to provide our field agents with a web application to send new business quotes to prospective insurance customers.  This brought us some limited web experience.  Then we hired a new technology champion as a vice president.  As a result, the green screens, the home grown intranet and the scheduling programs disappeared.  PC’s, internet access and Outlook replaced them.  We were making our move.  We implemented our first vendor product and started looking at outside software solutions.  We hired consultants to help when we didn’t have the in-house expertise.  We finally reached the 21st century.  The only problem was that we didn’t really have all of the necessary expertise that was required.  Although, we made some good decisions, we also made some bad decisions.  One of the better decisions was that we retained some very talented people to correct our deficiencies. 

My career went from being a database administrator, to becoming a data architect.  My knowledge base also had to increase at a very quick pace.  I had been brought up in the mainframe world. My new role required me to learn about business intelligence and server technology, which included Change Data Capture (CDC), ETL, OLAP, etc.  At the same time, the company experienced accelerated change.  Our entire server capacity at the time we created this architecture was only 50GB.  Our first data warehouse initiative which was a small subset of our claims data pushed us to 2TB’s.  With additional projects being added to the warehouse initiative, this number will exceed 8TB’s before the end of the year.  A new project to support transactional reporting, sourced from a common operational data store, rather than analytical reporting sourced from the warehouse, is going to expand the amount of data we store. It will also consolidate our many disparate reporting systems into one.  

We’ve come a long way in the past four years.  At one point in time we were looking at implementing three of the biggest software packages simultaneously.  We were able to come to our senses on that decision before any real damage was done.  

At any point in time, we may actively pursue the purchase of other insurance companies to enhance our book of business.  The ability to integrate newly purchased companies into our existing company is always on the mind of our data architects.

Dig deeper on Business intelligence case studies

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchDataManagement

SearchAWS

SearchContentManagement

SearchCRM

SearchOracle

SearchSAP

SearchSQLServer

Close