Welcome!

Government Cloud Authors: Elizabeth White, Pat Romanski, Dana Gardner, Liz McMillan, Gopala Krishna Behara

Related Topics: Government Cloud, Industrial IoT, Microservices Expo, Cognitive Computing , Agile Computing, Cloud Security

Government Cloud: Blog Post

Estimating the Hidden Costs of Cost Estimation

Federal agencies are not properly equipped to estimate their future IT infrastructure costs

A recent Government Accountability Office (GAO) report found that most federal agencies, with the exception of the Department of Defense, are not properly equipped to give accurate cost estimations of their IT infrastructure. There are many reasons for this, but the problem starts with the data that is being fed into most cost estimation practices and models.

For any organization, federal or commercial, the ability to credibly estimate the time and budget required for a project to reach a successful conclusion is crucial. The many benefits of good estimating have been explained over and over.

According to the GAO, federal agencies are not setting a good precedent in estimating their IT projects.  Most federal agencies have weak processes that rely on expert opinion, while some employ tools such as parametric models. At the root of any process, whether parametric or expert opinion, agencies need access to information about the systems they are supporting or seeking to develop - and this is precisely when the process begins to break down.

Incomplete, Bad and Unattainable Data
Collecting data is not cheap, and it takes time and effort to do it properly. When the budget is tight, data often gets cut from programs. As a result, agencies have an incomplete view of their systems. If they do have data, it is often ‘dirty,' meaning that poor time keeping or project tracking practices generated data that is effectively meaningless. In many cases, system integrators that are performing the work have the data, but agencies don't have access to it.

So, in lieu of data, agencies rely on expert opinion to provide basic inputs into their estimating process. But depending on the day your expert is having, they'll give you some stats about the applications (or not) and send you on your way. But how can you be sure that data is reliable?

The end result is you can't. By front-loading your estimation process or model with uncertain data, any result that comes out will be unreliable ... garbage in, garbage out.

Shrinking the Cone of Uncertainty
Federal organizations would benefit greatly from automated software analysis and measurement systems that generate unbiased metrics of their applications. The value of injecting fact-based measures into the front end of an estimating process greatly reduces the Cone of Uncertainty.

The Cone of Uncertainty describes the evolution of uncertainty in a project. In the beginning, when little is known, estimates are subject to large uncertainty. As more information is learned, the uncertainty decreases.

By injecting an accurate calculation of a system's size, we greatly reduce the amount of uncertainty. And supporting size data with measures of that system's technical and functional complexity, and an objective assessment of its underlying structural quality, reduces uncertainty even more. An estimate that has little uncertainty or a high degree of confidence is the foundation of accurately predicting development teams' productivity. This is important because planning and budgeting are merely exercises to determine how to allocate resources and plan when new capabilities will be available to your clients. How many developers will I need? How long will I need them? When will they be finished?

There are several sources (Standish's Chaos Reports) that document the IT industry's legacy of poor delivery. And there are many reasons why IT projects continue to fail.

We know that most IT budgets, both federal and commercial, are spent maintaining and supporting existing systems. It is clear that agencies that own these existing systems suffer from a lack of visibility into their complexity. Without this information, any planning and budgeting is handicapped. IT intensive programs that require the most planning to deliver systems on-time and on-budget would improve if we can shed some light into these systems and arm agencies with objective, fact-based insight.

Making the Invisible Visible
Through static code analysis, you can measure your application in real time, and gather unbiased metrics to share both internally and externally.

By getting these metrics right from the product that's being managed and worked on in real time, the data is consistent across all the programs. This independent, unbiased data can then be used to support program decisions around the ongoing management of the application.

When a racing team is tuning a car's engine, the team isn't going to ask the engineer what he thinks and run a race based solely off of that data. It'll fill the engine with sensors and monitor every metric it can grab. If your organization approaches software estimation the same way, you'll create a repository of useful data to show how your IT infrastructure has evolved, and what it will take to bring it to the next level.

Be sure to flip over to Dan Galorath's article on data driven estimation for more information on this topic.

More Stories By Lev Lesokhin

Lev Lesokhin is responsible for CAST's market development, strategy, thought leadership and product marketing worldwide. He has a passion for making customers successful, building the ecosystem, and advancing the state of the art in business technology. Lev comes to CAST from SAP, where he was Director, Global SME Marketing. Prior to SAP, Lev was at the Corporate Executive Board as one of the leaders of the Applications Executive Council, where he worked with the heads of applications organizations at Fortune 1000 companies to identify best management practices.

IoT & Smart Cities Stories
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...