Welcome!

Government Cloud Authors: Pat Romanski, Elizabeth White, Liz McMillan, Dana Gardner, Gopala Krishna Behara

Related Topics: Government Cloud, Industrial IoT, Microservices Expo, Cognitive Computing , Agile Computing, Cloud Security

Government Cloud: Blog Post

Estimating the Hidden Costs of Cost Estimation

Federal agencies are not properly equipped to estimate their future IT infrastructure costs

A recent Government Accountability Office (GAO) report found that most federal agencies, with the exception of the Department of Defense, are not properly equipped to give accurate cost estimations of their IT infrastructure. There are many reasons for this, but the problem starts with the data that is being fed into most cost estimation practices and models.

For any organization, federal or commercial, the ability to credibly estimate the time and budget required for a project to reach a successful conclusion is crucial. The many benefits of good estimating have been explained over and over.

According to the GAO, federal agencies are not setting a good precedent in estimating their IT projects.  Most federal agencies have weak processes that rely on expert opinion, while some employ tools such as parametric models. At the root of any process, whether parametric or expert opinion, agencies need access to information about the systems they are supporting or seeking to develop - and this is precisely when the process begins to break down.

Incomplete, Bad and Unattainable Data
Collecting data is not cheap, and it takes time and effort to do it properly. When the budget is tight, data often gets cut from programs. As a result, agencies have an incomplete view of their systems. If they do have data, it is often ‘dirty,' meaning that poor time keeping or project tracking practices generated data that is effectively meaningless. In many cases, system integrators that are performing the work have the data, but agencies don't have access to it.

So, in lieu of data, agencies rely on expert opinion to provide basic inputs into their estimating process. But depending on the day your expert is having, they'll give you some stats about the applications (or not) and send you on your way. But how can you be sure that data is reliable?

The end result is you can't. By front-loading your estimation process or model with uncertain data, any result that comes out will be unreliable ... garbage in, garbage out.

Shrinking the Cone of Uncertainty
Federal organizations would benefit greatly from automated software analysis and measurement systems that generate unbiased metrics of their applications. The value of injecting fact-based measures into the front end of an estimating process greatly reduces the Cone of Uncertainty.

The Cone of Uncertainty describes the evolution of uncertainty in a project. In the beginning, when little is known, estimates are subject to large uncertainty. As more information is learned, the uncertainty decreases.

By injecting an accurate calculation of a system's size, we greatly reduce the amount of uncertainty. And supporting size data with measures of that system's technical and functional complexity, and an objective assessment of its underlying structural quality, reduces uncertainty even more. An estimate that has little uncertainty or a high degree of confidence is the foundation of accurately predicting development teams' productivity. This is important because planning and budgeting are merely exercises to determine how to allocate resources and plan when new capabilities will be available to your clients. How many developers will I need? How long will I need them? When will they be finished?

There are several sources (Standish's Chaos Reports) that document the IT industry's legacy of poor delivery. And there are many reasons why IT projects continue to fail.

We know that most IT budgets, both federal and commercial, are spent maintaining and supporting existing systems. It is clear that agencies that own these existing systems suffer from a lack of visibility into their complexity. Without this information, any planning and budgeting is handicapped. IT intensive programs that require the most planning to deliver systems on-time and on-budget would improve if we can shed some light into these systems and arm agencies with objective, fact-based insight.

Making the Invisible Visible
Through static code analysis, you can measure your application in real time, and gather unbiased metrics to share both internally and externally.

By getting these metrics right from the product that's being managed and worked on in real time, the data is consistent across all the programs. This independent, unbiased data can then be used to support program decisions around the ongoing management of the application.

When a racing team is tuning a car's engine, the team isn't going to ask the engineer what he thinks and run a race based solely off of that data. It'll fill the engine with sensors and monitor every metric it can grab. If your organization approaches software estimation the same way, you'll create a repository of useful data to show how your IT infrastructure has evolved, and what it will take to bring it to the next level.

Be sure to flip over to Dan Galorath's article on data driven estimation for more information on this topic.

More Stories By Lev Lesokhin

Lev Lesokhin is responsible for CAST's market development, strategy, thought leadership and product marketing worldwide. He has a passion for making customers successful, building the ecosystem, and advancing the state of the art in business technology. Lev comes to CAST from SAP, where he was Director, Global SME Marketing. Prior to SAP, Lev was at the Corporate Executive Board as one of the leaders of the Applications Executive Council, where he worked with the heads of applications organizations at Fortune 1000 companies to identify best management practices.

IoT & Smart Cities Stories
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...
Rafay enables developers to automate the distribution, operations, cross-region scaling and lifecycle management of containerized microservices across public and private clouds, and service provider networks. Rafay's platform is built around foundational elements that together deliver an optimal abstraction layer across disparate infrastructure, making it easy for developers to scale and operate applications across any number of locations or regions. Consumed as a service, Rafay's platform elimi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace.