Welcome!

Government Cloud Authors: Liz McMillan, Kevin Jackson, Elizabeth White, Pat Romanski, Bob Gourley

Related Topics: Government Cloud

Blog Feed Post

Hitachi Data Interactive: XBRL: Taking It to the Next Level

Written by Bob Schneider     Posted on March 12, 2009

I recently reviewed this blog’s content for the past few months and was not surprised that more than half was devoted to the SEC’s XBRL mandate. The final rule for interactive data has certainly been the big story, a major milestone in XBRL implementation.  But there have been other developments that, while not as historic as the final rule, are notable and significant. These include:   

  • The IBM Data Governance Council — which comprises more than 50 companies that have pioneered best practices around risk assessment and data governance — announced in December that it was exploring the use of XBRL for risk reporting. Steve Adler, Chairman of the Council, reported on the follow-up meeting in late February.
  • The Open Compliance and Ethics Group established a provisional XBRL jurisdiction in September.
  • In October, the World Intellectual Capital Initiative (WICI) published “…a comprehensive information framework and XBRL taxonomy to help companies improve communications with investors and other stakeholders about business strategy and performance.”

Thus we see significant advances for XBRL in key areas: risk reporting; the entire governance, risk, and compliance (GCA) field; and nonfinancial reporting, including Key Performance Indicators (KPIs).

At the same time, cognizance of XBRL is spreading from a narrow confine of specialists to the broader business and IT communities, as evidenced by this article in Wired. CFAI surveys have consistently recorded relatively low recognition levels of XBRL among its members, but I would expect the next report to show a significant jump in their knowledge.

Following the December 2006 international XBRL conference, I wrote a post called The XBRL Moment. The meeting had been attended by the heads of arguably the three top accounting standards-setters, namely, the SEC, FASB, and IASB; the top guns at the AICPA, CFAI, and International Federation of Accountants came too. To me, the assembly of so many accounting and regulatory luminaries signified a special moment, a coming of age, for the obscure data standard with the highly forgettable acronym.      

I don’t think a single article in Wired compares. But, coupled with the other developments I’ve noted, I do sense that we’re entering a new phase for XBRL adoption that extends beyond external financial reporting to general business reporting. XBRL is — as heard every ten seconds on ESPN – taking it to the next level.    

This transformation, of course, is most welcome. Nevertheless, XBRL faces the challenge of  all new technologies, i.e., managing expectations. I see some signs of XBRL being caught between the Scylla of  “this changes nothing” and the Charybdis of “this changes everything.”  The low-end extreme is represented in this post by veteran IR pro John Palizza, who argues:

Maybe I’m just a skeptic…but I don’t see what all the fuss is about with XBRL. It’s just rearranging the data that we already had, which to me seems like a lot of work for not much benefit. Better understanding and benefit come from working through the data.  Balanced against this we seem to have given companies yet another opportunity to spin the data.

I’m reminded of that Comcast commercial where the guy who just got their service phones his brother; he thinks their calls will now ooze brotherly affection, but instead he finds they are as awkward as ever. I don’t see how XBRL can be expected to eliminate the natural inclination of management to put the best face on bad performance. In an XBRLized world, it’s still unlikely any CEO’s letter to shareholders in an AR will begin “We lost a ton of money last year. Part of the reason was the lousy economy. But mostly it was our poorly executed strategy; studied indifference toward our customers and suppliers; and general all-around stupidity.” 

At the other extreme is Mark Cuban’s statement “In fact, President Obama’s use or lack of use of XBRL for government will be a beacon as to just how much transparency we can expect from his administration.” A beacon? A useful indication, maybe. XBRL may provide a fillip to open government, but the best thing a President can do to aid transparency is to be honest and forthcoming in his own statements.

Coupled with the issue of realistic expectations is the question of realistic evaluation: How should we judge whether an XBRL project is a success?” In a recent Tweet, Dominic Jones of IR Web Report asked “If XBRL is so great, why didn’t its use in bank Call Reports prevent the current banking crisis? Just asking.”

I put a similar question to Christian Dreyer in our interview in November. His reply:

The credit crisis has causes that are outside of the domain of financial reporting and reported numbers as per today’s reporting standards. XBRL is "just" an efficient vector for such data. I’d be very reluctant to use the crisis as an argument to promote XBRL, because the linkage is marginal at best. We’ll hopefully see improved reporting standards with more transparency, less Held-To-Maturity trickery, and a lot more fair value as a consequence of the crisis. At that point, we’ll see all that information using XBRL.

I think Christian’s reply is wise, and it is augmented powerfully by an answer Neal Hannon gave to a question in our interview: “Was there anything important [in the XBRL world] you believe was not reported or underreported [in 2008]?” He answered:

The biggest underreported story about XBRL in the US is the FDIC. During the recent financial crisis, the almost two years’ worth of quarterly data collected in the XBRL format has given the Treasury Department valuable insight into which financial institutions have suspect holdings. Secretary Paulson is in a much stronger position to make correct bailout allocation decisions because of the landmark work accomplished at the FDIC.  I understand that since the crisis began, the FDIC has expanded the number of questions asked of over 10,000 U.S.banks each quarter. Kudos to the FDIC! 

Neal’s response isn’t necessarily the last word on the matter. It can still be asked whether XBRL adoption at the FDIC for call reports should have been expected to help more to avert the financial crisis, and whether the returns have been worth the cost.

But his answer suggests that, even if XBRL doesn’t result in persistently stable economies or totally transparent financial statements, it still can make important contributions toward those goals at reasonable expense. Instead of all-or-nothing-at-all expectations and questions — like the one I put to Christian — his reply points to a sensible, useful, and realistic approach to evaluating XBRL projects.   

Read the original blog entry...

More Stories By Diane Mueller

Diane Mueller is a leading cloud technology advocate and is the author of numerous articles and white papers on emerging technology. At ActiveState, she works with enterprise IT and community developers to evangelize the next revolution of cloud computing - private platform-as-a-service. She is instrumental in identifying, building, and positioning ActiveState's Stackato cloud application platform. She has been designing and implementing products and applications embedded into mission critical financial and accounting systems at F500 corporations for over 20 years. Diane Mueller is actively involved in the efforts of OASIS/TOSCA Technical Committee working on Cloud Application Portability, works on the XML Financial Standard, XBRL, and has served on the Board of Directors of XBRL International. She currently works as the ActiveState Cloud Evangelist.

@ThingsExpo Stories
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...