With all the attention that has been given to Boyd’s Loop (a.k.a. Observe, Orient, Decide, Act -or- OODA), all the articles seems to focus inward. Essentially, the loop focuses on the observer’s perspective and how they are managing their own decision process. While it is useful to help you understand your own decision loop and how to refine it, there is a better use that is being overlooked by many. The OODA can be used to represent both sides of any conflict and competition enabling further mastery of the situation.
Consider this simple dual loop model based upon my Prime Business Decision Loop. Your outwardly facing Actions feed the Acquisition of information by external parties. Likewise you Acquire information about the other party as they take Action in response. This is the external or shared perspective. Both parties have access to the same information or observations regarding external actions. Neither party has insight into the history or knowledge the other party Absorbed over a lifetime. Neither party has insight into the Analysis process of the other. Both parties can only Acquire information about the Actions of the other which occur in a shared public space.
Undeniably Big Data has taken off in the marketplace. It promises great value to those who master the management of the content embedded in all that data. The new value is accompanied by new risks as well. These risks are associated with the attributes of Big Data that set it apart from traditional data systems; variety, volume, veracity, and velocity.
In a previous blog entry, I questioned readers if their risk management scaled with their systems. Today I’m asking are your risk controls as fast as your Big Data streams and their associated decision algorithms? Put another way, has your capability to identify and react to negative trends driven by your automated decision systems kept pace with the systems themselves?
When considering the new risks that Big Data introduces, velocity comes to mind as having the most inherent risk and the need for governance to mitigate the risk. We have created automated decision engines that act upon streaming data with incredible speed. What has not kept pace is the forethought to ensure reactions can occur equally as fast.
Most systems today have an embedded Information Hysteresis Loop (act-react) that allows sufficient reaction time to respond to the unexpected. The 24 hour batch cycle gives the human operators’ time to examine, analyze, and sometimes correct bad automated decisions. Streaming real-time decision engines do not provide that luxury. That is the new risk that needs to be managed and controlled.
Consider the case of “flash crash” events in the financial markets. Automated selling rules acted faster than the ability of the human decision makers to respond. Since those events occurred, circuit breakers have been implemented to mitigate the risk of the fast automated selling. They are designed to react as fast as the decision engines controlling the selling rules.
The concept of Big Data Governance controls for velocity is not unlike negative feedback and automated gain control in electronic circuits. They are needed to prevent run away processes like the familiar squeal in audio systems. In the case of Big Data for complex event processing it is important to put automated safety rules in place that limit the amount of spending, risk exposure, or resources that an automated system can commit within a certain period of time. The safety rules need to be part of the design criteria and functional specifications. Otherwise your system will be running with a great accelerator and no brakes!
A recent Associated Press headline touted that a Study warns US must develop cyber intelligence. Being that Information Security is one of the pillars of Information Governance, the article should raise alarm bells for some businesses. Many businesses have been so intent on containing costs, that in the rush to outsource more and more of their data management, they have unwittingly exposed themselves to increased risks. They rely upon blind trust that the security offered by the vendor will be sufficient. Continue Reading »
Funny thing about large data systems, everyone knows the benefits scale due to the application of technology, but many companies turn a blind eye to the fact that so do the risks. Complacency and arrogance (C&A) team up to teach us occasional lessons to learn from, and if the screw-up in grand enough it may even get passed on in college texts. Continue Reading »
In the course of a great conversation on Data Governance with Max Gano of Stakeholder Care, we began discussing the application of controls. Max had some great thinking on how they fit into the realm of Data Governance. The conversation reminded me of a RACI Waterfall that I had created a few years back. The RACI was created to show the chain of Responsibility and Accountability with regard to Data Governance. It also served to explain the importance of documenting the intention of controls and linking them to higher authority.
Governance involves the delegation of Responsibility and Accountability as shown in this RACI Waterfall. Governance is only effective when the decision rights have been granted by higher authority and are enforced by those who are responsible and accountable. Governance succeeds only when the chain of responsibility and accountability is unbroken, and the expectations are documented and published as shown below.
It is a simple concept really, but often overlooked when creating a Data Governance program. That is why many governance programs fail. Either they have not obtained the authority to govern from the senior leadership team, or the senior leadership fails to provide the appropriate backing. When the chain of responsibility and accountability has been interrupted then effectiveness of controls and standards has been undermined.
This can be avoided by ensuring the leadership team stays engaged and supportive, that expectations are documented, and that compliance is monitored. These need to be non-negotiable elements of your program.
Recently an associate commented that perhaps we could better gain buy-in by answering the question “why now?” Adding the dimension of time was an interesting twist on spelling out the value proposition. After all, many businesses operations have been running fine for years with only ad-hoc data governance processes. What would be the risk of delaying implementing Data Governance for one more year? Continue Reading »
Over the last several years, there has been quite a bit of discussion regarding the distinction between data ownership and data stewardship. Stewardship commonly involves the daily and routine care-taking of all aspects of data systems. The roles are highly distributed. Ownership on the other hand is more concentrated. If you want to find the data owners, discover whose head would be on a pike following a major data related disaster such as a financial misstatement, or material loss. Continue Reading »