
5 Considerations For Hybrid Mobile Apps
19 April 2016
Doing Analytics & Being Analytical – Find the difference
6 May 2016There was a time when organizations small and Big, were full of notions that more data means more value. Businesses wanted more information about their customers, their products and their markets. They wanted this information primarily to be able to integrate everything to make data-driven strategic decisions. Today, that notion has multiplied 10x, and the organizations are looking for an in-depth understanding of all areas. But more data doesn’t always mean more trust, this is a very unsettling terrain and should be treated cautiously is what many Big Data Gurus will tell you. But, in reality, it’s the opposite – the 3Vs of Data (Volume, Variety and Velocity) have weighed down towards the uncertainty of Veracity. Big Data can be analyzed to make small company big! The right way to look at it is through the right approach towards the data. It is important here to understand, whether the data is structured or unstructured. In the traditional approach, the data used to be governed as it was discovered in the enterprise. This approach was more on the lines of tried and tested formula of Waterfall Approach, where the Data used to be initially fetched, then through the phases of conception, initiation, analysis and implementation, the data was governed. This approach leads to the storage of data in a central repository first and then utilizing it, but this was a very cumbersome process. By the time organizations were able to make out any sense of the data, it was already past its expiry date in terms of usage. Another problem with this approach has been that it requires a large central repository which is very expensive. This used to put huge cost burdens on the Organization and they were not even able to break even with the costs. Apart from this, if the Data that needed to be processed would be larger than the Central repository then either they had to buy new servers or they had to push the data in batches, which resulted in further problems in terms of delay in processing and also it didn’t assure the accuracy of the data. This process was as slow as a snail pace and all the data would automatically elevate to the highest level of governance. Goal & Importance of the DATA Governance The goal of the Data governance is to ensure the quality, availability, integrity, security and usability of the data within the organization. Largely, these command and control approaches, as is evident in the traditional governance strategies have never worked for Big Data Management. Some of the major areas which Big Data Governance needs to address are mainly focused on:
- Big Data as an Asset
- Formalized Accountability
- Consistency in Application
- Metadata to Improve Value
- Management of Data Production
- Management of Data Usage
A better approach to the Big Data Management can be through the agile data governance. The main goal of the agile data governance is to enable teams to maintain and develop high-quality data assets that can be streamlined at the initial stage of processing. This is all about profiling the data, understanding what it will be used for and then determining the required level of governance. This enables in reducing the redundancy at the initial level, thus making the process much faster and ensuring that only meaningful data is passed for governance.
To make this process more refined, enterprises can set up clearance levels at each stage. These stages can primarily be on the basis of executive level, strategic level, tactical level and operational level.
The functional areas of these stages has to be on the lines of govern the:
bloig_big_data_gov
- Definition of Big Data of the client
- Production of Big Data of the client
- Usage of Big Data of the clientThe Definition of the Big Data includes the availability of the
Big Data, infrastructure requirements, format of the Big Data, accountability of defining the Big Data and also defining the storage of Big Data. All these have to be taken care at the executive level and should be sorted at the initial level of discussion with the client.
The Production of the Big Data includes, how Big Data is produced, where does the Big Data comes from, the quality of production of Big Data and storing the information about the production of the Big Data. Its accountability lies at the strategic and tactical level, which comprises mostly of Directors, VPs, and the senior management.
The Usage of Data includes, how the Big Data is used and giving inputs about storing information also about how the Big Data can be utilized optimally by the client. The main accountability of this lies with the operational level which mainly constitutes of managers, data definers and individual contributors. Conclusion Time has come that enterprises realize the importance of Data Governance, without which there can be numerous security concerns. In addition to the Data veracity problem, Big data poses huge security concerns which can be catastrophic. A comprehensive data governance solution ensures that organizations not only protect their information assets but also their customers. Governed Data is always reliable, secured and ready to use, while ungoverned data from an ungoverned terrain has little value for Big Data Analytics and operations. The total amount of information that is available in today’s Digital World is to the tune of 2 trillion gigabytes. It’s up to the enterprises to truly leverage this opportunity and help organizations improve their decision making.
While creating lambda function, make sure to choose the IAM role created earlier (lamda-ec2-ami-role) and have specified sufficient memory and timeout configurations. Warning: This script copies all the AMI’s are being created for that day which does incur charge for the AMI storage. The script can be further customized to copy only the required AMI’s by matching the substring pattern or on the requirement basis.