While the merits of cloud and big data have been widely accepted, adopting these in the government sector comes with its own set of challenges. Can it be the harbinger for better e-governance?
Suchitra Pyarelal | January 5, 2015
Electronic government holds the promise for a speedier, more efficient and effective governance. Better governance would mean better delivery of services more attuned to the citizen’s expectations. An effective and efficient front-end service delivery calls for a coordinated back-end service integration involving a great level of data and information sharing. A high degree of collaboration and unification of data at the back-end therefore needs to be established. This leads to the area of data and data management. While considering data emanating within the government structure (the government data), the question that immediately comes to mind relates to the volume of data and how it will be stored.
Security is another key concern and the question that frequently crops up relates to security. Who will be the custodian of the data, maintain its security and ensure that data is trusted and not erroneous?
Besides, there are other pressing concerns that need to be addressed. What will happen when the data grows in size over time? This an often asked question. Then there is the issue about unification and integration of data with the existing and new systems in a way that it can be analysed in a meaningful format. It is important that the above key issues are addressed when dealing with government data and its storage to ensure we can gain valuable insights from it.
In a traditional data storage and management model, data centre was the initial step taken by the government for providing scalable infrastructure for storage and manageability of the data. Under this scheme the data centres actually provided infrastructure as a service (IaaS) by unifying the server infrastructure. Hardware and software virtualization were introduced as optimization techniques.
Relational databases with traditional network-attached storage (NAS) and storage-area network (SAN) technologies were deployed for handling structured data. Data warehouse and mining technologies with extract, transfer and load (ETL) balancing tools were used for extracting information from different sources.
The cloud way
The increasing thrust by the government in integrated citizen services delivery brought forth the need for inter-departmental information sharing, services collaboration, and applications interconnection. “Cloud Computing” became the model for addressing the aforesaid needs. The model provided the technology for integrating the segmented information and communication technology (ICT) infrastructure into one single logical structure.
Data residing across different sources and locations can converge and be stored in the cloud. The devices used in cloud computing are cost effective and energy efficient. Optimal resource sharing is made possible and the scalable cloud architecture allows high computing and large data storage.
As applications are evolving into “software as a service (SaaS)” and diverse platforms are being unified into “platform as a service (PaaS)”, the move to cloud model has become a necessity as it provides the underlying architecture for SaaS and PaaS.
Recent initiatives of the government through Megharaj, open data and the vision of digital India programme mark the move towards taking data and its storage to the next level. The answers to who will maintain the data and ensure its security and what will happen when the data grows in size over time are assumed to be in the way of being addressed thorough these initiatives. However, the remaining issues require diving into the core issue of data management.
What about big data?
Big data refers to a great amount of data that traditional data management techniques cannot manage and process due to complexity and size. It has three main features: volume (huge data), velocity (speed of data creation), and variety (variety of sources). A fourth characteristic that is also being discussed is the value of the data and the value it can generate through predictive analytics and forecasting.
The adoption of big data and its analysis have proven its merit in many industry and business sectors. The competitive advantage that several industry sectors have gained by big data analysis is now undisputed. With the rise of unstructured data, the tools and technologies have also moved beyond the traditional database management systems.
It is important at this juncture to look into the government sector big data, which by its nature is large in terms of size and heterogeneity and can easily fit into the category. The three features describing big data—volume, veracity and variety when extended to the government data, translates to sheer volume, large veracity and high velocity (based on the real time transactions).
As government big data assumes more complexity and heterogeneity with massive amount of both structured and unstructured data, subjecting it to the existing traditional data management strategies may not be sufficient for analysis. Applying the right analytic and pattern recognition tools on the data can yield results and desired value for the government.
Big data & the cloud
Big data in the government has to first be made conducive for being exploited for gaining value. The potential sources of data for decision making and forecasting, may actually be residing in different platforms and locations and they need to be first brought to a common platform by sourcing from the disparate sources. Cloud provides this desired environment. The quality of the data itself will have to be checked for errors and duplicity. Although the merits of cloud and big data have been widely accepted, they are not without challenges. Bringing the data onto the cloud would also mean that the required assurance of the security is met. A holistic approach requires to be adopted for realising the potential benefits of having the “big government data in the cloud”. Here are few quick things to be done.
In e-governance, the need for the government to have its data converted to intelligent form for decision making is now more important than ever. The challenge in the government is to deliver better and effective governance while at the same time optimising costs. Cloud and big data are two enabling technologies with cloud providing the right platform for harnessing the big data available. A right blend of the two can become the game changer in e-governance services delivery in India.
The government feels that the Public-Private Partnership (PPP) model needs to be revisited, said a World Bank expert. “As for the attempts to revive the “flow” of PPP projects, the government is convinced that the model needs to be revisited, with particular focus on rebalancing ri
Would raising an all women batallion help tackle Kashmir`s stone pelters?
PM Narendra Modi’s yet another niftily acronymed scheme, UDAN – short for Ude ‘Desh Ka Aam Naagrik’ and otherwise called ‘Regional Connectivity Scheme’ in officialese – got off to a flying start on Thursday. Modi formally launched a flight from Shimla to Delhi, and
He accompanied his father to film studios in Chennai and helped him in designing sets, but Thota Tharrani wanted to be an artist. So he studied mural painting and print-making, but as luck would have it, he finally returned to tinsel town. And the world soon took note. In Mani Ratnam’s pa
Is the AAP headed for a split?
A sale-purchase agreement was signed between Numaligarh Refinery Limited (NRL) and Bangladesh Petroleum Corporation (BPC) for supply of high speed diesel (HSD) through the proposed 131 km Indo-Bangla friendship pipeline. The agree