While the merits of cloud and big data have been widely accepted, adopting these in the government sector comes with its own set of challenges. Can it be the harbinger for better e-governance?
Suchitra Pyarelal | January 5, 2015
Electronic government holds the promise for a speedier, more efficient and effective governance. Better governance would mean better delivery of services more attuned to the citizen’s expectations. An effective and efficient front-end service delivery calls for a coordinated back-end service integration involving a great level of data and information sharing. A high degree of collaboration and unification of data at the back-end therefore needs to be established. This leads to the area of data and data management. While considering data emanating within the government structure (the government data), the question that immediately comes to mind relates to the volume of data and how it will be stored.
Security is another key concern and the question that frequently crops up relates to security. Who will be the custodian of the data, maintain its security and ensure that data is trusted and not erroneous?
Besides, there are other pressing concerns that need to be addressed. What will happen when the data grows in size over time? This an often asked question. Then there is the issue about unification and integration of data with the existing and new systems in a way that it can be analysed in a meaningful format. It is important that the above key issues are addressed when dealing with government data and its storage to ensure we can gain valuable insights from it.
In a traditional data storage and management model, data centre was the initial step taken by the government for providing scalable infrastructure for storage and manageability of the data. Under this scheme the data centres actually provided infrastructure as a service (IaaS) by unifying the server infrastructure. Hardware and software virtualization were introduced as optimization techniques.
Relational databases with traditional network-attached storage (NAS) and storage-area network (SAN) technologies were deployed for handling structured data. Data warehouse and mining technologies with extract, transfer and load (ETL) balancing tools were used for extracting information from different sources.
The cloud way
The increasing thrust by the government in integrated citizen services delivery brought forth the need for inter-departmental information sharing, services collaboration, and applications interconnection. “Cloud Computing” became the model for addressing the aforesaid needs. The model provided the technology for integrating the segmented information and communication technology (ICT) infrastructure into one single logical structure.
Data residing across different sources and locations can converge and be stored in the cloud. The devices used in cloud computing are cost effective and energy efficient. Optimal resource sharing is made possible and the scalable cloud architecture allows high computing and large data storage.
As applications are evolving into “software as a service (SaaS)” and diverse platforms are being unified into “platform as a service (PaaS)”, the move to cloud model has become a necessity as it provides the underlying architecture for SaaS and PaaS.
Recent initiatives of the government through Megharaj, open data and the vision of digital India programme mark the move towards taking data and its storage to the next level. The answers to who will maintain the data and ensure its security and what will happen when the data grows in size over time are assumed to be in the way of being addressed thorough these initiatives. However, the remaining issues require diving into the core issue of data management.
What about big data?
Big data refers to a great amount of data that traditional data management techniques cannot manage and process due to complexity and size. It has three main features: volume (huge data), velocity (speed of data creation), and variety (variety of sources). A fourth characteristic that is also being discussed is the value of the data and the value it can generate through predictive analytics and forecasting.
The adoption of big data and its analysis have proven its merit in many industry and business sectors. The competitive advantage that several industry sectors have gained by big data analysis is now undisputed. With the rise of unstructured data, the tools and technologies have also moved beyond the traditional database management systems.
It is important at this juncture to look into the government sector big data, which by its nature is large in terms of size and heterogeneity and can easily fit into the category. The three features describing big data—volume, veracity and variety when extended to the government data, translates to sheer volume, large veracity and high velocity (based on the real time transactions).
As government big data assumes more complexity and heterogeneity with massive amount of both structured and unstructured data, subjecting it to the existing traditional data management strategies may not be sufficient for analysis. Applying the right analytic and pattern recognition tools on the data can yield results and desired value for the government.
Big data & the cloud
Big data in the government has to first be made conducive for being exploited for gaining value. The potential sources of data for decision making and forecasting, may actually be residing in different platforms and locations and they need to be first brought to a common platform by sourcing from the disparate sources. Cloud provides this desired environment. The quality of the data itself will have to be checked for errors and duplicity. Although the merits of cloud and big data have been widely accepted, they are not without challenges. Bringing the data onto the cloud would also mean that the required assurance of the security is met. A holistic approach requires to be adopted for realising the potential benefits of having the “big government data in the cloud”. Here are few quick things to be done.
In e-governance, the need for the government to have its data converted to intelligent form for decision making is now more important than ever. The challenge in the government is to deliver better and effective governance while at the same time optimising costs. Cloud and big data are two enabling technologies with cloud providing the right platform for harnessing the big data available. A right blend of the two can become the game changer in e-governance services delivery in India.
In the organized manufacturing and service sector, employment is expected to increase from the current 38 million to 46-48 million by 2022, a new study has found. All the new forms of employment are expected to add a further 20% - 25% to the workforce of the current deﬁned “or
A day before a Supreme Court bench takes up petitions opposing mandatory Aadhaar linkage with several government services, the government has withdrawn its December 31 deadline to link Aadhaar with bank ac
A wide swathe of economic activities was nationalised in India after independence, and especially during Indira Gandhi’s prime ministership, for predominantly political reasons. But state ownership was also justified as a way to correct market failures, increase investible surpluses, and pursue wider
Calling for improved communication in the field of science and technology, eminent scientist and chairman, National Innovation Foundation, Dr Raghunath Mashelkar has said that it is important to advance knowledge and people need to know how that knowledge is for their own good. “Public awaren
Did the Rajasthan health department do the right thing by sending data on Muslim staff to centre?
Three in four abortions in India are through drugs from chemists and informal vendors rather than from health facilities, said a report in The Lancet. An estimated 15.6 million abortions were performed in the country in 2015, reports The Lancet in its latest released paper on ‘Inciden