government

6月 032014
 

“Evidence-based policy development” is a term which has been in use for a number of years. However, in most cases it remains more an aspiration than a practical reality. The consciousness of using more data to underpin policy has risen, but the change in practice has been modest. The question is whether agencies are using all available internal and external data at their disposal, exploring that data to gain valuable insights, using predictive modelling to determine how different options play out over the life of the program and balancing budgets and optimal outcomes.

Typically the basis of policy proposals consist of historical data, analytical models from other agencies (such as treasuries), accepted wisdom and professional opinion. While there may be valid reasons for ministers to sometimes take a path different from what the data might suggest, it is even more important in these circumstances to model the likely outcomes in order to successfully protect the government from unintended consequences.

New generations of policy developers are likely to be far more receptive to the “scientific” approaches to policy development. However, the imperative is to provide them with access to all relevant data and tools which enable them to explore it. Another aspect to consider is that the greater quantity of accessible data (often enabled by government ‘open data’ policies) potentially expose governments to challenges by opposition parties, individuals, academics and advocacy groups who, based on their own analysis of the available data, may question the declared expected outcomes of policies and programs. It is therefore important for ministers and their agencies to be able to demonstrate the strong factual foundation of their policies and programs. Importantly, the inflection point is now since all of this is already possible – the question is whether the will to fundamentally change the current approach to policy development exists.

Pro-active Compliance Management alludes to a very significant shift in the approach to fraud, waste, error and debt management in the public sector. The current post-fact management of these issues – where money is allowed to “leak out of the system” through a “pay now – check later” approach and where agencies then spend additional funds and resources to recover some of this money – is shifting to a proactive, preventative approach. Commercial organisations, like banks, cannot afford the high costs of the “double whammy” of the post-fact approach and there is a growing recognition that governments can’t either.

The drive from governments to return budgets to surplus has heightened this consciousness. The move to on-line systems has meant that analytics can be applied to transactions and inputs to verify their validity in real time. This means that fraud and error can be trapped at the point of interaction and be effectively prevented – very much the same as financial institutions already do at the point of transactions.

An important point to be made in this context is that analytical approaches to these areas are vastly superior to business rules-based approaches. The latter creates rules based on the repetition over time of an identified activity, however it is increasingly apparent that perpetrators are just as quick to change their behaviour to circumvent detection – meanwhile considerable financial loss or debt has occurred. By contrast, analytical approaches focus on the detection of anomalies which suggest error or malfeasance. While the error isn’t identified as a particular activity or repetitive, it flags something that is outside of normal or expected behaviour. In the sphere of social services, a proactive approach translates into better social outcomes. For instance, if the vast majority of welfare recipients are entitled because of an economic need, then the recovery of over-payments exacerbates their economic hardship. Better to have prevented an overpayment in the first place.

To recover current debt carried by governments – which in the Federal Government alone is in the tens of billions of dollars – analytical approaches allow for the differentiation of treatment strategies to optimise recovery, allowing agencies to recover the most money in the shortest time from those individuals and entities in the best position to pay off their debts.

Citizen-centric Service Delivery represents a fundamental shift away from payment-centric approaches towards service delivery that is built around a holistic understanding of individual citizens. This differentiated approach looks at all the relevant factors in an individual’s life and the identification of the next best action to improve their personal, social and economic wellbeing. This is a laudable vision towards which a number of agencies are already progressing.

It is reasonably expected that over time this will result in a more engaged and participative relationship between governments and citizens which will in turn produce better outcomes for both. It recognises that the various functions supporting citizens in need exist in silos, both within larger agencies, governments and across layers of government and that this does not serve citizens well. Experience to date demonstrates that with a more holistic and cross-functional approach, citizens feel more cared for and are therefore more inclined to take greater ownership outcomes and begin collaborating with agencies in what becomes a shared endeavour.

A benefit to governments is that they are able to reduce their service delivery costs while ensuring the citizen receives the most beneficial services. From a relevant technology perspective, it is interesting to observe that many of the analytical tools developed for the commercial sector work equally well in government services. For example, a customer intelligence capability designed to use all the known data about an individual consumer to determine the next best product to sell to them is reversed to determine the next best action to assist a citizen. The challenge for major agencies is that their systems are built around eligibility assessment and payments and have very little, if any, customer intelligence, case management or service planning capability. The need for a solid eligibility assessment and payment system remains, but these additional capabilities are required if agencies are to effectively transition to citizen-centric service delivery.

tags: analytics, business analytics, Government
5月 312013
 
Some things are just common sense, and having common academic standards for all states is one of those things. In a national milestone event, 45 states and the District of Columbia recently coalesced around a common set of standards for math and English/language arts. The process did not happen overnight, and there [...]
5月 252013
 
Perhaps no other sector has more massive volumes of text data than government.  During the SAS Government Leadership Summit earlier this week, different agencies discussed big data topics, trends and techniques.  And while diverse elements of focus were apparent, shared themes emerged that Juan Zarate, former Deputy Assistant to the […]
10月 162012
 
The annual Predictive Analytics World Government 2012 conference (PAW-G), in only its second year,  drew 470 attendees, more than double the year before! PAW-G is not just another conference where vendors outnumber the government attendees.  Government speakers from 18 government agencies participated this year as featured speakers and panelists and talked [...]
6月 212012
 
How do smart government organizations prevent fraud and improper payments? You might imagine they go after the bad guys one at a time - but technology is making it easier to identify networks of criminals with similar behaviors and even predict when certain behaviors are likely to start appearing. This fraud prevention [...]
6月 192012
 
SAS Solutions Architect David Pope recently held the audience’s attention at the SAS Government Executive Leadership Summit, where he helped demystify high-performance analytics for the government crowd in attendance.   His presentation entitled, “The Next Big Thing: High-Performance Analytics” reviewed how the analytics lifecycle can be greatly accelerated with recent technological [...]