« August 2016 | Main | October 2016 »

5 posts from September 2016

20 September 2016

Leveraging data to change cities as we know them - the social impact

By Checca Aird

Historically the public sector has been light years behind the private sector when it comes to utilising technology in their operations, and even later to the party when recognising the value of the data they hold. However, recent developments encouraging government transparency and efficiency have forced officials to rethink the issue.

Big Data is one of the Government’s Eight Great Technologies to support UK science strengths and business capabilities, but research such as PwC’s Global Data & Analytics Survey, shows that the volume, veracity and speed of public data need to be improved for real insights to be drawn. The ‘Information Economy Strategy’ reports on the increase in data being collected and the importance of computing power, in order to reap its potential economic value.

The use of big data analytics could save the public sector between £16bn and £33bn a year – equivalent to between 2.5 per cent and 4.5 per cent of the Government's total budget of about £700bn according to a report by think-tank - the Policy Exchange.

The scale and range of Government data is overwhelming. HM Revenues and Customs interacts with over 40 million customers, about 12 million more than the Big Four high street banks combined. Departments, agencies, and local authorities procure over £243 billion worth of goods and services and according to the Labour Market survey from the Office of National Statistics, about six million people work in the public sector, each of whom the Government store data on.

According to Tom Heath, Head of Research at the Open Data InstituteData can enable government to do existing things more cheaply, do existing things better and do new things we don’t currently do.”

Smart Cities

One of the most exciting developments in IOT and big data analytics is the potential for the creation of Smart Cities.

According to Wikipedia a Smart City is an urban development vision to integrate multiple Information and Communication Technology (ICT) solutions in a secure fashion to manage a city's assets – in other words connecting the movements of a city with its inhabitants.

As the use of social and mobile technologies increases and physical objects are embedded with sensors, cognitive systems will be able to find patterns in the vast quantities of data produced. They can then reason through patterns that emerge and learn from their interactions with us to refine their suggestions.

By 2019, the number of smart phone users in the world is expected to surpass 2.5 billion – this will allow people to have a digital key to the city right at their fingertips. Information about what is happening in the city, what experiences are relevant to them and how to get there will be delivered straight to their phone. Mobile apps will become the new standard for tracking pot holes, broken street lights and inaccessible sidewalks.

Public officials will also be able to have a direct communication channel to every citizen, allowing city leaders to develop bespoke plans for communities and regions, addressing each person’s needs in a tailored way.

By allowing cities and their leaders to leverage the knowledge in the data being produced around them, cities can become more flexible, faster to react, less encumbered by bureaucracy and more open to sharing data and insight. Perhaps even allowing citizens to have direct input into their community’s and city’s plans and receive feedback from city leaders.

The iUrban case study, carried out by PwC, showed that opening up Government data can massively contribute to a city’s competiveness in many ways - such as by increasing democratic participation, accountability and transparency, spurring innovation, and even improving a city’s service provision.

Some of the trail blazers in the journey to smart interconnected cities are IBM, Microsoft and Google.

Google’s Alphabet has a subsidiary that has started a “Sidewalk project” which, if approved, will lobby bids from states or cities in need of an upgrade.

It’s well known that Alphabet already has high-tech divisions working on developments that would be well-matched with a smart city; the self-driving car, Project WingMakani, and Project Insight, just to name a few - making this project an excellent commercial offering for Google.

IBM Researchers in Brazil are working on a crowdsourcing tool, Rota Acessivel that allows users to report accessibility problems to help people with disabilities better navigate challenges in urban streets and in Uganda, UNICEF is collaborating with IBM on a social engagement tool that lets Ugandan youths communicate with their government and community leaders on issues affecting their lives. Using IBM text analytics and machine learning, leaders are able to identify trending concerns or urgent matters and immediately take action where needed. These are the kind of analytical programmes that could be a game changer for developing countries that are regularly subject to political and economic turmoil.

Checca-blog

Microsoft CityNext aims to connect functions like energy, water, infrastructure, transportation, public safety, tourism, recreation, education, health and social services, and government administrations.

Several cities are already on their way to modernization, including Auckland, New Zealand; Buenos Aires, Argentina; Hainan Province and Zhengzhou in China; Hamburg, Germany; Moscow, Russia; and Philadelphia.

As with any technological advances, there are risks and limitations. Much like Tesla founder Elon Musk, no one wants "The Terminator" to become a reality.

The iUrban study suggests that implementing open data initiatives in the public sector is not without its challenges and requires a number of key enablers: distributed leadership, prioritisation, choosing the right scale while involving private companies, grassroots movements and agile brokers within the municipal administration.

Some of the main concerns revolve around privacy and data protection. The speed of advancements in technology is surpassing the legal system. While massive improvements could be made, such as edge analytics for healthcare, digital government platforms and citizen e-ID’s, there is debate as to how to implement these technologies ethically.

 

So what is stopping Government bodies from improving their constituent’s day-to-day lives on a massive scale? Is it a matter of trust? Are you comfortable with the level of data sharing and connectivity that would be required? While public bodies would need funding and the technological know-how to implement smart tech, without our trust and a willingness to share our data it won’t be possible to realise the benefits.

Join the debate at www.psrc.pwc.com, or leave your comments below.

12 September 2016

How London Market Insurers can navigate the challenges of deploying predictive analytics

In my first London Market focused blog, "How London Market insurers are responding to the perfect storm of challenges", I touched on the competitive advantages that could rise from exploiting Data Analytics.

I believe that predictive analytics is key to boosting underwriting returns in the London Market – even in very soft market conditions – most notably through teaming advanced analytics capabilities and tools with the augmentation of enhanced internal data with third party data and Big Data. The improved accuracy of risk analysis, segmentation and technical pricing will enhance an underwriter’s decision-making capabilities in implementing cycle management strategies, both in terms of:

  • walking away from a given risk if an appropriate premium cannot be commanded, with greater confidence; or
  • deepening their understanding of a given risk profile, so that rate reductions compared to expiring terms could (if needed) be offered to secure highly lucrative business at still attractive prices, or in order to robustly support rate increases at renewal.

Another angle relates to understanding exposures and thus aggregations better (potentially in real or near real time), which could also lead to more realistic Minimum & Deposit premiums or the ability to calculate and collect adjustment premiums earlier, improving cashflow management.

Counterarguments to this view often cite the lack of credible data volumes or sufficiently granular data, poor data quality, issues round the timeliness of data, data that is largely in paper format, and inertia within the industry to embrace the latest technology developments, as critical barriers to enabling widespread advances in data analytics in the London Market. The current soft market conditions also present a key challenge – how do you justify the level of investment you need to make today, with combined ratios nudging 100%, in the expectation of improved underwriting returns in the future?

The data conundrum can be addressed from a number of different directions:

  • The major brokers already hold a vast amount of data, and have been digitising large chunks of it, as part of data analytics-as-a-service propositions (albeit not necessarily focussed on enhancing insurers’ underwriting and pricing capabilities). In the ideal world, or further into the future at least, broker presentations of risks would be 100% digital, allowing insurers the ability to capture much more granular information on exposures (such as comprehensive details of individual aircraft or marine vessels within the fleet to be insured, and the associated telemetry and geo-spatial/GPS data over prior periods of cover, for example) and more detailed claims information at the point of underwriting.
  • Paper-based documentation and PDF versions of documents held internally, whether it be detailed appendices to claims files, policy slips, loss adjusters’ reports, legal reports, detailed listings of vessels insured, detailed listings of places of business for multinational insureds etc, can be digitised into valuable data. For example, this could be through scanning into digital document management systems and deploying text mining, “scraping” from PDFs, or manual re-keying of specific data items into structured data formats.
  • Trade organisations and associations, such as the International Union of Marine Insurance, often capture detailed exposure and claims data at the global industry level, offering a much richer data set than most insurers could hope to build.
  • A myriad of diverse data vendors have been springing up at an increasing rate, providing specialised data that supplements the plethora of open data sources that are relatively underemployed in the insurance sector.
  • Real time exposure data and telemetry can be captured via next generation telematics, RFID technology deployed in conjunction with cargo containers, and the ever growing wealth of data coming from Internet of Things connections. Insureds are already taking advantage of these new technologies, and recognise that these data sources can provide a clearer understanding of risk and operational resilience, which insurers can also seek to leverage from an underwriting and pricing perspective.

Technology and advanced software solutions are fundamental in capturing, storing and effectively analysing data from so many sources and in so many formats in one place. The costs of data storage have been falling rapidly, and the advent of Cloud computing has changed the perception of what is possible with gargantuan data sets. At the same time, a number of London Market players are re-examining their systems strategy, and any decisions around new platforms should ideally seek to future-proof their requirements in relation to data analytics.

Quantifying the future benefits from the deployment of data analytics in order to support a business case for data analytics initiatives is not straightforward in the pricing conditions currently faced by the London Market. For comparison purposes, in the US SME commercial lines market, insurers have been able to demonstrate improvements in class of business level loss ratios to the tune of 2-3 percentage points (and much higher improvements for specific subsets of business) through enhanced risk segmentation and keener pricing. The emerging game changer for the SME commercial lines market is the advance in analytics from descriptive (what happened) and diagnostic (why it happened) to predictive (what is likely to happen) and prescriptive (determining and ensuring the right outcome). Drawing on enriched data sets and the latest tools, this shift has not only allowed insurers to anticipate what will happen and when, but also to respond proactively. It has also fuelled the possibility for new types of coverage to be provided in the future, which rely much more on real time pricing, for example. The additional benefits of first mover (or at least early follower) advantage in offering a new product or form of coverage that is ground-breaking should also be taken into consideration in building a compelling business case.

The US specialty lines market, which is a more comparable barometer for the London Market, has seen much less in the way of active usage of advanced analytics techniques to date, and remains an area of significant opportunity. However, the growing volume of business derived through delegated authorities and binders in the London Market – which typically encompasses niche personal lines, commercial lines and specialty risks – means that some of the benefits achieved by US commercial lines insurers should be capable of being translated to the London Market.

Any improvements in loss ratios achieved by London Market players through the application of advanced analytics techniques would likely largely arise in the short term from either:

  • non-renewing business which can now be demonstrably shown to require significant rate increases, but which the market will not bear;
  • securing significant rate increases on renewals now that the risk profile is better understood; or
  • writing select tranches of new business at actual premium to technical premium ratios which are very attractive, but which have not been recognised as high quality risks by competitors using less sophisticated underwriting and pricing approaches.

The first of these channels would have an adverse impact on insurers’ expense ratios, as declining premium volumes from non-renewals would leave the fixed expense base relatively more uncovered. It would then be incumbent upon insurers to pursue profitable premium growth within the second and third of these channels to offset the impact – which would have its challenges in the current soft market. To ensure that the combined ratio is not adversely impacted in the short term, insurers would potentially also need to implement strategic cost reduction initiatives, including the deployment of new technologies to cut acquisition costs and target specific areas of variable expenses.

With the challenges of the soft market set to continue in the near term, the pressure on margins provides an impetus for a step change in the use and development of advanced analytics, supplemented by targeted strategic cost reduction programmes. These analytics innovations will be crucial in keeping one step ahead of the competition, especially when the market starts to harden.

08 September 2016

Embedding centralised technologies to help manage regulatory compliance data

By Sudheer Parwana

“Regulatory Compliance” is a phrase which my clients very rarely say with warm fuzzy feelings and cheerful thoughts. And why would they? In recent years there has been a greater focus on ensuring organisations operate legally, safely and ethically as a result of new social trends, technology advancements and greater globalisation. To make this happen, new regulations are increasingly being imposed at a rapid pace and I suspect June’s vote on Brexit will only add further complexity once it is in full flow.

To stay compliant, significant administrative effort and cooperation is required from all areas of the business. However, this can be costly, hard to monitor and quickly becomes impractical if not done effectively. To solve this challenge, I’ve been working with clients to help them embed smart technology into their operations, automate their processes and bring greater efficiency to the way compliance is integrated into their business as usual (BAU) activities.

Interestingly, although organisations are starting to invest and embed technology into their BAU operations, I’m still seeing many clients carry out their initial compliance assessment projects using basic tools like spreadsheets, emails and shared drives.

These tools tend to be used because they’re familiar, require little technical knowledge and are fast to get up and running. However, this approach can quickly become chaotic as soon as spreadsheets are copied, version control issues are introduced and time consuming tasks are required to consolidate information and generate regular management information (MI) reports.

The data footprint becomes large very quickly (just think about all the email exchanges and spreadsheets over the lifetime of a project) making it difficult to track down information and virtually impossible to establish an audit trail. These traditional tools also mean people work in silos and end up working in different ways.

Despite the fast start, the whole process becomes increasingly inefficient, difficult to manage and hard for senior stakeholders to get an accurate and reliable view of their compliance.

Quite a gloomy picture right? So how do you solve this?

Well, my passion is to help clients design and build innovative web applications to bring all their information into a single structured repository, embed automation and improve their operational efficiency. In most cases, this means building personalised applications that are not readily available off the shelf.

Probably the most powerful aspects of using a centralised solution is that it allows people (and teams) to work collaboratively so that tasks are carried out in a consistent way. This then means you can quickly view real-time reports and dashboards (with no manual effort required) and interact with them to get deep insights into your data. A full audit trail is also maintained, which means any key information you need in future will not be lost in someone’s inbox.

For example, one of my recent clients conducted a regulatory compliance review across 17 global business units. Having all the necessary tools and information in the single tool allowed teams to easily work together, share information and quickly complete their assessments. Senior stakeholders used the dashboards, throughout the project, to track progress, find trends and target key areas of focus within a few button clicks (and seconds). Once they got to the important information, they were then able to make informed decisions on next steps, plug any compliance gaps and document the resolutions with a full audit trail.

I wonder how long this would have taken using the traditional spreadsheet approach?

As you can see, a centralised technology can really help you streamline your operations and give you greater confidence in your business and compliance. Ultimately, using technology is not meant to fundamentally change your processes or ways of working, but enhance and simplify the way it is being executed so that it’s more automated, intelligent and easier to monitor. Done correctly, it can save countless hours, effort and sleepless nights. And this would put a smile on anyone’s face, right?

To find out more about PwC's Compliance Hub watch our short animation here.

05 September 2016

Why it’s time to break down silos in data governance

I’ve worked in data governance for 15 years and I can safely say that this is the most interesting time that I’ve experienced – and that’s partly because no two people in the world have the same view about what data governance means.

Ask a company that has recently suffered a data breach and they’ll tell you that data governance is all about preventing a data leak and avoiding the wrath of the Information Commissioner. Ask an organisation that’s been a victim of a cyber attack and they’ll say that it’s all about cyber security. To someone who has to manage large amounts of customer data, it’s about data privacy and ethics; to someone who wants to use data to make better decisions, it’s about data quality and reliability.

None of these is the wrong answer, but none go far enough. ‘We’re all in this together’ is the best way I can sum up data governance today. It’s about everything that people think it’s about, from treating data ethically to cyber security. That’s the point; it’s about all these things.

Too often, it is assumed data governance is all about managing and minimising risk but good data governance goes far beyond that. If you think solely in terms of threats you won’t see the opportunities.

At its best, as we explain in our recent video, data governance ensures that an organisation is making the best possible use of the data it holds and collects. It makes sure that an organisation has confidence and trust in its data, so that any decisions it makes are based on the best possible evidence. Some data may be more important than others, and organisations may have different data priorities, but “good” data ultimately drives the whole business.

And that brings me to the real problem. Too many organisations are treating data governance as a set of individual problems. Data governance is hampered by traditional silos – those with the necessary skills and resources (and a wide variety of skills and resources are required) are scattered throughout organisations, with little day-to-day contact with each other or any sort of co-ordination. The people who sit in regulatory compliance, for example, sit in different parts of the business from IT security, and sometimes at a different site altogether. They rarely speak to each other and never meet.

Why on earth not? Data quality isn’t an issue that’s unique to one part of the business, and neither is cyber security, or data privacy, or anything else to do with data governance. It’s relevant to, and affects, every part of the business every day. And that demands a holistic approach. Good data governance should be at the very heart of a business. It should be part of its DNA.

Getting there isn’t easy – it requires a change of focus and a shift in behaviour. What’s needed is someone who will bring all the elements of data governance in an organisation together, to give the various people working within data governance a structure under which they can work together. Increasingly that responsibility is falling to the Chief Data Officer.

Data governance has evolved beyond recognition during my career and it’s time for another fundamental change in the way organisations view and approach data governance. It is more diverse, involved and dynamic, and it needs an equally diverse, involved and dynamic team that spans the organisation so that the business can work together to achieve a common goal – trusted data. These are, indeed, interesting times.

 

01 September 2016

Data and analytics: what will be the tax functions’ role?

The tax function is one of the largest consumers of data in organisations. Typically, according to a joint study we carried out with the Manufacturers Alliance for Productivity and Innovation (MAPI) tax spends more than half of its time gathering data. It’s something that tax functions excel at even though this is becoming more challenging as tax data is housed in many locations and the volume of data is increasing. But that same survey also found that tax functions spend less than a third of their time on analysing the data they’ve collected. And that’s a problem.

The volume of data available and our increasing ability to analyse and use it is having a major impact on companies and on every function within an organisation. We’re creating an environment where data is the basis for decision-making – and the major business functions are working hard to adapt to that world. Tax mustn’t be left behind.

That’s why the impact of data and analytics on the tax function is the subject of the third paper in our Tax Function of the Future thought leadership series. We look in detail at the data challenges facing tax functions and what they need to do to get themselves in shape for now and what’s to come.

The central message is that the tax function must shift its focus if it’s to harness the full potential of data analysis. That means concentrating on the analysis of the data it collects, and on creating strategic partnerships throughout the organisation.

This will take, among other things, excellent technology. But overall, spending on technology in the tax function remains stubbornly low – according to the Tax Executive Institute 2011-2012 Corporate Tax Department Survey, it amounts to less than 5% of the budget in most cases. Our MAPI survey found that 80% of tax functions recognise that better technology and integration would improve their tax effectiveness, and yet three-quarters don’t have a tax technology strategy, or any plans to develop one. Clearly, this has to change.

If they’re to truly add organisational value in a data-driven world, tax functions must also break out of the silos that many find themselves in. Tax has relevance to decisions made across the organisation and the tax function has a real opportunity here to show how it can bring value to the business – as long as the function is in line with the wider business strategy.

This means that tax functions have to be far more actively involved in data management and analytics – and in the budgeting and planning process for data projects – than they have been in the past. Historically, the function has taken a passive role in data strategy but the data challenges that the function faces, and the critical importance of data use and analytics to its future value, make active involvement absolutely critical. A coherent data strategy should include people from tax, finance, IT and other functions if a complete business case is to be made.

This blog was originally published by Mark Schofield here.