Data granularity is a concept that relates to the level of detail of provided data making it easier to analyse and extract richer and more valuable insights. The level of data granularity determines the analysis that can be performed on the data, and whether results from that analysis will lead to appropriate conclusions. The more granularity, the more information is available for analysis, albeit at the costs of increased storage, memory and computing resources required.
What is granular data?
Granular data is the most basic level of data, which can be included in a specific data set. What it means, is the size of the data being analyzed must be portioned into its smallest form. By using as detailed data as possible, we can create a more specific overall picture of the data at hand. This also enables data analysts to use these small granules to use and shape bits and pieces of information, depending on their personal needs and projects.
Subdividing, or granulating, data also makes it easier to combine data sets coming from multiple sources. Therefore, various specialists around the world would much rather work on granular data, as it makes it simple to develop concise strategies and make substantial progress in their industry.
Reshaping the regulatory approach to data
Many regulators struggle with processing and interpretation of the massive amount of data which relates to the difficulty in data quality and management. Thus, many authorities shift towards granular reporting. In principle, granularity reduces complexity of reporting, improves the flexibility of analyses, and limits ad-hoc reporting. This is ensured by non-use of multiple templates, the re-use of information for alternative purposes, and the datasets can be used to develop analytical tools by feeding data into AI or machine learning.
Granular reporting involves the use of data broken down across multiple dimensions. This approach shifts the reporting burden from financial institutions (who no longer need to aggregate data and create redundant reports), to financial authorities. Additionally, it promotes data-driven culture through being easily adaptable for different uses, which results in greater use of data by internal users.
The challenge for granular reporting is the lack of data standardisation. Every financial authority may define required data in a different way which causes differences in understanding of particular data. Moreover, increase of data volume and granularity makes it harder to manage the data, which becomes much more complex. Storing granular data which may be sensitive, requires a major improvement in data security.
And last, but not least, granular data may not be applicable to all reports as some data points cannot be derived straight from granular data unless additional information specific to the financial institution is provided. In spite of all the challenges, the way to more granular and integrated reporting will likely continue to be pursued.
The case for data granularity
Since the financial crisis, authorities are more aware that identifying risks in financial systems is not possible without detailed, granular data. An example of such an approach can be regular ad-hoc stress testing, where regulators request granular reports from the banking sector. All of this is only possible by having granular data from different sources combined.
As central banks, regulators and supervisory authorities need to deal with more and more granular data collected from diverse sources, the challenge is to manage data collection to process and analyse the data. Innovations in this matter need to focus on mitigating arising obstacles to ensure conducting comprehensive analysis. Surely, with the collection of more granular data, the technical capacity and processing power of authorities’ IT systems must increase to cater to the larger volume and higher velocity of data.
Regardless, the benefits of granular data surpass the difficulty in implementation, as it allows a much more detailed analysis of the factors that may contribute to causing a financial crisis, allows to pinpoint the source of stress during the stress testing and gives richer and more timely supervisory insights.
Learn more about RegTech driven data management