Just as numerous organizations around the world a reaping dividends from their investments in big data projects, there’s much to gain for the Bank of Canada (BOC) if it can effectively use huge amounts of “non-official data” to better monitor the economy, according to one of its analysts.
“By providing an opportunity to exploit vast quantities of digital information, big data offers fresh insight that is relevant to the monitoring of the economic activity, wrote Nii Ayi Armah of the BOC Canadian economic analysis group, in her article, Big data analysis: The next frontier , which recently appeared in the Bank of Canada Review. “…Combining big data with sophisticated analysis could provide novel insights into household behaviour, firm expectations, financial market stability and economic activity.
Financial institutions like the BOC typically arrive at their long-term macroeconomic projections which are used to help form monetary policy decisions, from official statistical agencies which unfortunately publish their findings with a significant amount of lag time thereby preventing users from receiving information to facilitate “accurate and timely measurement” of indicators.
For instance, the gross domestic product (GDP) is a quarterly series that is quarterly that is published with a two-month lag time and revised over the next four years. The consumer price index is a monthly series that is published three weeks after the end of the reporting month.
Such a situation, said Armah, makes the exploration of “complementing official data” with more readily available “non-official” data such as information from social media and Web searches as well as electronic payment data from credit card and debit card transactions.
Armah, for instance cited how an 1997 research found that counting the frequency of appearance of the word “shortage” in print newspapers “can be a good predictor of inflation in the United States.”
Now, you would think that financial institutions such as the BOC would be all over big data by now. The BOC does, to a certain extend according to Armah.
“For example the bank’s regional offices collect and analyze data obtained from quarterly consultations with businesses across Canada to gather perspectives on such topics as demand and capacity pressures as well as views on economic activity in the future,” the analyst said.
However, despite innovations, several factors limit big data’s full potential.
“Specifically, it remains unclear how best to select, organize and aggregate unstructured data so that they provide meaningful signals about economic conditions, and what analytical tools need to be developed to integrate those signals with information from conventional data sources,” Armah said. “Assessing how representative big data samples are could prove to be problematic for standard methodologies.”
The analyst also said that much of what makes up big data today “exist in silos.”
“To unleash the full potential of big data, there is a need to first integrate the fragments of data sets so that they can be accessed easily and quickly by interested parties,” said Armah.
Lastly, there is the danger that such a scale of data mining and analysis could lead to serious privacy issues.
“By uncovering hidden connections between seemingly unrelated pieces of data, big data analysis can reveal personal information that some might deem too sensitive to share,” Armah said. “…A balanced regulatory framework is therefore necessary to effectively address concerns about privacy and use of personal information while still benefiting from the technological advances and a thriving data driven economy.”
Data Center Innovation by Cisco and IBM
Register for this webcast to learn how joint data center solutions from IBM and Cisco can increase resource utilization, support business continuity, provide higher availability and faster provisioning, and help you achieve productivity gains through virtualization, optimization, energy stewardship and flexible sourcing.