Data has consumed organizations. It is used in every facet of operations and is a major component of the decision making function. The way data is stored and presented is also an important factor as companies need to have the ability to proper analyze the data and information they collect in order to get the most value out of it.
This is why the role of data analysis has grown significantly in Canadian companies from every industry. In most cases, it is the industry leaders that have effectively invested in methods to efficiently collect, analyze, and apply what they have learned from data.
What is data analysis?
Data analysis is the process of taking data from its raw state and applying a number of parameters that allows for insights to be gain. This process involves:
- Assigning values
- Removing redundancy
- Modeling data
The end goal of data analysis is to uncover valuable information and insights that will allow you to develop conclusions. It is these conclusions that can then be used to support decision making at all levels of your company.
While on the surface it may seem simple, data analysis can be a very complex process. However, the complexity of the analysis is dependent on the type of data you are working with and what type of meaning you are attempting to extract.
Overall, there are two main problems or concerns that are addressed with the use of data analysis:
- Uncovering methods of how to describe, summarize, and explore raw data that has no meaning
- To create meaning, inferences, and apply values to data through analysis
Who uses data analysis?
There are many different approaches to data analysis and a multitude of diverse techniques that vary based on the profession that is doing the analysis. Regardless if you are looking for patterns in the data, looking to further proof points or explanations, test a hypothesis or describe facts and figures, data analysis is used by:
Writers, Researchers and academics
- Businesses of all sizes
- All levels of Government
- Policy developers and law makers
- Professors, teachers, and students
Types of data that is analyzed
There are a number of different types of data that can be analysed by companies for a variety of different purposes. Here are the most common:
- Qualitative data: This form of data is often difficult to analyze because values need to be assigned to the content before analysis can take place. Content analysis is a common example. The focus is usually on a small sample size.
- Quantitative data: This is commonly data that is in a number format and is relatively easy to measure because values are already assigned. The focus is usually on a large sample size.
- Categorical data: This is data that can be analyzed with the focus being on placing it in a specific category (i.e. age group or skill level). It can it based on a nominal, ordinal or binary level of measurement.
Key Considerations in data analysis
There is no shortage of key considerations, issues, and potential problems that need to be addressed during the data analysis process to ensure that the analysis yields accurate conclusions that can be used for decision making. Here are some of the things to consider:
Bias and objectivity
- Having the tools and skill to properly analyze
- Selecting the correct collection and analysis techniques
- The sample size of pool of data being used to draw an accurate conclusion
- The reliability of the analysis
- Effective training of analysis participants to ensure all procedures are followed
- Data presentation
- The accuracy of the analysis
Thursday, January 10, 2013Firms still not acting on data retention Majority of businesses believe they should implement data deletion policies but end up keeping information longer than they should
Friday, July 06, 2012Linux central to Higgs Boson discovery, claims physicist An unidentified person believed to be knowledgable credits the open source operating system for helping analyze data used in the finding
Wednesday, November 09, 2011Hadoop ready for corporate IT, execs say HADOOP WORLD CONFERENCE Reps from JP Morgan Chase and eBay say the open source data storage and analysis framework is ready for primetime enterprise play
Wednesday, April 07, 2010IBM’s new data crunchers target Oracle's Exadata A new suite of integrated systems for large-scale data analysis from IBM is a direct attack on rival Oracle’s Exadata platform
Thursday, June 28, 2007Big Blue aims supercomputing at enterprises Though associated with academia, IBM hopes its integration of Windows Compute Cluster Server with its hardware will gain the interest of financial services, among other verticals
Wednesday, June 20, 2007Cluster computing rocks ... when used to analyze them Analyzing rocks may not be everybody's idea of a fascinating avocation. But such analysis can translate into huge practical benefits, especially in the area of seismology – the science of earthquakes and their phenomena
Tuesday, May 20, 2003Privacy advocates: Congress must police data gathering Assurances of limited scope from the leaders of two proposed U.S. government data-gathering projects weren't good enough for privacy advocates, who on Tuesday urged Congress to police just what personal information the government should analyze.
Thursday, August 22, 2002Companies reassess accounting apps In light of the accounting problems that have plagued U.S. corporations, more attention is being paid to financial applications and the role IT plays in their implementation and use.