How to Collect Quantitative Data: Common Methods

Quantitative data collection methods include surveys, interviews, observations, and dataset reviews. Explore the techniques, pros, and cons of each method.

What is Quantitative Data Collection?

Quantitative data collection is a process that gathers measurable, numerical information, such as stock prices, trading volumes, interest rates, revenue figures, etc. The collected information can be analyzed statistically to identify patterns, trends, or relationships.

This differs from qualitative data collection which focuses on gathering descriptive, non-numerical information like market sentiment or customer feedback to understand behaviors, opinions, and experiences. 

Data collection methods for quantitative research often include surveys with close-ended questions, structured interviews, observational techniques, or reviews of existing datasets. This article will further discuss these common methods in detail, helping you find the best way to gather statistically reliable insights for quantitative data analysis, especially in the financial industry.

Types of Quantitative Data Sources

Before we explore the different data collection methods, it is crucial to understand that quantitative data can be gathered from three types of sources:

  • First-Party Data Sources: Collected directly by your organization from customers or users (e.g., surveys, purchase data). Even though data from first-party sources is reliable and tailored, it requires significant resources to collect and organize.
  • Second-Party Data Sources: Shared by another organization, often as part of a partnership. It offers valuable insights without extensive data collection but may have limited scope.
  • Third-Party Data Sources: Aggregated from various sources and sold by external providers. These sources provide efficient data access and broad insights. Still, careful selection is required to find reliable providers with accurate, up-to-date information, especially for financial datasets

Why is Quantitative Data Collection Important? 

By providing numerical evidence, quantitative data collection allows for objective analysis and reduces reliance on subjective interpretations. This data-driven approach is essential for developing comprehensive quantitative strategies

For instance, a portfolio manager might rely on quantitative data collection tools to compile information like historical asset returns, volatility, and correlations, to optimize portfolio allocation and maximize risk-adjusted returns. 

This is made possible due to various reasons. The objectivity of quantitative data collection ensures the data is collected in a standardized, unbiased manner, providing reliable insights. The repeatability of quantitative methods even allows results to be validated across different research, enabling analysts to test their accuracy.

For finance professionals, the precision of numerical data also enables detailed measurements, such as tracking market trends or evaluating investment performance. Moreover, its generalizability makes it possible to project findings from a sample to broader financial markets.

How to Select a Population Representatives for Quantitative Data Collection: Probability Sampling

Probability sampling minimizes sampling bias by relying on random selection methods to choose participants or data representatives. This ensures every individual in the target population has an equal chance of being included, leading to more accurate and generalizable results. This technique is especially valuable for surveys or interviews, providing a representative sample that reflects the broader population.

The four main types of probability sampling techniques include:

  1. Simple Random Sampling: This involves randomly selecting individuals from the entire population without particular criteria or grouping logic. It’s straightforward and effective but can be impractical for very large populations.
  2. Cluster Sampling: The population is divided into smaller groups or clusters, often based on geography or other natural divisions. A random selection of clusters is then fully surveyed. This method reduces costs and effort, especially when populations are widely dispersed.
  3. Systematic Sampling: Researchers start with a randomly chosen individual and then select every nth person (e.g., every 10th or 20th) from the population list. This method is efficient but can inadvertently introduce bias if there’s an underlying pattern in the list order.
  4. Stratified Sampling: The population is divided into distinct subgroups/strata based on specific characteristics, such as age, gender, or income level. A random sample is then drawn from each stratum. This ensures representation from all key subgroups, making it ideal for diverse populations.

Quantitative Data Collection Method: Observation

To collect quantitative data, analysts and researchers employ structured observation. This process involves establishing clear criteria for the behaviors or events of interest and developing standardized procedures for recording numerical data. Tools like checklists and observation schedules help ensure consistent and objective measurement, focusing on quantifiable aspects such as frequency or duration.

 In structured observations, analysts and researchers establish clear criteria for the subject of interest and develop standardized procedures for recording numerical data.
Source: Freepik

Ways to Conduct Observations for Quantitative Data Collection

Structured observations can be conducted in various settings, both in person and online. 

  • Online Observations: This can be automated using analytics tools or software. For instance, TEJ’s TCRI™ Watchdog monitors company-related news or events and then categorizes the information to provide quantifiable data such as event frequency and intensity scores. This automated approach saves researchers significant time and effort compared to manual observation.
  • In-Person Observations: This might involve observing customer behavior in a retail store, counting the number of people entering a specific area, or measuring the time spent on a particular task. 

Pros and Cons of Observations

This method’s key advantage is its objectivity, as it relies on direct measurement rather than self-reported data. It can also provide insights into actual market trends or behavior rather than stated intentions.

Still, observation can be time-consuming without the right tools and may be influenced by observer bias if not carefully structured. Also, in online settings, privacy concerns must be addressed to comply with data privacy regulations. 

Quantitative Data Collection Method: Existing Data Review

Document review is a secondary research method where researchers systematically examine pre-existing records and datasets. This involves identifying, extracting, and organizing relevant numerical information from these sources to facilitate quantitative data analysis.

Analysts can examine pre-existing records and datasets to collect quantitative data.
Source: Freepik

Data that Can Be Reviewed for Quantitative Data Collection

Several types of data sources can be reviewed:

  • Public Records: These include official documents maintained by government agencies, public institutions, or organizations. Examples include financial statements of publicly traded companies, census data, and government reports. These records often contain valuable quantitative data on various aspects of society, the economy, and specific industries.
  • Personal Documents: These are records created by individuals, such as personal financial statements, medical records, or academic transcripts. While access to personal documents is often restricted due to privacy concerns, they can be a valuable source of quantitative data when available and ethically obtained.
  • Physical Evidence: This category includes physical objects or artifacts that contain quantitative information. Examples include sales receipts, inventory records, or manufacturing output data. These tangible records can provide direct evidence of quantifiable activities.
  • Specialized Datasets:  In addition to general data sources, many industries rely on specialized datasets curated by expert providers. Such specialized datasets provide readily accessible and reliable quantitative data for in-depth analysis and strategy development in specific sectors. For example, in the financial industry, TEJ’s comprehensive quantitative investment database includes a wide range of financial and economic data from basic stock prices to detailed financial metrics, such as company risk attributes, and broker trading information. 

Pros and Cons of Existing Data Review

One of the primary advantages of reviewing existing data is its cost-effectiveness. Compared to primary research methods that require significant investment in data collection, such as surveys or experiments, utilizing existing data minimizes expenses related to participant recruitment, data collection instruments, and personnel. This makes it a particularly attractive option for researchers with limited budgets.

Moreover, this method offers quick access to large volumes of information. Because the data has already been collected and compiled, researchers can quickly begin their analysis without the delays associated with primary data collection, such as designing surveys, obtaining ethical approvals, and waiting for responses.

Still, this method may present data quality issues, such as missing variables, inconsistent data collection methods, or outdated information. Thus, it is important to select a source with reliable, up-to-date data to ensure effective quantitative analysis.

Quantitative Data Collection Method: Survey or Questionnaire

Surveys and questionnaires are widely used for quantitative data collection, focusing on gathering numerical data through structured questions. These questions are typically closed-ended, using formats like multiple-choice, rating scales, or yes/no options to facilitate easy quantification and statistical analysis. This structured approach allows for efficient data aggregation and comparison across responses.

Surveys and questionnaires gather numerical data through structured questions.
Source: Freepik

Ways to Conduct Survey/Questionnaire for Quantitative Data Collection

Two common surveys/questionnaire methods for quantitative data collection are:

  1. Online Survey/Questionnaire:
    These are distributed electronically, often via email links, website pop-ups, or social media platforms. Online surveys are widely used due to their cost-effectiveness, speed, and flexibility, as respondents can participate at their convenience using any device. This offers a broad audience reach while reducing geographic barriers. The downside is that online surveys rely on internet access so it may exclude populations less familiar with digital tools.
  2. Mail Survey/Questionnaire:
    This traditional method involves sending physical questionnaires through the mail, often including a cover letter and a prepaid return envelope.  While response rates can be lower and slower than online surveys, mail surveys offer a sense of anonymity and allow respondents to answer at their own pace. This can be beneficial for gathering sensitive information or reaching populations with limited internet access.

Pros and Cons of Surveys/Questionnaires

The strengths of surveys and questionnaires include their ability to collect data from a large number of respondents efficiently and their capacity to standardize data collection for easy analysis. 

However, there could be a potential response bias, where respondents may not accurately represent the target population, and the possibility of low response rates, especially with mail surveys. 

Quantitative Data Collection Method: Interview 

Interviews, when structured appropriately, can be a valuable method for collecting quantitative data. Unlike qualitative interviews that explore in-depth perspectives, quantitative interviews focus on gathering measurable data through standardized questions. These questions are typically closed-ended, similar to those in surveys so that they can be quantified for statistical analysis.

For effective quantitative data collection through interviews, researchers should ensure clarity in their questions, maintain consistency in the interview process, and use neutral language to avoid bias.

It is also important to note that interviews can be categorized based on their structure, being either structured, semi-structured, or unstructured. Researchers should understand their differences to conduct interviews properly for quantitative data collection:

  • In structured interviews, the interviewer adheres strictly to a predetermined set of questions, leaving no room for deviation. This ensures consistency across all participants. 
  • Semi-structured interviews offer some flexibility, allowing the interviewer to ask follow-up questions for clarification but still maintaining a core set of standardized inquiries. 
  • Unstructured interviews rely on open-ended questions and are primarily used for qualitative research due to their exploratory nature.
Quantitative interviews focus on gathering measurable data through standardized, close-ended questions.
Source: Freepik

Ways to Conduct Interviews for Quantitative Data Collection

Several types of interviews are used for quantitative data collection:

  • Telephone Interviews:
    These involve conducting interviews over the phone. While less currently common due to the rise of online methods, they can still be useful for reaching geographically dispersed populations or when face-to-face interaction is not feasible.
  • Face-to-Face Interviews: These involve direct, in-person interaction between the interviewer and the respondent. They offer the advantage of building rapport and observing non-verbal cues. Yet, they can be more time-consuming and expensive compared to other methods.
  • Computer-Assisted Personal Interviewing (CAPI): This method combines face-to-face interaction with technology. Interviewers use devices like tablets or laptops to administer the questionnaire and directly input responses into a database. This streamlines the data collection process and reduces errors associated with manual data entry.

Pros and Cons of Interviews

Interviews provide reliable and in-depth quantitative data, as interviews have the flexibility to clarify ambiguities with the respondent in real time, ensuring more complete numerical information.

Nevertheless. this method can be expensive and time-consuming due to the need for trained interviewers. Furthermore, response bias can be a concern, particularly if participants feel pressured to answer in a socially desirable way, impacting data accuracy.

Example of Quantitative Data Collection and Its Use Case in Finance

Let’s consider an example of how quantitative data is used to evaluate investment strategies, focusing on factor investing which involves selecting stocks based on specific elements or “factors” that have historically been associated with higher returns. 

In this case, the firm wants to determine if portfolios constructed based on value (undervalued stocks) and momentum (stocks with strong recent performance) outperform a general market benchmark. The entire process would involve the following steps:

Step 1: Quantitative Data Collection from Financial Datasets

To conduct this analysis, the firm collects historical financial data from a reliable data provider like TEJ. By reviewing the financial dataset, the firm can gather all the necessary data they information, including monthly stock returns, book-to-market ratios (a key metric for identifying value stocks), and past 12-month returns (used to identify momentum stocks) for a broad universe of stocks over 10 years, covering 120 months. 

Step 2: Portfolio Construction with Collected Financial Data

Using this data, the firm constructs two distinct portfolios. The first is a value portfolio which comprises stocks with the highest book-to-market ratios, indicating that these stocks may be undervalued by the market. The second, a momentum portfolio, consists of stocks with the highest returns over the previous 12 months, suggesting strong recent performance. Both portfolios are rebalanced monthly to ensure they continue to reflect the desired factor exposures.

Step 3: Calculate Financial Metrics to Evaluate Performance

After constructing the portfolios, the firm calculates several key performance metrics. These include the average monthly return, which represents the average return generated by the portfolio each month; the annualized return, which is the compounded average monthly return over a year; volatility, measured by the standard deviation of returns, which indicates the degree of price fluctuation; and the Sharpe ratio, a risk-adjusted return measure where higher values indicate better performance.  

Hypothetical results of this analysis might reveal the following insights:

Portfolio/BenchmarkValue PortfolioMomentum PortfolioMarket Benchmark
Average Monthly Return1.1%0.9%0.8%
Annualized Return13.9%11.4%10%
Volatility15%12%14%
Sharpe Ratio0.850.950.71

Step 4: Analyze Results to Make Investment Decisions or Further Analysis

These quantitative results indicate that both the value and momentum strategies outperformed the market benchmark over the analyzed period. While the value portfolio generated higher overall returns, the momentum portfolio demonstrated better risk-adjusted performance, as indicated by its higher Sharpe ratio and lower volatility. 

As such, the investor might decide to allocate a portion of their portfolio to either the value strategy, momentum strategy, or even a combination of both. The specific allocation would depend on their individual risk tolerance and investment goals. A more risk-averse investor might favor the less volatile momentum strategy, whereas an investor seeking higher returns might allocate more to the value strategy, accepting the higher volatility.

In turn, this type of analysis enables investment firms to make data-driven decisions about portfolio construction and factor allocation. Further analysis, such as examining rolling returns, conducting regression analysis, or evaluating drawdowns, could provide even deeper insights into the performance and risk characteristics of these investment strategies.

Important Reminder: This analysis is for reference only and does not constitute any product or investment advice.

We welcome readers interested in various trading strategies to consider purchasing relevant solutions from Quantitative Finance Solution. With our high-quality databases, you can construct a trading strategy that suits your needs.

“Taiwan stock market data, TEJ collect it all.”

The characteristics of the Taiwan stock market differ from those of other European and American markets. Especially in the first quarter of 2024, with the Taiwan Stock Exchange reaching a new high of 20,000 points due to the rise in TSMC’s stock price, global institutional investors are paying more attention to the performance of the Taiwan stock market. 

Taiwan Economical Journal (TEJ), a financial database established in Taiwan for over 30 years, serves local financial institutions and academic institutions, and has long-term cooperation with internationally renowned data providers, providing high-quality financial data for five financial markets in Asia. 

  • Complete Coverage: Includes all listed companies on stock markets in Taiwan, China, Hong Kong, Japan, Korea, etc. 
  • Comprehensive Analysis of Enterprises: Operational aspects, financial aspects, securities market performance, ESG sustainability, etc. 
  • High-Quality Database: TEJ data is cleaned, checked, enhanced, and integrated to ensure it meets the information needs of financial and market analysis. 

With TEJ’s assistance, you can access relevant information about major stock markets in Asia, such as securities market, financials data, enterprise operations, board of directors, sustainability data, etc., providing investors with timely and high-quality content. Additionally, TEJ offers advisory services to help solve problems in theoretical practice and financial management!

Quantitative Data Solution for Rigorous Financial Strategies

Accurate data is essential for making informed financial decisions and developing successful investment strategies, as access to reliable and comprehensive data can be the difference between profit and loss, especially in today’s complex markets.

TEJ provides high-quality financial and economic data essential for rigorous quantitative analysis. Our datasets encompass a wide range of information, from fundamental financial metrics and company risk attributes to detailed market data. Moreover, TEJ collaborates with industry leaders like Eagle Alpha, Neudata, and Snowflake to ensure broad accessibility to our premium data for investors worldwide. 

Explore TEJ’s services today and enhance your financial strategies with our trusted quantitative data solutions.

Back
Procesing