Example Of Data Representation Report

Type of paper: Report

Topic: Tourism, Travel, Model, Demand, Function, Value, Information, Log

Pages: 9

Words: 2475

Published: 2021/02/26

1. INTRODUCTION

In this paper we consider the basic techniques of statistical analysis and probability theory regarding to a real-world problem.

This report aims to build a demand function for quantity of travel in Ruritarnia, (using data set for Ruritania 1985 to 2014). We are interested in the factors which have a significant effect on the quantity of travel demanded. For this purpose we adopt a traditional Marshallian demand function Qd1 = f(p1,p2,Y). This implies that the quantity demand for any good is a function of its price, the price of other goods, and the income level. That’s why the goal of the analysis is to understand the character of association between the quantity of travel demanded and other factors, which are significant. To do this, we perform regression analysis considering quantity of travel as a dependent factor and other factors as independent.

We are given with a data set of 120 observations. Each observation represents the price and quantity levels of the goods at a particular date. There are 37 variables characterize data:

PMTFH - .

QTRAV26 -
You should clearly explain each variable in this section. What does each variable mean?
According to the scatterplots of the quantity of travel vs. other variables (see graphs 1, 2, 3 in the appendix) the characteristic of relationship appeared to be a log-log function. This suggests constant elasticity at any level of travel quantity, and the elasticity is represented by the parameters. In this report we investigate how do these factors influence the quantity demand of travel.
There are no graphs in the appendix. In addition to scatterplots I recommend you to provide the frequency histogram of the distribution of variables chosen to show that the data is “good” to make a forecast.
2. ANALYTICAL PROCESS.

Before you choose only 3 variables in your research, you should have a well-grounded explanation, why other variables are not useful.

One of the most common methods to do this is to run correlation matrix for LQtrav vs. all variables you have. Many coefficients will be not significant, but those which are significant and correlation is strong, may be used in the further research.
We start with regression, y=0+1x1+2x2+3x3+e
where y=LQtrav (natural logarithmof travel)
x1= LPtrav (natural logarithm of travel price)
x2= LY(natural logarithm of income)
x3 = LPALLOTH (natural logarithm of all other prices)

Run the regression using PCGive I got (PcGive least squares estimators output is included in appeddix, see ‘data_model 1’)

y = 10.32 2.570x1 + 0.6368x2 + 1.958x3
SE (1.565) (0.1532) (0.1944) (0.2348) R2= 0.850943
et=ρet-1+νt E(vt)=0 var(vt)=σv2 cov(vt,vs)=0 for t≠s
H0: ρ=0 H1: ρ≠0

Under H0, dU < DW < 4 dU

At α=0.05, (Λ=4, n>100): dL=1.552 dU=1.675
DW= 1.24 (reported by PcGive)
Since 1.24 < dU , we reject H0, there is evidence suggesting autocorrelation (positive) exists. Therefore, the assumptions for multi-regression model do not hold.
When autocorrelation exists in the model, although the least squares estimator is still an unbiased linear estimator, it is not efficient, in addition the standard errors computed are not correct. So without appropriate treatment this model cannot be used for hypothesis testing or forecasting.

We can use generalized least squares method (GLS) to treat this.

yt=0+1x1t+2x2t+3x3t+et
Where et = ρet-1+νt
(From PCGive test report, see appendix ‘Error autocorrelation
coefficients in auxiliary regression (model 1)’) ρ=0.3723
et = 0.3723et-1+νt se(ρ)= 0.08724

Therefore we can transfer the old problematic model into a new model

y*=0*+ 1x1*+2x2*+3x3*+νt

Where

y*= yt ρ yt-1
x1*= x1t ρ x1t-1
x2*= x2t ρ x2t-1
x3*= x3t ρ x3t-1

Estimate the new multi regression model using PCGive

We have new equation (PcGive least squares estimators output is included in appeddix see ‘data_model 2’)
y*= 6.364 2.621x1*+ 0.6266x2*+ 2.0350x3*
se (1.293) (0.2065) (0.2614) (0.3291) R^2 = 0.733067
we check the autocorrelation again with the new model
test for autocorrelation AR(2) use Breusch-Godfrey test
y*=0*+ 1x1*+2x2*+3x3*+νt
where νt= ρ1νt-1+ρ2νt-2 +ut
the auxiliary regression model is (more detail information from Pcgive is included in appendix ‘Error autocorrelation coefficients in auxiliary regression (model 2)’
νt= 0*+ 1x1*+2x2*+3x3*+ ρ1νt-1 + ρ2νt-2 +ut
H0: ρ1 = ρ2 =0

H1: at least one of the ρ≠0

For large sample size under H0, the test statistic is:
(n-p)R2 ~ χ (p=2, α=0.05)2
χ2 (from PcGive)= 3.4932 [0.1744]
We can see the p-value for test statistic is 0.1744>0.05, therefore, we accept H0, there is no serial correlation in both first order and second order with this new model.
cov(yi,yj) = cov(νi, νj) = 0, where i≠j

Now we need to check other assumptions for general regression model.

We can use the WHITE test to check if there is heteroskedasticity.
The WHITE test auxiliary function for heteroskedasticity with square product testing is
νi2=0+ 1xi1*+2xi2*+3xi3*+ 4xi1*2+5xi2*2+6xi3*2+ei

Heteroskedasticity coefficients for the square products are reported in PcGive(see appendix ‘heteroskedasticity test summary’)

H0: 1 =2 =3 =4 =5 =6 =0

H1: at least one of the ≠0

The distribution under null hypothesis is
2=NR2 ~ 2 (6, =0.05)
2c.v.=12.592
As indicated in PcGive 2 = 6.3951 < 2c.v

Therefore we accept H0

the auxiliary function for square and cross product is (heteroskedasticity coeffecients for the square and cross products are reported in PcGive see appendix ‘heteroskedasticity test summary’)
νi2=0+ 1xi1*+2xi2*+3xi3*+ 4xi1*2+5xi2*2+6xi3*2+7xi1*xi2* + 8xi1* xi3* + 9xi2* xi3* +ei
H0: 1 =2 =3 =4 =5 =6 =7 =8 =9 =0

H1: at least one of the ≠0

The distribution under null hypothesis is
2=NR2 ~ 2 (9, =0.05)
2c.v.=16.919
as indicated in PcGive 2 = 11.188 < 2c.v
Therefore we accept H0, the disturbance term are homoskedastic, the standard errors for coefficients are not biased, i.e. var(νi)=σ2, it passes the WHITE test.

In order to test normality, we can use Jarque-Bera test.

For a normally distributed sample, JB=N6S2+K-324=0
H0: JB=0 H1: Jb≠0

The test statistic under null hypothesis is

2=JB ~ 2(2,0.95) and 2c.v.=5.99
JB calculated in PcGive is 3.4304, this is less than 5.99, therefore we accept H0 and the disturbance term is normally distributed with the given sample, i.e. νi ~N(0, σ2) (more test detail reported by PcGive is included in appendix ‘Normality test for residuals’)

Now we check the model specification using the Ramsey’s RESET test.

y* = 6.364 2.62121x1*+ 0.6266x2*+ 0.6266x3*

The auxiliary function is:

y*=0*+ 1x1*+2x2*+3x3*+γ1y*2 + e
H0: γ1= 0

H1: γ1≠ 0

Under H0, the test statistic is RR ~ F(1, 119-5, =0.05)
Fc.v.=3.92

Therefore, we accept H0, there is no model misspecification, and we accept the original model.

You may also test multicollinearity (for example, using Variance Influence Factor – VIF).
3. RESULT ANALYSIS & INTERPRETATION

The general regression model we conducted applies for all the assumptions of the multiple regression model

And there is no misspecification in the model
y*= 6.364 2.62121x1*+ 0.6266x2*+ 2.0350x3*
se (1.293) (0.2065) (0.2614) (0.3291)
R^2 = 0.733067 n=119 k=4

We conduct F test for this model, is check if it is overall significant

H0: 0*=1 =2 =3

H1: at least one of the is not zero

Under H0, the test statistic is
F=R2/(k-1)(1-R2)/(n-k) ~ F(3,115,=0.05)
F reported by PcGive is F(3,115) = 105.3 [0.000]**
With p-value 0.000(at 3 decimal places), this is less than , therefore we reject H0 and accept the alternative hypothesis, conclude the model is overall significant.

We can also apply t-test to check the significant for each single coefficient

H0: i=0
H1: i≠0 (i=1,2,3)
Under H0, test statistic is t=bi/se(bi) ~ t(115,=0.05)

The result reported by PcGive is ,

for 1 p-value=0.000<
for 2 p-value=0.0181<
for 3 p-value=0.000<

Therefore we reject H0 and conclude that the sample evidence, x1*, x2*, x3*will bring significant changes to y*.

In 2004 global warming became a better-known concern for the public all around the world, it is presented on media more often than ever. Since 2004, more government policies and regulations about reducing carbon emissions were conducted, working toward Tokyo protocol targets; there are also more non-governmental campaigns and organisations promoting to reduce greenhouse gas emission. Since travel is an important cause for carbon emission, these may change people’s preferences on travel. If this does have a significant effect on people’s demand for travel, then the coefficients might change and the result in the estimated function being unstable across subsamples. To check if there is a structural break since 2004, we conduct chow test.

We assign a dummy variable D on observations occurred after 2004(including 2004)

D=0 if the observation occured before 20041 if the observation occured after 2001(including 2004)
E(y*) = 0*+ 1x1*+2x2*+3x3*+ λ0D + λ1(Dx1*)+ λ2(Dx2*)+ λ 3(Dx3*) =β0+β1x1*+β2x2*+β3x3* (D=0)(β0+λ0)+(β1+λ1)x1*+(β2+λ2)x2*+(β3+λ3)x3* (D=1)
H0: λi = 0 (i=0,1,2,3)

H1: at least one of the λi is not zero.

Under H0, test statistic is
F=[RSSR–RSS1+RSS2]÷4RSS1+RSS2÷(N-8) ~ F(4,111, 0.95)
RSSR=22.15 RSS1= 14.16 RSS2= 5.44 N=119
(more detailed data from PcGive can be found in appendix ‘chow test statistics’)
Fc.v.= 2.45
F=3.61 > Fc.v.
Thus we conclude that we reject the null hypothesis that the wage equation is the same before and after 2004 first at 5% level of significance, and propose there is a structural break.

The coefficient for dummy variable D and be calculated by

λ = E(y*)for subsample after 2004 E(y*)for subsample before 2004 = 8.619699.27306 = 0.6534

Therefore the modified model is

y*= 6.364 2.62121x1*+ 0.6266x2*+ 2.0350x3* 0.6534D

Before interpreting the model we can transform the log-log function

y*=0*+ 1x1*+2x2*+3x3*+λD function is actually
lnQt ρlnQt-1 = 0(1ρ) +1(lnPt ρlnPt-1) +2(lnYt ρlnYt-1) +3(lnÄt ρlnÄt-1)+λD
(Q: quantity demand of travel, P: price travel, Y: income level, Ä:price for all other goods, D:dummy variable — after 2004, D=1; otherwise, D=0)

Therefore, remove the log function we have the quantity demand equation

QtQt-1ρ=eλD∙e(1-ρ)β0∙PtPt-1 ρβ1∙YtYt-1ρβ2 ∙ÄtÄt-1ρβ3
Qt=Ptβ1∙Ytβ2∙Ätβ3∙eλD∙e1-ρβ0∙(Qt-1Pt-1∙Yt-1∙Ät-1)ρ
QtPtβ1∙Ytβ2∙Ätβ3=eλD∙e1-ρβ0∙Qt-1Pt-1∙Yt-1∙Ät-1ρ
and at time t-1
Qt-1Pt-1β1∙Yt-1β2∙Ät-1β3=eλD∙e1-ρβ0∙Qt-2Pt-2∙Yt-2∙Ät-2ρ
we can substitute the last equation into the one before, and we have
QtPtβ1∙Ytβ2∙Ätβ3=eλD∙e1-ρβ0∙e1-ρβ0∙Qt-2Pt-2∙Yt-2∙Ät-2ρρ
substitutions like this can be due many times until we reach time period 1, (t=n=119)therefore we can simplify the equation (Q1 represent quantity of travel demand at time t=1, same for P1, Y1, Ä1)
QtPtβ1∙Ytβ2∙Ätβ3=eλD∙e[1-ρβ0 ∙ i=0n-1ρi]∙Q1P1∙Y1∙Ä1ρn=eλD∙eβ0(1-ρn)∙Q1P1∙Y1∙Ä1ρn
(note: i=0n-1ρi= 1-ρn1-ρ )
because we found out from before ρ=0.3723<1, and n=119, which is quite large, therefore ρn=0
QtPtβ1∙Ytβ2∙Ätβ3=eλD∙eβ0∙Q1P1∙Y1∙Ä10=eλD+β0
now we can rewrite the old log-log quantity function as
Qt=eλD+β0∙Ptβ1∙Ytβ2∙Ätβ3
where as 0 = 0*/(1 ρ)= 10.22 1= -2.621 2= 0.6266 3=2.035 λ=0.6534
finally we arrived at the demand function
Q =e-0.6534D+10.22 P-2.621 Y0.6266 Ä2.035
Or lnQ =10.22 2.621lnP +0.6266lnY +2.035ln Ä 0.6534D

This is the conducted gross demand equation for Ruritania, using data set 1985 to 2014.

Since 1 +2+3= 0.0406 ≈ 0, This equation has homogeneity of zero, this is consistent with the feature of Marshallian demand function.
The price elasticity for travel is εQ,P=∂Q∂P∙PQ≈lnQlnP =β1=-2.621
This tells us with percentage change of income and price of other goods hold constant, a 1% increase in price of travel is estimated to reduce the travel quantity consumed by 2.621%, and coefficient standard deviation is 0.2065, which is relatively small. Therefore, travel in Ruritania has a very elastic demand. An implication for this is that if government wishes to reduce quantity of travel in order to lower carbon emission resulted from travel, they could apply taxation on travel price, and this would give an very effective outcome in reducing quantity of travel consumed in Ruritarnia. For firms providing travel as theirs product, for example, bus companies, taxi companies, railway services etc, high elasticity of demand might be a negative news. Assuming the firms have price setting power, since
markup=1price elasticity of demand-1 , with higher elastic demand firms are forced to charge lower mark-up. This high elasticity of demand might due to many substitutes for travel -- walking by foot, bike riding etc, are essentially free (if we ignore the opportunity costs of time). Therefore in many cases, when the cost of travel goes up people could just choose to walk at price of zero, hence making demand for travel relatively elastic.
Income elasticity for travel is εQ,Y=∂Q∂Y∙YQ≈lnQlnY= β2=0.6266
With percentage change of travel price and other goods price hold constant, a 1% increase in income level is estimated to increase the travel quantity consumed by 0.6266%, with standard deviation of 0.2614. Travel is considered as a normal good and a necessary good in Ruritania.
Cross price elasticity for travel is εQa,Pb=∂Qa∂Pb∙PbQa≈lnQalnPb= β3=2.035.
With percentage change of travel price and income level hold constant, a 1% increase in price of all other goods is estimated to increase the travel quantity consumed by 2.035% and the standard deviation for coefficient is 0.3291. Travel is considered as a complement for other goods in Ruritania.
However, the variable ‘price of all other goods’ treats a bundle of goods together and overall they are complementary with travel. There is a potential problem: not all goods have the same cross price elasticity with travel. While many goods appear to be complemented with travel (we have to travel to purchase or consume these goods), there are numbers of goods, such as telephone, Internet may be substitute for travel and they might have a negative cross elasticity of demand with travel. If we can telephone or email someone, it avoid us the hazard of physically travel to another place and talk to a certain person face to face. This indicates that if we have more sufficient data, it might be a good idea to separate the goods and investigate the cross price elasticity between travel with individual good or at least divide the whole bundle into smaller groups.
For the dummy variable, lnQ(after 2004)-lnQ(before 2004)=ln⁡[Q(after 2004)Q(before 2004)]=λ
the percentage difference between before and after 2004 is 100(eλ 1)% = 47.97%
The dummy variable suggests, keeping income level, price of travel and other goods constant, after 2004, the quantity demand for travel reduced by 47.97%, comparing to before 2004. I have suggested earlier, this structural break after 2004 might be a result of people gained higher realization of the seriousness and the severity of global warming, hence changed preference relating to the consumption of travel, they consume less. However, the data did not provide any information indicating the reason for this structure break, the change may just as likely due other factors. Therefore with no further information, we cannot be certain, whether the lower quantity demand for travel is due to higher awareness of global warming.
The estimated intercept suggests, when percentage change of all variables in the model hold constant, quantity of travel increase at a rate of 10.22%. This is a relative large number, it shows there might be other factors influencing the demand for travel, for example, the technology development of transportation, such as the popularisation of commercial air travel. Technology advancement for transportation could increase the quantity of travel we consume. Furthermore, the evitable globalization would increased our demand for travel from both leisure and business aspect. However, due to the data set we used have limited information, we cannot identify what are the driving factors for the percentage growth in quantity demand when income, price of travel and other related goods are kept constant.
When conducting the model, the price of travel used I the data set is only the monetary cost of travel, for example, train ticket, taxi fee, cost of operating a car etc. However in real life, we experience other costs, for example, the opportunity cost of time spent travelling. The research by Randall (reference -- )argues that individuals value travel cost differently, it is very difficult to observe the cost of travel directly, he suggests that travel costs are therefore unobservable. If the travel prices provided by this set of data are in fact inaccurate, it could lead to serious implications, since with different cost we would have a different demand function. Research by Englin & Shonkwiler (reference --) incorporated the ‘unobservable’ travel cost and found that travel time value is 39.7% of the wage rate in their example.

Different types of travel would have different elasticity

Another issue in our model is that, travel can due to many different purposes, including work related commuting, shopping, socialising, emergency, recreational, joy rides etc. Different types of travel would have very different price, income and cross price elasticities. The report from Todd Litman (reference..) ranked different types of travel by its user value, with emergency and commuting being the highest and joy rides being the lowest. This suggests travel value is highly variable, and price sensitivity of travel demand differs for different kind of trips. It might be more useful more make more sense to consider the elasticity for different types of travel than for the aggregate travel demand.

PCGIVE OUTPUT

DATA_MODEL 1
Modelling LQtrav by OLS
The estimation sample is: 1 - 120
Coefficient Std.Error t-value t-prob Part.R^2
Constant 10.3219 1.565 6.59 0.0000 0.2727
LPtrav -2.57000 0.1532 -16.8 0.0000 0.7081
LY 0.636785 0.1944 3.28 0.0014 0.0847
LPALLOTH 1.95764 0.2348 8.34 0.0000 0.3747
sigma 0.471667 RSS 25.8065019
R^2 0.850943 F(3,116) = 220.7 [0.000]**
log-likelihood -78.0607 DW 1.24
no. of observations 120 no. of parameters 4
mean(LQtrav) 14.5466 var(LQtrav) 1.44277

Error autocorrelation coefficients in auxiliary regression (model 1):

Lag Coefficient Std.Error
1 0.37723 0.08724
RSS = 22.1976 sigma = 0.193023

DATA_MODEL 2

Modelling *LQT by OLS
The estimation sample is: 2 - 120
Coefficient Std.Error t-value t-prob Part.R^2
Constant 6.36362 1.293 4.92 0.0000 0.1739
*LP -2.62121 0.2065 -12.7 0.0000 0.5836
*LY 0.626628 0.2614 2.40 0.0181 0.0476
*LPallother 2.03503 0.3291 6.18 0.0000 0.2495
sigma 0.438902 RSS 22.1529711
R^2 0.733067 F(3,115) = 105.3 [0.000]**
log-likelihood -68.8251 DW 2.14
no. of observations 119 no. of parameters 4
mean(*LQT) 9.03148 var(*LQT) 0.697401

Error autocorrelation coefficients in auxiliary regression (model 2):

Lag Coefficient Std.Error
1 -0.071033 0.09385
2 0.15354 0.09456
RSS = 21.5027 sigma = 0.190289
Testing for error autocorrelation from lags 1 to 2 Chi^2(2) = 3.4932 [0.1744] and F-form F(2,113) = 1.7087 [0.1858]

Heteroscedasticity test summary

Heteroscedasticity coefficients:
Coefficient Std.Error t-value
*LP -0.15705 1.3135 -0.11957
*LY -1.1016 5.9411 -0.18542
*LPallother 7.1289 6.9905 1.0198
*LP^2 0.042978 0.23660 0.18165
*LY^2 0.13117 0.60183 0.21795
*LPallother^2 -1.1336 1.0559 -1.0736
RSS = 9.92098 sigma = 0.303086 effective no. of parameters = 7

Regression in deviation from mean

Testing for heteroscedasticity using squares
Chi^2(6)=6.3951 [0.3804] and F-form F(6,108)=1.0223 [0.4148]

Heteroscedasticity coefficients:

Coefficient Std.Error t-value
*LP -13.615 6.8634 -1.9836
*LY 14.787 10.087 1.4660
*LPallother 24.424 11.743 2.0800
*LP^2 -0.92036 0.57284 -1.6067
*LY^2 -1.5524 1.1476 -1.3527
*LPallother^2 -3.4616 1.9957 -1.7345
*LP**LY 2.2906 1.2562 1.8234
*LY**LPallother -1.6566 2.0914 -0.79213
*LP**LPallother 2.2535 1.5506 1.4533
RSS = 9.49869 sigma = 0.300772 effective no. of parameters = 10

Regression in deviation from mean

Testing for heteroscedasticity using squares and cross products
Chi^2(9)=11.188 [0.2630] and F-form F(9,105)=1.2107 [0.2963]

Normality test for Residuals

Observations 119 Mean 0.0000
Std.Devn. 0.43146 Skewness 0.29969
Excess Kurtosis 0.54230 Minimum -0.95847

Maximum 1.4326

Asymptotic test: Chi^2(2) = 3.2395 [0.1979]
Normality test: Chi^2(2) = 3.4304 [0.1799]

Chow test statistics

Subsample before 2004 (observation:2 – 76)
Coefficient Std.Error t-value t-prob Part.R^2
Constant 0.694620 2.476 0.280 0.7799 0.0011
*LP -2.93258 0.3187 -9.20 0.0000 0.5440
*LY 0.900787 0.3503 2.57 0.0122 0.0852
*LPallother 3.70720 0.9106 4.07 0.0001 0.1892
sigma 0.446609 RSS 14.1616171
R^2 0.682315 F(3,71) = 50.83 [0.000]**
log-likelihood -43.9097 DW 2.36
no. of observations 75 no. of parameters 4
mean(*LQT) 9.27306 var(*LQT) 0.594368

Subsample after 2004 (observation: 77 – 120)

Coefficient Std.Error t-value t-prob Part.R^2
Constant 4.88230 2.812 1.74 0.0902 0.0701
*LP -3.07509 0.3415 -9.00 0.0000 0.6697
*LY 0.226626 0.5895 0.384 0.7027 0.0037
*LPallother 3.39993 1.308 2.60 0.0130 0.1446
sigma 0.368712 RSS 5.43794843
R^2 0.795375 F(3,40) = 51.83 [0.000]**
log-likelihood -16.436 DW 2.6
no. of observations 44 no. of parameters 4
mean(*LQT) 8.61969 var(*LQT) 0.603981
reference:
1. A.Randall, A difficulty with travel cost methods, Land Economy 70,page 88-96 (1994)
2. Jeffrey Englin and J.S.Shonkwiler, modeling recreation demand in the presence of unobservable travel costs: towards a travel price model, Journal of environmental economics and management 29,Page 368-377(1995)
3. Todd Litman, Understanding transport demands and elasticities: how prices and other factors affect travel behavior, Victoria transport policy institute, march 2013
References
Here I put the list of literature which may be related to your research:
Allen, Michael Patrick. Understanding Regression Analysis. New York: Plenum Press, 1997.
Berry, William Dale. Understanding Regression Assumptions. Newbury Park, Calif.: Sage Publications, 1993.
Colell, Andreu, and Michael Dennis Whinston. Microeconomic Theory. New York: Oxford University Press, 1995.
Dobson, Annette J. An Introduction to Generalized Linear Models. 2nd ed. Boca Raton: Chapman & Hall/CRC, 2002.
Doornik, Jurgen A., and David F. Hendry. PcGive 12. London: Timberlake Consultants, 2007.
Kleinbaum, David G., and Lawrence L. Kupper. Applied Regression Analysis and Other Multivariable Methods. North Scituate, Mass.: Duxbury Press, 1978.
Kuntz, Masako. Ordinal Log-linear Models. Thousand Oaks, Calif.: Sage Publications, 1994.
Neeleman, D. Multicollinearity in Linear Economic Models. Tilburg: Tilburg University Press, 1973.
Nicholson, Walter. Microeconomic Theory: Basic Principles and Extensions. 2d ed. Hinsdale, Ill.: Dryden Press, 1978.

Cite this page
Choose cite format:
  • APA
  • MLA
  • Harvard
  • Vancouver
  • Chicago
  • ASA
  • IEEE
  • AMA
WePapers. (2021, February, 26) Example Of Data Representation Report. Retrieved April 25, 2024, from https://www.wepapers.com/samples/example-of-data-representation-report/
"Example Of Data Representation Report." WePapers, 26 Feb. 2021, https://www.wepapers.com/samples/example-of-data-representation-report/. Accessed 25 April 2024.
WePapers. 2021. Example Of Data Representation Report., viewed April 25 2024, <https://www.wepapers.com/samples/example-of-data-representation-report/>
WePapers. Example Of Data Representation Report. [Internet]. February 2021. [Accessed April 25, 2024]. Available from: https://www.wepapers.com/samples/example-of-data-representation-report/
"Example Of Data Representation Report." WePapers, Feb 26, 2021. Accessed April 25, 2024. https://www.wepapers.com/samples/example-of-data-representation-report/
WePapers. 2021. "Example Of Data Representation Report." Free Essay Examples - WePapers.com. Retrieved April 25, 2024. (https://www.wepapers.com/samples/example-of-data-representation-report/).
"Example Of Data Representation Report," Free Essay Examples - WePapers.com, 26-Feb-2021. [Online]. Available: https://www.wepapers.com/samples/example-of-data-representation-report/. [Accessed: 25-Apr-2024].
Example Of Data Representation Report. Free Essay Examples - WePapers.com. https://www.wepapers.com/samples/example-of-data-representation-report/. Published Feb 26, 2021. Accessed April 25, 2024.
Copy

Share with friends using:

Related Premium Essays
Contact us
Chat now