What do you see when you compare the coefficients on RootCause estimated in those two regressions? Why do we see those results in 2 and 3?

Math/Physic/Economic/Statistic Problems

Create a 1000 observation dataset. Generate variables RootCause and OtherThing as independent, uncorrelated variables each drawn from a normal distribution with mean 0 and variance

1. Create a set of normal error terms with mean 0 and variance 1. Let Outcome = 1 + RootCause + 3OtherThing + errors.

1. Draw a graphical representation of the data generating process (DGP) involving the variables Outcome, RootCause, and OtherThing, that is, show by drawing arrows how these three variables relate to each other in the data you generated. Are RootCause and OtherThing independent?

If you think they are independent, how would you represent graphically that they are independent in your DGP? If you think they are not independent, how would you represent graphically that they are not independent in your DGP?

2. Regress Outcome on RootCause. Report and interpret the result. Did you estimate the causal effect of RootCause on Outcome with this regression? Did you estimate the causal effect of RootCause on Outcome with this regression?

If you think you were able to estimate this causal effect, why do you think you were able to do so with this regression? If you think you were not able to estimate this causal effect, why do you think you were not able to do so with this regression?

3. Regress Outcome on RootCause and OtherThing. Report and interpret the result. Did you estimate the causal effect of RootCause on Outcome?

If you think you were able to estimate this causal effect, why do you think you were able to do so with this regression? If you think you were not able to estimate this causal effect, why do you think you were not able to do so with this regression?

4. Compare the results of the regressions you ran in 2 and 3.

What do you see when you compare the coefficients on RootCause estimated in those two regressions? Why do we see those results in 2 and 3?
Problem set 8

Clear your work space and create a new data set with 1000 observations. Generate variable RootCause following a normal distribution with mean 0 and variance

1. Generate variable OtherThing = 2RootCause + noise where noise follows a normal distribution with mean 0 and variance 1. Create a set of normal error terms with mean 0 and variance 1. Let Outcome = 1 + RootCause + 3OtherThing + errors.

1. Draw a graphical representation of the data generating process (DGP) involving the variables Outcome, RootCause, and OtherThing, that is, show by drawing arrows how these three variables relate to each other in the data you generated.

2. Regress Outcome on RootCause. Report and interpret the result. Did you estimate the causal effect of RootCause on Outcome with this regression?

If you think you were able to estimate this causal effect, why do you think you were able to do so with this regression? If you think you were not able to estimate this causal effect, why do you think you were not able to do so with this regression?

3. Regress Outcome on RootCause and OtherThing. Report and interpret the result.

Did you estimate the causal effect of RootCause on Outcome? If you think you were able to estimate this causal effect, why do you think you were able to do so with this regression? If you think you were not able to estimate this causal effect, why do you think you were not able to do so with this regression?

4. Compare the results of the regressions you ran in 2 and 3. What do you see when you compare the coefficients on RootCause estimated in those two regressions?

Why do we see those results in 2 and 3? In your answer, explain how those results are related to the question whether the regression of Outcome on RootCause and OtherThing meets the conditional independence assumption.