# Implement Seemingly Unrelated Regression

This example shows how to include exogenous data for several seemingly unrelated regression (SUR) analyses. The response and exogenous series are random paths from a standard Gaussian distribution.

In seemingly unrelated regression (SUR), each response variable is a function of a subset of the exogenous series, but not of any endogenous variable. That is, for $j=1,..,n$ and $t=1,...,T$, the model for response $j$ at period $t$ is

`${y}_{jt}={a}_{j}+{b}_{j1}{x}_{{k}_{1}t}+{b}_{j2}{x}_{{k}_{2}t}+...+{b}_{j{k}_{j}}{x}_{{k}_{j}t}+{\epsilon }_{jt}$`

The indices of the regression coefficients and exogenous predictors indicate that:

• You can associate each response with a different subset of exogenous predictors.

• The response series might not share intercepts or regression coefficients.

SUR accommodates intra-period innovation correlation, but inter-period innovation independence, i.e.,

`$E\left({\epsilon }_{it}{\epsilon }_{js}|X\right)=\left\{\begin{array}{cc}0;& t\ne s,\phantom{\rule{0.2777777777777778em}{0ex}}i\ne j\\ {\sigma }_{ij};& i\ne j,\phantom{\rule{0.2777777777777778em}{0ex}}t=s\\ {\sigma }_{i}^{2}>0;& i=j,\phantom{\rule{0.2777777777777778em}{0ex}}t=s\end{array}.$`

### Simulate Data from True Model

Suppose that the true model is

`$\begin{array}{l}{y}_{1t}=1+2{x}_{1t}-1.5{x}_{2t}+0.5{x}_{3t}+0.75{x}_{4t}+{\epsilon }_{1t}\\ {y}_{2t}=-1+4{x}_{1t}+2.5{x}_{2t}-1.75{x}_{3t}-0.05{x}_{4t}+{\epsilon }_{2t}\\ {y}_{3t}=0.5-2{x}_{1t}+0.5{x}_{2t}-1.5{x}_{3t}+0.7{x}_{4t}+{\epsilon }_{3t}\end{array},$`

where ${\epsilon }_{jt}$, $j=1,...,n$ are multivariate Gaussian random variables each having mean zero and jointly having covariance matrix

`$\Sigma =\left[\begin{array}{ccc}1& 0.5& -0.05\\ 0.5& 1& 0.25\\ -0.05& 0.25& 1\end{array}\right]$`

Suppose that the paths represent different econometric measurements, e.g. stock returns.

Simulate four exogenous predictor paths from the standard Gaussian distribution.

```rng(1); % For reproducibility n = 3; % Number of response series nExo = 4; % Number of exogenous series T = 100; X = randn(100,nExo);```

`mvregress`, the workhorse of `estimate`, requires you to input the exogenous data in a `T`-by-1 cell vector. Cell $t$ of the cell vector is a design matrix indicating the linear relationship of the exogenous variables with each response series at period $t$. However, `estimate` associates each predictor to every response. As a result, `estimate` requires the predictor data in a matrix.

Create a VAR model object that characterizes the true model. Simulate a length 100 path of responses from the model.

```aTrue = [1; -1; 0.5]; bTrue = [[2; 4; -2] [-1.5; 2.5; 0.5] [0.5; -1.75; -1.5] [0.75; -0.05; 0.7]]; InnovCov = [1 0.5 -0.05; 0.5 1 0.25; -0.05 0.25 1]; TrueMdl = varm('Beta',bTrue,'Constant',aTrue,'Covariance',InnovCov)```
```TrueMdl = varm with properties: Description: "3-Dimensional VARX(0) Model with 4 Predictors" SeriesNames: "Y1" "Y2" "Y3" NumSeries: 3 P: 0 Constant: [1 -1 0.5]' AR: {} Trend: [3×1 vector of zeros] Beta: [3×4 matrix] Covariance: [3×3 matrix] ```
`Y = simulate(TrueMdl,T,'X',X);`

### SUR Using All Predictors for Each Response Series

Create a VAR model suitable for SUR using the shorthand syntax of `varm`.

`Mdl1 = varm(n,0);`

`Mdl1` is a `varm` model object template representing a three-dimensional VAR(0) model. Unlike `TrueMdl`, none of the coefficients, intercepts, and intra-period covariance matrix have values. Therefore, `Mdl1` is suitable for estimation.

Estimate the regression coefficients using `estimate`. Extract the residuals. Display the estimated model using `summarize`.

```[EstMdl1,~,~,E] = estimate(Mdl1,Y,'X',X); summarize(EstMdl1)```
``` 3-Dimensional VARX(0) Model with 4 Predictors Effective Sample Size: 100 Number of Estimated Parameters: 15 LogLikelihood: -412.026 AIC: 854.052 BIC: 893.129 Value StandardError TStatistic PValue _________ _____________ __________ ___________ Constant(1) 0.97898 0.11953 8.1902 2.6084e-16 Constant(2) -1.0644 0.10019 -10.623 2.3199e-26 Constant(3) 0.45323 0.10123 4.4772 7.5611e-06 Beta(1,1) 1.7686 0.11994 14.745 3.2948e-49 Beta(2,1) 3.8576 0.10054 38.37 4.1502e-322 Beta(3,1) -2.2009 0.10158 -21.667 4.1715e-104 Beta(1,2) -1.5508 0.12345 -12.563 3.3861e-36 Beta(2,2) 2.4407 0.10348 23.587 5.2666e-123 Beta(3,2) 0.46414 0.10455 4.4395 9.0156e-06 Beta(1,3) 0.69588 0.13491 5.1583 2.4922e-07 Beta(2,3) -1.7139 0.11308 -15.156 6.8911e-52 Beta(3,3) -1.6414 0.11425 -14.367 8.3713e-47 Beta(1,4) 0.67036 0.12731 5.2654 1.399e-07 Beta(2,4) -0.056437 0.10672 -0.52885 0.59691 Beta(3,4) 0.56581 0.10782 5.2476 1.5406e-07 Innovations Covariance Matrix: 1.3850 0.6673 -0.1591 0.6673 0.9731 0.2165 -0.1591 0.2165 0.9934 Innovations Correlation Matrix: 1.0000 0.5748 -0.1357 0.5748 1.0000 0.2202 -0.1357 0.2202 1.0000 ```

`EstMdl` is a `varm` model object containing the estimated parameters. `E` is a $T$-by- $n$ matrix of residuals.

Alternatively, and in this case, you can use the backslash operator on `X` and `Y`. However, you must include a column of ones in `X` for the intercepts.

`coeff = ([ones(T,1) X]\Y)`
```coeff = 5×3 0.9790 -1.0644 0.4532 1.7686 3.8576 -2.2009 -1.5508 2.4407 0.4641 0.6959 -1.7139 -1.6414 0.6704 -0.0564 0.5658 ```

`coeff` is a `n`-by- `nExo + 1` matrix of estimated regression coefficients and intercepts. The estimated intercepts are in the first column, and the rest of the matrix contains the estimated regression coefficients

Compare all estimates to their true values.

```InterceptsTbl = table(aTrue,EstMdl1.Constant,coeff(1,:)',... 'VariableNames',["True" "estimate" "backslash"])```
```InterceptsTbl=3×3 table True estimate backslash ____ ________ _________ 1 0.97898 0.97898 -1 -1.0644 -1.0644 0.5 0.45323 0.45323 ```
```cB = coeff'; cB = cB(:); CoefficientsTbl = table(bTrue(:),EstMdl1.Beta(:),cB((n + 1):end),... 'VariableNames',["True" "estimate" "backslash"])```
```CoefficientsTbl=12×3 table True estimate backslash _____ _________ _________ 2 1.7686 1.7686 4 3.8576 3.8576 -2 -2.2009 -2.2009 -1.5 -1.5508 -1.5508 2.5 2.4407 2.4407 0.5 0.46414 0.46414 0.5 0.69588 0.69588 -1.75 -1.7139 -1.7139 -1.5 -1.6414 -1.6414 0.75 0.67036 0.67036 -0.05 -0.056437 -0.056437 0.7 0.56581 0.56581 ```
```InnovCovTbl = table(InnovCov,EstMdl1.Covariance,... 'VariableNames',["True" "estimate"])```
```InnovCovTbl=3×2 table True estimate _______________________ ________________________________ 1 0.5 -0.05 1.385 0.6673 -0.15914 0.5 1 0.25 0.6673 0.97312 0.21649 -0.05 0.25 1 -0.15914 0.21649 0.99338 ```

The estimates from implementing `estimate` and the backslash operator are the same, and are fairly close to their corresponding true values.

One way to check the relationship strength between the predictors and responses is to compute the coefficient of determination (i.e., the fraction of variation explained by the predictors), which is

`${R}^{2}=1-\frac{\sum _{j}^{n}{\underset{}{\overset{ˆ}{\sigma }}}_{\epsilon j}^{2}}{\sum _{j}^{n}{\underset{}{\overset{ˆ}{\sigma }}}_{Yj}^{2}},$`

where ${\underset{}{\overset{ˆ}{\sigma }}}_{\epsilon j}^{2}$ is the estimated variance of residual series $j$, and ${\underset{}{\overset{ˆ}{\sigma }}}_{Yj}^{2}$ is the estimated variance of response series $j$.

`R2 = 1 - sum(diag(cov(E)))/sum(diag(cov(Y)))`
```R2 = 0.9118 ```

The SUR model and predictor data explain approximately 93% of the variation in the response data.