-
Notifications
You must be signed in to change notification settings - Fork 0
/
Block_J.Rmd
162 lines (118 loc) · 4.55 KB
/
Block_J.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
\newpage
\chapter{Programming}
\section{Writing Functions}
A simple function can be constructed as follows:
\begin{verbatim}
function_name <- function(arg1, arg2, ...){
commands
output
}
\end{verbatim}
You decide on the name of the function. The function command shows R that you are writing a function. Inside the parenthesis you outline the input objects required and decide what to call them. The commands occur inside the { }.
The name of whatever output you want goes at the end of the function. Comments lines (usually a description of what the function does is placed at the beginning) are denoted by "\#".
\begin{verbatim}sf1 <- function(x){
x^2
}
\end{verbatim}
This function is called sf1. It has one argument, called x.
Whatever value is inputted for x will be squared and the result outputted to the screen. This function must be loaded into ``R} and can then be called. We can call the function using:
\begin{verbatim}
sf1(x = 3)
#sf1(3)
[1] 9
To store the result into a variable x.sq
x.sq <- sf1(x = 3)
x.sq <- sf1(3)
> x.sq
[1] 9
\end{verbatim}
Example
\begin{verbatim}
sf2 <- function(a1, a2, a3){
x <- sqrt(a1^2 + a2^2 + a3^2)
return(x)
}
\end{verbatim}
This function is called sf2 with 3 arguments. The values inputted for a1, a2, a3 will be squared, summed and the square root of the sum calculated and stored in x. (There will be no output to the screen as in the last example.)
The return command specifies what the function returns, here the value of x. We will not be able to view the result of the function unless we store it.
\begin{verbatim}sf2(a1=2, a2=3, a3=4)
sf2(2, 3, 4) # Can't see result.
res <- sf2(a1=2, a2=3, a3=4)
res <- sf2(2, 3, 4) # Need to use this.
res
[1] 5.385165
\end{verbatim}
We can also give some/all arguments default values.
\begin{verbatim}mypower <- function(x, pow=2){
x^pow
}
\end{verbatim}
If a value for the argument pow is not specified in the function call,
a value of 2 is used.
\begin{verbatim}mypower(4)
[1] 16
\end{verbatim}
If a value for "pow" is specified, that value is used.
\begin{verbatim}
mypower(4, 3)
[1] 64
mypower(pow=5, x=2)
[1] 32
\end{verbatim}
%----------------------------------------------------%
\large \begin{verbatim}
> code here
\end{verbatim}\large
\large \begin{verbatim}
> code here
\end{verbatim}\large
%---------------------------------------------------%
\subsubsection{slide234}
The TS are <equation here>
The p-values for both of these tests are 0 and so there is enough evidence to reject $H_0$ and conclude that both 0 and 1 are not 0, i.e. there is a significant linear relationship between x and y.
Also given are the $R^2$ and $R^2$ adjusted values. Here $R^2 = SSR/SST = 0.8813$ and so $88.13\%$ of the variation in y is being explained by x.
The final line gives the result of using the ANOVA table to assess the model t.
%----------------------------------------------------%
\subsubsection{slide235}
In SLR, the ANOVA table tests <EQN>The TS is the F value and the critical value and p-values are found
in the F tables with (p - 1) and (n - p) degrees of freedom.
This output gives the p-value = 0, therefore there is enough evidence to reject H0 and conclude that there is a signicant linear relationship between y and x. The full ANOVA table can be accessed using :
<TABLE HERE>
\subsubsection{slide236}
Once the model has been tted, must then check the residuals.
The residuals should be independent and normally distributed with
mean of 0 and constant variance.
A Q-Q plot checks the assumption of normality (can also use a
histogram as in MINITAB) while a, plot of the residuals versus fitted values gives an indication as to whether the assumption of constant variance holds.
<HISTOGRAM>
%----------------------------------------------------%
\subsubsection{slidename}
\large \begin{verbatim}
> xbar <- 83
> sigma <- 12
> n <- 5
> sem <- sigma/sqrt(n)
> sem
[1] 5.366563
> xbar + sem * qnorm(0.025)
[1] 72.48173
> xbar + sem * qnorm(0.975)
[1] 93.51827
\end{verbatim}\large
\subsubsection{Testing the slope (II)}
You can compute a
t test for that hypothesis simply by dividing the estimate by its standard
error
\begin{equation}
t = \frac{\hat{\beta}}{S.E.(\hat{\beta})}
\end{equation}
which follows a t distribution on n - 2 degrees of freedom if the true $\beta$ is
zero.
%----------------------------------------------------%
\begin{itemize}
* The standard $\chi^{2}$ test in chisq.test works with data in matrix form, like fisher.test does.
* For a 2 by 2 table, the test is exactly equivalent to prop.test.
\end{itemize}
\large \begin{verbatim}
> chisq.test(lewitt.machin)
\end{verbatim}\large