forked from jac2130/DiversityMeasures
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathNotes.tex
98 lines (74 loc) · 4.13 KB
/
Notes.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
% Created 2013-11-12 Tue 08:05
\documentclass[11pt]{article}
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\usepackage{fixltx2e}
\usepackage{graphicx}
\usepackage{longtable}
\usepackage{float}
\usepackage{wrapfig}
\usepackage{soul}
\usepackage{textcomp}
\usepackage{marvosym}
\usepackage{wasysym}
\usepackage{latexsym}
\usepackage{amssymb}
\usepackage{hyperref}
\tolerance=1000
\usepackage{hyperref}
\usepackage{amsmath}
\usepackage{caption}
\usepackage{subcaption}
\usepackage{graphicx}
\usepackage[usenames,dvipsnames,svgnames,table]{xcolor}
\hypersetup{
colorlinks,%
citecolor=black,%
filecolor=black,%
linkcolor=blue,%
urlcolor=black
}
\providecommand{\alert}[1]{\textbf{#1}}
\title{Current Notes for Near Future Work.}
\author{Johannes Castner}
\date{\today}
\hypersetup{
pdfkeywords={},
pdfsubject={},
pdfcreator={Emacs Org-mode version 7.8.09}}
\begin{document}
\maketitle
\setcounter{tocdepth}{3}
\tableofcontents
\vspace*{1cm}
\newpage
\section{Different Bayes Nets for different purposes}
\label{sec-1}
\section{Some calculations}
\label{sec-2}
Here are some calculations that might help with fixing the integration in my MA paper, which are currently not done as well as they could be (has been fixed since, using Mark Newman's algorithm to get the weights and sample points for a Gaussian Quadrature, but the code here is still interesting). Calculating the interpolating polynomial for any given set of points $x_1, \lodts, x_k$:
$$\phi_k(x)=\prod_{m=1\ldotsN, m\not=k}\frac{x-x_m}{x_k-x_m}$$
\section{Function determines the comunication rituals.}
\label{sec-3}
\section{Diversity as a function of conviction and causal strength}
\label{sec-4}
I must write Diversity as a function of $\alpha$ and $\beta$, the causal effect shape parameters. The more certain one is in ones' propositions, or the stronger these propositions are (causal effect) the less able one is to learn anothers' propositions, if one were to be removed suddenly from one's own learning environment and thrown into the learning environment of the other. Thus, the more certain each individual is in her own convictions, the greater should be the Diversity. This is a question, whether this measure satisfies this.
The expected causal effect (strength of proposition) exerted by one cause, $A$, on its effect, $B$, is:
$$E(\text{Causal-Effect}_{i, (A, B)}) = E(\pi_{i, (A, B)})=\frac{\alpha}{\alpha + \beta}$$
and the associated uncertainty that one has in one's mind about this causal effect can be expressed in terms of the Entropy of the Beta distribution, as:
\begin{equation} \label{eq: entropy}
H(\alpha, \beta)=\int_0^1 -f(\pi_{i, (A, B)}, \alpha, \beta)\log(f(\pi_{i, (A, B)}, \alpha, \beta))d\pi_{i, (A, B)}=\\
\ln(\textbf{B}(\alpha, \beta))-(\alpha-1)\psi(\alpha)-(\beta-1)\psi(\beta)+(\alpha+\beta-2)\psi(\alpha+\beta),
\end{equation}\\
where $\psi(\cdot)$ is the digamma function, defined as the logarithmic derivative of the gamma function, $\Gamma(\cdot)$:
\begin{equation}
\psi(x)=\frac{d}{dx}\ln(\Gamma(x))=\frac{\Gamma^{'}(x)}{\Gamma(x)}.
\end{equation}
The Entropy, $H(\cdot)$, of the beta distribution, in this case representing the uncertainty of the value of a causal effect, is maximized in both of its arguments, $\alpha$ and $\beta$ when each takes on the value $1$, or when the beta distribution coincides with the uniform distribution. Of course, this should not come as a surpise, as this particular distribution has often been used as an uninformative prior for some parameter, despite of some serious flaws in this approach that have been pointed out in the literature (see, for example, King 1989).
The Diversity should increase for decreased Entropy, holding constant $\alpha/(\alpha + \beta)$, for example at $\frac{1}{2}$ (when $\alpha =\beta$) and it should increase in $\alpha-\beta$, holding constant the Entropy (as happens whenever the values of $\alpha$ and $\beta$ are reversed).
\section{Forecasts; how they work}
\label{sec-5}
shares outstanding will be the same for each forecast. Revenue and costs can be predicted using some causal mechanisms \ldots{}less disagreement on cost side.
Aluminum prices -> + Revenues
cost pressures: independent variable
\end{document}