Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
fixed EC typo
  • Loading branch information
rcc02007 committed Nov 13, 2017
1 parent 3e00e7e commit 0defb9c
Show file tree
Hide file tree
Showing 20 changed files with 13 additions and 6 deletions.
Binary file modified 14_stats_and_montecarlo/octave-workspace
Binary file not shown.
Binary file modified 16_splines/octave-workspace
Binary file not shown.
Binary file modified 17_integrals_and_derivatives/octave-workspace
Binary file not shown.
Binary file added extra_credit/.README.md.swp
Binary file not shown.
11 changes: 5 additions & 6 deletions extra_credit/README.md
Expand Up @@ -27,22 +27,21 @@ columns with your netid on each row as such,

**Nonlinear Regression - Logistic Regression**

![logistic regression of Challenger O-ring failure](http://www.stat.ufl.edu/~winner/cases/challenger.ppt)
[logistic regression of Challenger O-ring failure](http://www.stat.ufl.edu/~winner/cases/challenger.ppt)

Use the Temperature and failure data from the Challenger O-rings
[challenger_oring.csv](./challenger_oring.csv). Your independent variable is temperature and your dependent
variable is failure (1=fail, 0=pass). Create a function called `cost_logistic.m` that
takes the vector `a`, and independent variable `x` and dependent variable `y`. Use the
function, $\sigma(t)=\frac{1}{1+e^{-t}}$ where $t=a_{0}+a_{1}x$. Use the cost function,
function, ![sigma](./equations/sigma.png) where ![t](./equations/t.png). Use the cost function,

$J(a_{0},a_{1})=1/m\sum_{i=1}^{n}\left[-y_{i}\log(\sigma(t_{i}))-(1-y_{i})\log((1-\sigma(t_{i})))\right]$
![cost](./equations/cost.png)

and gradient

$\frac{\partial J}{\partial a_{i}}=
1/m\sum_{k=1}^{N}\left(\sigma(t_{k})-y_{k}\right)x_{k}^{i}$
![costgrad](./equations/costgrad.png)

where $x_{k}^{i} is the k-th value of temperature raised to the i-th power (0, and 1)
where ![x](./equations/x.png) is the k-th value of temperature raised to the i-th power (0, and 1)

a. edit `cost_logistic.m` so that the output is `[J,grad]` or [cost, gradient]

Expand Down
Binary file added extra_credit/equations/.cost.tex.swp
Binary file not shown.
Binary file added extra_credit/equations/.costgrad.tex.swp
Binary file not shown.
Binary file added extra_credit/equations/.sigma.tex.swp
Binary file not shown.
Binary file added extra_credit/equations/.t.tex.swp
Binary file not shown.
Binary file added extra_credit/equations/.x.tex.swp
Binary file not shown.
Binary file added extra_credit/equations/cost.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions extra_credit/equations/cost.tex
@@ -0,0 +1,2 @@
J(a_{0},a_{1})=1/m\sum_{i=1}^{n}\left[-y_{i}\log(\sigma(t_{i}))-(1-y_{i})\log((1-\sigma(t_{i})))\right]

Binary file added extra_credit/equations/costgrad.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions extra_credit/equations/costgrad.tex
@@ -0,0 +1,2 @@
\frac{\partial J}{\partial a_{i}}=1/m\sum_{k=1}^{N}\left(\sigma(t_{k})-y_{k}\right)x_{k}^{i}

Binary file added extra_credit/equations/sigma.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions extra_credit/equations/sigma.tex
@@ -0,0 +1,2 @@
\sigma(t)=\frac{1}{1+e^{-t}}

Binary file added extra_credit/equations/t.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions extra_credit/equations/t.tex
@@ -0,0 +1 @@
t=a_{0}+a_{1}x
Binary file added extra_credit/equations/x.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions extra_credit/equations/x.tex
@@ -0,0 +1 @@
x_{k}^{i}

0 comments on commit 0defb9c

Please sign in to comment.