This homework asks you to think about a relatively simple case of nonlinear optimization: the well known Probit model. Of course there are plenty of *canned* solutions to this problem, however, here we want to hone our skills a bit when it comes to actually implementing a nonlinear optimization problem. We will see that

providing gradient and/or hessian information to an algorithm changes the speed and quality of convergence

Different algorithms reach slightly different optima

there are several ways to obtain standard errors in a likelihood estimation setting.

Get the notebook here

As usual, teams of at least 2.

submit static html file.

dropbox link via slack.

Last modified: February 13, 2024. Website built with Franklin.jl and the Julia programming language.