Class 13 Stat701 Fall 1997
Weighted Least Squares and Monte Carlo Simulation.
Todays class.
- Heteroscedasticity and Weighted Least Squares.
- Monte Carlo studies.
Plan:
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Review: unbiased and efficient.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
-
Summarize Ordinary Least squares under heteroscedasticity.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
-
Consider options and their properties.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Discuss Monte Carlo simulations.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
-
Look at simulation results.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
-
Data analysis example --
housing prices and pollution.
What makes a good estimate?
A good estimator,
of a population parameter
has at least two properties:
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- On average it takes on the correct value, that is
: UNBIASED
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- It is most concentrated around the true value: EFFICIENT
Monte Carlo studies.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- A stochastic extension of scenario analysis.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/redpin.gif)
- Best case scenario
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/redpin.gif)
- Typical case scenario
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/redpin.gif)
- Worst case scenario
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Problem: they are not equally likely to happen
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Would like to include in the overall evaluation
the probability/frequency that each scenario happens.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Monte Carlo refinement: let the computer randomly generate the
scenarios (many of them - hopefully with frequencies in accordance
with reality) and evaluate strategies over these random draws.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Generate a world - evaluate an action on that world.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Potentially much more informative - for example can talk about
the ``chances'' that an event happens.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Downside - if computer generates incongruent scenarios then get
garbage.
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- Very effective when mechanism to generate world is
simple but evaluating an action in that world is complex,
e.g. simple models for the market (the world) but need to price a
complex derivative (the action)
![*](http://www-stat.wharton.upenn.edu/~waterman/icons/bluepin.gif)
- The foundation for the analysis: the Law of Large Numbers (maybe the
colloquial law of averages)
Apply to indicators of an event happening: then in English
proportion of
times that an event happens in the simulation tends to the probability
that the event happens.
Simulation results.
Housing data example.
Check: Does the statement
the long run probability that it rains tomorrow is 0.3
confuse you.
It shouldn't: the term long run has nothing to do with
way into the future. It just means averaging over an increasing
number of events.
Richard Waterman
Mon Oct 20 22:02:16 EDT 1997