Simplicity, Inference and Modelling: Keeping it by Arnold Zellner, Hugo A. Keuzenkamp, Michael McAleer

By Arnold Zellner, Hugo A. Keuzenkamp, Michael McAleer

The concept that simplicity issues in technology is as previous as technology itself, with the a lot brought up instance of Ockham's Razor. an issue with Ockham's Razor is that just about each person turns out to just accept it, yet few may be able to outline its distinctive that means and to make it operational in a non-arbitrary method. utilizing a multidisciplinary point of view together with philosophers, mathematicians, econometricians and economists, this monograph examines simplicity through asking six questions: what's intended by way of simplicity? How is simplicity measured? Is there an optimal trade-off among simplicity and goodness-of-fit? what's the relation among simplicity and empirical modelling? what's the relation among simplicity and prediction? what's the connection among simplicity and comfort?

Show description

Read or Download Simplicity, Inference and Modelling: Keeping it Sophisticatedly Simple PDF

Best econometrics books

Long Memory in Economics

Whilst making use of the statistical concept of lengthy variety based (LRD) procedures to economics, the robust complexity of macroeconomic and monetary variables, in comparison to common LRD methods, turns into obvious. to be able to get a greater figuring out of the behaviour of a few monetary variables, the e-book assembles 3 assorted strands of lengthy reminiscence research: statistical literature at the houses of, and assessments for, LRD approaches; mathematical literature at the stochastic techniques concerned; types from financial concept offering believable micro foundations for the occurence of lengthy reminiscence in economics.

The Theory and Practice of Econometrics, Second Edition (Wiley Series in Probability and Statistics)

This largely dependent graduate-level textbook covers the foremost versions and statistical instruments at the moment utilized in the perform of econometrics. It examines the classical, the choice thought, and the Bayesian methods, and includes fabric on unmarried equation and simultaneous equation econometric types. comprises an in depth reference record for every subject.

The Reciprocal Modular Brain in Economics and Politics: Shaping the Rational and Moral Basis of Organization, Exchange, and Choice

The current paintings is an extension of my doctoral thesis performed at Stanford within the early Nineteen Seventies. in a single transparent feel it responds to the decision for consilience via Edward O. Wilson. I trust Wilson that there's a urgent desire within the sciences at the present time for the unification of the social with the ordinary sciences.

Analogies and Theories: Formal Models of Reasoning

The booklet describes formal types of reasoning which are geared toward taking pictures the best way that fiscal brokers, and determination makers more often than not take into consideration their setting and make predictions in response to their previous adventure. the focal point is on analogies (case-based reasoning) and common theories (rule-based reasoning), and at the interplay among them, in addition to among them and Bayesian reasoning.

Additional resources for Simplicity, Inference and Modelling: Keeping it Sophisticatedly Simple

Sample text

Several philosophers have asserted, without providing much of a supporting argument, that the trade-off problem has no objective solution. What is the problem of simplicity? 21 For example, Kuhn (1977) claimed that scientists differ in how much importance they assign to one virtue of a theory as opposed to another, and that this difference is just a matter of taste. One scientist may think that the most important demand on a theory is that it should make accurate predictions; another may hold that the first duty of a theory is that it be elegant and general.

Accepting or rejecting Prout’s Law (or any other) is a matter of taste – or should we call it free will? 42 Herbert A. Simon the 3/2-power law. Whether it accepts the former, or rejects it and goes on to the latter, depends on the goodness-of-fit criterion it applies (determined by the programmer). BACON, like Kepler and Prout (and everyone else), needs a separate parameter (a ‘propensity for simplicity’) to determine what degree of approximation is acceptable. , the log function, the exponential, the sine function), but it is remarkable that with the linear function as its sole primitive, it discovers not only Kepler’s Third Law, but also Joseph Black’s law of the equilibrium temperatures of mixtures of liquids, Ohm’s law of current and resistance, Snell’s law of refraction, the law of conservation of momentum and a host of others.

Since the likelihoods of specific curves cannot explain why simplicity is desirable, perhaps we should consider the likelihoods of families of curves. This approach requires that we ask, for example, what the probability is of obtaining the data, if (LIN) is correct? This quantity is an average over all the specific straight lines (L1 , L2 , . ) that belong to the family: X Pr(Data j LIN) ¼ Pr(Data j Li ÞPr(Li j LINÞ: Some of the Li ’s are very near the data, so the value of Pr(Data | Li ) for those straight lines will be large; however, many straight lines will be quite far away, and so the value of Pr(Data | Li ) for them will be small.

Download PDF sample

Rated 4.25 of 5 – based on 27 votes