A few comments on inference, reasoning, and the use of data

Non-Monotonic Inference

*Non-monotonicity* is a feature explored in mathematical logic regarding systems of inference and truth that capture reasoning that is capable of responding to changes in information

Boolean Algebra (the set-theoretic implementation of classical sentential or propositional logic) is *monotonic*

Consider, (a) A → C

Under classical logic or Boolean Algebra, the addition of new information B [which is modeled as a conjunct added to the antecedent of sentence (a)] does not change C's being true

(b) (A & B) → C

Consider, *if you are sneezing, you have a cold* but *if you are sneezing and seeing white spots, then you don't have a cold you actually have a migraine*

Thus, many kinds of updated inferences and conclusions found readily in law, medicine, and science cannot be modeled (or at least face significant difficulty in being modeled) formally in classical logic or Boolean Algebra alone

Such an example is a simple but important reminder that advanced inference frameworks (Markov Learning Nets, other kinds of ANN's, AGM Belief Revision, non-monotonic logics, etc.) are required to capture such kinds of everyday reasoning

The Future

We usually *underestimate* the time and cost required to bring a predicted innovation or technology about (this is probably equally true of our personal lives as well as on grand scale)

But when that innovation or technology does arrive, we usually find ourselves having *underestimated* its impact, power, or effect

Putting this together:

We usually underestimate both (a) the time and cost required to bring a technology or innovation about and (b) the eventual impact, effect, and manifestation of that technology or innovation when it is finally achieved

Information Asymmetry

A notion introduced in this famous paper: wiki and pdf

Scenarios where *Information asymmetry* exists can be very concerning

Extending the concept of *information asymmetry*, we see clearly that a lack of knowledge, expertise, or understanding seriously inhibits inference and the effective use of data (both because the data is absent and because the expertise to understand and use it effectively is missing)

In particular, many decisions are made without consulting existing domains of expertise outside of one's own (enterprise AI people are largely unaware of formal epistemology, etc.)

Smoothing out available information (making it available between multiple parties where at least one piece of information is held by less than all and that's valuable to each) and its creation is key to the *knowledge economy* in the *Age of Information* both in terms of a general metaphor for how the economy works now and which commodity is ultimately valuable within it

Judgment, Analysis, and the Use of Data

Information, alone, is just information

The use of information, determining what it means and knowing what to do with it is key - an art and a science

*Inference* involves *judgment*, *insight*, and a degree of *wisdom* and *experience* to correctly reason to those previously mentioned ends (i.e. knowing what it means, how to use it, etc.)

Such considerations have recently sparked a backlash against Big Data - though I'd caution that Big Data's awesome, it's just that our appreciation for logic, inference, and reasoning (both in terms of human and in terms of machine) needs to significantly increase and be applied in superior fashion

Big Data Creates False Confidence

To Rescue Democracy, Go Outside

Using Big Data Isn't Enough Anymore

Weapon of Math Destruction