# Ordlista

**artificial intelligence (AI).** A machine intelligence, or the field which studies intelligent machines. ”Narrow AI” displays intelligence only in a narrow domain, like chess or arithmetic. ”Strong AI” or ”**artificial general intelligence (AGI)**” matches or exceeds human intelligence, including the ability to solve problems in a wide variety of environments. On this website, ”AI” means ”AGI” unless otherwise specified.

**cognitive bias.** An obstacle to truth produced by one’s mental machinery. (Other obstacles to truth include the cost of information and computation.) Many such obstacles are common and predictable enough that they have names, like the conjunction fallacy and the affect heuristic.

**decision theory.** The study of correct decisions. A core idea is *expected utility maximization*: an agent should choose the action that maximizes their expected utility. Unsolved problems in decision theory arise when considering thought experiments wherein ideal situations challenge the applicability of current decision theories.

**expected utility.** Expected value (in utility).

**expected value.** The average value of all the possible outcomes of an event, each outcome weighted by its probability. Suppose you’re about to roll a die, and you’ll win (in dollars) twice the number you roll, unless you roll a 6, in which case you’ll win 4 times the number you roll. The expected value of rolling the die is [2($1) + 2($2) + 2($3) + 2($4) + 2($5) + 4($6)]/6 = $9.

**intelligence.** Efficient cross-domain optimization. Intelligence is one’s ability to efficiently use available resources to shape the world in accordance with one’s preferences, in a wide variety of environments.

**probability theory.** The study of probability. On this website, ”probability” refers to *degrees of belief or uncertainty*, so ”probability theory” means *Bayesian* probability theory, which follows from the laws of logic. A core rule of probability theory is Bayes’ Theorem.

**rationality.** Systematized winning. Sometimes called ”technical rationality” to distinguish it from ”Hollywood rationality.” ”**Epistemic rationality**” is the craft of obtaining true beliefs, *i.e.* making optimal belief updates according to the laws of logic and probability theory. ”**Instrumental rationality**” is the craft of achieving one’s goals, *i.e.* making optimal choices in accord with the laws of decision theory.

**utility.** A numerical measure of preference or value. In a utility function, outcomes with higher utilities are preferred to outcomes with lower utilities.

**utility function. **A function that assigns utilities to outcomes. Outcomes with higher utilities are preferred to outcomes with lower utilities. Humans do not have coherent utility functions, nor would we expect them to given how they evolved, which is why they have so much trouble guessing their own utility functions.