What is it about Hopper? Every once in a while an artist comes along who articulates an experience, not necessarily consciously or willingly, but with such prescience and intensity that the association becomes indelible. He never much liked the idea that his paintings could be pinned down, or that loneliness was his metier, his central theme. “The loneliness thing is overdone,” he once told his friend Brian O’Doherty, in one of the very few long interviews to which he submitted.
Why, then, do we persist in ascribing loneliness to his work? The obvious answer is that his paintings tend to be populated by people alone, or in uneasy, uncommunicative groupings of twos and threes, fastened into poses that seem indicative of distress. But there’s something else too; something about the way he contrives his city streets. What Hopper’s urban scenes replicate is one of the central experiences of being lonely: the way a feeling of separation, of being walled off or penned in, combines with a sense of near unbearable exposure.
This tension is present in even the most benign of his New York paintings, the ones that testify to a more pleasurable, more equanimous kind of solitude. Morning in a City, say, in which a naked woman stands at a window, holding just a towel, relaxed and at ease with herself, her body composed of lovely flecks of lavender and rose and pale green. The mood is peaceful, and yet the faintest tremor of unease is discernible at the far left of the painting, where the open casement gives way to the buildings beyond, lit by the flannel-pink of a morning sky.
In the tenement opposite there are three more windows, their green blinds half-drawn, their interiors rough squares of total black. If windows are to be thought analogous to eyes, as both etymology, wind-eye, and function suggests, then there exists around this blockage, this plug of paint, an uncertainty about being seen – looked over, maybe; but maybe also overlooked, as in ignored, unseen, unregarded, undesired.
In the sinister Night Windows, these worries bloom into acute disquiet. The painting centres on the upper portion of a building, with three apertures, three slits, giving into a lighted chamber. At the first window a curtain billows outward, and in the second a woman in a pinkish slip bends over a green carpet, her haunches taut. In the third, a lamp is glowing through a layer of fabric, though what it actually looks like is a wall of flames.
There’s something odd, too, about the vantage point. It’s clearly from above – we see the floor, not the ceiling – but the windows are on at least the second storey, making it seem as if whoever’s doing the looking is hanging suspended in the air. The more likely answer is that they’re stealing a glimpse from the window of the ‘El’, the elevated train, which Hopper liked to ride at night, armed with his pads, his fabricated chalk, gazing avidly through the glass for instances of brightness, moments that fix, unfinished, in the mind’s eye. Either way, the viewer – me, I mean, or you – has been co-opted into an estranging act. Privacy has been breached, but it doesn’t make the woman any less alone, exposed in her burning chamber.
KAKISTOCRACY (n.)
Government by the least qualified or most unprincipled citizens; a form of government in which the worst people are in power.
[Origin: Greek “kakistos” or “worst”, the superlative form of “kakos” or “bad”.“Kakos” is closely related to “Caco” or “defecate”]

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE
NY Times: Sept 16, 2016
ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are.
It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason.
In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.)
In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational.
But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right?
Wrong. In a series of studies, Professor Stanovich and colleagues had large samples of subjects (usually several hundred) complete judgment tests like the Linda problem, as well as an I.Q. test. The major finding was that irrationality — or what Professor Stanovich called “dysrationalia” — correlates relatively weakly with I.Q. A person with a high I.Q. is about as likely to suffer from dysrationalia as a person with a low I.Q. In a 2008study, Professor Stanovich and colleagues gave subjects the Linda problem and found that those with a high I.Q. were, if anything, more prone to the conjunction fallacy.
Based on this evidence, Professor Stanovich and colleagues have introduced the concept of the rationality quotient, or R.Q. If an I.Q. test measures something like raw intellectual horsepower (abstract reasoning and verbal ability), a test of R.Q. would measure the propensity for reflective thought — stepping back from your own thinking and correcting its faulty tendencies.
There is also now evidence that rationality, unlike intelligence, can be improved through training. In a pair of studies published last year in Policy Insights From the Behavioral and Brain Sciences, the psychologist Carey Morewedge and colleagues had subjects (more than 200 in each study) complete a test to assess their susceptibility to various decision-making biases. Then, some of the subjects watched a video about decision-making bias, while others played an interactive computer game designed to decrease bias via simulations of real-world decision making.
In the interactive games, following each simulation, a review gave the subjects instruction on specific decision-making biases and individualized feedback on their performance. Immediately after watching the video or receiving the computer training, and then again after two months, the subjects took a different version of the decision-making test.
Professor Morewedge and colleagues found that the computer training led to statistically large and enduring decreases in decision-making bias. In other words, the subjects were considerably less biased after training, even after two months. The decreases were larger for the subjects who received the computer training than for those who received the video training (though decreases were also sizable for the latter group). While there is scant evidence that any sort of “brain training” has any real-world impact on intelligence, it may well be possible to train people to be more rational in their decision making.
It is, of course, unrealistic to think that we will ever live in a world where everyone is completely rational. But by developing tests to identify the most rational among us, and by offering training programs to decrease irrationality in the rest of us, scientific researchers can nudge society in that direction.
Terrence Malick doing what he does best. This time no pesky actors to distract him from the breathtaking vistas.