Journalist/developer. Interactive editor @frontlinepbs. Builder of @HomicideWatch. Sinophile for fun. Past: @WBUR, @NPR, @NewsHour.
1360 stories
·
43 followers

Cast Iron Pan

3 Comments and 11 Shares
If you want to evenly space them, it's easiest to alternate between the Arctic and Antarctic. Some people just go to the Arctic twice, near the equinoxes so the visits are almost 6 months apart, but it's not the same.
Read the whole story
chrisamico
13 hours ago
reply
Sometimes XKCD hits close to home.
Boston, MA
Share this story
Delete
2 public comments
jheiss
7 hours ago
reply
If soap is destroying your "seasoning" then you don't actually have seasoning.
ManBehindThePlan
10 hours ago
reply
The obsession of cast iron care is laughable, considering that the pioneers rode with them over the Rockies. Of course, they actually used them, so the patina was always kept.

Statistical diversity in US newsrooms

1 Share

If a news organization wants to talk about the world in a fair way, it needs points of view from a group of people who are representative of said world. Otherwise, bias comes to play no matter how hard you try. Google Trends looks at the how different groups are represented in major news organizations across the country.

Tags: , ,

Read the whole story
chrisamico
2 days ago
reply
Boston, MA
Share this story
Delete

Research Risks

4 Comments and 10 Shares
The 1919 Great Boston Molasses Flood remained the deadliest confectionery containment accident until the Canadian Space Agency's 2031 orbital maple syrup delivery disaster.
Read the whole story
chrisamico
2 days ago
reply
Boston, MA
Share this story
Delete
4 public comments
lamontcg
2 days ago
reply
Seems like the people studying supernova precursor candidate stars in the local group, and the possibility of the neutrino flux sterilizing all life on earth, would push astronomy more to the right...
snarfed
2 days ago
reply
made me https://www.google.com/search?q=define%3Amycology 😂
mooglemoogle
2 days ago
reply
Two things:
1. dentistry seems too low on the super villain scale
2. I like that Paleontology isn’t quite all the way left.
Virginia
jepler
2 days ago
reply
I just knew the mouserover text would be about "molasses storage".
Earth, Sol system, Western spiral arm
brico
5 hours ago
M

The Supreme Court Is Allergic To Math

1 Comment and 2 Shares

The Supreme Court does not compute. Or at least some of its members would rather not. The justices, the most powerful jurists in the land, seem to have a reluctance — even an allergy — to taking math and statistics seriously.

For decades, the court has struggled with quantitative evidence of all kinds in a wide variety of cases. Sometimes justices ignore this evidence. Sometimes they misinterpret it. And sometimes they cast it aside in order to hold on to more traditional legal arguments. (And, yes, sometimes they also listen to the numbers.) Yet the world itself is becoming more computationally driven, and some of those computations will need to be adjudicated before long. Some major artificial intelligence case will likely come across the court’s desk in the next decade, for example. By voicing an unwillingness to engage with data-driven empiricism, justices — and thus the court — are at risk of making decisions without fully grappling with the evidence.

This problem was on full display earlier this month, when the Supreme Court heard arguments in Gill v. Whitford, a case that will determine the future of partisan gerrymandering — and the contours of American democracy along with it. As my colleague Galen Druke has reported, the case hinges on math: Is there a way to measure a map’s partisan bias and to create a standard for when a gerrymandered map infringes on voters’ rights?

The metric at the heart of the Wisconsin case is called the efficiency gap. To calculate it, you take the difference between each party’s “wasted” votes — votes for losing candidates and votes for winning candidates beyond what the candidate needed to win — and divide that by the total number of votes cast. It’s mathematical, yes, but quite simple, and aims to measure the extent of partisan gerrymandering.

Four of the eight justices who regularly speak during oral arguments1 voiced anxiety about using calculations to answer questions about bias and partisanship. Some said the math was unwieldy, complicated, and newfangled. One justice called it “baloney” and argued that the difficulty the public would have in understanding the test would ultimately erode the legitimacy of the court.

Justice Neil Gorsuch balked at the multifaceted empirical approach that the Democratic team bringing the suit is proposing be used to calculate when partisan gerrymandering has gone too far, comparing the metric to a secret recipe: “It reminds me a little bit of my steak rub. I like some turmeric, I like a few other little ingredients, but I’m not going to tell you how much of each. And so what’s this court supposed to do? A pinch of this, a pinch of that?”

Justice Stephen Breyer said, “I think the hard issue in this case is are there standards manageable by a court, not by some group of social science political ex … you know, computer experts? I understand that, and I am quite sympathetic to that.”

“What Roberts is revealing is a professional pathology of legal education.”

And Chief Justice John Roberts, most of all, dismissed the modern attempts to quantify partisan gerrymandering: “It may be simply my educational background, but I can only describe it as sociological gobbledygook.” This was tough talk — justices had only uttered the g-word a few times before in the court’s 230-year history.2 Keep in mind that Roberts is a man with two degrees from Harvard and that this case isn’t really about sociology. (Although he did earn a rebuke from the American Sociological Association for his comments.) Roberts later added, “Predicting on the basis of the statistics that are before us has been a very hazardous enterprise.” FiveThirtyEight will apparently not be arguing any cases before the Supreme Court anytime soon.

This allergy to statistics and quantitative social science — or at least to their legal application — seems to present a perverse incentive to would-be gerrymanderers: The more complicated your process is, and therefore the more complicated the math would need to be to identify the process as unconstitutional, the less likely the court will be to find it unconstitutional.


But this trouble with math isn’t limited to this session’s blockbuster case. Just this term, the justices will again encounter data again when they hear a case about the warrantless seizure of cell phone records. The Electronic Frontier Foundation, the Data & Society Research Institute, and empirical scholars of the Fourth Amendment, among others, have filed briefs in the case.

“This is a real problem,” Sanford Levinson, a professor of law and government at the University of Texas at Austin, told me. “Because more and more law requires genuine familiarity with the empirical world and, frankly, classical legal analysis isn’t a particularly good way of finding out how the empirical world operates.” But top-level law schools like Harvard — all nine current justices attended Harvard or Yale — emphasize exactly those traditional, classical legal skills, Levinson said.

In 1897, before he had taken his seat on the Supreme Court, Oliver Wendell Holmes delivered a famous speech at Boston University, advocating for empiricism over traditionalism: “For the rational study of the law … the man of the future is the man of statistics and the master of economics. It is revolting to have no better reason for a rule of law than that so it was laid down in the time of Henry IV.” If we hadn’t made much progress in the 500 years between Henry IV and Holmes, neither have we made much progress in the 120 years between Holmes and today. “What Roberts is revealing is a professional pathology of legal education,” Levinson said. “John Roberts is very, very smart. But he has really a strong anti-intellectual streak in him.”

I reached Eric McGhee, a political scientist and research fellow at the Public Policy Institute of California who helped develop the central gerrymandering measure, a couple days after the oral argument. He wasn’t surprised that some justices were hesitant, given the large amount of analysis involved in the case, including his metric. But he did agree that the court’s numbers allergy would crop up again. “There’s a lot of the world that you can only understand through that kind of analysis,” he said. “It’s not like the fact that a complicated analysis is necessary tells you that it’s not actually happening.”

During the Gill v. Whitford oral argument, the math-skeptical justices groped for an out — a simpler legal alternative that could save them from having to fully embrace the statistical standards in their decisionmaking. “When I read all that social science stuff and the computer stuff, I said, ‘Is there a way of reducing it to something that’s manageable?’” said Justice Breyer, who is nevertheless expected to vote with the court’s liberal bloc.

It’s easy to imagine a situation where the answer for this and many other cases is, simply, “No.” The world is a complicated place.


Documentation of the court’s math problem fills pages in academic journals. “It’s one thing for the court to consider quantitative evidence and dismiss it based on its merits” — which could still happen here, as Republicans involved in the Wisconsin case have criticized the efficiency gap method — “but we see a troubling pattern whereby evidence is dismissed based on sweeping statements, gut reactions and logical fallacies,” Ryan Enos, a political scientist at Harvard, told me.

One stark example: a 1986 death penalty case called McCleskey v. Kemp. Warren McCleskey, a black man, was convicted of murdering a white police officer and was sentenced to death by the state of Georgia. In appealing his death sentence, McCleskey cited sophisticated statistical research, performed by two law professors and a statistician, that found that a defendant in Georgia was more than four times as likely to be sentenced to death if the victim in a capital case was white compared to if the victim was black. McCleskey argued that that discrepancy violated his 14th Amendment right to equal protection. In his majority opinion, Justice Lewis Powell wrote, “Statistics, at most, may show only a likelihood that a particular factor entered into some decisions.” McCleskey lost the case. It’s been cited as one of the worst decisions since World War II and has been called “the Dred Scott decision of our time.”

Maybe this allergy to statistical evidence is really a smoke screen — a convenient way to make a decision based on ideology while couching it in terms of practicality.

Another instance of judicial innumeracy: the Supreme Court’s decision on a Fourth Amendment case about federal searches and seizures called Elkins v. United States in 1960. In his majority opinion, Justice Potter Stewart discussed how no data existed showing that people in states that had stricter rules regarding the admission of evidence obtained in an unlawful search were less likely to be subjected to these searches. He wrote, “Since, as a practical matter, it is never easy to prove a negative, it is hardly likely that conclusive factual data could ever be assembled.”

This, however, is silly. It conflates two meanings of the word “negative.” Philosophically, sure, it’s difficult to prove that something does not exist: No matter how prevalent gray elephants are, their numbers alone can’t prove the nonexistence of polka-dotted elephants. Arithmetically, though, scientists, social and otherwise, demonstrate negatives — as in a decrease, or a difference in rate — all the time. There’s nothing special about these kinds of negatives. Some drug tends to lower blood pressure. The average lottery player will lose money. A certain voting requirement depresses turnout.

Enos and his coauthors call this the “negative effect fallacy,” a term they coined in a paper published in September. It’s just one example, they wrote, of an empirical misunderstanding that has proliferated like a tsunami through decades of judges’ thinking, affecting cases concerning “free speech, voting rights, and campaign finance.”

Another example of this fallacy, they wrote, came fifty years later in Arizona Free Enterprise v. Bennett, a 2011 campaign finance case. The topic was Arizona’s public campaign financing system, specifically a provision that provided matching funds to publicly financed candidates. The question was whether this system impinged on the free speech of the privately funded candidates. A group of social scientists, including Enos, found that private donations weren’t following the kind of patterns they’d expect to see if the public funding rule were affecting how donors behaved. The Supreme Court didn’t care and ultimately struck down the provision.

In his majority opinion, John Roberts echoed Stewart and repeated the fallacy, writing that “it is never easy to prove a negative.”


So what can be done?

McGhee, who helped develop the efficiency gap measure, wondered if the court should hire a trusted staff of social scientists to help the justices parse empirical arguments. Levinson, the Texas professor, felt that the problem was a lack of rigorous empirical training at most elite law schools, so the long-term solution would be a change in curriculum. Enos and his coauthors proposed “that courts alter their norms and standards regarding the consideration of statistical evidence”; judges are free to ignore statistical evidence, so perhaps nothing will change unless they take this category of evidence more seriously.

But maybe this allergy to statistical evidence is really a smoke screen — a convenient way to make a decision based on ideology while couching it in terms of practicality.

“I don’t put much stock in the claim that the Supreme Court is afraid of adjudicating partisan gerrymanders because it’s afraid of math,” Daniel Hemel, who teaches law at the University of Chicago, told me. “[Roberts] is very smart and so are the judges who would be adjudicating partisan gerrymandering claims — I’m sure he and they could wrap their minds around the math. The ‘gobbledygook’ argument seems to be masking whatever his real objection might be.”

But if the chief justice hides his true objections behind a feigned inability to grok the math, well, that’s a problem math can’t solve.





Read the whole story
chrisamico
3 days ago
reply
Boston, MA
Share this story
Delete
1 public comment
skorgu
2 days ago
reply
:(
kazriko
2 days ago
I think in general most branches of the government have an allergy to math, statistics, and even quantifying the results from their policies.

Map of Santa Rosa fires

1 Share

Using both satellite images and ground surveys, The New York Times maps the damage due to the fires in Santa Rosa. Crazy. I live a couple of hours away from the area and I still could smell the smoke.

See also Nicolette Hayes’ more personal map.

Tags: ,

Read the whole story
chrisamico
4 days ago
reply
Boston, MA
Share this story
Delete

Santa Rosa Fire satellite imagery

1 Comment

By: Brynne Morris

Santa Rosa residents forced to evacuate can now see if their homes and neighborhoods are ok. DigitalGlobe has been capturing and sharing updated imagery of the fire all week. The largest satellite company in the world has super powerful sensors that can see through smoke at very high resolution. DigitalGlobe opened up the imagery to help first responders and the community see what is happening across such a wide area of destruction. This imagery is being processed by Robin Kraft, an earth observation expert and developer who grew up in Santa Rosa and wanted to see if his father’s house was intact. Since Wednesday, he has been updating the imagery daily to provide others with the same resource. Here is a look at Robin’s app:

In the video above, you can see imagery of the area from July 2017, and as the slider moves, it shows updated imagery from October 12 at 12:30 pm. Red indicates vegetation like trees and shrubbery, and in grey you can see burned houses and rubble. The map covers areas in and around Santa Rosa, Sonoma, and Napa.

Please share this map so everyone in the area has access to the imagery — it will be updating all weekend until the fire is contained.

You can also view the original map Robin has been sharing here.

Brynne Morris


Santa Rosa Fire satellite imagery was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.

Read the whole story
chrisamico
6 days ago
reply
Too many familiar places in this map.
Boston, MA
Share this story
Delete
Next Page of Stories