Journalist/developer. Storytelling developer @ USA Today Network. Builder of @HomicideWatch. Sinophile for fun. Past: @frontlinepbs @WBUR, @NPR, @NewsHour.
2046 stories
·
45 followers

With six words, Michelle Obama rewires America’s conversation on race

1 Share

In her speech at the Democratic National Convention in 2016, Michelle Obama coined one of the defining phrases of the political era: “When they go low, we go high.”

Going high did not work. Donald Trump won that election. While many of his supporters expressed discomfort with his go-low approach to politics, far more embraced it. Trump, despite his pedigree as a New York billionaire, would embarrass and attack and disparage the perceived elites, and many Americans loved him for it.

Lesson learned. In her speech Tuesday night at the 2024 Democratic convention, Obama didn’t explicitly revoke the “we go high” mantra, but she made clear that a different moment called for a different approach. It wasn’t that the former first lady went low, exactly, but she was unsparing in her disdain for and criticisms of her husband’s successor.

In one of the more memorable stretches of her speech, she equated the Democratic nominee, Vice President Kamala Harris, with the majority of Americans who never enjoyed Trump’s wealth and privilege — and the safety net that accompanies them.

Harris “understands that most of us will never be afforded the grace of failing forward,” Obama said. “We will never benefit from the affirmative action of generational wealth. If we bankrupt a business or choke in a crisis, we don’t get a second, third or fourth chance. If things don’t go our way, we don’t have the luxury of whining or cheating others to get further ahead. No.

“We don’t get to change the rules so we always win,” she continued. “If we see a mountain in front of us, we don’t expect there to be an escalator waiting to take us to the top. We put our heads down. We get to work.”

Trump’s name wasn’t used but it didn’t need to be. That line about the escalator, a call back to Trump’s 2015 campaign launch, made the point obvious, if it wasn’t already.

But there are six words in that stretch that extend well beyond Trump. Obama used a phrase that succinctly and elegantly reframes the ongoing debate over inequality in the United States and how it might be addressed: “the affirmative action of generational wealth.”

It’s concise, centered on two familiar concepts. The first is “affirmative action,” the term used to describe programs generally focused on ensuring that non-White Americans have access to resources and institutions they might not otherwise have. And the second is “generational wealth,” the transition of economic (and social) power through families and, at times, communities.

These are descriptors of elements in American society that are in tension. If you are a recipient of generational wealth, you don’t need affirmative action to ensure you have access. If you are someone who would benefit from affirmative action, you generally are not someone with access to generational wealth. Of course, you might be, which is one of the outliers used to criticize affirmative action programs: They often center more on demographic traits than on economic class.

The linchpin of Obama’s phrase, though, is its shortest word: “of.” She isn’t contrasting affirmative action and generational wealth as conduits to power and success, she’s overlapping them. She’s noting that generational wealth is a form of affirmative action, here in the person of Trump but certainly beyond that.

How? Because generational wealth presents opportunities to people who might otherwise not have access to them: legacy admissions at Ivy League colleges, tutors and training, vehicles and housing that make entry-level jobs or internships more feasible. These are benefits that derive from social and economic class — a form of affirmative action. This is how reframing a subject works; it presents familiar information in a new context.

The natural response, of course, is that a parent bolstering her child’s success is different from a government program that includes an effort to ensure that Black Americans have equal access. But this is the point of the word “generational.” We’re not simply considering a rich parent and the advantages they might offer. We’re focused on patterns of wealth transitioning from parent to child over and over again. And those patterns, traced backward over surprisingly few decades, very quickly bring us back to racial divisions.

Democratic National Convention

DNC 2024 live updates: Oprah makes the case for Harris ahead of Tim Walz’s moment

The speakers, schedule and how to watch the Democratic National Convention

There is no question that Black and White Americans did not have equal access to economic success in the 1950s or 1960s. They didn’t in later decades, either, thanks to ongoing overt discrimination (like being unable to rent an apartment) and discriminatory patterns built in to lending and jobs systems (such as making it harder to obtain a mortgage for homes in some neighborhoods). Nearly every American has a parent or grandparent who was alive in the era of explicit discrimination — that’s two generations away. Generational wealth, then, almost necessarily means wealth rooted in an American economy where explicit discrimination existed. It also means wealth that still enjoys the sorts of systemic protections and advantages, including ones from the government, that are pilloried when focused on addressing historical inequality.

One of the central debates over race in recent years has centered on existence or extent of racism embedded in American social and legal systems. The rise of the Black Lives Matter movement, focused on systemic racism in law enforcement, increased the number of White Americans — specifically, White Democrats — who indicated that they thought discrimination was a central cause for the lower incomes and worse housing many Black Americans experience.

The biennial General Social Survey, reflecting Republicans’ broad rejection of the idea of systemic racism, finds that they are much more likely to indicate that Black Americans have worse economic positions due to lack of motivation.

Republicans reject the idea of systemic racism, in part, because they view it as an unfair and unpatriotic disparagement of the United States. It’s in part, too, because the narrative of America overcoming explicit racism during the Civil Rights movement suggests that the fight is over. Many point to Michelle Obama’s husband: How could racism exist in an America that elected a Black man as president?

It’s also in part because the Black Lives Matter movement and questions about racism in general are coded as Democratic issues and therefore subject to partisan response. Black Lives Matter led to the right embracing Blue Lives Matter. Discussions of systemic racism were met with many White Republicans viewing themselves as victims of anti-White racism (to Trump’s political benefit). Affirmative action programs became a useful target for demonstrating that sort of anti-White bias.

Michelle Obama knows this. Her line overlapping affirmative action and generational wealth wasn’t offering “affirmative action” as a pejorative term. It was, instead, contextualizing a different way in which people are boosted by circumstances that aren’t always under their control. It was a defense of affirmative action programs that noted how wealth built in an explicitly unfair economy was its own form of unearned advantage.

It was pointed at Trump, yes. But it’s a reframing that rewires the conversation of race and advantage in a striking way. In six words that will likely have more staying power, if not more success, than “we go high.”

Read the whole story
chrisamico
16 days ago
reply
Boston, MA
Share this story
Delete

CJI and The Invention of Pitcraft — The Fight Primer

1 Share
Read the whole story
chrisamico
16 days ago
reply
Boston, MA
Share this story
Delete

Welcome to a new era of technology growth for the Minnesota Star Tribune

1 Share

By Aron Pilhofer

Chief Product Officer / Star Tribune

August 18, 2024 at 1:24AM

If you’re reading this, you may have noticed things are starting to look a little different around here. The Minnesota Star Tribune’s website and apps received a technology reboot and I’m excited to tell you more about it. We’ve been hard at work updating our digital products to better serve you, and our recent digital reboot is just the start.

We’ve made upgrades to our platforms because we’re committed to providing the best possible experience for everyone who comes to the Minnesota Star Tribune. We think Minnesotans deserve an excellent digital news product that keeps you informed about all that’s happening in your community and the world.

Some of the major enhancements you’ll see include:

  • An improved, fully responsive design so you can more easily find the news you care about.
  • A seamless experience across all our digital properties whether you are on desktop, laptop, tablet or mobile.
  • Better performance across all platforms. Our apps will feel snappier, and more up-to-date. Our desktop and mobile sites will load more quickly than ever before.
  • A new, modern homepage design that offers an improved reading experience with larger imagery and clearer story hierarchy. The most important and relevant stories will feel that way.
  • A greatly improved breaking news experience.
  • Reimagined article pages that better showcase our photojournalism with cleaner headlines and easier-to-read copy.
  • Rebuilt apps for iOS and Android featuring updated design, better performance and loads of new features.
  • Better accessibility for readers of all abilities.

All these improvements are designed with you in mind. We want to build the best possible experience as we execute our vision to create the leading model for local news in America — driving innovation in media to make every Minnesotan’s life better. These platform enhancements are one of the first milestones in achieving that vision.

And we’re just getting started.

We aim to offer more features to enrich the experience of consuming news from the Minnesota Star Tribune. Greater personalization, new story forms and better integration of audio and video content are just a few examples on our roadmap.

Like any technology upgrade, we’re going to launch and iterate, getting user feedback to make sure things are working well. We know we won’t get it 100% right, and we want to hear from you when we don’t. Our technology reboot is just the beginning of some great things to come.

None of this is possible without your help.

The new design and layout of our website and apps have been created through ongoing collaboration with our audience and all our teams at the Minnesota Star Tribune. We’re proud to offer the features, layout and functionality that you’ve told us matter the most. If you have feedback, or would like to participate in our ongoing enhancement process, please reach out at help.startribune.com.

Read the whole story
chrisamico
19 days ago
reply
Boston, MA
Share this story
Delete

The early days of OpenStreetMap | Système D.

1 Share

We didn’t realise it at the time, but OpenStreetMap came from a unique moment in internet history.

It was still the time of the Old Internet. The time when you, or anyone, could upload a few pages to your own “webspace” or “homepage”, writing about what took your fancy. You didn’t have to worry about Facebook and Google as gatekeepers, or HTTPS, or your server being compromised within five minutes of turning on, or GDPR, or front-end frameworks, or any of the curses of the modern web.

And it was the brief flowering of Web 2.0… before it all went corporate. Sites like Flickr were young and independent. Every month saw the launch of some new crowdsourced knowledge base.

The origin myth of OSM is that of crusty GIS types, national mapping agencies, and cartographers sneering “you can’t map the world with volunteers”. But of course we could do it. We never doubted that we could. In 2004 it was obvious that you could build OSM.

It was so obvious that several people had the same idea at broadly the same time. Steve Coast started OpenStreetMap with Matt Amos and Tom Carden. Jo Walsh and Schuyler Erle started London Free Map (also 2004?). Nick Whitelegg started freemap (Oct 2004). I started Geowiki after hatching the idea with a bunch of friends from university (Sep 2002). We all coalesced around OSM because Steve went out and evangelised for it, speaking at endless LUGs and Dorkbots and hack weekends, whereas the rest of us essentially wanted to sit at home, hack on code or draw maps.

2004 was when the homegrown web met the participatory web, before it all got turned to shit by Facebook and Google. You wouldn’t start OSM in 2024: the nascent project would be squashed by Google, or founder under Reddit/HN users’ expectations of “why not good yet”, or flame out in some controversy over privacy or disputed borders. But we got through all of that in our early years, when the world was simpler.

What’s remarkable about OSM is not that it started, but that it thrived. Other OSM-adjacent projects from the same period didn’t. OpenGuides could have been the Time Out or Rough Guides to OSM’s A-Z, but petered out. OpenFlights too. Wikitravel got nobbled by a private buyer, re-emerged as Wikivoyage, but never really got traction.

Why did it thrive?

It didn’t, at first. In 2004 and 2005 OSM was still not much more than an idea. Events were still mainly Steve going out evangelising. The tech was an endless succession of false starts. The server was unreliable. There were 1000 users by December 2005, but they didn’t map much. It took until January 2006 before Britain’s (fairly few) motorways were mapped.

But a community was slowly building around the idea. People were talking and a common purpose was being forged. From that common purpose, a handful of people identified pieces in the jigsaw they could fit. Imi Scholz wrote JOSM (December 2005), Artem Pavlenko wrote Mapnik (winter 2006), I wrote Potlatch (March 2007). Glue code like the <a href="http://openstreetmap.org" rel="nofollow">openstreetmap.org</a> website itself (rewritten in Rails in May 2007), and mod_tile/renderd (December 2007), held it all together. Add Yahoo imagery (December 2006), and OSM was finally at the stage where you could draw a road and it would appear on a map. We had a conference in 2007… and something to talk about.

Since then, it’s been incremental. OSM today looks not much like OSM did in 2004, quite a lot like OSM in 2007. 9th August 2004 is OpenStreetMap’s official birthday, but 0-year olds don’t do very much. It took a few years before we learned how to walk.

Posted on Sunday 11 August 2024. Link.

Previous post: Oxford’s traffic filters

Read the whole story
chrisamico
27 days ago
reply
Boston, MA
Share this story
Delete

Towards Standardizing Place

1 Share

Alameda & Oakland, as seen in Overture's Explore tool.

Why I’m Excited for Overture’s GERS

Last week, The Overture Maps Foundation announced the General Availability of its global maps datasets. The exiting of beta for the places, buildings, divisions, and base layers is a tremendous achievement for all involved. I’ve been lucky enough to participate with Overture for the last year and a half, through my work at Precisely. I’m especially excited to help guide Overture’s Global Entity Reference System, or GERS.

It’s been hard to express my excitement, especially to non-geospatial geeks, so I’ll attempt to explain here.

Overture’s Global Entity Reference System has a real shot at standardizing how datasets and systems deal with place.

Coordinate reference systems (CRS) do a great job defining how we can describe locations as coordinate sets in our databases. Our de facto standard, WGS 84, works with all the major map platforms and databases, is the lingua franca of the GPS network, and has plenty of precision for nearly all GIS use cases1. As a standard for describing where a location is, WGS 84 is excellent and (almost always) interoperable. It has standardized location in our databases, datasets, and data software.

But these systems cannot standardize place.

The difference between locations and places is an important nuance. Humans build out, demarcate, and describe discrete venues and areas in space. The questions we ask only deal in coordinates because they have to; we’d rather ask questions about roads, paths, houses, stores, schools, forests, arenas, cities, and more – the places we define to organize the world.

So what do we need to standardize place? Let’s back up and understand what we need to build a standard. We need:

  1. A Source of Truth: An agreed-upon reference point to benchmark our measurements. For WGS 84, this source of truth is (0,0), the point where the Prime Meridian intersects with the Equator.
  2. Mechanisms for Disseminating the Truth: Networks and tools for taking measurements and ascertaining our relationship with the source of truth. For WGS 84, this is a complex system including GPS satellites, antennas, geolocation APIs, and geospatial software for computing and working with coordinates.
  3. A Format for Expressing Your Position Relative to the Truth: A set system for expressing our positions relative to the benchmark. For WGS 84, this is a coordinate pair (with a third metric for elevation, if you’d like), which is easily stored and shared with minimal hiccups.

These three ingredients are critical and present with every standard.

My favorite example is our current standardization of time, the trio of UTC, NTP, and UNIX Time. Each element respectively maps to the requirements: the source of truth (UTC), the mechanism for dissemination (NTP), and an expressive format (UNIX time).

This standardization, like all of them, was hard-won over decades. Hundreds of years even, if you follow UTC back to its ancestor Greenwich Mean Time and other standards that arose in response to steam trains2.

Midtown Manhattan, as seen by Overture's places, buildings, and base layers.

To standardize place, our source of truth must be much more complex than WGS 84’s or UTC’s. Any standard must know about our geography – those roads, houses, parks, and more. We can’t use a single location to benchmark everything. We need a dataset that knows about all the places we want to associate with and ask questions of. This dataset must be open and iterrogable by all. Otherwise, we stand no hope of building our second requirement, the dissemination mechanisms that allow people to situate themselves and their data against the source of truth.

But even “open” isn’t enough. It must be easy to access. If a dataset is open but requires you to download a massive file and stage it in a specialized database – unfamiliar to most – is it really accessible?

This is why Open Street Map isn’t sufficient to standardize place. OSM is an amazing project, focused on making the best maps, freely available. But their dissemination mechanism is the map itself; everything is done in service of building that final map view.

As an accessible dataset, OSM is lacking. Planet files and their extracts are unruly and intimidating. Getting one to the point where you can query it is challenging for anyone unfamiliar with the OSGeo stack. Further, OSM’s tagging system – while perfectly suited to a collaborative, global map-building project – isn’t ideal for data exploration or analytics (though some projects are helping).

Thankfully, Overture’s main product is its data, not a map. This data is staged accessibly, on AWS and Azure, in cloud-native geoparquet. Accessing the entire corpus, a single layer, or a subset of either can be done with AWS, Azure, or Overture’s Python CLI tools. If you use Google Cloud or Snowflake, CARTO staged the data on both.

Heck, you can use duckdb to query the data remotely. Here’s how you get all the places in my home ZIP code:

If you’re after a smaller region, use Overture’s Explorer interface and click the “Download Visible” button.

The data is easy to access. It’s early days, but the foundations are laid for robust dissemination mechanisms to emerge: tools that quickly and easily associate your data with the source of truth that is the dataset itself.

All the places in my neighbordhood.

The last requirement for a standard is the expressive format, which also happens to be my favorite part of Overture: GERS.

Every entity in Overture’s data products – each road, building, place, address, city, country, and more – has a unique GERS ID. This ID is intended to be stable and traceable, “providing a mechanism to match features across datasets, track data stability, and detect errors in the data.”

GERS IDs are 128 bits and 32 characters. They don’t encode an entity’s coordinates, rather they serve as a pointer to the entity in Overture’s data. GERS, coupled with Overture datasets, is a potential standard for identifying the places we’ve defined in our world.

The Future of GIS isn’t More Maps; It’s Column Joins

The ability to easily connect disparate datasets dramatically increases their value. More data connections create more perspectives; and more answers to more questions.

There is tremendous potential value latent in data that could be connected to a place but isn’t.

Standardizing location with WSG 84 hasn’t proven sufficient. Every organization can benefit from geospatial intelligence. But the number of organizations capable of GIS is a tiny fraction of them. GIS is too complex for most situations.

People wanting information about places need to know about WGS 84 and Web Mercator as much as someone using UNIX time stamps needs to know about the resonant frequency of the cesium powering our atomic clocks. Most of the time, a map isn’t necessary. With a persistent key system like GERS, we can prepare, explore, and analyze geo data with column joins and SQL queries. We can visualize our findings with bar charts and other standard visualizations which are easier to create and consume3.

By delivering a data standard for places, not just locations, we can work towards a future where all one needs to access geospatial intelligence are column joins.

A standard for places makes the geospatial market much, much bigger. It will reduce the cost of data onboarding, lessen the experience needed to load data and make connections, and increase our ability to make connections by magnitudes. More data will be pinned down to places, broadening our perspective and understanding.

With Overture we have the seeds to standardize place. And it’s getting better with every release. As an ecosystem builds around Overture data, then GERS, the effects will be massive. Perhaps place will realize its potential as a standard, common, join key.

  1. However we can imagine emerging technology use cases, specifically spatial computing, which might require greater precision than WGS 84 allows. But there are still nascent and complex. 

  2. Don’t get me started on Train Time, one of my favorite rabbit holes. But, in a nutshell: before trains, the fastest means of transport was a horse. And when it took a full day to get from Cincinnati to Cleveland, regional clocks disagreeing by ~30 minutes doesn’t really matter. Once you start running trains on shared tracks, time syncronization really matters (if you’d prefer your trains don’t crash). Over decades, standardized time became established country by country. But that’s another post… 

  3. Don’t get me wrong: I love maps. But the GIS community too often forgets that most people can’t easily read maps. Map reading has never been intuitive for most this has been made worse by turn-by-turn apps. Rather than improving our relationship with maps, they mediate it. GIS as an industry is too focused on maps as a final product. And it limits the size of our audience. 

Read the whole story
chrisamico
36 days ago
reply
Boston, MA
Share this story
Delete

They Watched ‘Star Wars.’ It’s Why They’re in the Olympics.

1 Share

Copyright ©2024 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

Appeared in the July 29, 2024, print edition as '‘Star Wars’ Kicked Off Their Path to Olympics'.

Read the whole story
chrisamico
40 days ago
reply
Boston, MA
Share this story
Delete
Next Page of Stories