Two Degrees: Cities, Architecture and Our Changing Environments

Source: Two Degrees: Cities, Architecture and Our Changing Environments | Commonwealth Club

There were a few things in this podcast that struck me. The first was the summary of the thesis of Collapse: How Societies Choose to Fail or Succeed. Societies collapse because one of the 3 following things happen:

  • They don’t think there is a problem
  • They think there is a problem, but think it’s someone else’s to solve
  • They think there is a problem, know it’s theirs to solve, but take ineffective action

This describes how lots of things fail, not just civilizations. I’ve seen so many software projects fail on premise #1 and #2. It seems simple, but as a framing it’s pretty good at classifying where things are stuck.

Efficiency is not sufficient

A lot of the talk was about how we’re going to need to change the built world. We hear a lot of talk about efficiency, which is good, but not sufficient. When it comes to the efficiency of cities, dense infill near transit hubs ends up far surpassing any retrofitting of buildings. Building cities around the idea of decreased car miles is super critical.

Will pipelines carry Hydrogen in the future?

One thing I did not realize is that a lot of our city level infrastructure for the methane/natural gas network existed before natural gas was widely used. It used to carry Coal gas, which is a mix of a lot of things, but notably 70% hydrogen. This means that the city level infrastructure could be reused to supply hydrogen gas in a future where we don’t want to be burning methane. 

There was lots more in the episode, and I’ll have to listen to it a second time because it was so informative. Not everything fits in my brain going over it only once. You can listen to the whole episode on the Commonwealth Club site.

Maize that fixes it’s own Nitrogen

For thousands of years, people from Sierra Mixe, a mountainous region in southern Mexico, have been cultivating an unusual variety of giant corn. They grow the crop on soils that are poor in nitrogen—an essential nutrient—and they barely use any additional fertilizer. And yet, their corn towers over conventional varieties, reaching heights of more than 16 feet.

A team of researchers led by Alan Bennett from UC Davis has shown that the secret of the corn’s success lies in its aerial roots—necklaces of finger-sized, rhubarb-red tubes that encircle the stem. These roots drip with a thick, clear, glistening mucus that’s loaded with bacteria. Thanks to these microbes, the corn can fertilize itself by pulling nitrogen directly from the surrounding air.

Source: The Indigenous Mexican Corn That Uses Air as Fertilizer – The Atlantic

Take 1: Holy crap this is cool. Corn is a huge staple grain, and requires a lot of off farm inputs to grow because it takes a lot of nutrients out of the ground.

Take 2: This maize matures in 8 months instead of 3 months for commercial corn. Interesting. Dr Sarah Taber pointed out on twitter that this is a really critical point. Nitrogen fixation takes a lot of energy, that has to come from somewhere. Modern varieties of maize might have had this bred out of them for a reason, so they put their energy into sugar and maturation instead of the ground. It may not be possible to keep this trait, and have the maize mature any faster.

This is important. Because the headlines for most articles on this make it sound like we’ve solved a hard problem in farm science and corn won’t need fertilizer in the future. That’s definitely not what the science says.

Take 3: The science behind verifying this is kind of amazing. You can’t tag nitrogen atoms to prove where they are coming from. So they did 5 different independent ways that each provide circumstantial evidence that the maize is actually doing this.

Take 4: The IP generated by this goes into the public trust. This is done under the Nagoya Protocol to address the very real concerns of bio-piracy by indigenous peoples. Good on them!

Take 5: The url of the Altantic piece is https://www.theatlantic.com/science/archive/2018/08/amaizeballs/567140/. Yes, they really did go there.

Tell the Complicated Story

It turns out that one of the solutions to get us all to talk to each other is to stop simplifying the narratives we use:

After the conversation ends and the participants are separated, they each listen to audio of their conversations and report how they felt at each point. Over time, the researchers noticed a key difference between the terrible and non-terrible conversations: The better conversations looked like a constellation of feelings and points, rather than a tug of war. They were more complex.

But could that complexity be artificially induced? Was there a way to cultivate better conversations? To find out, the researchers started giving the participants something to read before they met — a short article on another polarizing issue. One version of the article laid out both sides of a given controversy, similar to a traditional news story — arguing the case in favor of gun rights, for example, followed by the case for gun control.

The alternate version contained all the same information — written in a different way. That article emphasized the complexity of the gun debate, rather than describing it as a binary issue. So the author explained many different points of view, with more nuance and compassion. It read less like a lawyer’s opening statement and more like an anthropologist’s field notes.

After reading the article, the two participants met to discuss Middle East peace — or another unrelated controversy. It turns out that the pre-conversation reading mattered: in the difficult conversations that followed, people who had read the more simplistic article tended to get stuck in negativity. But those who had read the more complex articles did not. They asked more questions, proposed higher quality ideas and left the lab more satisfied with their conversations. “They don’t solve the debate,” Coleman says, “but they do have a more nuanced understanding and more willingness to continue the conversation.” Complexity is contagious, it turns out, which is wonderful news for humanity.

Source: Complicating the Narratives – The Whole Story

The article calls for a new approach to journalism which makes sure to tell the complex story, and not the simple one. It’s full of very specific ways of doing this, and why telling a 2 sided story isn’t the same thing as a complicated one.

One of the things that struck me most was how much this was the same as the motivational interviewing / active listening that is part of Citizens’ Climate Lobby training. When we pull back from interactions all being high stakes winner take all, and more about mutual explorations, we make a lot more progress understanding each other.

8 months in with Geothermal Heating & Cooling

Last summer about this time we made a big decision. We were going to work with Dandelion (a new geothermal company in the area) and replace all our Fuel Oil based forced air heating and hot water system with a Geothermal system. Instead of burning oil to heat our home, we’d use 1000 feet of water pipe, going up and down a  new 500 foot well to extract and compress heat from the earth.

How does Geothermal Heating work?

Once you get below 10 ft here in our corner of New York State, the ground temperature is about 50 degrees F. This is a giant renewable source where you can either extract heat (in the winter) or dump heat (in the summer), and it really doesn’t budge the ground temp. A compressor is used in the furnace to turn this 50 degree ground loop heat into 90 degree air in the winter, or 42 degree air in the summer.

The compressor is where all the energy is consumed. However,  moving heat is much more efficient than creating it, so heat pumps have efficiencies of over 100%. Typically ground source heat pumps will produce 4 – 5 units of heat for every 1 unit of electricity put in. For cooling it’s even better.

By the numbers

In the winter of 2016-2017 we spent about $2000 on fuel oil and service contract for our old system. That was based on a fuel oil price of about $1.90 / gallon as part of a really good group buy. It was also a relatively warm winter.

The new system went into place on Nov 22nd of 2017. Early in the heating season. This heating season included a 14 day cold snap starting at Christmas where it was 20 degrees below averages the whole time. Even with all of that our electricity add from the furnace was around $650 dollars for the winter (the Waterfurnace system we got has really detailed metrics in it that let me see it’s energy use). There is a harder to account for hot water heating part of the equation, especially as we also got an Chevy Bolt EV this year. Also the year in oil would have been much more than the year before (both in use and cost). But suffice it to say, we come out way ahead on operating costs no matter how you slice it.

Our June and July bills from Central Hudson are less than last years, even though it’s been a hotter summer, and we’re also charging an EV. It looks like for the month of July we’ll end up spending about $32 in electricity for cooling. Here is a graph of all the current number in kWh used.

What else we love about the system

There are lots of qualitative things we love about the system as well. First of it so much quieter. It has 2 stages on both heating and cooling, and stage 1 (the more efficient) runs with a low fan speed that means unless you are in the room adjacent to the furnace it’s hard to know it’s running. This lower fan speed also does a much better job of pushing the heat out to the edges of the house. The whole house got much more consistent.

Getting rid of the fuel oil system means we no longer have a fuel oil tank in our basement of indeterminate age rusting away in the corner. There is no whiff of oil smell at times. The primary risk of carbon monoxide and potential fires in the house is gone. And that 700 gallons of fuel oil we used the last year is no more, which is 3.5 tons of CO2 emissions not taking place (the CO2 from the increased electricity we used in the winter comes to about 0.7 tons).

We installed a whole house humidifier along with it, so now can keep the house comfortable in the winter without filling humidifiers through the house.

And lastly, our screened in porch got so much nicer. The old AC compressor was right outside it, and loud. Now it’s in the basement and can’t be heard outside.

We love it

While I knew on paper that a ground source heat pump like this would be great, having never experienced one before I had this niggling concern all the way through the process in the fall. What would it actually be like?

It’s been amazing. At least once a week I have a moment about how great this new system is. The quiet, the comfort, the savings are all pretty amazing.

The Expanse Saved!

Given that my last post was about the Expanse getting cancelled, I’d be remiss to not write about the show getting saved. Amazon Studios has stepped up for Season 4 (… and possibly beyond?). The announcement happened at the International Space Development Conference where Jeff Bezos was getting an award.

Cas Anvar helped organize a panel on Science in the Expanse at ISDC, and tells an incredible tale of how that came to be, and how the whole evening went on The Churn. (Spoilers: this episode also deconstructs Expanse episode 308, so if you aren’t current in the show you’re likely going to ruin one of the best reveals of the season). Cas’s story is really amazing, and any fan is going to love it.

I’m really looking forward to Season 4, because book 4 is one of my favorites in the series.

If you haven’t been watching the Expanse yet, go check it out on Amazon Prime. Binge the first 4 episodes as a block before making up your mind on the show, because there is so much universe, and so many characters to set up.

‘The Expanse’ vs Syfy

The current third season of The Expanse will be the space drama’s last one on Syfy. The cable network has decided not to renew the show for a fourth season, with the last episode slated to air in early July. Alcon Television Group, which fully finances and produces the critically praised series, plans to shop it to other buyers.

The Expanse is one of the most well reviewed sci-fi series on TV, with the current third season scoring 100% on Rotten Tomatoes (vs. 95% for Season 2 and 76% for Season 1).

The cancellation decision by Syfy is said to be linked to the nature of its agreement for the series, which only gives the cable network first-run linear rights in the U.S. That puts an extraordinary amount of emphasis on live, linear viewing, which is inherently challenging for sci-fi/genre series that tend to draw the lion’s share of their audiences from digital/streaming.

Source: ‘The Expanse’ Canceled By Syfy After Three Seasons, Will Be Shopped | Deadline

After Dark Matter was cancelled last year, I really wondered if the Expanse would suffer the same fate. The cancellation has more to do with the set of deals Syfy made a few years ago when it was trying to get back into science fiction.

They wanted to get back on the map, but because they had been out of it for so long they mostly made deals where they bought the rights from a production company for television broadcast, but left the rest of it on the table. This let them get a full slate of shows more or less overnight, without having to foot the whole production cost for all of them. Fast forward 3 years, the sci-fi genre shows are doing really well, but mostly on the streaming / digital front. Syfy gets none of that. So their return on investment on those shows is pretty low compared to the shows they funded fully. In an attempt to up their ROI, they have been dumping the shows they don’t own, to do new ones they do. Like Krypton.

It all makes sense in a spreadsheet, but sucks for fans of the genre. I’m hopeful that Alcon will find a new home for the Expanse because it really is one of the most amazing shows I’ve ever seen. And it massively rewards rewatching. There are still 8 more episodes of Season 3 yet to air. And it’s been one of the best so far. So go buy and watch the Expanse on Amazon or iTunes to further nudge Alcon and any potential broads partners that it’s worth their while to support it.

One Space or Two?

Among people who preferred one space, reading speed was about 5 wpm higher with one space. Among people who preferred two spaces, reading speed was about 9 wpm higher with two spaces. In other words, people preferred whatever they preferred, and the difference wasn’t very much anyway.

Source: One Space or Two? That Recent Study Won’t Tell You. – Mother Jones

This is a great break down of the bad science reporting in the recent “2 spaces” articles. A lot of science reporting in popular media is done without looking at the study in question in detail. The mother jones article breaks down why this was really weird.

But, the original article did demonstration one thing well. We all have a tendency to broadcast information that confirms our biases, without looking deeply as to whether that was properly grounded. Which is why a poorly done study on a trivial but polarized issue generates a lot internet traffic.

When algorithms surprise us

Machine learning algorithms are not like other computer programs. In the usual sort of programming, a human programmer tells the computer exactly what to do. In machine learning, the human programmer merely gives the algorithm the problem to be solved, and through trial-and-error the algorithm has to figure out how to solve it.

This often works really well – machine learning algorithms are widely used for facial recognition, language translation, financial modeling, image recognition, and ad delivery. If you’ve been online today, you’ve probably interacted with a machine learning algorithm.

But it doesn’t always work well. Sometimes the programmer will think the algorithm is doing really well, only to look closer and discover it’s solved an entirely different problem from the one the programmer intended. For example, I looked earlier at an image recognition algorithm that was supposed to recognize sheep but learned to recognize grass instead, and kept labeling empty green fields as containing sheep.

Source: Letting neural networks be weird • When algorithms surprise us

There are so many really interesting examples she has collected here, and show us the power and danger of black boxes. In a lot of ways machine learning is just an extreme case of all software. People tend to write software on an optimistic path, and ship it after it looks like it’s doing what they intended. When it doesn’t, we call that a bug.

The difference between traditional approaches and machine learning, is debugging machine learning is far harder. You can’t just put an extra if condition in, because the logic to get an answer isn’t expressed that way. It’s expressed in 100,000 weights on a 4 level convolution network. Which means QA is much harder, and Machine Learning is far more likely to surprise you with unexpected wrong answers on edge conditions.

CAFE standard of 55mpg seem high? It’s not the real number, and the real number is a lot more interesting.

If automakers complied with the rules solely by improving the fuel economy of their engines, new cars and light trucks on the road would average more than 50 miles per gallon by 2025 (the charts here break out standards for cars and light trucks separately). But automakers in the United States have some flexibility in meeting these standards. They can, for instance, get credit for using refrigerants in vehicle air-conditioning units that contribute less to global warming, or get credit for selling more electric vehicles.

Once those credits and testing procedures are factored in, analysts expected that new cars and light trucks sold in the United States would have averaged about 36 miles per gallon on the road by 2025 under the Obama-era rules, up from about 24.7 miles per gallon in 2016. Automakers like Tesla that sold electric vehicles also would have benefited from the credit system.

Source: How U.S. Fuel Economy Standards Compare With the Rest of the World’s – The New York Times

This is one of those areas where most reporting on the CAFE standard rollback has been terrible. You tell people the new CAFE standard is 55 mpg, and they look at their SUV, and say, that’s impossible. With diesel off the table after the VW standard, only the best hybrids today are in that 55 mpg range. How could that be the average?

But it’s not, it’s 55 mpg equivalent. You get credit for lots of other things. EVs in the fleet, doing a better job on refrigerant switch over. 2025 would see a real fleet average of around 36 mpg if this was kept in place.

More importantly is that in rolling back this standard it’s going to make US car companies less competitive. The rest of the world is going here, and US not just means companies that don’t hit these marks have a shrinking global market.

The future of scientific papers

The more sophisticated science becomes, the harder it is to communicate results. Papers today are longer than ever and full of jargon and symbols. They depend on chains of computer programs that generate data, and clean up data, and plot data, and run statistical models on data. These programs tend to be both so sloppily written and so central to the results that it’s contributed to a replication crisis, or put another way, a failure of the paper to perform its most basic task: to report what you’ve actually discovered, clearly enough that someone else can discover it for themselves.

Perhaps the paper itself is to blame. Scientific methods evolve now at the speed of software; the skill most in demand among physicists, biologists, chemists, geologists, even anthropologists and research psychologists, is facility with programming languages and “data science” packages. And yet the basic means of communicating scientific results hasn’t changed for 400 years. Papers may be posted online, but they’re still text and pictures on a page.

Source: The Scientific Paper Is Obsolete. Here’s What’s Next. – The Atlantic

The scientific paper is definitely currently being strained in it’s ability to vet ideas. The article gives a nice narrative through the invention of Mathematica and then Jupyter as the path forward. The digital notebook is incredibly useful way to share data analysis as long as the data sets are made easily available. The DAT project has some thoughts on making that easier.

The one gripe I’ve got with it is being a bit more clear that Mathematic was never going to be the future here. Wolfram has tons of great ideas, and Mathematic is some really great stuff. I loved using it in college 20 years ago on SGI Irix systems. But one of the critical parts of science is sharing and longevity, and doing that on top of a proprietary software platform is not a foundation for building the next 400 years of science. A driving force behind Jupyter is that being open source all the way down, it’s reasonably future proof.

Exploring and discovering how things are more complicated, with a focus on climate and software