A guilty data mining pleasure

This may not be highbrow, but it sure is interesting to grab smart meter readings (from one’s utility or, in our case,  https://www.smartmetertexas.com/) and cross-referencing it with weather data from the nearest NOAA station. The near-perfect correlation between daily usage and maximum temperature won’t surprise anyone, but the weird outlier days make me wonder what additional factor I should take into account...

Also, I need to figure out what time to set the EV charger for to best balance out our power usage (at a glance, any time outside of 4pm-midnight).

EVs & bringing back the good old days of drivability

On a bit of a lark I test drove a BMW i3 electric vehicle yesterday and it gave me a strong sense of deja-vu: in the early 90’s, my daily driver BMW E21 318i had been totalled thanks to an inattentive driver in oncoming traffic and I needed a replacement quickly (and cheaply). I ended up in a six-cylinder E21 320 (no “i” — this was the downdraft carburetor 2 liter six, which was never sold in the US) for a while. It was old and slower than the 318i but still ran fine. The amazing thing, though, was its drivability at low speeds. You could crawl in a parking lot at basically engine idle speeds in first gear (no automatic gearboxes or torque converters around these parts!), control speed finely with small throttle inputs and it always was responsive — never jerky like in a fuel-injected car. It was quite a delight in the city. Every car I’ve owned since has been fuel-injected and, progressing from mechanical to electronic fuel injection, less and less drivable at low speeds.

Driving the i3 felt surprisingly like that old 320: the abilty to control speed with precision and a really linear response. It didn’t feel OMG fast on the open road (again, like the 320)... Just that I still managed to reach the speed limit way faster than I expected. I don’t know if/when EVs are going to gain mindshare with ordinary car-loving people in the US beyond the brand phenomenon of Tesla, but there are some definite benefits beyond plain economy that we’re collectively ignoring by just looking at specification sheets.

Betting on the Network

Just like last year, the "holiday" new computer season is upon us. This year's new and interesting machines are the Microsoft Surface Studio and Apple's new MacBook Pros -- I'm not going to venture a guess as to who picked the right approach to touch input (both, for all I know), but it feels like they continue to bet that form-factor and efficiency are more important than outright power for these machines. They are targeted at serious users, much like the iPad Pro was last year, but with performance that is not in line with the fastest hardware you can buy. The biggest, baddest, fastest hardware is still available for those that really need it (e.g., gamers), but it looks like the broader targeted machines favor user experience over outright power.

I do wonder if what we are seeing is a bet by Microsoft and Apple on the network: we haven't seen it materialize in the "Pro" software we use today, but that the next step is to move the heavy processing to the cloud -- just like software makers have learned to do for mobile devices. It's not a reality today, but given an always-available network connection and a near-infinity of compute power on the back end, how much horsepower do I need sitting in my lap or on my desk? Now that Microsoft is a big cloud player, they could see this as a win-win.

If this turns out to be their intent, I guess that we should start seeing hints to how they intend to offload computation from the client sooner rather than later... Something more granular and sophisticated than treating the client as he extent of the computing universe or like a dumb terminal at the other end of the scale (à la Google).

Roger Johansson on Actor System Location Transparency

Roger Johansson -- one of the folks behind the Akka.NET actor framework -- posted today about how he'd learned that the location-transparency of distributed-first systems like Akka and Akka.NET is is a bit of a double-edged sword. That is so true. However, having some experience implementing a system that doesn't attempt to make local and remote message passing look identical, I have to say that it is as much (lack of) programmer education as convenience that harms us.

There is a fundamental jump between local and remote message passing that, while not as obvious as -- say -- going from object-oriented programming in Java or C# to writing (untyped) actors, should not be underestimated. On top of that, distributed systems have a tendency to violate the assumption of homogeneity that programmers are so fond of making. As Johansson states, this may come in the form of interactions with completely foreign systems, but it also comes from dealing with the products of other teams and even versions of one's own software.

Even when the APIs make the semantic and behavioral differences between local and remote communications clear, we -- as programmers -- need to give them some thought before it makes much of a difference. In that respect, the location transparency in systems like Akka is wonderful within the parameters it was intended for: scaling up to a distributed, homogenous, system.


I've moved blogging platforms. As it turns out, automatically moving content from Typepad to just about anything else is really hard, so I'll be manually bringing over some of the blog's history as time permits.

Sorry for the ignominious end to 10+ years of posts.

UPDATE: Who am I kidding? I'm not going to find the time to move over selected posts from the old blog. They can be found on Typepad at: http://cwirving.typepad.com/

These are my opinions only. They are not those of my employer..