Heartbleed = overcomplexity + input validation failure

The Heartbleed vulnerability is because the OpenSSL code didn’t validate an input. It’s also because OpenSSL had unnecessary complexity.

OpenSSL has a heartbeat feature that allows clients to send up to 64 kilobytes of arbitrary data along with a field telling the server how much data was sent. The server then sends that same data back to confirm that the TLS/SSL connection is still alive. (Creating a new TLS/SSL connection can take significant effort.)

The problem is if the client specifies that it sent more data than it actually did, the server would send back the original data and some of its RAM. For example, suppose the client sent a 1K message but said it’s 64KB. In response, the server would send a 64KB message back, which was the original 1K message plus 63K of data from the server’s RAM, which could include sensitive, unencrypted data from other programs.

How this could have been prevented:

  1. Avoid pointless complexity: don’t require the client to also send the length of the arbitrary text. The server should have been able to detect the length of the text.
  2. Validate all input. The server failed to ensure that the client’s description of the text length matched its actual length. (The fact that the server could detect the message’s actual length further validates my view on #1.)

Keep it simple! In addition to driving up creation and maintenance costs, needless complexity is more opportunities for things to break.

TxDOT Dallas District to keep speed trap speed limits?

The Dallas County Sheriff’s Office recently made hay over massive noncompliance with freeway speed limits. Drivers aren’t the problem. The problem is almost all our freeway speed limits are speed traps: they are way too low.

TxDOT recently proposed 5 mph speed limit increases on a few, outlying highways. This isn’t enough.

TxDOT did a lot of speed studies in 2012. If TxDOT followed its own speed zoning procedure, then these studies require a 10 mph increase on almost all Dallas-area freeways, even inside the loop. All those 60 mph limits? Almost all should be 70 mph, some 75 mph!

Why be concerned about only 5 or 10 mph? It makes a huge difference in the percent of drivers who are criminalized. Take TxDOT’s study of I-30 at Hampton Rd. The speed limit is 60, but it should be 70. Look at the difference only 10 mph makes:

Speed limitPercentage of drivers criminalized
60 (current)83%

(Aside: It’s a myth that raised speed limits mean everyone goes that much faster. Raised speed limits just mean fewer safe drivers play a reverse lottery.)

Here’s the current speed limits at all speed checks. Red is 60 mph, orange is 65 mph.
Current speed limits in the Dallas area

Here’s what the speed limit should be, if TxDOT follows its own rules. Orange is 65 mph, green is 70 mph, and blue is 75 mph.
new speed limits
(I think the 65 mph speeds on I-30 are anomalous and should be rechecked.)

Here’s some background:

Environmental speed limits (ESLs)

In the early 2000s, mendacious bureaucrats in the North Central Texas Council of Governments inflicted environmental speed limits on the Dallas/Ft. Worth area as part of a Clean Air Act compliance plan. Already too-low speed limits were made even lower : all area 70 and 65 mph limits within roughly 50 miles of downtown Dallas and Fort Worth were reduced by 5 mph.

Environmental speed limit map
Map of environmental speed limits (ESLs) for Dallas-Fort Worth area

I wrote “mendacious” because even before ESLs were imposed, these bureaucrats knew that they were ineffective. Here’s why:

In the late 1990s, based on modeling done in EPA’s MOBILE5 software, it was believed that capping all area speed limits at 55 mph would get the area 1.5% closer to needed emissions reductions. Yes, that’s right, a whopping, practically unenforcable 10-15 mph speed limit reduction just buys us 1.5%. That was hugely unpopular in Houston and was replaced with a 5 mph reduction scheme, the same scheme that DFW got. So OK, this 5 mph reduction scheme may be more on the order of 0.5% of the emissions goal (assuming a linear relationship). Further, they found that almost all the emissions benefit was from heavy trucks, not cars.

It gets better. Before the ESLs even went into effect, they reran the models using newer software, EPA MOBILE6. Well, lo and behold, the newer software found that, at best, the emissions reduction was so small, it was a rounding error! Even newer software, EPA MOVES, now finds that there is probably no emissions benefit of lowered speed limits. Despite all this, the EPA-approved smog reduction plan had ESLs baked into them, and NCTCOG bureaucrats lacked the spines to do anything about it until over a decade later! Only in 2013 did we start to get public reports of ESLs going away.

To be clear: if ESLs go away, then TxDOT can freely raise speed limits on ESL roads. It doesn’t need to revert to the limit in effect before the ESLs. When the ESLs took effect, Texas law did not permit speed limits over 70 mph; since then, the law has been changed to allow up to 85 mph limits. Even then, there were many 65 mph limits that were themselves under-marked.

TxDOT practices

Why is TxDOT proposing only a 5 mph speed limit increase? I don’t know, and it contradicts TxDOT’s own rules.

TxDOT’s Procedures for Establishing Speed Zones require the speed limit be the 85th percentile speed rounded to the nearest 5 mph increment (reference). If the 85th percentile speed is, say, 73 mph, the speed limit must be 75 mph. You may ask, “What is the 85th percentile speed?” It’s the upper end of the “flow of traffic”. 85% of drivers’ speeds are at or under the 85th percentile speed.

TxDOT’s manual allows some deviation from this, like higher than average crashes or roadway design factors. However, by definition, most roads will not have higher than average crashes, and freeways generally are built to the highest design standards, so these cannot be used to justify TxDOT’s failure to use the 85th percentile speeds.

So what gives? I don’t know.  This is only a guess, but I think TxDOT’s Dallas Office may no longer care about the serving the public, preferring to maintain speed trap speed limits. I say this for two reasons.

First, the existence of ESLs don’t explain all our low speed limits. ESLs never applied to any roads within I-635 or I-20! ESLs only applied to roads that used to have 65 or 70 mph limits.

That means that TxDOT has always been free to raise non-ESL road speed limits. I am sure that, for years, there has been a good case to raise almost every inner-loop freeway, which were not bound by ESLs, by at least 10 mph. Why has TxDOT not bothered to do it, and why does it still drag its feet?

Second, TxDOT has previously shown active disregard for the motoring public. For example, in the late ’90s, Dallas District staff tried to impose arbitrary speed limits on Farm to Market roads that were 10-15 mph too low. While TxDOT’s Austin office rebuked them, the result was still a uniform speed limit that remained 5-10 mph too low.

I want TxDOT to stop using arbitrary, speed trap speed limits. Making all of us play a reverse lottery doesn’t make roads safer, doesn’t clean the air, and benefits nobody. If TxDOT wants to serve the motoring public and is interested in following its own policy, it will raise almost every Dallas-area freeway speed limit to 70 or 75 mph.

Want to see the actual speed studies? They’re all here: http://arencambre.com/txdotSpeedStudies/

Google is not linking to HTTPS versions of everyone’s sites

In the University Web Developer’s (UWEBD) listserv today, a conversation took off about how Google was linking to the HTTPS version of Florida Gulf Coast University’s web site. It was a problem because of FGCU’s broken HTTPS channel.

I was surprised at the misconceptions that came over a technically astute email group. Here’s my statement:

Two inaccurate things have been said about Google.

Inaccurate statement 1: Google is securing others’ sites. Dangerous misconception! Google cannot “secure” your site. If Google’s link to you uses HTTPS, that does not “secure” your site. It just means Google is linking to your site’s secure channel. “Securing” a site includes transport security (HTTPS channel) among many other things. Most importantly, YOU, the site owner, do the “securing”, not Google.

Inaccurate statement 2: Google is en masse sending users to HTTPS channels on web sites. Nope. For example, Southern Methodist University has had both HTTP and HTTPS channels for www.smu.edu for over a decade. Google links to the HTTP version.

Starting late last year, Google encrypts traffic between the user and its search site. If you visit http://google.com, Google redirects you to https://google.com. That has no bearing on whether Google’s search results link to HTTPS or HTTP channels. However, it may limit site owners’ view of search keywords (reference); that isn’t related to the inaccurate statement.

You can still get unsecured Google search using http://www.google.com/webhp?nord=1 (note the highlight), but only if you’re not signed in. A search on Florida Gulf Coast University on the unsecured version still links to the HTTPS channel.

There’s are many reasons why Google is linking to FGCU’s secure channel, but it’s almost certainly not because of Google’s own change.

Sitecore blogging song

This is a song about Sitecore blogging woes, with the answers to the woes at http://mikael.com/2013/11/sitecore-mvp-summit-team-7/. Sung to the tune of the Scout camp song “I met a bear”.

I have a blog
About Sitecore
I can’t find time
To write the blog

I like to write
About Sitecore
I don’t know what
To write about

I have a blog
About Sitecore
Nobody wants
To look at it

I like my blog
About Sitecore
MVPs want
To talk to me

I’ve got a blog
About Sitecore
It wastes my time
I need it short

Getting high quality graphics out of Quantum GIS

It’s hard to get print-quality graphics out of Quantum GIS (QGIS). There’s a kludgy command line method, but it doesn’t always work right (see that page’s comments). It’s stupid that you have to take a GUI-based program to the command line to get good graphics!

I asked for something straightforward 2 years ago (link), but it hasn’t gotten much traction. In the meantime, you can use the Print Composer as a workaround. Here’s how:

  1. Orient your QGIS viewport to fully include the part you want to export. It’s OK if it shows a little more than what you want to export. For example, I’m only wanting the gridded part of this view:
    qgis broad view
  2. File > New Print Composer.
  3. In the button bar, click Add new map (Add new map button).
  4. With the mouse pointer, draw a large rectangle on the canvas. It’ll show the view you you established in step 1 and more:
    Print composer - initial view
  5. Notice how the image is offset from center. Center this graphic in its box:
    1. Click the Move item content button (Move item content button).
    2. Figure out what you want to be in the center of the exported image. With your mouse inside the rectangle you drew, drag the image until the part you want to export is centered in the rectangle. In my case, I just moved it up a bit:
      Centered viewport
  6. Since I am really only shooting for the grid in my final output, I need to zoom in. On the bottom right side, click on the Item Properties tab. You should see a field named Scale. Gradually reduce the value in Scale, starting with about 10% at a time, until what you want almost fills the frame. Press enter after you change that number to see the effect of that change. After adjusting the scale down by about 1/3 (a smaller number in Scale zooms in the view), I finally have it looking as I want:
    Centered and zoomed in
    (I’ve filed issues to add a tool to zoom in and out of items and to better explain the scale field.)
  7. Click the Export as Image button (Export as Image button) and save the PNG using the dialog.

Voila, you have a high quality image! You may still need to crop it to get it just right.

If you need a higher resolution, then click on the Composition tab and change the DPI value to the right of Print as raster. (Print as raster is probably not related to the DPI; this has been filed as a bug (http://hub.qgis.org/issues/7973).) Export again.