Sad News, Good News

I wanted to post here something that I never, ever would want to post.  Last Tuesday (when internet connectivity was only on my iPhone and my emotions were too intense) Jessica and I drove to the hospital in Ukiah, California and after the morning and part of the afternoon in the ER learned that Jessica had had a miscarriage.  This is not the sort of thing you expect.  This is nearly the worst case scenario.  We cried a lot there in the hospital.  I wrote notes on my iPhone about how I was feeling, but I won’t post them as they’re far too intense for me to publish without feeling like its too much.  They’re also feelings that I don’t have any more because we’re OK.  We’re trusting the Lord that He’s used this to get our attention.  Prior to the unexpected fourth pregnancy (our first pregnancy was also a miscarriage), we we had planned on no more pregnancies.  Except that now our hearts are set on having a third child that we can hold, love and prepare for a life of intensity.

My brother, sister, and I were all born in Ukiah, CA.  It was strange and backwards to go to the same small town and discover that this expected Peterman life would not be seeing Ukiah.  After the doctor’s gentle disclosure of the diagnosis, “Fetal Demise”, we went to eat (having missed breakfast and our normal lunch) at a place where I recall eating with glee as a boy, the Mutt Hut.  Something about the place, and the honestly tasty hot dogs, brought a sense of comfort that sounds stupid as I write this.  I was with my wife, who I loved, looking forward to my two healthy girls, whom we both love dearly, and eating food (which we really needed).  Ukiah has a movie theater there that I remember as a child.  I went there once as a teenager, too.  Ukiah now has another memory in my heart and mind now: the place where Jessica and I decided we will try for a third child – a place that has some endings, but also an important beginning.  The beginning for the plan for three Peterman kids for Randy and Jessica.

We’re doing OK.  We’re doing well.  We’re doing this on purpose.  And we’re looking forward to seeing this little child we didn’t get to meet on this earth in heaven.

A Few Links Around the Web

I’m a huge fan of learning new things (and I like to think that you are, too):

My buddy Dave O’Hara sent me this link: 7 Rules of Unobtusive JavaScript. A good overview of why you should be coding unobtrusively (which I do in as many cases as possible) and also explains some good things about namespacing and object access.

I’m trying to learn about TDD and JavaScript which has been a bit harder than I expected: we need more of this in the JavaScript community!  Here’s a bit over at the Ajaxian on doing TDD with JsMock.

Assumptions I’ve Seen in the HTML 5 Debate

It wasn’t until the recent flurry of the guardfather of web standards, Jeffrey Zeldman, and his posts about HTML 5 (see: In Defense of Web Developers, HTML 5 Nav Ambiguity, HTML 5 Is A Mess, and so forth) that I began looking into HTML 5.  I’m busy being pragmatic with my code today and making tough choices about browser support and figuring out how I can make HTML 4.01 work consistently in those browsers that I do support.  But with the promise of change coming I need to be on the hunt for details rather than waiting for the browsers to fully implement the spec.  If we wait for browsers to fully implement specs it could very well be after I’m out of the web development industry before they’re implemented.  I wish that last line were a funny joke, but sadly 100% implementation is not likely in the next few years because the spec isn’t complete.  The reason that the spec isn’t complete is because people on both sides of a bunch of arguments have been making assumptions.  Lets take a look at those assumptions, shall we?

Assumption 1: The Needs of the Web Are or Are Not Going To Be The Same in ‘N’ Years

I love this assumption either direction you take it. Its awesome if it stays the same because then we can clean up our markup so that WordPress 4.0 (or your blogging platform of choice) can have 16 elements repeated over and over.  We won’t have table nesting issues, we won’t have DIV-itis, we’ll have semantically pure documents.  Or not.  Because as long as we’re using the Sliding Door technique or any number of other hacks to get markup to map to CSS (note: the spec will change for this technology, too) we’ll be polluting the DOM.  Also, if the web will change: how will it change?  Assuming that we can create “The Perfect” markup language in HTML 5 is naive at best and possibly stupid at worst.  What did we need in HTML, JavaScript and CSS before the iPhone, Table PCs and the user interface in “Minority Report”?  Assuming that the human interface to data on the web and elsewhere will remain stagnant is flawed horribly.  If we ever get HTML 5 out the door, we’ll get HTML 6 out the door some decades after that (or HTML will die and we’ll move to some other markup format).  What if we’re closer to Matrix-like data input than we think and you stop taking in the web through what you consider a browser?

Assumption 2: We Need More or Less Markup Elements

There are great arguments for newer or different markup elements.  There are great arguments for using the old ones and just styling them with CSS.  There are great arguments but many of them have, on some core level, more assumptions.  Arguments for newer elements are valid for present web contents if you are looking for semantic markup.  If you take the negative view (and assumption) that there will be no really powerful algorithm in your lifetime that can really, truly process semantic markup then this is a voided argument and you move on.  The assumption and expectation trumps the ideal nature of semantic markup.  You will make no headway here.  If you assume that semantic markup will lead to better programs to parse the data then what you’re really looking for is XML + some sort of namespace and doctype information that will help computers parse the data beyond what the browser is doing.  Microformats help in this area, but are not complete enough to make all documents data fully parsable.  Also, if you’re talking about web standards I should not that data synchronization standards are diverse and incomplete in most implementations as well.

We may need less markup if we can style the div and span elements however we want.  Or so you may think.  That sort of thinking is based on the assumption that all markup is basically the same with different styling.  An oversimplification for sure, but I’ve found that oversimplification makes life easier, and so myself and others are often caught doing it here, there, or in other places we don’t talk about in polite company.  Its the Internet, so we do it publicly, but on sites that anonymize our usage, of course.  Hacks make Internet Explorer bugs more bearable, Firefox bugs less annoying, Opera work like every other browser, and Safari and Chrome like Internet Explorer (or not).

The Rise of the Pragmatist

I like to fancy myself a pragmatist.  It doesn’t buy me anything, but I can pretend that I’m practical, which is nice when someone asks you to do something and you don’t have an unlimited amount of time or money.  I don’t have time to wait for all of HTML 5 to be implemented.  I don’t have time for the web to catch up with desktop software of the 80’s and 90’s that had option/select/text entry elements that allowed the user to input any type of text, but also choose from some pre-populated options.  I don’t have time for the web to allow my DIV-itis to look much prettier when presented in a hierarchical tree-like structure.  I need markup that works now.  HTML 5 will be a step in the right direction, but it will be slowly implemented in ways we don’t know yet and we’ll see what happens.  If the Canvas element replaces Adobe Flash and SVG replaces VML: I’ll cheer.  If I don’t ever get to use either of those elements or technologies because the implementers of those technologies never get off their collective backsides within the walls of various browser vendors then I’ll be practical about what I do.  I’m going to make several assumptions about the future – as the above statements reveal.  I believe that the web will change through JavaScript and CSS libraries and hacks.  I believe that HTML standards will come, and I think that just like HTML 3, HTML 4 will one day be old.  I’m going to assume that practical software development is all about satisfying the demands of your boss, your client, or yourself with what you can do now.

But you wouldn’t be a pragmatist if you didn’t also believe that you could practically bring about resolution and change for HTML 5, HTML 6 or HTML 7.  Practical change.  Good change.  The things you discover you needed from HTML 5 that its purveyors could not foresee.  Keep pushing for better web standards, but don’t keep fighting for better web standards.  The fighting causes delay, the fighting isn’t pragmatic, and the fighting doesn’t clean up the web, it worsens it.  Data wants to be free, do your part to continue to free it.  I’m going to start by reading through the proposed HTML 5 specs to see what I’ll be up against in Internet Explorer 9.5 – whenever that comes out.

Pre-Road Trip Jitters

We’re leaving on what can only be described as a crazy road trip.  We’ll be hitting 8 states: Colorado (we live on the eastern side of the state and have to drive through it going west, over the Rockies), Utah (all the way across), Nevada (all the way across), California (all the way across), Oregan (all the way up), Washington (the southern half), Idaho (All the way across the lower half), back through Utah and into Wyoming on 80 and then south into Denver.  In two and a half weeks.

I’m anxious, but mostly because I hate getting up at 2:00 AM – and then driving for 20 hours.  But we’re glad to see everyone we’ll see.  Maybe we’ll see you.

Tracking… Tracking… Broken… Fixed!

My trackball, which I’ve been using instead of a mouse for several years now thanks to a push from my friend Dave O’Hara and my chiropractor bill started to go on the fritz this week.  It wouldn’t click correctly and sometimes would register a double click (a few times causing me frustration).  I ordered one to come from Amazon to replace this broken device.  This afternoon it seemed toast.  It wouldn’t click, it would just make clicking sounds but fail to actually send the correct signal.

In desperation as the replacement hadn’t arrived I cracked it open and blew it out with a compressed air can.  After replacing the batteries this thing is humming along and working like new.  Except I still have a new one on order.  Anyone looking for a trackball? 🙂

You Are Not Reading This Here

[as per my wife’s instructions this is not an announcement]

I am so excited I can’t tell you how excited I am.  But I CAN tell you that we’re having another baby (gender unknown) and its pretty darn awesome.  Just have to let you know.

Names that will cheese off people who think we should name the baby other names, or who think it should be kept a secret:

Girl: Charlotte Rose Peterman – nickname: Charlie, though Unkle Kurt has said he’ll call her Lotty.

Boy: Eric Matthew Peterman – nickname: Eric, though Unkle Kurt has said he’ll call him Eric.

We’re excited and hope you’ll celebrate with us!

Go Native: JSON Vs eval is evil

In the world of browser performance you can find yourself looking for the little things to make big differences, or even a lot of little things to make a bigger difference together.  I’ve been researching one particular change that is coming down the pike: native JSON handling.  John Resig wrote about the need for native JSON support in the browser in 2007 and its finally come.  The difference it makes between Firefox 3.0 and 3.5 is major, the difference between Internet Explorer 7 and 8 is important, and the safety that native support brings for prevention of cross site scripting (XSS) is critical.

I’ve created two tests that you can try for yourself: the eval test and the JSON test.  The tests loop 20 times to give you a broader test range and reveal the average time.  There are notes in the test pages to clarify a few observations, but I’ll put them here just for the sake of a single source.  The test pulls in 1600 JSON objects and either evaluates them using the JavaScript eval function (eval(/*JSON String*/);), or it parses them with the native JSON parser (JSON.parse(/*JSON String*/)).  For consistency’s sake I used the data from John Resig’s test which I have copied onto my server to reduce the load on his server and not steal bandwidth.  My tests were run locally to reduce bandwidth latency influencing results, but you can see that over the Internet, even on a broadband connection, the performance only gets worse.

Firefox 3.5 has javascript tracing enabled and the typical test results will show a much slower first pass with subsequent results being much, much faster. It should never be assumed that the user will be getting the exact same data back like this test shows, so the slower performance should be expected.

Internet Explorer 8’s Eval test appears to be almost as fast as the JSON test and their eval code’s execution is pretty fast already. However, the JSON.parse() code appears to be much safer to use and is thus preferable.

The final results are based on the averages (which are much more consistent than comparing the ‘best’ numbers): eval is roughly 500 milliseconds (or 500%) slower in Firefox 3.5 the first time and nearly the same speed in Internet Explorer 8 with an average of 10 seconds slower in 20 passes.  So for either identical or much faster performance and greater safety against XSS it is a no-brainer to switch to including native JSON support as a preferred method of dealing with JSON data over eval.

I do want to note that I was impressed by Internet Explorer 8’s eval speed, it was much greater than I had expected, and generally disappointed with Firefox’s, but since it is an evil function to use, that’s not all bad.

King Corn: A Movie You Should Watch

Uncle Ben, in the first of the 2000’s Spiderman franchise, tells a young Peter Parker, “With great power comes great responsibility.”  Watching King Corn (website) tonight over Netflix’s instant watching service was sobering.  It isn’t the most entertaining movie you will ever watch.  Comparing it to Spiderman might be cruel because one is for fantasy and fun and the other is for education and presenting reality in a film format.  Despite my stating that it isn’t ‘fun’ to watch, you should watch it because the contents of the film are disturbing.

If you think that government spending is out of hand: watch this film.

If you think that Americans are nutritionally screwed up and need to eat better: watch this film.

If you think that you’ve got everything together and your life is all roses: watch this film.

I am allergic to corn grain and corn syrup, though corn oil does seem to be OK in moderation, and so for me corn is just not a great item to eat.  After watching how corn syrup is made, I’m pretty sure I wouldn’t want to eat corn syrup even if I were able to eat it.  I’m convicted, once again, that I should be careful what I put into this body, but don’t take my word for it.  Watch this film, read Michael Pollard’s book “In Defense of Food” – and see where you land.  I bet it isn’t in a field of corn, or in line at McDonalds.

Coffee with Jeremy

I should have taken a picture.  It would have lasted longer.  However, I had a great time with a brother in the Lord, Jeremy, this morning.  We talked about a lot of stuff, but I can’t tell you about it because what happens in Starbucks, stays in Starbucks.  More blogging will be coming soon.  I’m stoked about that.