The Greater Fool

Live to eat or eat to live?

And of course he doesn't just eat it...He has to eat it in my face.

I grew up in an ethnic household, and a poor one at that, so we ate any and everything. From Spam fried rice to oxtail stew to duck tongues to ham hocks to thousand-year-old eggs. My wife is from the mid-west where the dominant flavor profile is bland.

We have learned to compromise but it wasn't an easy road. On one of our early meals together, we went to a Tuscan restaurant where the bread service was accompanied by extra-virgin olive oil. She was horrified! During an extended stay with her family, I learned the multitude of ways that ground beef, pasta, and Campbell's Cream of Chicken soup can be combined into a lifetime of "hot dishes."

Today, we still have some ground rules. She won't try any seafood and I've managed to learn to love the congealed salad. Still, the thought of a guys-with-unadventurous-wives supper club is piquing my interest.

When is null not null?

In JavaScript, that's when. Say you have an object in C# like Person with string properties for FirstName and LastName. Something like this:

public class Person
    public string FirstName { get; set; }
    public string LastName { get; set }

Those properties are, of course, nullable.  In JavaScript, an instance of that object might look something like this:


That's what the JSON symbol would look like. Now if you wanted to set the FirstName to null and pass that object back to some controller on the server-side code, you might expect something like this:


Well, we came across this today. We passed the above JSON through the ModelBinder and this is what came out on the server-side:

+ FirstName = "null"
+ LastName = "Yon"

Wait, what!?! A little digging reveals that <em>null</em> in JavaScript is a special object with type of object. Somehow the default ModelBinder interprets that to be a string and infers that we really wanted "null" instead of null. What we wanted, in fact, was undefined instead. That is a special value whose type is "undefined." So passing this JSON


will achieve the desired effect. It's a simple concept and a subtle difference but a potential gotcha, nonetheless.

What does Web-enabled really mean?

One of the projects I'm working on is to build the of a successful LOB client-server desktop application (nee winform). My company recognizes the ubiquity of the Internet and the mobile paradigm, so the means web-enabled and within a browser. We are not porting to application, ie. rewriting it in HTML/JavaScript but rather designing for a new platform. It's tempting to take the existing User Experience and translate it to the browser. Certainly, tools like HTML5, jQuery, insert your favorite ECMAScript library here, it's able to duplicate the desktop experience in a straightforward, if not elegant, manner.

The existing desktop application is a typical narrow-focus business design that relies on training, gradual evolution and Microsoft patterns (hover tooltips, header-detail split layouts, modal dialogs, et al) to enhance usability. In the

Developers vs Project Managers

Is there a more incongruous relationship than the one that exists between developers and project managers? On one hand, you have artists who are tasked with creating something from nothing. On the other, you have a resource manager that is trying to prognosticate, with some measure of reliability, the outcome on a daily, weekly, sprint-ly basis. There is less common ground between these two people than exists in the DMZ between North and South Korea. And yet, we interoperate on a regular basis to come up with progress reports and burndown charts and other artifacts to prove that this investment was, indeed, a wise one.

Sure you have your utility programmers who's tasked only with plumbing together existing solutions but, in general, the good engineers are more akin to artists than to knowledge workers. We are talking about skills such as inspiration, design, and architecture. It's like asking to an architect to plan you an office that may be used in a home or a 170-floor building and everything in between. It will be used by 2 users or 7 million. It must be completely responsive on first use and be able to accommodate being scaled to "the cloud." All of these things require more than what a blueprint and steel can offer. As a software developer, we are asked to construct tools and artifacts that make all of this a reality.

Project management, as a discipline, developed as a response to the very basic nature of a project. A project is a ephemeral task that is, by definition, somewhere outside the range of routine business activity. As such, leadership will try to wrap their arms around the beast by employing someone known as a PM to ensure that goals are met while honoring preconceived constraints. Therein lies the problem. If you accept that software development is an art then preconceived constraints are nothing more than a guess. Traditional Project Management patterns are designed around concepts that have measurable (and linear) workflows. Software development is not one of those concepts.

Overall, we are still in a very immature industry that hasn't ironed out all the details yet but we should approach this challenge with new ideas rather than yesterday's retreads. If developers really are the basis for our future economy then we should find better ways to measure and them and their work. Until then, I offer you this bit of wisdom given to me by someone smart than myself, "You should treat Project Managers like mushrooms. Keep them in the dark and feed them lots of shit."

Why the current conservatism in this country is so distasteful

Paul [has] none of the resentment that burns in Gingrich or the fakeness that defines Romney or the fascistic strains in Perry's buffoonery. He has yet to show the Obama-derangement of his peers, even though he differs with him. He has now gone through two primary elections without compromising an inch of his character or his philosophy. This kind of rigidity has its flaws, but, in the context of the Newt Romney blur, it is refreshing. He would never take $1.8 million from Freddie Mac. He would never disown Reagan, as Romney once did. He would never speak of lynching Bernanke, as Perry threatened. When he answers a question, you can see that he is genuinely listening to it and responding - rather than searching, Bachmann-like, for the one-liner to rouse the base. He is, in other words, a decent fellow, and that's an adjective I don't use lightly. We need more decency among Republicans.

I don't often blog about politics (or religion, for that matter) but occasionally I come across something thoughtful and feel the need to voice my dissent or "mega-dittos." Here is a blog from Andrew Sullivan endorsing Ron Paul for the 2012 election. Sullivan contrasts Paul to his contemporaries, among them, the current darlings of the polls. While I don't agree with his endorsement of Ron Paul as the best answer, I certainly like his summary of the current GOP platform of fear-mongering, extremist views, and a general lack of restraint and decency.

FWIW, I like Huntsman's bid for the less fascist side of conservatism.

My favorite tidbit of CS lore: Etymology of Apache Server

As the specification effort for HTTP began to take the form of complete specifications, we needed server software that could both effectively demonstrate the proposed standard protocol and serve as a test-bed for worthwhile extensions. At the time, the most popular HTTP server (httpd) was the public domain software developed by Rob McCool at the National Center for Supercomputing Applications, University of Illinois, Urbana-Champaign (NCSA). However, development had stalled after Rob left NCSA in mid-1994, and many webmasters had developed their own extensions and bug fixes that were in need of a common distribution. A group of us created a mailing list for the purpose of coordinating our changes as "patches" to the original source. In the process, we created the Apache HTTP Server Project

In case you ever wondered if Apache was named for the helicopter or its eponymous Native American tribe. Another reason to remember not to take yourself too seriously.

What we have here is a failure to communicate

I'm reading Land of Lisp in an effort to better understand Functional Programming patterns and idioms. Why not start with the mother of all functional (and procedural, object-oriented) languages. That's gotten me thinking about the impedance mismatch of programming languages and natural human languages.

When computing machines were first invented, we had machine language, or ones and zeros as the CS pioneers referred to it. Then came compilers and interpreters and run-times and virtual-machines.  All in an effort to better match the precision of machine code with the vagary of human language. What makes the human brain remarkable is its ability to process patterns and context while evaluating expressions. It allows for a relatively small syntax to express a wide variety of ideas and concepts. That's our natural language has evolved. Take a small base, the English alphabet for instance, and build it into patterns, words and sentences, until a usable, repeatable, inferable system is accepted by all users then share! These patterns have a lot of visual and logical helpers that give the expression meaning beyond the provided syntax.

Computers, by nature, excel at precision and accuracy so they require a language that's very precise in order to function properly. Early languages like Lisp don't even really have a syntax to speak of, rather more of an abstract prefix notation tree. Symbols like parentheses, commas, quotes, and back-quotes rule the land with instructions and data intermingling throughout. There's no functionally useless tokens, or syntactic sugar, here to assist readability or understanding. As languages evolve and become higher-form, they typically approximate more human-readable features. Each new language supposed to better interface concepts with instructional code.

As machines get faster and platforms get more efficient, we find new layers of abstraction to exploit them. It's a never-ending cycle that precludes us from actually gaining any computing efficiency. We even have macros and domain-specific languages now to approximate the utility and business knowledge that our brains intrinsically and subconsciously handle for us. I propose we quit that chase. We should just adopt a human-understandable language that's as elegant and functional Lisp and quit trying to layer language on top of framework on top of platform ad infinitum.

(defun are-you-with-me ()
             (apply #'share(cdr ('language *knowledge*))))