FREE hit counter and Internet traffic statistics from freestats.com

Friday, October 24, 2003

Microsoft Thoughts

Jon Box wrote an interesting little piece on open source and the Windows OS. Also see Scoble's rant on How to Hate Microsoft and the following feedback for some enlightening views.

My own take is that:

  • Standardization is good for everybody since it allows developers and customers to shoot for a common target in the same way that economic stability promotes liberal democracy. I'm with Jon on getting the source code for the OS. Why in the world would I want it? I'm certainly not going to change it since it takes me out of my core competency and therefore increases the cost of my products and services. Open Source is not the answer, proprietary products from organizations that implement solid and secure code based on standards is.

  • Microsoft embracing and promoting W3C standards is a logical extension of the standards they brought to the desktop by gaining market share. In the desktop world of the 1990s computers were disconnected and the shrink wrapped software model predominated. In that kind of market a single platform (the Windows API) was the only way software developers could write software that took advantage of economies of scale. In the internet world, the economies of scale are tied to internet standards (HTML, XML, SOAP etc.). Therefore Microsoft's ablity to compete in the long term is directly tied to their ability to incorporate and adhere to standards in their products (such as IE). This is why the book "Breaking Windows" by David Bank was so interesting. It shows how Microsoft almost missed the boat.

  • Most of the upgrades in products such as Office (and perhaps Longhorn?) are unncessary except those that relate directly to W3C specifications that allow other software to integrate with these products and that better enable users to work in a connected environment (like DRM). This is because the standard software products that will need to run on the desktop are now well-defined and mature (word processing, spreadsheets, games). Everything else is internet related.

  • Developer tools must move in the direction of subsuming all of the infrastructure for writing applications (navigation, caching, load balancing, personalization, security, UI widgets). I as a developer should only have to specify business rules and point the app at a datastore. Whidbey is making the first steps in this direction with the inclusion of a set of enterprise design tools and a more robust framework for ASP.NET applications.

  • Microsoft makes money by convincing users and businesses to upgrade their products. Until a different revenue model comes to the forefront, don't expect Microsoft not to continue to produce upgrades and market the hell out of them (like Office System).
  • The nature of software (being essentially free to copy and distribute) makes it unlike other industries where a monopoly of the physical infrastructure truly stifles competition (the robber barons of the 1890s). So a 95% share of the desktop doesn't necessarily make a monopoly.


  • Just my two cents.

    Wednesday, October 22, 2003

    Is this the Tiqit (ticket)?

    Heard about an interesting new mobile-style device from Tiqit. This device runs Windows XP instead of CE and includes wireless connectivity. It is sort of a Tablet PC light although I don't know (and doubt) if it runs the inking layer. The interesting thing is that the screen is simply scaled down rather than losing most of the resolution as happens with Pocket PC devices. Should be an interesting addition to the mobility space.

    How far did the Mick hit it?

    Also had an interesting email discussion recently about the distance that Mickey Mantle's famous "565" foot homerun in April of 1953 actually travelled. Along with this are other claims of prodigous Mantle blasts.

    After reading these claims I went back and looked at what Adair says in his book The Physics of Baseball about how far a ball can be hit. Based on his model he says that in order to hit an 85mph fastball 400 feet you need a bat speed of 76mph. To hit one 450 feet you need a speed of 86mph. He estimates that under normal conditions this is the limit to how far a ball can be hit. If however, the fastball is 95mph add 10 feet. If the batter pulls the ball add another 15 feet. If it's 100 degrees add another 20 feet. With a 10mph wind add another 40 feet. Hit it in Colorado and add an additional 10-15 feet. Add them all up and if everything is going your way the ball could go 550 feet.

    Since his book was published in 1990 I assume that a bat speed of 76mph must be modeled after what were perceived norms for the time. With the increase in weightlifting it is probably not uncommon to get velocities of 85mph for guys like Bonds and Sosa although I haven't seen any data on this at all (incidentally I wonder if increased swing velocities account for the increase in broken bats that announcers and players often comment on). And with every increase in velocity you multiply the force (f=mv^2).

    Sosa's homerun in Florida in July was projeted at 484 feet. Adair estimates that Mante's homerun went between 511 and 520 feet because we know Mick's blast cleared a 55 foot wall by 5 feet, 460 feet from the plate and then glanced off a beer sign. So he estimates that the ball would have landed 60 feet past the sign given an angle of descent of 45 degrees.

    Tuesday, October 21, 2003

    Compact Framework Goodies

    I'm teaching a Compact Framework course this week and was reminded that in order for developers to get all the tools Microsoft has released you'll want to get the following:

    VS .NET 2003 (of course)
    Compact Framework service pack 1
    Windows Mobile Developer Power Toys
    SDK for Pocket PC 2003 (new emulators)
    Utilities for VS .NET 2003 Add On Pak 1.1

    Code Samples
    UI Guidelines for Pocket PC
    And others found on the mobility site

    Friday, October 17, 2003

    Dusty's Failures

    I think this nicely sums up Dusty's failures.

    Thursday, October 16, 2003

    Goodbye 2003 Cubs

    Here is the best piece I've seen on the collapse of the 2003 Cubs. Very well said.

    Wednesday, October 15, 2003

    Next Year Again

    Well, the collapse is complete. I guess it was inevitable but I had a bit of hope until the last out was made. Coming into the series I said the Marlins should be the favorite since they are clearly a superior offensive team and their pitching is on a par (if you count their relievers). So the best team won I think but it doesn't help when you have leads in both game 6 and 7 and blow them with your aces on the mound. Sad day in Wrigleyville.

    Also, some poor managing by Dusty. He elected to pitch to the number 8 hitter with the pitcher on deck. That is one situation where an intentional or semi-intentional walk is called for. Being down 2 versus being down 4 makes it a completely different ballgame. He also of course should have pulled Wood much earlier, Dusty has a tendency to stick with starters too long.

    Smoking Gun

    Here it is, the interference in all its glory:
    http://www.thesmokinggun.com/archive/cubfan1.html

    Updater App Block Alternative

    A post on my recent article on the updater application block from Microsoft:

    "The .NET Application Updater block does provide a framework for a client-pull model of application deployment and upgrade, and is pretty good for some simple deployment scenarios.

    There's a company called Kinitos that offers a more comprehensive solution for the central management of .NET client applications. One of the main differences is that they enable a centrally initiated style of application management including:
    - SERVER-INITIATED install, upgrade, rollback, uninstall of client applications
    - Ability to target specific users, roles or groups with an upgrade
    - Peer-based delivery of application upgrade. (Once one client at a remote location has been upgraded, the upgrade will propagate through the local peer network rather than having all updates pulled from central server.)
    - Incremental upgrades where only changed portions of application are transmitted during upgrade
    - Central monitoring, tracking of installed version info and ability to interactively query client from central location as to its current status

    This is just one area where they make .NET smart client application development, deployment and management easier. Their platform also lets applications make online/offline/occasionally-connected operation virtually transparent to user.

    It's worth a look."


    Might be interesting....

    Tuesday, October 14, 2003

    Self Destruction

    Hard to believe the spectacle I just witnessed. In all my years of watching baseball, that was the worst game I've ever seen. The team literally self destructed. At least there's plenty of blame to go around: Prior for walking Castillo, the fan for reaching over the railing (I think the call could have gone either way but certainly the fan had to know better, Gonzales for not turning the double play, and Dusty for leaving Prior in too long and intentionally walking not one, but two batters in the inning who both ended up scoring. Intentionally walking the bases loaded is almost always a bad idea and more so when you're already behind.

    It'll be interesting to see how the Cubs respond. The bats seem to have gone silent and they've started swinging at alot of first pitches. I'm confident Wood will respond well. Can't wait for tomorrow night!

    ColdFusion to ASP.NET

    Jon Box sent me a link to a nice document comparing ColdFusion to ASP.NET. Very helpful for getting the big picture.
    Migrating from ColdFusion to ASP.NET

    Monday, October 13, 2003

    Compact Framework Book Site

    Check out the site for our new book due out this month. You can download the code and get the TOC.

    Sunday, October 12, 2003

    1984 Redux?

    Let's hope that the 4-0 win by the Marlins tonight is not the start of a repeat of 1984. I heard on NPR this week that Dennis Eckersely, who pitched for the Cubs in 84, said that on the flight out to San Diego (after the Cubs had trounced the Padres in the first two games) the team was brimming with overconfidence. He clearly thought that they were looking past the Padres and we all know what happend (Steve Garvey not withstanding). Let's hope this Cubs team has a better perspective. Of course, this team has Prior and Wood for games 6 and 7 (if needed).

    Beckett was simply too tough tonight. The Cubs just got beat by a very good pitcher who was on top of his game. Zambrano bounced back pretty well so that was encouraging. It does look as if Sammy can't hit the good fastball (94 and up) anymore. Way late most of the time as Beckett turned up the heat after the brushback pitch.

    Saturday, October 11, 2003

    Cubs up 2-1

    Great Cubs win last night 5-4 to go up 2-1 against the Marlins. I thought Dusty made the right move in leaving Remlinger in there. He looked good and was hitting his spots. Wood pitched well although I thought Leiter was correct in saying he challenged too much with 2 strikes. He was ahead in the count all night but gave up the big hits with 2 strikes.

    Both Glanville and Goodwin came through in the pinch. Goodwin especially has had several huge pinch hits in the post season and coming down the stretch. He had a very good year and is still the best bunter I've ever seen (when he played for the Royals).

    Clement goes tonight, who has been inconsistent all year. We'll see which Clement shows up tonight.

    Will and California

    George Will had an excellent column this morning regarding California's recall election. I especially like:

    "Now this exercise in "direct democracy" -- precisely what America’s Founders devised institutions to prevent -- has ended with voters, full of self pity and indignation, removing an obviously incompetent governor. They have removed him from the office to which they re-elected him after he had made his incompetence obvious by making most of the decisions that brought the voters to a boil."

    He goes on to note that Arnold is no conservative and echoes my previous post that he won't be able to get California out of its initative-created fiscal mess because the budget is largely off limits. His take is that Arnold was elected because he vowed to repeal the car tax. Rightly so but until Californians put some limits on themselves, they even the Terminator isn't going to help them.

    Friday, October 10, 2003

    Moneyball

    Cardinal fan Jon Box brought this article to my attention.

    "Without putting too fine a point on it, the Redbirds are taking their first steps into the Moneyball realm. Luhnow should bring a different perspective to a team that has not relied as heavily on statistical models and data as some other franchises in recent years."

    The Cards appear to have seen the light and will start to apply some of the sabermetric principles I mentioned in my review of Moneyball posted earlier. Good for them. It doesn't appear the Cubs are anywhere close to such an approach as their thinking appears to be very traditionalist.

    Thursday, October 09, 2003

    Photographic Memory

    I just picked up Edmund Morris' The Rise of Theodore Roosevelt. I had previously read Theodore Rex about Roosevelt's presidency and this book follows his life from birth in 1858 to when McKinley is shot in 1901. Anyway, I was fascinated by description of Roosevelt's reading habits in the prologue. A short excerpt:

    ""Reading with me is a disease." [quoting Rooselvelt] He succombs to it so totally...that nothing short of a thump on the back will regain his attention. Asked to summarize a book he has been leafing through with such apparent haste, he will do so in minute detail, often quoting the actual text. The President manages to get through at least one book a day even when he is busy. Owen Wister has lent him a book (of 300 odd pages) shortly before a full evening's entertainment at the White House, and been astonished to hear a complete review of it over breakfast...On evenings like this, when he has no official entertaining to do, Roosevelt will read two or three books entire."

    In one stretch from when he was inagurated in the fall of 1901 until about the same time the following year, Roosevelt was able to list over 114 authors that he had read, many of them multi-volume works. When asked how he could quote books that he hadn't read for 20 years, he thought for a second and said something to the effect that after thinking about the book for a second, the pages came before his mind in total.

    Gov. Arnold

    I'll have to admit that I was suprised that Arnold actually won, but then again in California anything is possible. Unfortunately, I don't think Arnold is going to be able to save California. Aside from the fact that he likely doesn't know what he's talking about, I recently read some interesting analysis of California in the book The Future of Freedom: Illiberal Democracy at home and abroad by Fareed Zakaria.

    Basically, Zakaria views California's fiscal and management troubles as largely a creation of their own making through the overuse of "direct democracy" through the ballot initiative process. Zakaria notes that since proposition 13 in 1978 which lowered taxes to 1975 levels and put a limit on the amount taxes could grow, Californians (and the nation at large) have fallen in love with bypassing legislatures and making their own laws. In the 1990s there were 378 such referendums across the country compared with 257 in the 1980s, 181 in the 1970s, and 88 in the 1960s. In the year 2000 alone there were 204!

    Zakaria feels that citizens going over the heads of the legislatures causes more problems than it solves since the normal give and take of politics is destroyed and instead politicians are mandated to act in a certain way. In the biggest example, 85 percent of California's budget is actually untouchable by Arnold and the legislature as the result of ballot initiatives (for example, proposition 98 mandates that 40% of the budget be spent on education). As a result California now spends very little on infrastructure (5% today as opposed to 22% in the 1950s). So what do they do? They propose yet another ballot iniative to allocate 1% of the budget to fix bridges etc.

    Zakaria also argues that since the normal give and take of the political process is bypassed, even iniatives that seem like a good idea to conservatives like me (ending affirmative action in prop 209 and regulating immigration in prop 187) end up having polarizing effects. He contrasts these efforts with the welfare reform package passed by congress during the Clinton administration that had the benefit of being crafted through compromise.

    I find myself agreeing with much of what Zakaria says and especially his call for a return to a "liberal democracy" with representative institutions as the founders intended. There is good reason to craft a system where cooler heads can debate and compromise. All the more so as the electorate becomes less educated and the issues become more complex (how can I as a citizen know that 40% is the right amount to spend on education, why not 37%?). This book also has just the right amount of counterintuitiveness to be on the mark. Zakaria also mentions the books Paradise Lost by Peter Shrag and Democracy Derailed by David Broder which likely illuminate this topic in more detail. If only there were time to read all the books....

    Cubs Rock Marlins

    Both games of the NLCS have been great games to watch for Cubs fans. Even though they took the loss in game 1, they battled back twice to tie the score and Sosa's homerun in the bottom of the 9th was one of the great moments in Cubs history.

    I'll have to admit that I'm one of the harshest critics of the Cubs for picking up Gonzales. He's never had an on base percentage higher than .322 in a full season and so the Cubs should have known exactly what they were going to get when they picked him up. Couple that with his large salary (4.5M) and by any reasonable measure it was a poor decision. However, in the post season anybody can get hot and right now Gonzales is leading the charge. Going to try and get a ticket for game 7 at Wrigley, not likely however.

    On another front I found some interesting statistics on the probabilities of scoring in various situations in this article. I know there are some better stats somewhere, perhaps I'll need to download some retrosheet data and run the numbers once.

    Random .NET Stuff

    Some very nice information on performance tuning of a large .NET app that Microsoft runs internally can be found here. Also ran across a nice article on generic in the CLR from the October issue of MSDN Magazine

    Looks like the gloves will be off October 27 on the next version of VS .NET and the CLR as Microsoft presents at PDC.

    Monday, October 06, 2003

    DataFactory or MSDN DAAB?

    As some of you may know last year I wrote a .NET DataFactory class for a TechEd presentation I did. The slides and code for the talk including the DataFactory in both C# and VB can be downloaded from the Atomic site. The class is also explained in chapter 18 of my book Teach Yourself ADO.NET in 21 Days.

    In any case, this class was designed to do three things:
    1) Abstract the .NET Data Provider you are using by exposing a constructor that accepts the provider name and then using reflection to instantiate the proper .NET Data Provider objects (command, connection, etc) on the fly.
    2) Allow for database independent coding by abstracting the SQL statements and data types into a set of XML files that the DataFactory calls
    3) Command object caching by keeping a set of static hashtables that reference command objects that have already been created. The DataFactory then clones the command object (using ICloneable) when needed to create a new command.

    The DataFactory is similar to the MSDN Data Access Application Block first released in April of 2002 since they both reduce the amount of code you need to write. The difference I see are:

    1) The DAAB is not .NET Data Provider independent. In their model there is a class called SqlHelper that knows how to use SqlClient. If you want other providers then you write a helper class like OdbcHelper for that (as they did in the Nile 3.0 sample app). There is no common interface they all use (which wouldn't be difficult to implement)
    2) The DAAB is not database independent. It does not abstract SQL in any way
    3) The DAAB caches parameters but not commands

    That being said, the DataFactory would be more appropriate if you know that you need to work against multiple backends. The DataFactory is more complex because of the use of XML files and I don't believe it performs quite as well as the DAAB because of using reflection.

    Is next year finally here?

    Well, the Cubs are going to the NLCS after their 5-1 victory on Sunday night. Kerry Wood and Mark Prior showed why the old adage about good pitching stopping good hitting is so true. 3 wins down 8 to go. Could this be the year?

    I thought Baker's use of the bunt in the first inning with Lofton on 2nd was an example of poor managing. By bunting in that situation you almost guarantee yourself that you'll not have a big inning. And although Miller hit a double in game 4, I wouldn't hesitate to pinch hit for him in any situation, like the one in the 9th when he struck out looking with a runner on third and less than 2 outs.

    So how to do you know that bunting in that situation is normally a bad idea? Below is a table of Net Expected Run Values calculated by Carl Morris.






    Outs/Men on Base01231,21,32,31,2,3
    0.537.9071.1381.3491.5151.7621.9572.399
    1.294.544.720.920.9681.1401.3531.617
    2.114.239.347.391.486.522.630.830

    Anyway, to answer the question posed above, Lofton was on 2nd with nobody out. That situation yields an expected run value of 1.138. By bunting the runner to third you now move the state of the game to a runner on third with one out, which yields an expected run value of .920. So you decrease your number of expected runs (the chance for a big inning), while likely increasing your chance of scoring the single run (although I would guess not by much). But what if Grudz singles and moves Lofton to third? Now the expected run value moves to 1.762. What if he strikes out (which he did)? The NERV moves down to .720. So generally speaking unless you're sure your pitcher is going to throw a shutout, I don't see any reason to bunt in that situation. Further, you'll notice that when moving from 0 outs to 1 out, you decrease the NERV by as much as a half run. Outs are the most valuable resource the offense has. Squandering them with bunts in the early innings is never a good idea. However, when you a need a run to tie the game in the late innings, then the play makes sense.

    Looking at it a different way, below is a table of linear weight run values (originally computed by Pete Palmer) that I found on the SABR list. In this matrix each situation records a context in which a batter can perform. The columns indicate their result (a walk, single, double etc.) while the values are the offensive contribution of the result (i.e. a grandslam is only woth 2.82 runs since the other runners on base deserve some credit for producing the 4 runs). Using tables likes these sabermetricians can get a feel for how valuable walks or other offensive contributions are. I believe this table was generated from data in the 1990s. This kind of complete analysis can be done using retrosheet data.











    Situation/Resultwsdthrout
    none on.30.30.49.701.00-.20
    first.36.41.881.311.77-.30
    second.20.68.991.151.64-.34
    third.19.75.881.001.57-.35
    1 and 2.58.891.551.962.37-.53
    1 and 3.38.861.331.782.19-.47
    2 and 3.221.071.401.602.01-.49
    loaded1.001.321.952.392.82-.75
    ALL.34.50.821.111.49-.30

    Friday, October 03, 2003

    Struggling with C.S. Lewis

    I just finished reading Lewis Agonistes by Lou Markos, my wife's former professor. This book was the "inspiration" for the title of this blog.

    A great book and highly recommended for anyone with a desire to understand the main ideas of Lewis and apply them in this postmodern world. In the book Lou goes step by step through the modern Christian's struggle with science, the new age, evil and suffering, the arts, and heaven and hell. By adroitly pulling key passages and ideas from the Lewis cannon he weaves an argument and a response to the challenges of both modernism and postmodernism in these key areas.

    I like the fact that the book seems to challenge the reader more and more as you progress in it (at least it did me). I once heard a lecturer say that it is lucky if a scholar has one great idea in his lifetime. Well, Lewis had several and Markos illuminates them with the ease of someone who has deep knowledge of history, theology, literature, and Lewis.

    Updater Application Block for .NET

    Here is a link to my latest article on the Updater Application Block from Microsoft. Very interesting technology that uses BITS to allow .NET applications to be auto-updating or to create an auto-updating service.

    Also see my overview of all the current application blocks. It sounds also like Microsoft will be releasing several more throughout the next year so stay tuned!

    Object-relational Mapping

    After having a discussion on object-relational mapping techniques I was reminded of Scott Ambler's work. Some very good principals here.

    Thursday, October 02, 2003

    The Law of Unintended Consequences

    I was struck by a interesting experience when walking into a book store today. As I passed the shelves of books I noticed that there were sections for History, Biography, Religon, Mystery, Romance and so on. I then noticed a section called African American. While I perused the other sections I skipped the African American section, I suppose, at least subconsciously, because being white I assumed there was nothing there germaine to me. However, after feeling a bit guilty I approached the section and skimmed the titles. Of course, the section was filled not with books only about the civil rights movement or slavery but with books by African Amercian authors in all the other genres in the rest of the store.

    So here are the questions: Why would a bookstore owner feel the need to create a section only for African American authors when authors of all other races are categorized by genre? Does creating such a section serve any purpose except allowing non-black patrons to more easily pass by some very good books and at the same time perhaps causing black patrons to miss good literature in the wider store? In answer to the first question, my guess is that the same guilt that finally caused me to skim the books in that section drove the owner of the book store to create a special African American section. And in answer to the second, the owner's actions elevated race over content (in this case of a person's writing instead of his character), which likely was exactly opposite of the intent.

    So what is the wider lesson? Like affirmative action, sometimes attempts to make concessions can end up causing a result exactly opposite to that intended (the "law of unintended consequences", see Thomas Sowell's book).

    Cretaceous Fossils

    My daughter Laura and I went fossil hunting in the Smoky Hills Chalk of western Kansas in August on our way out to Colorado. Here is the website that she and I developed for her first science project of the year (FrontPage 2000, which I was fairly impressed with after having not touched FrontPage since 1997 or so).

    The 1 day hunt was hot but fascinating. I was most struck by the abundance of large clams (Platyceramus platinus) that were eroding from almost every rock and under our feet. Some of these clams were 3 to 4 feet in diameter. I was also impressed by how difficult it is to spot some of the fossils and how a trained eye like our guide's could see bones and teeth from a distance. My main discovery of the day was a small mosasaur tooth pictured on Laura's site.

    For more information on the types of fossils you find in western Kansas see the Oceans of Kansas web site. Hope to do it again next year.

    Generics in C# and VB

    Often you'll hear someone say something like, "one of the ways that C# and VB will diverge in the future is that C# will support generics". Well, both C# and VB will support generics in the Whidbey release. The second thing you'll likely hear is someone asking, "what is a generic?"

    The best explanation I've heard (from Don Box) is that generics are simply "types with holes" that can be filled by the caller. A simple example would be a type that compares two values and returns the larger of the two (adapted from this article):


    public class Compare<ItemType, ItemType>
    {
    public ItemType Larger(ItemType data, ItemType data2)
    {
    // logic...
    }
    }

    Compare<int, int> compare = new Compare<int, int>;
    int MyInt = compare.Larger(3, 5);

    The caller can then redeclare the Compare class using floats or some other data type instead of integers. While this can be done using System.Object generics will perform better since boxing and unboxing are not required (see this article for an overview of how much this costs).

    Whidbey Rocks

    I recently attended a 3 day event at Microsoft for publishers and authors. Several of the product groups came in and gave us demos and information on the next version of Visual Studio .NET and the Windows .NET Framework.

    From all the info I have thus far Whidbey is going to be a major improvement for the lives of .NET developers everywhere. I am especially interested in the increased level of committment for the support of enterprise developers in terms of baking architectural patterns and WS-I standards into the product.

    There should be lots of good sessions on this (as well as the bits) and other topics (ASP.NET improvements are significant as well) at the PDC in October. I won't be able to attend but advanced copies of a book on the Compact Framework Jon Box and I wrote will be there. Be sure to get yours!

    Thought provoking article...

    An interesting article from First Things, October 1998
    In Defense of Disbelief

    Affirmative Action

    I watched "Law and Order" tonight while in Nashville teaching a class. The topic was affirmative action and a play on the Jayson Blair case at the New York Times. The characters on the show referenced the Michigan law school decisions handed down this summer. This summer, after the decision came down I skimmed the PDF on the supreme court web site and a piece on it.

    It sounds like they've left the door open for worse mischief. If they said that you can take race into account as a "plus factor" (the law school decision) but not use any quantitative means to determine how much of a plus factor it should be (the undergraduate decision), then you leave the door wide open for any subjective interpretation unless it starts to smell bad according to the whim of the justices (as with redistricting in the 1990s). In fact, the undergraduate decision says that diversity can "constitute a compelling state interest" while the 20 points given to minority students was "not narrowly tailored to achieve educational diversity." So what exactly does "narrowly tailored" mean?

    The strange thing is that the justices say contradictory things in the same paragraph. In the graduate school decision the opinion says that "attaining a diverse student body is at the heart of the law school's proper institutional mission" and that

    "enrolling a critical mass of minority students simply to assure some specified percentage of a particular group merely because of its ethnic origin would be patently unconstitutional. But the law school defines its critical mass concept by reference to the substantial, important, and laudable educational benefits that diversity is designed to produce including..."

    and then they go on to commit sociology by citing studies that probably have very little backing. Somehow diversity can be at the heart of a university's mission but selecting applicants to fulfill this diversity "merely" by ethnic origin is wrong? So it appears that if you set a quantitative goal and discriminate you are "patently unconstitutional" but if you hide the goal and then cite the "laudable educational benefits" you can discriminate all day long. Sounds to me like giving 20 points would be a lot more fair and at least people would know what they have to overcome.

    I think the simple reason that a plus factor type system could work in a law school is that they have fewer applicants and so can make decisions based on lots of personal information (this was specifically mandated in the Bakke case) where race would presumably play a smaller role in proportion. In an undergraduate program the sheer numbers of applicants means you need to apply some quantitative measure to see who gets in and who doesn't simply because you can't collect and process such detailed information on all applicants. I noticed that the court explicitly rejected the idea of this impracticality stating that "The fact that the implementation of a program capable of providing individualized consideration might present administrative challenges does not render constitutional an otherwise problematic system." So it sounds like if you have lots of applicants you must be officially race blind but if you don't (or have the resources to sort through it) you're free to consider race? This sounds ripe for abuse since colleges won't want to give up a race-based system and so will continue to use it as a plus but not document the fact.

    I'm of the opinion that colleges should base their admissions on test scores and extra curricular activies but not race, gender, etc. The onus should be on the high schools to produce students with equivalent test scores. Of course, if you buy the argument that there are differences in general intelligence between racial groups that at least partially get represented in SAT scores, then race-based admissions that level the scores quantitatively should be acceptable. That's a whole different story that ties into the IQ discussion...

    PreJIT on .NET

    I had a question on adding PreJIT to a Windows Installer package. See this walkthrough on MSDN for how to do this....

    Guns, Germs, and Steel

    I just had to share a short synopsis I wrote almost two years ago of one of best books I've read in the last 5 years; Guns, Germs, and Steel: The Fates of Human Societies by Jared Diamond. If you haven't read it, you should...

    This Pulitzer Prize winning book from 1997 attempts to answer the question, "why was the Spaniard Francisco Pizarro able to capture the Incan emperor Atahuallpa on November 16, 1532 within minutes of their first meeting when Pizarro commanded 168 terrified soldiers and Atahuallpa 80,000?" Of course, the broader question is why Europeans so easily conquered native peoples in many parts of the globe after 1492. Diamond's answer is that Europeans inherited guns (advanced weapons), germs (diseases that natives did not have resistance to), and steel (advanced technology including writing) as lucky accident of their environment rather than any innate difference in ability.

    He then spends most of the book discussing what he views as the four proximate causes since 13,000 BC that allowed European civilization to develop more quickly.

    1) The first is the availability of domesticable plants that allowed people in the Fertile Crescent (who spawned the eventual European societies) to move from a hunter-gatherer lifestyle to a sedentary farming lifestyle. This also allowed individuals to move away from food production and become specialists (soldiers, chiefs, etc.) which in turn drove political organization and increased population densities allowing new technologies to be invented more quickly. His contention is that while food production arose independantly in five or six areas of the world, the Fertile Crescent had far and away the best wild candidates for domestication. As a result, farming developed first in this area.

    2) The second cause is the domestication of animals that spurred both agriculture and food production. Domestic animals also infected those farming peoples with diseases to which they eventually built resistance and which proved decisive when the farmers come into contact with nomadic tribes. Once again the Fertile Crescent had many more animals that were able to be domesticated than other places. For example, the Americas contained only 1 large mammal (the llama) that could be domesticated while Africa, despite being known for large mammals, had none that were ever domesticated (and still aren't) because of various traits.

    3) The third cause is the axes of the various continents. For example, the Americas are situated in a north-south axis along with Africa while Eurasis lies east-west. A north-south axis is less desirable for the spread of populations, technology, and crops since latitudinal changes bring environmental barriers such as deserts (the Sahara), rainforests (Brazil, Congo) that are difficult to cross without adapting a different set of crops and technologies. As a result the most advanced societies of the New World (the Incan and Aztec) had no knowledge of each despite being separated by only 700 miles. Conversely, trade between mediterranean societies acted as a positive feedback mechanism between the varying cultures.

    4) The fourth cause is simply the available land mass that is able to support various cultures. Basically, the more people there are the more quickly technologies develop. In this respect Eurasia is much larger than either Africa or the Americas (with Australia very much behind).

    The author walks through each cause and provides many interesting facts and some speculation. He then devotes a chapter to each major continent and traces its history as he sees through his proximate causes and from archeology and linguistic evidence. I found the book fascinating because it provides a framework for understanding history. Its especially interesting to note where certain crops were first domesticated even though we think of them originating from different areas today (for example coffee was developed in Ethiopia, not South America). Now, when you sit down to dinner you can have a lively discussion of where all the food you're eating comes from (by the way, your spouse may find this quite annoying).

    Diamond's field work has mostly been done in New Guinea so there's an overemphasis on the migrations from China through Indonesia and New Guinea finally culminating in the population of Easter Island, but the book is still well worth reading.

    Programming or Engineering?

    I was reminded that I hadn't posted anything in my field of IT yet...so...

    Recently, on a trip to Seattle I had a chance to read Professional Software Development by Steve McConnell. McConnell's name is of course instantly recognizable through his books Code Complete, Rapid Software Development, and the Software Project Survival Guide which I've read previously.

    In this book McConnell argues for moving the state of software development from its current state as a craft or art to the more rigorous approach he terms Software Engineering (chapter 4, Softwar Engineering, Not computer Science). In this approach software development is more systematized as developers apply proven techniques and disciplines to their projects. He does this by organizing the book into four parts, The Software Tar Pit, Individual Professionalism, Organizational Professionalism, and Industry Professionalism.

    McConnell argues quite persuasivley that many of the techniques and processes ("body of knowledge") needed to deliver software on time and on budget are well known and have been for quite some time. Organizations simply don't take advantage of this body of knowledge and treat software development only as construction or coding. I was particularly struck by McConnell's analysis of the stable core of this body of knowledge and his estimate that as of 2003, 50% of the knowledge needed to develop software is in a state he calls the "stable core" (p41). In other words, the stable core is that knowledge that developers can invest in early in their careers and that doesn't change over time.

    All of this leads McConnell to the most controversial aspect of the book; his analysis that software development is progressing towards professional status (chapter 6, Novum Organum) akin to the traditional engineering, medical, and law professions. Those stable and older professions all incorporate professional education, accreditation, individual and/or organizational certification, and licensing. For my part, I'd love to see a more rigorous approach to development, and if that means licensing of some small percentage of developers (McConnell estimates around 5%) then I'm all for it.

    One of the most interesting aspects to me is his support for true software engineering degree programs rather than simply Computer Science degrees. He argues that the decrease in CS degrees issued since the mid 1980s (and only now starting to rebound), is related to their lack of relevance to the day to day work of software developers (chapter 18, Hard Knocks). I'll have to admit that I have often thought that my CS degree was ill-preparation for so many of the tasks and skills needed to develop software. I for one, would heartily like to see more software engineering degrees offered that focused not only on code creation but also estimation, design, analysis, testing, documenation, etc. Back in the day when I received my CS degree, nobody even thought there was a difference and so many students had no idea what real world software development even entailed as they set out on their first job. In fact, I distinctly recall after reading Code Complete, that this book should be required reading for every new employee in an organization since it teaches so much that a CS curriculum leaves out.

    I also was interested in his classification of organizations as "process-oriented" (NASA or Boeing) or "committment-oriented" (Microsoft). The former emphasizes repeatable processes and standard techniques while the latter looks for developers who "sign up" or commit to a project and just do what it takes to get the job done. The extremes of these two types are referred to as bureaucratic or sweatshops. The former is overly process driven, so much so that developers are bogged down in documents and meetings, while the latter requires long hours and burns people out. In my experience I see alot more organizations following the committment-oriented model for two primary reasons 1) They see how Microsoft has made it succeed, and 2) It is cheaper since they don't pay overtime and requires less initial investment by upper management. The problem however as I see it, is that as an organization grows it becomes more difficult to hire developers who can be so invested, especially as the median age of developers continue to rise (now around 34 years old with the distribution moving to the right end of the curve). And secondly, organizations rarely provide the autonomy (that equals motivation) that Microsoft does to create software. Getting an organization to move from committment to process is incredibly difficult and this is just where many organizations fail. McConnell's inclusion of his company's professional development program (chapter 16) is a breath of fresh air. Would that all organizations took this approach to developers as more than simply a set of programming skills and language profficiency!

    In all, I'd highly recommend this book anyone who looks around and sees projects out of control and wonders what could be done.

    Wednesday, October 01, 2003

    Computing Offensive Value

    One of the hallmarks of sabermetrics are the formulas sabermetricians use to estimate the offensive constribution of individual players. Bill James first introduced such a formula (Runs Created) in the late 1970s or early 1980s in his yearly Baseball Abstract, which are among the founding documents of sabermetrics. Since that time others have picked up the banner and James himself has further refined his formulas and extended them to try and calculate not only how many runs a player has produced, but how that translates into wins. In fact, last year James published Win Shares, a somewhat controversial book that catalogs these attempts.

    Well, one of the best efforts at precision in the attempts to calculate how many runs a player has produced is Extrapolated Runs (XR) developed by Jim Furtado and published in 1999.